1) Do you believe that 0 is a real value for an int type? Because
precisely the argument you're using to say that null is the equivalent
of an unassigned value (for reference types) apply to 0 for int
variables - it just happens to be the default value.
Back when I started using C, Kernigan and Ritchie C, that is, which is the
"purest" form, when you declared an integer without assigning it, it didn't
have a value of 0. It had the value of whatever was in the memory location
that the compiler assigned it to. This was because C was not initially
designed to "fix things up" for inexperienced developers.
The fact that the C# compiler assigns a default value to an integer is
something built into the compiler. So, again, we're talking English
semantics here rather than reality. In reality "null" signifies "nothing."
Now, a computer cannot represent nothing without using some number, so 0 (\0
in C) has traditionally been the "value" used to signify nothing, which is
the source of the confusion. Take a look at the dictionary definition of
"null" -
http://dictionary.reference.com/search?q=null.
With the advent of higher-level programming languages, more abstraction away
from numbers was added to the mix. For example, in C, 0 is false, and
*anything else* is true. In C, you can write:
while (a + b) { ... }
Logically, this makes perfect sense. Since true is not false, and false is
not true, if 0 is false, what is 2? Since 2 is not 0, it is true. However,
the abstraction introduced by these higher-level languages also muddied the
water. These languages are all compiled to machine language, and therefore
represent mathematical operations. The abstraction proves to be useful, but
at a price, which is confusion. Today, false is not treated as a number by
the developer, and the concepts of true and false have been abstracted to a
higher level which ultimately must be processed numerically by the computer.
The same holds true for "null." The current abstraction level of computing
languages, C# in this discussion, can cause confusion when being talked
about using human language, as the abstractions approximate human ideas. So,
again, we're just having a semantic argument here, which I'm not sure is
useful.
So, I will stand behind my statement that null signifies nothing, which
means that it signifies the absense of a value. One can not signify nothing
with nothing. It must be represented by something. Hence, the confusion.
But, to put it more colloquially, how about trying to understand what I'm
saying, rather than mincing my words? We all have different ways of talking
about things. But it is not the words we use that matter, as much as the
ideas they represent. Human language is not a programming language like C#,
where every token always means exactly the same thing, and there is no room
for interpretation. On the contrary, human language is all about
interpretation, nuance, and gist.
I only hope the OP isn't entirely befuddled by this point! ;-)
--
HTH,
Kevin Spencer
Microsoft MVP
Printing Components, Email Components,
FTP Client Classes, Enhanced Data Controls, much more.
DSI PrintManager, Miradyne Component Libraries:
http://www.miradyne.net