L
Larry David
Ok, first of all, let's get the obvious stuff out of the way. I'm an idiot. So please indulge me for a moment. Consider it an act of "community service"....
What does "64bit" mean to your friendly neighborhood C# programmer? The standard answer I get from computer sales people is: "It means that the CPU can process 64 bits of data at a time instead of 32." Ok... I guess I *kind* of understand what that means at an intuitive level, but what does it mean in practice? Consider the following code:
long l=1;
for(int i=0;i<5;i++){
Console.WriteLine("Emo says "+l);
l+=1;
}
How would this code run differently on a 64 bit processor as opposed to a 32 bit processor? Will it run twice as fast since the instructions are processed "64 bits at a time"? Will the 64 bit (long) variable 'l' be incremented more efficiently since now it can be done in a single processor instruction?
Now I want to ask about memory. I think this is the one benefit of 64bit computing that I DO understand. In a 32bit system, a memory pointer can only address 2^32 worth of process memory versus 2^64 worth of memory (wow!) in a 64bit system. I can see how this would be a major advantage for databases like SQL Server which could easily allocate over 4gigs of memory -- but is this a real advantage for a typical C# application?
Finally, I want to ask about interoperability. If I compile a 32bit C# app, will the ADO.NET code that it contains be able to communicate with the 64bit version of SQL Server?
Thanks for helping a newbie,
Larry
What does "64bit" mean to your friendly neighborhood C# programmer? The standard answer I get from computer sales people is: "It means that the CPU can process 64 bits of data at a time instead of 32." Ok... I guess I *kind* of understand what that means at an intuitive level, but what does it mean in practice? Consider the following code:
long l=1;
for(int i=0;i<5;i++){
Console.WriteLine("Emo says "+l);
l+=1;
}
How would this code run differently on a 64 bit processor as opposed to a 32 bit processor? Will it run twice as fast since the instructions are processed "64 bits at a time"? Will the 64 bit (long) variable 'l' be incremented more efficiently since now it can be done in a single processor instruction?
Now I want to ask about memory. I think this is the one benefit of 64bit computing that I DO understand. In a 32bit system, a memory pointer can only address 2^32 worth of process memory versus 2^64 worth of memory (wow!) in a 64bit system. I can see how this would be a major advantage for databases like SQL Server which could easily allocate over 4gigs of memory -- but is this a real advantage for a typical C# application?
Finally, I want to ask about interoperability. If I compile a 32bit C# app, will the ADO.NET code that it contains be able to communicate with the 64bit version of SQL Server?
Thanks for helping a newbie,
Larry