Is there a difference between the C# "int" and System.Int32 ???

  • Thread starter Thread starter C Newby
  • Start date Start date
That is as I thought.

But I was wondering ... so far as I remember, there was an intrinsic integer
type in Java. Is there any reason to think that the .NET implementation of
integers would suffer from performance problems (in comparison to Java),
even if only slightly? Or is the Java implementation in fact the same and
I'm just remembering it incorrectly?

TIA//
 
C Newby said:
That is as I thought.

But I was wondering ... so far as I remember, there was an intrinsic
integer type in Java. Is there any reason to think that the .NET
implementation of integers would suffer from performance problems (in
comparison to Java), even if only slightly? Or is the Java
implementation in fact the same and I'm just remembering it
incorrectly?

No, System.Int32 is still treated differently by the compiler and IL.
Each type in .NET is a "proper" type though, including value types.
 
Thanks Jon. I looked around a little on MSDN as to how Int32 is handled by
the runtime, but I didn't see anything. Could you either elaborate on the
subject some more or direct me to a reference? Thanks again//
 
C Newby said:
Thanks Jon. I looked around a little on MSDN as to how Int32 is handled by
the runtime, but I didn't see anything. Could you either elaborate on the
subject some more or direct me to a reference? Thanks again//

Download the CLI spec from www.ecma-international.org (ECMA 335) and
look at the various op codes which deal with 32 bit integers
specifically.

Alternatively, compile a C# program which does things like adding ints
together, and then run ildasm on it and look at the IL :)
 
C Newby said:
That is as I thought.

But I was wondering ... so far as I remember, there was an intrinsic integer
type in Java. Is there any reason to think that the .NET implementation of
integers would suffer from performance problems (in comparison to Java),
even if only slightly? Or is the Java implementation in fact the same and
I'm just remembering it incorrectly?

TIA//

Just to elaborate. . .

Java has a strict distinction between primitive types and non-primitive
types. You cannot create new primitive types, and you cannot cast a
primitive type to an object reference.

In .NET the type system containes both value types (which perform like
primitive types, are passed on the stack and have value semantics) and
reference types, which perform like Java Objects.

In doing this, .NET gets better performance for things like Decimal and
database-specific types which can be implemented as value types, and
eliminates the annoying manual boxing in Java where each primitive type has
an object wrapper type.

David
 
Back
Top