typedef

  • Thread starter Thread starter Mr.Tickle
  • Start date Start date
M

Mr.Tickle

I want to give an alias for a type in C# so i can say SOMETYPEID refers to
int etc, how can I do this on C#?

something like typedef long SOMETYPEID in c++
 
Hi Mr. Tickle,

you can use *using* directive
using SOMETYPEID = System.Int32;

Because c# doesn't have anything like header files and this alises are not
embeded as a metadata in the assemblies you can use it only in the file
where you put this declaration.

HTH
B\rgds
100
 
Hi Mr. Tickle,

Mr.Tickle said:
I want to give an alias for a type in C# so i can say SOMETYPEID refers to
int etc, how can I do this on C#?

something like typedef long SOMETYPEID in c++

There is no typedef per se. You could do something like this, though:

using SOMETYPEID = System.Int64;

The primary drawback to the above is that it must be included in every
code file where you would want to use SOMETYPEID.

Regards,
Dan
 
Mr.Tickle,

You can do this using the "using" directive, like so:

// Indicate sometypeid is an int.
using SOMETYPEID = int;

Or:

using SOMETYPEID = System.Int32;

However, this is different from a typedef in C++, as it doesn't create
new type information. It is just an alias for the type.

Hope this helps.
 
ok, thanks.

Anotherone, is it better to use int or System.Int32 in code? using the built
in keyword or the library type?

kind like using int or INT in C++

Nicholas Paldino said:
Mr.Tickle,

You can do this using the "using" directive, like so:

// Indicate sometypeid is an int.
using SOMETYPEID = int;

Or:

using SOMETYPEID = System.Int32;

However, this is different from a typedef in C++, as it doesn't create
new type information. It is just an alias for the type.

Hope this helps.


--
- Nicholas Paldino [.NET/C# MVP]
- nick(dot)paldino=at=exisconsulting<dot>com



Mr.Tickle said:
I want to give an alias for a type in C# so i can say SOMETYPEID refers to
int etc, how can I do this on C#?

something like typedef long SOMETYPEID in c++
 
Mr. Tickle,

It really doesn't matter, as it is just an alias. They are the same thing.

-Nicholas Paldino [.NET/C# MVP]
-nick(dot)paldino=at=exisconsulting<dot>com

ok, thanks.

Anotherone, is it better to use int or System.Int32 in code? using the built
in keyword or the library type?

kind like using int or INT in C++

Nicholas Paldino said:
Mr.Tickle,

You can do this using the "using" directive, like so:

// Indicate sometypeid is an int.
using SOMETYPEID = int;

Or:

using SOMETYPEID = System.Int32;

However, this is different from a typedef in C++, as it doesn't create
new type information. It is just an alias for the type.

Hope this helps.


--
- Nicholas Paldino [.NET/C# MVP]
- nick(dot)paldino=at=exisconsulting<dot>com



Mr.Tickle said:
I want to give an alias for a type in C# so i can say SOMETYPEID refers to
int etc, how can I do this on C#?

something like typedef long SOMETYPEID in c++
 
Thus spake Mr.Tickle:
Anotherone, is it better to use int or System.Int32 in code? using
the built in keyword or the library type?

You should use the alias defined by your language and let the copiler
handle the mapping to the appropriate type within the Framework.
 
Hi, Mr.Tickle
Anotherone, is it better to use int or System.Int32 in code? using the built
in keyword or the library type?

You can use either. *int* is just language keyword that serves as a shortcut
for System.Int32.
Using *int* will make the code nicer and easier for reading by any C#
programmer.
You may consider using System.Int32, though, if you share the code with
someone that work with another language. This guy may has got use to use
different keyword for integer values or in some of the languages *int* may
mean something else like System.Int64 for example and this may put him/her
to confusion. Any .NET programmer has to know what System.Int32 is, though.

Other examples may be found of advantages and disadvantages of using *int*
and using System.Int32.
I think you have to use the one that makes your work and the work of the
other members of your team easier and safer.

HTH
B\rgds
100
 
Mr.Tickle said:
Anotherone, is it better to use int or System.Int32 in code? using the built
in keyword or the library type?

I personally use int unless it's part of a member name, where I'll use
the framework name, as it could be invoked from a different language.

For instance, I might have:

int ReadInt32();
float ReadSingle();

etc
 
you can't inherit from a base type in C#, but you can from almost any other
type.

public class al : ArrayList {}

with this, you could just use 'al' in place of 'ArrayList' if you were
declaring that type. Other than that, I don't believe there's an
equivelant.

Chris
 
Chris LaJoie said:
you can't inherit from a base type in C#, but you can from almost any other
type.

Could you clarify what you mean by a "base type" here? I suspect you
mean "value type" but it's not terrible clear.
 
Yes, base type was a poor choice of words. Value type, such as int, long,
char, etc. is what I meant.

Chris
 
Chris LaJoie said:
you can't inherit from a base type in C#, but you can from almost any other
type.

public class al : ArrayList {}

with this, you could just use 'al' in place of 'ArrayList' if you were
declaring that type. Other than that, I don't believe there's an
equivelant.

the problem is that this approach does not work on value types.
but i never understood why they didn't allow inheritance of value types
 
codymanix said:
the problem is that this approach does not work on value types.
but i never understood why they didn't allow inheritance of value types

Consider if you could derive from value types. Suddenly they wouldn't
have a fixed size - an array declared as:

MyStruct[] x = new MyStruct[100];

couldn't just allocate 100*(size of MyStruct) because it could hold
MyDerivedStruct elements which were larger than MyStruct ones.

Now propagate that problem everywhere else - parameter passing, stack
sizing, etc.
 
C-sytle macros in C# would be nice. Or at least a replacement for them.

#define ascii System.Text.ASCIIEncoding.ASCII

They could also be used to create "functions within functions". ahh the
good ol days ;)

Chris


Jon Skeet said:
codymanix said:
the problem is that this approach does not work on value types.
but i never understood why they didn't allow inheritance of value types

Consider if you could derive from value types. Suddenly they wouldn't
have a fixed size - an array declared as:

MyStruct[] x = new MyStruct[100];

couldn't just allocate 100*(size of MyStruct) because it could hold
MyDerivedStruct elements which were larger than MyStruct ones.

Now propagate that problem everywhere else - parameter passing, stack
sizing, etc.
 
Chris LaJoie said:
C-sytle macros in C# would be nice. Or at least a replacement for them.

#define ascii System.Text.ASCIIEncoding.ASCII

They could also be used to create "functions within functions". ahh the
good ol days ;)

Precisely the reason they should be avoided at all costs, IMHO.
A function is a function, why do we need a way to provide the services of a
function(even psuedo services) that can't be shared among assemblies?
Chris


Jon Skeet said:
codymanix said:
the problem is that this approach does not work on value types.
but i never understood why they didn't allow inheritance of value
types

Consider if you could derive from value types. Suddenly they wouldn't
have a fixed size - an array declared as:

MyStruct[] x = new MyStruct[100];

couldn't just allocate 100*(size of MyStruct) because it could hold
MyDerivedStruct elements which were larger than MyStruct ones.

Now propagate that problem everywhere else - parameter passing, stack
sizing, etc.
 
Daniel O'Connell said:
Precisely the reason they should be avoided at all costs, IMHO.
A function is a function, why do we need a way to provide the services of a
function(even psuedo services) that can't be shared among assemblies?

Agreed. Macros are a good way of destroying readability, IMO.
 
Consider if you could derive from value types. Suddenly they wouldn't
have a fixed size - an array declared as:

MyStruct[] x = new MyStruct[100];

couldn't just allocate 100*(size of MyStruct) because it could hold
MyDerivedStruct elements which were larger than MyStruct ones.

Now propagate that problem everywhere else - parameter passing, stack
sizing, etc.


there will be no problem. in c/c++/pascal and most of other languages allow
inheritance of value types. where could there be a problem?

1) a cast from MyStruct[] x to MyDerivedStruct[] x is not allowed
2) when you pass MyDerivedStruct as parameter where MyStruct was expected a
downcast is performed
3) the same if for assignment
4) upcasts for structs are never allowed because the members of the derived
class are lost when a downcast occures so you can't recreate them when you
perform an upcast.

i would consider it to forbid virtual methods in structs because they could
lead to access members which arent there:

struct A
{
public abstract virtual void foo();
}

struct B : A
{
int x;
public override void foo() { x=0; }
}

B b = new B();
A a = b;
a.foo(); // the struct A does not contain x

i see no reason why inheritance of structs should not be allowed.
 
Agreed. Macros are a good way of destroying readability, IMO.

Thats a matter of opinion though. I think they improved readability bo
eliminating a bunch commonly used blocks of code.
 
Back
Top