D
DR
mySqlBytes.buffer is getting converted to BigEndian even though both SQL
server 2005 and the CLR function are on the same machine which shows
BitConverter.IsLittleEndian == true
in tsql: select * from dbo.MY_CLR_FUNCTION(cast(1024 as binary(4)))
public static int MY_CLR_FUNCTION(SqlBytes mySqlBytes)
in debugger binaryData.buffer now shows bigendian!!
[0] 0 byte
[1] 0 byte
[2] 4 byte
[3] 0 byte
the data stays littleendian if the integer is hard coded in C#:
int myint;
myint = 1024;
byte[] b = new byte[4];
b = BitConverter.GetBytes(myint);
in debugger binaryData.buffer shows littleendian
[0] 0 byte
[1] 4 byte
[2] 0 byte
[3] 0 byte
in all cases BitConverter.IsLittleEndian shows true, so why is the integer
hard coded in tsql getting converted to big endian when it arrives in a clr
function?
server 2005 and the CLR function are on the same machine which shows
BitConverter.IsLittleEndian == true
in tsql: select * from dbo.MY_CLR_FUNCTION(cast(1024 as binary(4)))
public static int MY_CLR_FUNCTION(SqlBytes mySqlBytes)
in debugger binaryData.buffer now shows bigendian!!
[0] 0 byte
[1] 0 byte
[2] 4 byte
[3] 0 byte
the data stays littleendian if the integer is hard coded in C#:
int myint;
myint = 1024;
byte[] b = new byte[4];
b = BitConverter.GetBytes(myint);
in debugger binaryData.buffer shows littleendian
[0] 0 byte
[1] 4 byte
[2] 0 byte
[3] 0 byte
in all cases BitConverter.IsLittleEndian shows true, so why is the integer
hard coded in tsql getting converted to big endian when it arrives in a clr
function?