mySqlBytes.buffer is getting converted to BigEndian even though both SQL server 2005 and the CLR fun

  • Thread starter Thread starter DR
  • Start date Start date
D

DR

mySqlBytes.buffer is getting converted to BigEndian even though both SQL
server 2005 and the CLR function are on the same machine which shows
BitConverter.IsLittleEndian == true

in tsql: select * from dbo.MY_CLR_FUNCTION(cast(1024 as binary(4)))

public static int MY_CLR_FUNCTION(SqlBytes mySqlBytes)

in debugger binaryData.buffer now shows bigendian!!
[0] 0 byte
[1] 0 byte
[2] 4 byte
[3] 0 byte

the data stays littleendian if the integer is hard coded in C#:

int myint;
myint = 1024;
byte[] b = new byte[4];
b = BitConverter.GetBytes(myint);

in debugger binaryData.buffer shows littleendian
[0] 0 byte
[1] 4 byte
[2] 0 byte
[3] 0 byte


in all cases BitConverter.IsLittleEndian shows true, so why is the integer
hard coded in tsql getting converted to big endian when it arrives in a clr
function?
 
in debugger binaryData.buffer now shows bigendian!!
[0] 0 byte
[1] 0 byte
[2] 4 byte
[3] 0 byte

That is little-endian, as the most significant byte is last. This is typical
of x86 processors, and is used by Windows for storage.
in debugger binaryData.buffer shows littleendian
[0] 0 byte
[1] 4 byte
[2] 0 byte
[3] 0 byte

This is big-endian, as the most significant byte is the first.

The BitConverter's IsLittleEndian property indicates the endianness of the
computer architecture. So, it will not change, except on a different
computer.

--
HTH,

Kevin Spencer
Chicken Salad Surgeon
Microsoft MVP

DR said:
mySqlBytes.buffer is getting converted to BigEndian even though both SQL
server 2005 and the CLR function are on the same machine which shows
BitConverter.IsLittleEndian == true

in tsql: select * from dbo.MY_CLR_FUNCTION(cast(1024 as binary(4)))

public static int MY_CLR_FUNCTION(SqlBytes mySqlBytes)
in debugger binaryData.buffer now shows bigendian!!
[0] 0 byte
[1] 0 byte
[2] 4 byte
[3] 0 byte

the data stays littleendian if the integer is hard coded in C#:

int myint;
myint = 1024;
byte[] b = new byte[4];
b = BitConverter.GetBytes(myint);

in debugger binaryData.buffer shows littleendian
[0] 0 byte
[1] 4 byte
[2] 0 byte
[3] 0 byte


in all cases BitConverter.IsLittleEndian shows true, so why is the integer
hard coded in tsql getting converted to big endian when it arrives in a
clr function?
 
in debugger binaryData.buffer now shows bigendian!!
[0] 0 byte
[1] 0 byte
[2] 4 byte
[3] 0 byte

That is little-endian, as the most significant byte is last. This is typical
of x86 processors, and is used by Windows for storage.

No, the most significant byte is first: 1024=0*2^24 + 0*2^16 + 4*2^8 +
0*2^0

Hence it's big-endian.

Short test app for BitConverter to demonstrate:

using System;
using System.Collections.Generic;

class Program
{
static void Main(string[] args)
{
byte[] bytes = BitConverter.GetBytes(1024);
for (int i=0; i < bytes.Length; i++)
{
Console.WriteLine ("bytes[{0}]={1}", i, bytes);
}
Console.WriteLine ("LittleEndian? {0}",
BitConverter.IsLittleEndian);
}
}

On my box that shows:
bytes[0]=0
bytes[1]=4
bytes[2]=0
bytes[3]=0
LittleEndian? True

So the pattern quoted at the top of this post (where [1]=0, [2]=4) is
big endian.

Jon
 
jon is right, it the original post is correct on endianness. but the
question still stands, why does every windows product use little endian
accept sql server's convert to varbinary use big endian?


Kevin Spencer said:
in debugger binaryData.buffer now shows bigendian!!
[0] 0 byte
[1] 0 byte
[2] 4 byte
[3] 0 byte

That is little-endian, as the most significant byte is last. This is
typical of x86 processors, and is used by Windows for storage.
in debugger binaryData.buffer shows littleendian
[0] 0 byte
[1] 4 byte
[2] 0 byte
[3] 0 byte

This is big-endian, as the most significant byte is the first.

The BitConverter's IsLittleEndian property indicates the endianness of the
computer architecture. So, it will not change, except on a different
computer.

--
HTH,

Kevin Spencer
Chicken Salad Surgeon
Microsoft MVP

DR said:
mySqlBytes.buffer is getting converted to BigEndian even though both SQL
server 2005 and the CLR function are on the same machine which shows
BitConverter.IsLittleEndian == true

in tsql: select * from dbo.MY_CLR_FUNCTION(cast(1024 as binary(4)))

public static int MY_CLR_FUNCTION(SqlBytes mySqlBytes)
in debugger binaryData.buffer now shows bigendian!!
[0] 0 byte
[1] 0 byte
[2] 4 byte
[3] 0 byte

the data stays littleendian if the integer is hard coded in C#:

int myint;
myint = 1024;
byte[] b = new byte[4];
b = BitConverter.GetBytes(myint);

in debugger binaryData.buffer shows littleendian
[0] 0 byte
[1] 4 byte
[2] 0 byte
[3] 0 byte


in all cases BitConverter.IsLittleEndian shows true, so why is the
integer hard coded in tsql getting converted to big endian when it
arrives in a clr function?
 
I'm thinking this is a bug in sql server to use big endian. Hopefully future
versions of sql server will consistantly use little endian per the Microsoft
standard.

DR said:
jon is right, it the original post is correct on endianness. but the
question still stands, why does every windows product use little endian
accept sql server's convert to varbinary use big endian?


Kevin Spencer said:
in debugger binaryData.buffer now shows bigendian!!
[0] 0 byte
[1] 0 byte
[2] 4 byte
[3] 0 byte

That is little-endian, as the most significant byte is last. This is
typical of x86 processors, and is used by Windows for storage.
in debugger binaryData.buffer shows littleendian
[0] 0 byte
[1] 4 byte
[2] 0 byte
[3] 0 byte

This is big-endian, as the most significant byte is the first.

The BitConverter's IsLittleEndian property indicates the endianness of
the computer architecture. So, it will not change, except on a different
computer.

--
HTH,

Kevin Spencer
Chicken Salad Surgeon
Microsoft MVP

DR said:
mySqlBytes.buffer is getting converted to BigEndian even though both SQL
server 2005 and the CLR function are on the same machine which shows
BitConverter.IsLittleEndian == true

in tsql: select * from dbo.MY_CLR_FUNCTION(cast(1024 as binary(4)))

public static int MY_CLR_FUNCTION(SqlBytes mySqlBytes)
in debugger binaryData.buffer now shows bigendian!!
[0] 0 byte
[1] 0 byte
[2] 4 byte
[3] 0 byte

the data stays littleendian if the integer is hard coded in C#:

int myint;
myint = 1024;
byte[] b = new byte[4];
b = BitConverter.GetBytes(myint);

in debugger binaryData.buffer shows littleendian
[0] 0 byte
[1] 4 byte
[2] 0 byte
[3] 0 byte


in all cases BitConverter.IsLittleEndian shows true, so why is the
integer hard coded in tsql getting converted to big endian when it
arrives in a clr function?
 
DR said:
I'm thinking this is a bug in sql server to use big endian. Hopefully future
versions of sql server will consistantly use little endian per the Microsoft
standard.

While it's certainly slightly odd that it uses big endian (without
specifically documenting it) I certainly hope MS has more concern for
backward compatibility than to change the conversion at this stage.
 
Sorry, Jon. I used a calculator to check my figures, and I misread the
results. I do a lot of work with files that may be big- or little-endian,
and wrote the routines to read them correctly, but that was some time ago.
Put it down to early morning syndrome. My math skills improve as the day
wears on!

--

Kevin Spencer
Chicken Salad Surgeon
Microsoft MVP

4*2^8 =
Jon Skeet said:
in debugger binaryData.buffer now shows bigendian!!
[0] 0 byte
[1] 0 byte
[2] 4 byte
[3] 0 byte

That is little-endian, as the most significant byte is last. This is
typical
of x86 processors, and is used by Windows for storage.

No, the most significant byte is first: 1024=0*2^24 + 0*2^16 + 4*2^8 +
0*2^0

Hence it's big-endian.

Short test app for BitConverter to demonstrate:

using System;
using System.Collections.Generic;

class Program
{
static void Main(string[] args)
{
byte[] bytes = BitConverter.GetBytes(1024);
for (int i=0; i < bytes.Length; i++)
{
Console.WriteLine ("bytes[{0}]={1}", i, bytes);
}
Console.WriteLine ("LittleEndian? {0}",
BitConverter.IsLittleEndian);
}
}

On my box that shows:
bytes[0]=0
bytes[1]=4
bytes[2]=0
bytes[3]=0
LittleEndian? True

So the pattern quoted at the top of this post (where [1]=0, [2]=4) is
big endian.

Jon
 
Back
Top