N
nightwatch77
Today I stumbled upon a very interesting 'feature' of .net 2.0
My application uses a generic SortedList class for storing a sorted
list of objects (on a DateTime key) - that is, SortedList<DateTime,
object>. However, the program kept crashing for some time until I
debugged it to find out that it crashes on inserting data to the
sorted list. Problem was that I was inserting two items with the same
date (the same key).
Hey, how do you think, should sorted list just crash when there is a
duplicate value? Normal lists are expected to store duplicate values,
but some great inventor at Microsoft has made it better. They have
documented this behavior, but why they assume sorting shouldn't work
for duplicate entries in the list? Normal sorting algorithms are just
fine with that.
Of course, I had to use non-generic ArrayList instead with custom
IComparer for sorting.
And there goes another surprise - ArrayList.Sort implements an
unstable sort algorithm. That is, for equal objects, it does not
guarantee that after sorting they will keep the original order. Of
course this is not documented at all. And of course causes problems -
in my case it would make sequential messages with timestamp come in
wrong order.
I don't know what sort algorithm did Microsoft use, but simple
QuickSort implementation is a stable algorithm and has no problem with
sorting duplicated values. When Microsoft recruits any developer, they
ask him to implement some list sorting algorithm during job interview,
so I thought all developers in Microsoft are capable of implementing
it correctly.
Best regards,
RG
My application uses a generic SortedList class for storing a sorted
list of objects (on a DateTime key) - that is, SortedList<DateTime,
object>. However, the program kept crashing for some time until I
debugged it to find out that it crashes on inserting data to the
sorted list. Problem was that I was inserting two items with the same
date (the same key).
Hey, how do you think, should sorted list just crash when there is a
duplicate value? Normal lists are expected to store duplicate values,
but some great inventor at Microsoft has made it better. They have
documented this behavior, but why they assume sorting shouldn't work
for duplicate entries in the list? Normal sorting algorithms are just
fine with that.
Of course, I had to use non-generic ArrayList instead with custom
IComparer for sorting.
And there goes another surprise - ArrayList.Sort implements an
unstable sort algorithm. That is, for equal objects, it does not
guarantee that after sorting they will keep the original order. Of
course this is not documented at all. And of course causes problems -
in my case it would make sequential messages with timestamp come in
wrong order.
I don't know what sort algorithm did Microsoft use, but simple
QuickSort implementation is a stable algorithm and has no problem with
sorting duplicated values. When Microsoft recruits any developer, they
ask him to implement some list sorting algorithm during job interview,
so I thought all developers in Microsoft are capable of implementing
it correctly.
Best regards,
RG