B
Bob
It's not especially important, but I always like to know the best way of doing
things for when I encounter a case where performance becomes a factor... say I
have a string array and I want to get a distinct list, that is, get a list
without duplicates. In SQL this is easy because it's done for you. In code, what
would be the best practice? Array.Sort(stringarray) followed by n comparisons
collecting on the changes? Or perhaps adding each string as a key to a
collection or hashtable and just letting duplicates refusal do the work? I have
always wondered, are there circumstances when catching errors in quantity like
this degrade performance? Is it common, avoided?
Bob
things for when I encounter a case where performance becomes a factor... say I
have a string array and I want to get a distinct list, that is, get a list
without duplicates. In SQL this is easy because it's done for you. In code, what
would be the best practice? Array.Sort(stringarray) followed by n comparisons
collecting on the changes? Or perhaps adding each string as a key to a
collection or hashtable and just letting duplicates refusal do the work? I have
always wondered, are there circumstances when catching errors in quantity like
this degrade performance? Is it common, avoided?
Bob