E
Eran
Hi,
I have a huge data structure, which I previosly stored in a
Dictionary<int, MyObj>
MyObj is relatively small (2 int, 1 DateTime, 1 bool).
The dictionary I am using is quite large (25,000), and I have 500 such
dictionaries.
What I've noticed is that the total memory consumed became over 1 GB.
When I changed the implementation to List<MyObj>, or SortedList<int,
MyObj> the memory consumption deteriorated drastically (to 500 -
600MB). Ofcourse, I need the fetching O(1) functionality, so the lists
are not a very good work-around.
What I am asking, is this a normal overhead?
I can hardly understand a x2 overhead between hashtables and lists. Am
I wrong here?
Do you know of a more efficient implementation of a Dictionary object?
I have a huge data structure, which I previosly stored in a
Dictionary<int, MyObj>
MyObj is relatively small (2 int, 1 DateTime, 1 bool).
The dictionary I am using is quite large (25,000), and I have 500 such
dictionaries.
What I've noticed is that the total memory consumed became over 1 GB.
When I changed the implementation to List<MyObj>, or SortedList<int,
MyObj> the memory consumption deteriorated drastically (to 500 -
600MB). Ofcourse, I need the fetching O(1) functionality, so the lists
are not a very good work-around.
What I am asking, is this a normal overhead?
I can hardly understand a x2 overhead between hashtables and lists. Am
I wrong here?
Do you know of a more efficient implementation of a Dictionary object?