.NET 2.0 Dictionary<> overhead

  • Thread starter Thread starter Eran
  • Start date Start date
E

Eran

Hi,

I have a huge data structure, which I previosly stored in a
Dictionary<int, MyObj>
MyObj is relatively small (2 int, 1 DateTime, 1 bool).

The dictionary I am using is quite large (25,000), and I have 500 such
dictionaries.
What I've noticed is that the total memory consumed became over 1 GB.
When I changed the implementation to List<MyObj>, or SortedList<int,
MyObj> the memory consumption deteriorated drastically (to 500 -
600MB). Ofcourse, I need the fetching O(1) functionality, so the lists
are not a very good work-around.

What I am asking, is this a normal overhead?
I can hardly understand a x2 overhead between hashtables and lists. Am
I wrong here?
Do you know of a more efficient implementation of a Dictionary object?
 
"Eran" <[email protected]> a écrit dans le message de (e-mail address removed)...

| I have a huge data structure, which I previosly stored in a
| Dictionary<int, MyObj>
| MyObj is relatively small (2 int, 1 DateTime, 1 bool).
|
| The dictionary I am using is quite large (25,000), and I have 500 such
| dictionaries.
| What I've noticed is that the total memory consumed became over 1 GB.
| When I changed the implementation to List<MyObj>, or SortedList<int,
| MyObj> the memory consumption deteriorated drastically (to 500 -
| 600MB). Ofcourse, I need the fetching O(1) functionality, so the lists
| are not a very good work-around.
|
| What I am asking, is this a normal overhead?
| I can hardly understand a x2 overhead between hashtables and lists. Am
| I wrong here?
| Do you know of a more efficient implementation of a Dictionary object?

How are you measuring the memory ? Don't forget, memory for lists tends to
get allocated in double the previous allocation as the list grows

Then do a quick count of the memory required for 500 x 25,000 x the instance
size of your object, plus 500 x 25,000 x the instance size of an int, plus
the instance size of the 500 empty Dictionary<,> objects. Then factor in the
memory fragmentation caused by adding items one at a time, allowing for the
memory manager reallocating every time the present allocation is exceeded.

No, I don't think it is particularly excessive, compared with List<>.

Joanna
 
Back
Top