DataSet Memory Usage

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

I'm working with a large ADO.Net DataSet and my application uses quite a bit
of memory.

According to the memory profiler I am using, there are about 120,000
DataRows, and they take up about 5 Megabytes. These DataRows are contained in
a total of about 70 tables. However in addition there are about 8,000 int32[]
array objects that all together take up about 50 Megabytes. I think that most
of these are instantiated from within the System.Data namespace.

My application inserts, deletes or modifies about 10 percent of the
DataRows. It also uses DataViews and sorting.

Is this a known problem of some kind?
 
Um, it's a little late so I can't do the math, but off of the top of my
head, those numbers looks approx correct. Remember all of that data is
getting cached locally - so that sounds right. The easy way is to check w/
a trace and see what the Ram being used before you call Fill is, and then
afterward.
 
Starting with the dataset partially filled (1.0M fields), filling it with
2.7M more fields increases the memory footprint of my application from 38 to
62 MB. This is 24 Mbytes for 2.7M fields, or about 9 bytes per field, which
might not be too unreasonable. But then, after much updating, deleting, and
adding, the number of fields is almost unchanged but the memory footprint
goes up to 252 MB. Is this a memory leak, or some kind of non-scaling
behavior of DataSet?

Unfortuantely my memory profiler has not been too useful so far because it
slows the application down too much.
 
Starting with the dataset partially filled (1.0M fields), filling it with
2.7M more fields increases the memory footprint of my application from 38
to
62 MB. This is 24 Mbytes for 2.7M fields, or about 9 bytes per field,
which
might not be too unreasonable. But then, after much updating, deleting,
and
adding, the number of fields is almost unchanged but the memory footprint
goes up to 252 MB. Is this a memory leak, or some kind of non-scaling
behavior of DataSet?

These numbers by themselves are virtually meaningless. What's your system
free memory? Maybe you've got a lot of memory which the garbage collector
could collect if it wanted to, but it doesn't feel it needs to do it.

How is your paging activity, especially hard faults (page faults which cause
a page to be read from disk).

A large virtual memory size may be alarming, but by itself it doesn't mean
much at all.

John Saunders
 
The numbers I gave are not virtually meaningless.

Our application occasionally throws out of memory errors (a total of two
times in one month of sporadic production running on six different servers).
That is why I am working on this problem.

I have been able to investigate garbage collector activity only in the first
phase of the behavior I described in my previopus post, while the memory
footprint is still around 70 MB. The garbage collector is very active (1,300
level 0 GCs, 290 level 1 GCs and 14 level 2 GCs in one hour). Unfortunately
the memory profiler I am using (.NET Memory Profiler 2.0) slows the
application down so it never reaches the point where the footprint starts to
grow).

By the time the application (without profiling) reaches 250 MB memory use on
my development system, which has 500 MB of RAM, the system is almost out of
memory. I have observed the application grow to 800 MB running on a server;
the servers have 2 GB of memory.
 
The numbers I gave are not virtually meaningless.

Our application occasionally throws out of memory errors (a total of two
times in one month of sporadic production running on six different
servers).
That is why I am working on this problem.

You didn't say that before. It's context. Keep telling us those things.
I have been able to investigate garbage collector activity only in the
first
phase of the behavior I described in my previopus post, while the memory
footprint is still around 70 MB.

You didn't mention your investigation, either. You just quoted numbers.
The garbage collector is very active (1,300
level 0 GCs, 290 level 1 GCs and 14 level 2 GCs in one hour).
Unfortunately
the memory profiler I am using (.NET Memory Profiler 2.0) slows the
application down so it never reaches the point where the footprint starts
to
grow).

By the time the application (without profiling) reaches 250 MB memory use
on
my development system, which has 500 MB of RAM, the system is almost out
of
memory.

250MB physical or virtual? What was the size of your system cache?
I have observed the application grow to 800 MB running on a server;
the servers have 2 GB of memory.

Have the servers run out of memory as well?

You may well be going about this the right way, but you haven't said so in
your posts. You've got all the information about what's going on - please
share it with the newsgroup!


John Saunders
 
Hi,

First of all, I would like to confirm my understanding of your issue. From
your description, I understand that you got some memory erros when using
large DataSet to cache data. If there is any misunderstanding, please feel
free to let me know.

Could you let us know more information on the exceptions? For example, what
kind of exception and exception messages. Could you explain more on your
project. If the DataSet is too large in memory, it will be hard for us to
manage data and encounter a performance hit. I think we can try to find a
way to put less data in memory to release more resources.

Kevin Yu
=======
"This posting is provided "AS IS" with no warranties, and confers no
rights."
 
One factor that will contribute to this sort of memory usage is the use of
DataViews. When a DataView contains sort or filter parameters the DataTable's
internal indexes ArrayList gets an Index object added which contains an int array
with the row number of the rows the DataView refers to. The same thing happens
when you use the DataTable's Select method.
 
Back
Top