J
Jitu
I'm really stumped on this one and would appreciate some
pointers...
I have an app that allows importing certain data in a
dataset (at which point its all in memory) and then use
BinaryFormatter to store the dataset and some more stuff
in a binary file as my application's data file. This all
works fine with smaller files, but some of the datasets
can be as large as 100 Meg when loaded in memory. The
import operation works fine, but when I try to serialize
my app's data in such a case using BinaryFormatter to
FileStream, I get a OutOfMemoryException. I have about
512M RAM. Only once was I able to write it successfully
without getting the OOM exception, and noticed that the
peak memory usage for the process went up to over a gig. I
have no idea why, and have tried a number of different
things such as explicitly specifying the target stream as
File in the context for the BinaryFormatter, but don't
seem to be able to avoid the exception.
Any thoughts on how I could handle this so that it can get
written out and read back from the file? I do want to
support 100M datasets in memory, so using SQL server, etc.
is not an option. This strictly has to be a file based
dataset.
Thanks in advance.
Jitu
pointers...
I have an app that allows importing certain data in a
dataset (at which point its all in memory) and then use
BinaryFormatter to store the dataset and some more stuff
in a binary file as my application's data file. This all
works fine with smaller files, but some of the datasets
can be as large as 100 Meg when loaded in memory. The
import operation works fine, but when I try to serialize
my app's data in such a case using BinaryFormatter to
FileStream, I get a OutOfMemoryException. I have about
512M RAM. Only once was I able to write it successfully
without getting the OOM exception, and noticed that the
peak memory usage for the process went up to over a gig. I
have no idea why, and have tried a number of different
things such as explicitly specifying the target stream as
File in the context for the BinaryFormatter, but don't
seem to be able to avoid the exception.
Any thoughts on how I could handle this so that it can get
written out and read back from the file? I do want to
support 100M datasets in memory, so using SQL server, etc.
is not an option. This strictly has to be a file based
dataset.
Thanks in advance.
Jitu