Been looking a bit around and i can't seem to find a way to calculate
the memory usage of a dataset/datatable...
I would like to get the potential usage of memory and the actual size of
a specific datatable/dataset.
That will be a very complex calculation.
A single data table will contain at least (you will need
to check all the data structures in reflector to the exact):
number of colums * average size of meta data about columns
number of rows * (size of reference + object overhead for each row object)
number of rows * number of columns * (size of reference + object
overhead + average size of data)
The size of data will depend on the data type:
integers/floating point - 2/4/8 depending on specific type
strings - 2*actual character length in database
For few rows the overhead will probably be a lot bigger
than the data itself.
For many rows the data will be more important.
For many rows and very long strings (or BLOB's) then
the data will completely dwarf the overhead.
It is probably easier to do some measurement on a specific
data table than to try to calculate it.
Or you can just make an assumption that:
data table size in memory < 3 * raw data size
I am confident that it will be true for many realistic data.
Arne