Dataset disconnected arch and memory

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

Hi

When I load a dataset with data obtained from a db, the dataset is stored in the memory of the client system. AFAIK the size of the dataset (which implicitly mean the number of rows that can be inserted in the dataset) is limited by the client system's memory.

What happens when the dataset exceeds this limit? I suppose Windows will start using the virtual memory. But what will happen if even this is surpassed? Is there any way to know how much memory I could use beforehand so that I can use that much memory only

Thanks a lot

Benn
 
Hi Benny,

How big your dataset will be?

It will be impractical keep a huge dataset in memory, not only this consume
a lot of resources on the client machine but working with it will be also
resource intensive, remember that a database engine is an optimized tool to
handle big quantities of data and perform operations on them.
In this case would be more efficient to hit the DB several times and get
only the subset needed at the moment than gets all the data and manipulate
it from memory.


Cheers,

--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation

Benny said:
Hi,

When I load a dataset with data obtained from a db, the dataset is stored
in the memory of the client system. AFAIK the size of the dataset (which
implicitly mean the number of rows that can be inserted in the dataset) is
limited by the client system's memory.
What happens when the dataset exceeds this limit? I suppose Windows will
start using the virtual memory. But what will happen if even this is
surpassed? Is there any way to know how much memory I could use beforehand
so that I can use that much memory only?
 
This is right. But nonetheless, I was just wondering what the results would have been. Any idea

Benn

----- Ignacio Machin ( .NET/ C# MVP ) wrote: ----

Hi Benny

How big your dataset will be

It will be impractical keep a huge dataset in memory, not only this consum
a lot of resources on the client machine but working with it will be als
resource intensive, remember that a database engine is an optimized tool t
handle big quantities of data and perform operations on them
In this case would be more efficient to hit the DB several times and ge
only the subset needed at the moment than gets all the data and manipulat
it from memory


Cheers

--
Ignacio Machin
ignacio.machin AT dot.state.fl.u
Florida Department Of Transportatio

Benny said:
in the memory of the client system. AFAIK the size of the dataset (whic
implicitly mean the number of rows that can be inserted in the dataset) i
limited by the client system's memorystart using the virtual memory. But what will happen if even this i
surpassed? Is there any way to know how much memory I could use beforehan
so that I can use that much memory only
 
I'd expect the Fill method to throw an exception when you go to fill it with
GBs of data. In a perfect world maybe you'd get OutOfMemoryException, but
who knows. It should be easy to test: create a db table with a ridiculous
number of rows and fat columns and then Fill a dataset with the whole table.
 
Back
Top