B
Bernie Yaeger
This is a question for sql server 2000 and vb .net and xp pro, so I am
posting to all ng's.
Is there any way of knowing or approximating the amount of memory exhausted
in retaining a large disconnected (ado .net) table/dataset? It a .dbf file
is in memory, I could identify the size of the file directly, but with sql
server 2000 and the swapping technologies of windows xp pro, I only know the
size of the .mdf, which is of no value. Say the .mdf is 4.6 gig and it
contains 160 tables, 200 sp's, a few view, triggers, etc. One table - a
large one - has 25 cols and 1.2 million rows. Its physical size is no where
designated. I load this in a vb .net app into a dataset. It takes a long
time, obviously.
But once loaded, approx how much memory has been used? Did it write xml and
is it continually swapping out to get the appropriate xml and then display
it, allow changes to it, etc?
Tx for any help.
Bernie Yaeger
posting to all ng's.
Is there any way of knowing or approximating the amount of memory exhausted
in retaining a large disconnected (ado .net) table/dataset? It a .dbf file
is in memory, I could identify the size of the file directly, but with sql
server 2000 and the swapping technologies of windows xp pro, I only know the
size of the .mdf, which is of no value. Say the .mdf is 4.6 gig and it
contains 160 tables, 200 sp's, a few view, triggers, etc. One table - a
large one - has 25 cols and 1.2 million rows. Its physical size is no where
designated. I load this in a vb .net app into a dataset. It takes a long
time, obviously.
But once loaded, approx how much memory has been used? Did it write xml and
is it continually swapping out to get the appropriate xml and then display
it, allow changes to it, etc?
Tx for any help.
Bernie Yaeger