quick question: how datareader stores data

  • Thread starter Thread starter Vadim
  • Start date Start date
V

Vadim

Hi,

Where does the datareader store it's data in this scenario:
Create sqlconnection,
Create sqlCommand - query specified here could return 1000 rows
SqlCommand.ExecuteDataReader, returns datareader
use DataReader.read to iterate through the rows.

As far as I understand all the 1000 rows in this case will be initially
brought to the client and consume memory,
they won't stay on the db server and be retrieved one by one with each read.
Is this correct?
Or will they stay on a db server somehow and be retrieved one by one, since
data reader require open connection all the time?

Thank you

Vadim
 
If you were using a DataSet they would be kept in memory, but not with a
DataReader. They are read as you iterate thru the rows, and yes, you need
your connection to be open for the whole duration. Both have their
advantages, it depends on what you're trying to do.
 
John,

Thank you for reply. So essentially if my query returns 1000s of records and
I want to read them one by one and save
into a text file, I don't have to worry about the memory consumption,
because on each read only one row will be pulled from
the database and saved into the file, so the data is no taccumulated in the
server's memory.
Can you please verify this?
Thank you

Vadim
 
You're welcome :)

Yes, your data will be fetched as you read from the DataReader (it's a
"connected", lightweight, quick and forward-only [firehose cursor] way to
read data), with very little memory overhead, but it keeps the DB connection
open for a little bit longer (usually not a problem). So essentially, you
are correct. DataSets instead are a "disconnected" way to read data that
caches a lot of extra stuff, and is way more demanding in resources
(memory/cpu) but works better in some cases (I avoid them as much as
possible i.e. not use it unless you need its "extra" features). DataReaders
are definately faster/lighter, but then again don't forget to close your
connection after (you don't want to "leak" DB connections), perhaps by using
..ExecuteReader(CommandBehavior.CloseConnection) so it closes when you close
your DataReader (proper error handling can't hurt either). There are good
ADO.Net books out there, and LOTS of very good online/free resources (MSDN,
..Net sites, tutorials, blogs, etc) that can get you more familiar with
ADO.Net :)
 
John,

Do you have any idea how much extra memory you use with a datatable of 1000
rows without any relation or whatever.

The way you write it I get the idea it is something as 10Mb extra. I have
the idea that it is much less than as soon as you start to write your own
classes not build on the principle of a datatable..

A datatable exist from datarows which have very few overhead in them, the
only things are the address of the table they exist in and the rowstates.
Beside that there are the DataColumns to keep the descriptions for every
column and than there is of course the datatable description.

I assume that as soon as you start to build your own class, with every
description on the lowest level, you are consuming much more memory.

If it are procedures as straight reading and printing, than the datareader
consumes less memory.

Just my thought,

Cor
 
Hi Cor,
My goal is to build a csv file from sql server data and send it to a
clients's browser, so essentially I will be reading data row by row,
accumulate it in a file and send to a browser when it's done, I want to do
it this way in order to consume as little memory as possible, and my
question was if I use data reader will it just keep in memory one row at a
time or accumulate it somewhere. I think the answer is one at a time and I
guess memory consumption would be much higher with dataset, btw I am using
1.1.
I would be happy to avoid using files as buffers for security reasons, but
not sure how else this could be done.

Thank you

Vadim
 
Thank you, Cor.
Cor Ligthert said:
Vadim,

Did I in fact that not answer in my previous message.
If you write the CSV rows instead of printing line than you do in my idea
what I wrote and is the datareader your route to go.

http://msdn2.microsoft.com/en-us/library/t5s055f0.aspx

And to build your string to write

http://msdn.microsoft.com/library/d...ml/frlrfsystemtextstringbuilderclasstopic.asp

(Be aware that a CSV file is culture dependend, in non English countries
the delimiter is a ";").

I hope this helps,

Cor
 
Yes I do (I'm no MVP but I still code for a living). It's not 10MB (unless
you have a table with a ridiculous amount of columns or large amounts of
data in some columns). But between storing 1000 average rows in memory or
using a DataReader, there still is a significant difference, which is
useless overhead IMHO, unless you do need RowState or such, is better
avoided (like in this case). From what I've experienced, the DR-based code
will also scale better (with some "light" Business Objects). If RowState,
Schema or other "advanced" stuff is needed, then DS are definately the way
to go, but I tend to get a bit "ticked off" by people who just blindly use
DataSets for everything (out of lazyness perhaps, and that includes
returning them in Web Services). When they're required/needed/preferable I
use them, but often they're not the best choice.
 
John,

In this way written is there not one word I disagree with you.

Have a look at my second answer to Vadim to verify that if you want..

:-)

Cor
 
Back
Top