How do you read the data from datareader without waiting ExecuteReader to finish first

  • Thread starter Thread starter GG
  • Start date Start date
G

GG

I have millions of rows of data into a table and need to archive it in a
txt file. However, I the app crashes after a while and the memory maxes
out.
How can I issue reader = command.ExecuteReader and not without for the
app to cache all data locally on the pc and start using the rows in a
while (reader.Read()) {
--use the rows to save the txt file
}
?

Thanks
 
GG said:
I have millions of rows of data into a table and need to archive it in a
txt file. However, I the app crashes after a while and the memory maxes
out.
How can I issue reader = command.ExecuteReader and not without for the
app to cache all data locally on the pc and start using the rows in a
while (reader.Read()) {
--use the rows to save the txt file
}
?

A data reader should not cache the data locally.

What provider are you using?

Arne
 
GG said:
I have millions of rows of data into a table and need to archive it in a
txt file. However, I the app crashes after a while and the memory maxes
out.
How can I issue reader = command.ExecuteReader and not without for the
app to cache all data locally on the pc and start using the rows in a
while (reader.Read()) {
--use the rows to save the txt file
}
?

A data reader should not cache the data locally.

What provider are you using?

Arne
 
Hello GG,
I have millions of rows of data into a table and need to archive it in
a
txt file. However, I the app crashes after a while and the memory
maxes
out.
How can I issue reader = command.ExecuteReader and not without for the
app to cache all data locally on the pc and start using the rows in a
while (reader.Read()) {
--use the rows to save the txt file
}
?

Thanks

The Datareader should do exactly that, so there might be something else going
wrong in your code. Are you writing directly to the file, or are you reading
the contents before writing?

Also, if you're reading from SQL Server, have a look at SSIS (SQL Server
Integration Services), which is designed to load and extract data to and
from SQL Server databases.
 
Hello GG,
I have millions of rows of data into a table and need to archive it in
a
txt file. However, I the app crashes after a while and the memory
maxes
out.
How can I issue reader = command.ExecuteReader and not without for the
app to cache all data locally on the pc and start using the rows in a
while (reader.Read()) {
--use the rows to save the txt file
}
?

Thanks

The Datareader should do exactly that, so there might be something else going
wrong in your code. Are you writing directly to the file, or are you reading
the contents before writing?

Also, if you're reading from SQL Server, have a look at SSIS (SQL Server
Integration Services), which is designed to load and extract data to and
from SQL Server databases.
 
Where is the database server? Local?

Are you attempting to create a file on your local pc with millions of rows
of data. How are you handling the writing to the file?

Probably a good idea to give us the code example in full as there are too
many variables. Licklyhood is it is nothing to do with DataReader but
something you are doing within your loop.
 
Where is the database server? Local?

Are you attempting to create a file on your local pc with millions of rows
of data. How are you handling the writing to the file?

Probably a good idea to give us the code example in full as there are too
many variables. Licklyhood is it is nothing to do with DataReader but
something you are doing within your loop.
 
Try to fix that in your SQL Transact code

A simple method is to use the where clause which created slices in the way
you want.

Cor
 
Try to fix that in your SQL Transact code

A simple method is to use the where clause which created slices in the way
you want.

Cor
 
This is the code that I am running. I am not even trying to save to a
txt file and the app still crashes.

int cnt = 0;
using (AseConnection connection = new AseConnection( aseConnString )) {
AseCommand command = new AseCommand( sqlSel, connection );
connection.Open();
AseDataReader reader = command.ExecuteReader();
while (reader.Read()) {
cnt++;
Console.WriteLine( cnt.ToString( "#,0" ) );
}
reader.Close();
}

Ase is Sybase's implementation of ado. Stepping through the code, the
compiler stops at
AseDataReader reader = command.ExecuteReader();
and then start to see the memory increasing and then after a while it
crashes. I do not even go to the while loop. I am trying to create an
alternative for bcp out from tables so we can use xcopy deployment.


Thanks
 
This is the code that I am running. I am not even trying to save to a
txt file and the app still crashes.

int cnt = 0;
using (AseConnection connection = new AseConnection( aseConnString )) {
AseCommand command = new AseCommand( sqlSel, connection );
connection.Open();
AseDataReader reader = command.ExecuteReader();
while (reader.Read()) {
cnt++;
Console.WriteLine( cnt.ToString( "#,0" ) );
}
reader.Close();
}

Ase is Sybase's implementation of ado. Stepping through the code, the
compiler stops at
AseDataReader reader = command.ExecuteReader();
and then start to see the memory increasing and then after a while it
crashes. I do not even go to the while loop. I am trying to create an
alternative for bcp out from tables so we can use xcopy deployment.


Thanks
 
GG,

What I meant is that if you do this in a loop where your include in your
select something

Let say that you know your keys are from 1 to 1000 then you can do in a loop
every time a the next 100 and build this around your code.

You can as well use the database cursor, but that is not as simple as above.

Cor
 
GG,

What I meant is that if you do this in a loop where your include in your
select something

Let say that you know your keys are from 1 to 1000 then you can do in a loop
every time a the next 100 and build this around your code.

You can as well use the database cursor, but that is not as simple as above.

Cor
 
I agree that there are workarounds for one table by chopping things up.
However, for a general purpose "bcp out" that will not work since each
table is different and will need more code. I was reading the help for
datareader and says
"The DataReader provides an unbuffered stream of data that
allows procedural logic to efficiently process results from
a data source sequentially. The DataReader is a good choice
when retrieving large amounts of data because the data is
not cached in memory."
I should probably contact sybase and how they have implemented it.

Thanks
 
I agree that there are workarounds for one table by chopping things up.
However, for a general purpose "bcp out" that will not work since each
table is different and will need more code. I was reading the help for
datareader and says
"The DataReader provides an unbuffered stream of data that
allows procedural logic to efficiently process results from
a data source sequentially. The DataReader is a good choice
when retrieving large amounts of data because the data is
not cached in memory."
I should probably contact sybase and how they have implemented it.

Thanks
 
I would be pointing my finger at the SyBase implemetation of a dat provider
personally.

It is most likely their implementation of the datareader that is causing you
issues I would check their support site.
 
I would be pointing my finger at the SyBase implemetation of a dat provider
personally.

It is most likely their implementation of the datareader that is causing you
issues I would check their support site.
 
Back
Top