R
Randy Collins
We need to retrieve approx 200,000 records containing about 100 megabytes of
data across the network from a SQL 2000 DB server to a VB.NET app that then
cursors through this data, does some processing and outputs it to several
destination tables. What is the optimal way to handle this large amount
of data? Should we retrieve this from SQL Server using an ADO.NET
server-side cursor, which takes care of buffering etc? I'm concerned about
the 2 hours or so that the db connection would have to be open. We could
add error handling to reexecute if the query and figure out how to continue
where we left off, but this sounds kludgy. We could retrieve all 100
megabytes of data into a clientside recordset and keep it in RAM. However,
this would not scale should this data increase dramatically in size in the
future. We could retrieve this data to the client server and save it to
disk and then retrieve from it there.
Ideas?
data across the network from a SQL 2000 DB server to a VB.NET app that then
cursors through this data, does some processing and outputs it to several
destination tables. What is the optimal way to handle this large amount
of data? Should we retrieve this from SQL Server using an ADO.NET
server-side cursor, which takes care of buffering etc? I'm concerned about
the 2 hours or so that the db connection would have to be open. We could
add error handling to reexecute if the query and figure out how to continue
where we left off, but this sounds kludgy. We could retrieve all 100
megabytes of data into a clientside recordset and keep it in RAM. However,
this would not scale should this data increase dramatically in size in the
future. We could retrieve this data to the client server and save it to
disk and then retrieve from it there.
Ideas?