Copy data between two dataadapters

  • Thread starter Thread starter Thomas Pagel
  • Start date Start date
T

Thomas Pagel

Hi,

we want to build an application which copies all data from the source to the
destination without changing anything.

We use data adapters for this process. So we open both adapters, use "fill"
on the source and give an "update" to the destination.

The problem is that the table we want to copy is quite large. So we see
quite a memory usage. So we decided to move the data in blocks, 1000 records
at a time. We use the dataadpter.fill(dataset,0,1000,tablename) to read the
first records, then we do an update, then the next fill
(dataadpater.fill(dataset,1001,2000,tablename) and loop that till EOF.

We see that the programm moves the first records quite quickly but gets
slower and slower over time. Our idea is that the dataadapter take more and
more time to skip the first records of the source to get to the real
"payload".

Is there anything we can do better? We simply want to move the first 1000
records, then the next 1000, and so on...

Thanks,



Thomas
 
Thomas,

If you have an indexed ID column in your table, I'd recommend
rewriting the SELECT on your source dataadapter to include WHERE id >
@minrecord AND id <= @maxrecord.

When before executing your dataadapter.fill, call

dataadapter.selectcommand.parameters("@minrecord").Value = 0
dataadapter.selectcommand.parameters("@maxrecord").Value = 1000

The next time through, increment them by 1000, and so on.

Good luck,

Jonathan
 
Jonathan,

we would be happy if we would have an ID... We have a primary key, but it's
a string...

Thanks,


Thomas
 
Hi,

the problem is that we have a quite "special" ODBC driver as a source (SQL
Server as destination), we tried to use it with SQL Server DTS and it's not
very reliable. Getting data from it by a dataadapter works fine but we
experience performance problems. A bulk copy would make sense if we would
find a way to get the data from that "special" ODBC source into a text file
but I don't see a way to do that...

Thanks,


Thomas

Bin Song said:
Hi, Thomas,

For copying large amount of data, I don't see the reason to use dataset.
What's your database in both source and destination? Doing bulk copying in
database will be much faster that using dataset.
 
Hi, Thomas

So which operation takes more time? Fill ODBC data into dataset, or copy first dataset to the second dataset, or Update data into SQL Server
If it is the problem of updating, I have another idea
1. Fill ODBC data into a dataset:
da.Fill(ds

2. Get the XML of the dataset:
strXml = ds.GetXml(

3. Create a Stored Procedure for updating
Create Proc MySP(@doc varchar(8000)) --Max length of varchar is 8000, you migh conside text for larger dat
A
DECLARE @idoc in
EXEC sp_xml_preparedocument @idoc OUTPUT, @do
INSERT Into YourTable
SELECT
FROM OPENXML (@idoc, '/ROOT/Row',1
EXEC sp_xml_removedocument @ido

4. Update using SQLComman
cmd.ExecuteNonQuery("MySP"

I am not sure this will work or not for your case. hope it helps

Bin Song, MC
 
Thomas,

Does your source dataset support SELECT TOP n and ORDER BY on the primary key?

-- J
 
Jonathan,

yes, that's no problem...

We just changed our application to read the source record by record... This
gives us a better performance on big recordsets...

Thanks,


Thomas
 
Back
Top