copy query result to different DB

  • Thread starter Thread starter suja
  • Start date Start date
S

suja

Hi
I 've got 2 databases on 2 different SQL servers.
I need to copy the result of a stored procedure on SQL1 into a table in a
database of SQL2.
What would be the most efficient way of doing this. I'm slightly concerned
about performance,
since the query result can easily be 500000 records.

Any suggestions?

thanks

Sue
 
Hi,

1) DTS

2) A stored procedure on destination sql server - which opens a remote
connection on source sql server & runs the stored proc

Something like linked server

Any experts may comment on what is better ?

Kalpesh
 
thanks for your prompt reply
I should have mentioned it :-)
I can't use DTS. The whole data transfar has to happen in the backround
within my application
 
Hi,

Assume
Server A = Source SQL Server
Server B = Destination SQL Server

Use t-sql statements that help you run queries on remote sql server

<example>
EXEC sp_addlinkedserver 'OracleSvr',
'Oracle 7.3',
'MSDAORA',
'ORCLDB'
GO

INSERT INTO Titles
SELECT *
FROM OPENQUERY(OracleSvr, 'SELECT name, id FROM joe.titles')
GO

</example>

I am not an expert on this. But this will be better done on servers
rather than on .NET end

HTH
Kalpesh
 
suja said:
Hi
I 've got 2 databases on 2 different SQL servers.
I need to copy the result of a stored procedure on SQL1 into a table
in a database of SQL2.
What would be the most efficient way of doing this. I'm slightly
concerned about performance,
since the query result can easily be 500000 records.

It depends on how beefy the servers are. Assuming that you have to
pull the records from sql1 onto a local machine, here's what I would do.
I'd have one thread reading the results from sql1, and sticking them
into an intermediate object. Every 1000 records or so, stick the object
into a threadsafe queue.
Have one or more writer objects polling the queue. They pull one an
intermediate object out of the queue, and insert the data into sql2 in
parallel, with each one holding its own database connection.

There are a number of things to be aware of.
1) Reading's a LOT faster than writing. You need to make sure that your
reader doesn't use up all of your memory reading from SQL1 and storing
the data into the queue.
2) If you're using .net 2.0, you need to look into using the
SQLBulkCopy object. This can make inserts a lot faster. If you're also
using SQL Server 2005, I believe that it lets you use arrays as SQL
parameters. Oracle does this, and it's a fairly big win for bulk data.
3) You'll need to tune the number of threads that you use. On a great
big Oracle server (24 processors, 12 GB RAM), 7 threads can write to
the database as fast as one. On our dinky test SQL Server (512 MB RAM,
one processor), I was getting serious timeout errors with 6 threads.
4) Beware insert triggers. They can hurt performance. On Oracle, we got
a noticable (over 500,000 rows) performance increase by replacing our
row level insert trigger with a table level trigger.
5) Be careful with transactions. If you're doing this inside
transactions, it's possible to fill the log, and force a rollback.
 
Back
Top