suja said:
Hi
I 've got 2 databases on 2 different SQL servers.
I need to copy the result of a stored procedure on SQL1 into a table
in a database of SQL2.
What would be the most efficient way of doing this. I'm slightly
concerned about performance,
since the query result can easily be 500000 records.
It depends on how beefy the servers are. Assuming that you have to
pull the records from sql1 onto a local machine, here's what I would do.
I'd have one thread reading the results from sql1, and sticking them
into an intermediate object. Every 1000 records or so, stick the object
into a threadsafe queue.
Have one or more writer objects polling the queue. They pull one an
intermediate object out of the queue, and insert the data into sql2 in
parallel, with each one holding its own database connection.
There are a number of things to be aware of.
1) Reading's a LOT faster than writing. You need to make sure that your
reader doesn't use up all of your memory reading from SQL1 and storing
the data into the queue.
2) If you're using .net 2.0, you need to look into using the
SQLBulkCopy object. This can make inserts a lot faster. If you're also
using SQL Server 2005, I believe that it lets you use arrays as SQL
parameters. Oracle does this, and it's a fairly big win for bulk data.
3) You'll need to tune the number of threads that you use. On a great
big Oracle server (24 processors, 12 GB RAM), 7 threads can write to
the database as fast as one. On our dinky test SQL Server (512 MB RAM,
one processor), I was getting serious timeout errors with 6 threads.
4) Beware insert triggers. They can hurt performance. On Oracle, we got
a noticable (over 500,000 rows) performance increase by replacing our
row level insert trigger with a table level trigger.
5) Be careful with transactions. If you're doing this inside
transactions, it's possible to fill the log, and force a rollback.