SQLBulkCopy memory issue question

  • Thread starter Thread starter SteveB
  • Start date Start date
S

SteveB

Hi All,

Using the SQLBulkCopy for big CSV files consumes almost all the memory in
the computer. I found how to resolve this problem while google that, but no
code was added. The person claimed that he flushed all temporary working
tables every 100K records to avoid filling up the memory.

If someone dealt with this please write down the code.

Thanks,

Steve
 
Here is how I did it
2 options.

1. If file resides on the same computer with SQL server then run 'BULK
INSERT' command with ADO
2. If file is not on the same computer then run 'bcc.exe' that comes with
SQL server client tools.

George.
 
Hi George,

I have many CSV files with 24 mil records and it is all automated the import
many files one after another, and working with the .net just works great for
me. I need to find the way to avoid filling up the memory when dealing with
big files.

How to flush out the temp working tables say for 100K records that were
processed?

Please advise.

Regards,

Steve
 
Lookup the batch size parameter in SQL Server's Books OnLine for both of
those commands.
 
Back
Top