Text files

  • Thread starter Thread starter Ruslan Shlain
  • Start date Start date
R

Ruslan Shlain

I have a task to get data from comma delimited text files and write that in
to the DB. I have it working where I read each line and put it in to the DB.
Problem is there could me thousands of lines and it takes around 40 sec to
process file that big. Is there are things that I can do to get it working
faster. Any ideas or links would be very helpful.
 
* "Ruslan Shlain said:
I have a task to get data from comma delimited text files and write that in
to the DB. I have it working where I read each line and put it in to the DB.
Problem is there could me thousands of lines and it takes around 40 sec to
process file that big. Is there are things that I can do to get it working
faster. Any ideas or links would be very helpful.

<http://www.connectionstrings.com>
-> "Text"
 
Ruslan Shlain said:
I have a task to get data from comma delimited text files and write that in
to the DB. I have it working where I read each line and put it in to the DB.
Problem is there could me thousands of lines and it takes around 40 sec to
process file that big. Is there are things that I can do to get it working
faster. Any ideas or links would be very helpful.

40 seconds for thousands of records? That doesn't sound so bad to me.

I'm guessing of course but it seems like you need to determine which part of
the process is using up the time. If the delay is in the database server
the solution would be significantly different than if it is a problem with
reading the text file.

It also seems that the range of solutions is limited by your particular
business need. For instance, what happens if a record (or multiple records)
fail? Do you proceed with the remaining records, stop, roll back?

Tom
 
Back
Top