Text File Import Performance

  • Thread starter Thread starter Mike Schumann
  • Start date Start date
M

Mike Schumann

I am trying to import a 200MByte delimited text file into a new Access
Database. The Import Wizard has been running over 1 hour, sucking up 100%
CPU on my laptop and has yet to open a window.

I've imported much smaller files before without any problems. Is this
normal for such a large file? Does anyone have any idea how long this is
going to take? (I have a 1G RAM & a 3 year old Centrino based Sony laptop).

Thanks,
 
Hi Mike,

A 200M delimited file might break Access's max database size.

Over an hour is unusual. The longest I can recall (that didn't crash) is
about 40 minutes. Of course, a 200M file is rather unusual.

I would strongly recommend that you explore going with SQL for that kind of
storage. MS SQL is good choice - MySQL is an economic consideration.

Question - do you actually need all those records, or did you plan on
sorting out a small sub-set of data that met your criteria? If you planned
on weeding out this rather large file anyway, you might consider reading the
file in a line at a time, testing the row against some of your criteria and
only appending those rows to your table that meet the specification.

As an alternative, I might chop that file into 5 or 10 chunks and import
them one at a time.

Hope this helps,
Gordon
 
Problem solved. The file I was reading in used NL as the delimiter for a
new line. I was initially using Access 97, which didn't recognize NL as an
end of line delimiter. It ran for about 6 hours before it decided that the
record length was too long.

When I tried the same thing using Access 2000 it worked fine. The file was
about 250,000 records, and the import took a couple of minutes.

I am going to extract about 2,000 records from this database for a mailing,
so SQL is really overkill.

Thanks for the help,

Mike Schumann

gllincoln said:
Hi Mike,

A 200M delimited file might break Access's max database size.

Over an hour is unusual. The longest I can recall (that didn't crash) is
about 40 minutes. Of course, a 200M file is rather unusual.

I would strongly recommend that you explore going with SQL for that kind
of storage. MS SQL is good choice - MySQL is an economic consideration.

Question - do you actually need all those records, or did you plan on
sorting out a small sub-set of data that met your criteria? If you
planned on weeding out this rather large file anyway, you might consider
reading the file in a line at a time, testing the row against some of your
criteria and only appending those rows to your table that meet the
specification.

As an alternative, I might chop that file into 5 or 10 chunks and import
them one at a time.

Hope this helps,
Gordon
 
Back
Top