Hi Mark,
Supplementing what Joe has said: I've imported CSV files with more than
two million records. The following precautions seem wise:
-Create the table in Access first with the correct field types. Make
sure that there is no primary key and no indexes: if Access is indexing
the data as it imports it, the process will be much slower and there is
more chance of exceeding the 2GB limit and of things going wrong
generally. After importing, compact the database and then create the
indexes you need.
-Have both the text file and the database on a local drive; doing this
across a network makes it take much longer and increases the chance of
problems.
Remember that most problems with importing text files are caused by
irregularities in the file or the data (e.g. incorrect line breaks, a
missing or superfluous quote mark...). I've often had to write Perl or
VBScript scripts to scan a textfile to find glitches like this so they
can be corrected.
Finally, it sometimes helps to pre-process the file at the text file
stage before importing to Access (e.g. to dump records or fields you
don't want, or to split it into manageable chunks). Tools for this
include Perl, VBScript, and the Gnu textutils (Windows versions
downloadable from
http://unxutils.sourceforge.net/)