Neil,
The fact that it's a CSV file may complicate matters. Is it the kind of
CSV file in which text fields are delimited (qualified) by quote marks
and may contain commas?
If not - i.e. if every comma is a field separator - just download the
Gnu utilities from
http://unxutils.sourceforge.net. They include 'cut',
which can extract specified fields from a delimited or fixed-width file,
and 'grep', which can extract lines that match a pattern and dump the
rest. Most of them have no limit on file size.
'cut' doesn't handle CSV files which may have commas in the data. One
easy way to extract fields from these is with Perl (free download from
www.activestate.com among other places): the standard installation
includes the ParseWords module which can handle most CSV files.
Something like this executed at the Windows command prompt will do the
job:
C:\Temp>perl -wlnMText:
arseWords -e"chomp; print join ',',
(parse_line ',', 1, $_ )[0..5,29,40..80]" "Input.txt"
The list in the square brackets specifies which fields you want:
0 is the first field,
0..5 is the first to sixth fields
and so on.
Once you're happy with the output you can either redirect it to a file
(e.g. append > "Output.txt" to the command) or have Perl edit the file
and leave a backup copy of the original by inserting
-i.bak
before the -e.
If it's a really gnarly file - e.g. with linebreaks and/or escaped
quotes in the data, things get more complicated - but probably there's a
free Perl module available to handle it.
John, thanks for that, pretty much confirms my fears.
The text file does have dozens of fields of null or duff data, which is
really of no interest to me. Trouble is, I don't know how to strip this out
of the text file so that it can be imported. In one of life's cruel twists,
I can filter it out or delete it once I have the data in Access!
Can you recomend a text editor or a different database I can pre-edit the
date with?