Importing a wide .csv file

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

I have a comma separated text file with more than 256 commas in each row.
Since this exceeds the column/field limit in Excel/Access, I'm unable to
import the entire file. Do you have any tips to get around that? Thanks.
 
You'll either have to split the file coming in, or else use VBA to import
the file line by line, and store the data appropriately.
 
I have a comma separated text file with more than 256 commas in each row.
Since this exceeds the column/field limit in Excel/Access, I'm unable to
import the entire file. Do you have any tips to get around that? Thanks.

There's a good chance that importen it straight to >256 columns is not the
best way to go.
Most likely you'll end up with a database that isn't properly normalized.
Are you familiar with normalization? If not you should familiarize yourself
with it.
It may be better to split your data into several tables. Or you have a table
with 2-3 columns such as this:

ID (autonumber)
recordID (number) 'indicates line from text file
myvalue (number/text) 'the value between the commas in your file.

This way you're only using 3 columns but a lot of rows - which is the way a
database should be :-)
 
Hi Peter,

As others have said, the file can almost certainly be normalised into
two or more narrower tables which can be imported into Access.

If you have, or can install, Perl on your computer I can send you a
program that will convert your wide file into a tall narrow one that can
then be imported into normalised tables. (Perl is a free download from
www.activestate.com.)

Alternatively, it's possible to use standard text file tools from the
Unix world. For instance, the "cut" utility will extract selected fields
from one text file into another. Windows versions of these can be
downloaded free from http://unxutils.sourceforge.net.
 
Back
Top