Text file import error

  • Thread starter Thread starter Bob
  • Start date Start date
B

Bob

I'm using Access 2000 and trying to import a .txt file
that's more than 2 GB in size. But I get an error message
that says the file has no data. Is this because the .txt
file is too big? And if so, is there a way I can break it
apart and bring it in as separate tables in pieces?
 
You may need more than one database.... ACCESS 2000 is limited to 2 GB total
in size. If your data is more than 2 GB, you won't be able to import it into
a single database.

You may need to look at using SQL server or some other, more robust, option.
 
Bob said:
I'm using Access 2000 and trying to import a .txt file
that's more than 2 GB in size. But I get an error message
that says the file has no data. Is this because the .txt
file is too big? And if so, is there a way I can break it
apart and bring it in as separate tables in pieces?

There are a variety of ways. The simplest is to open the file in WordPad and
delete 9/10 or so of it at an end of record delimiter, then save it as
db1.txt
Open it again and get rid of all but the second tenth. this is db2.txt etc.

As mentioned this will probably be to large to import. I say probably only
because it might be fixed length with a lot of empty space.
 
Hi Bob,

I'm pretty sure that this is because the file is too big for Access's
import routines to handle. Depending on what's in the file, it may even
contain more data than can be contained in an Access database (max size
2GB, including tables, indexes and all).

But most big text files contain a lot of wasted space and/or redundant
data, and can be munged at the textfile level into something more
compact and importable. For instance, if it's a fixed-width file with
mostly empty fields, converting to a delimited format can dramatically
reduce its size. If you can get it down to say 1 GB total you can then
think in terms of of splitting that into a dozen or more smaller files
that can be imported one at a time, giving you a big but not totally
unmanageable table. Minimise the number of indexes on this table while
you're importing.

Also, few big text files are fully normalised; if there are repeating
fields (e.g. a group of fields for each month of the year) or redundant
data (e.g. the same contact details being repeated in multiple records)
it's possible to normalise them into a separate table, further reducing
the amount of data.

In fact it's possible to normalise data at the textfile stage, into
multiple tables contained in separate smaller textfiles which can more
easily be imported.

Tools for the above:
-Perl (http://www.activestate.com) for just about any kind of textfile
processing, including conversion from fixed width to csv

-Unix textutils such as "split" (one file into a number of shorter
ones), "cut" (extract a subset of fields), "nl" (add line numbers for a
primary key), "sort", and "uniq" (dump duplicated lines); Windows
versions at http://unxutils.sourceforge.net

Post back with more information about the file (but please don't attach
it to your message<g>).
 
It's a fixed width file, and I've tried to open it with
WordPad but to no avail. The file is actual 2.8 GB in size,
so nothing I've tried so far will open it. If I could get
it open, I'd probably do what someone suggested -- take the
part out of it I'm most interested in, delete the rest, and
save that as its own table. But I can't get it open. ... I
used to be familiar with a text editor program that could
open anything and save out a selection of about any size,
but I've forgotten its name.
-----Original Message-----
Hi Bob,

I'm pretty sure that this is because the file is too big for Access's
import routines to handle. Depending on what's in the file, it may even
contain more data than can be contained in an Access database (max size
2GB, including tables, indexes and all).

But most big text files contain a lot of wasted space and/or redundant
data, and can be munged at the textfile level into something more
compact and importable. For instance, if it's a fixed-width file with
mostly empty fields, converting to a delimited format can dramatically
reduce its size. If you can get it down to say 1 GB total you can then
think in terms of of splitting that into a dozen or more smaller files
that can be imported one at a time, giving you a big but not totally
unmanageable table. Minimise the number of indexes on this table while
you're importing.

Also, few big text files are fully normalised; if there are repeating
fields (e.g. a group of fields for each month of the year) or redundant
data (e.g. the same contact details being repeated in multiple records)
it's possible to normalise them into a separate table, further reducing
the amount of data.

In fact it's possible to normalise data at the textfile stage, into
multiple tables contained in separate smaller textfiles which can more
easily be imported.

Tools for the above:
-Perl (http://www.activestate.com) for just about any kind of textfile
processing, including conversion from fixed width to csv

-Unix textutils such as "split" (one file into a number of shorter
ones), "cut" (extract a subset of fields), "nl" (add line numbers for a
primary key), "sort", and "uniq" (dump duplicated lines); Windows
versions at http://unxutils.sourceforge.net

Post back with more information about the file (but please don't attach
it to your message<g>).

I'm using Access 2000 and trying to import a .txt file
that's more than 2 GB in size. But I get an error message
that says the file has no data. Is this because the .txt
file is too big? And if so, is there a way I can break it
apart and bring it in as separate tables in pieces?

--
John Nurick [Microsoft Access MVP]

Please respond in the newgroup and not by email.
.
 
I've opened some pretty big files with Multi-Edit. They have a fully
functional demo that you could try:

http://www.multieditsoftware.com/demo.php

--
Cheryl Fischer
Law/Sys Associates
Houston, TX

It's a fixed width file, and I've tried to open it with
WordPad but to no avail. The file is actual 2.8 GB in size,
so nothing I've tried so far will open it. If I could get
it open, I'd probably do what someone suggested -- take the
part out of it I'm most interested in, delete the rest, and
save that as its own table. But I can't get it open. ... I
used to be familiar with a text editor program that could
open anything and save out a selection of about any size,
but I've forgotten its name.
-----Original Message-----
Hi Bob,

I'm pretty sure that this is because the file is too big for Access's
import routines to handle. Depending on what's in the file, it may even
contain more data than can be contained in an Access database (max size
2GB, including tables, indexes and all).

But most big text files contain a lot of wasted space and/or redundant
data, and can be munged at the textfile level into something more
compact and importable. For instance, if it's a fixed-width file with
mostly empty fields, converting to a delimited format can dramatically
reduce its size. If you can get it down to say 1 GB total you can then
think in terms of of splitting that into a dozen or more smaller files
that can be imported one at a time, giving you a big but not totally
unmanageable table. Minimise the number of indexes on this table while
you're importing.

Also, few big text files are fully normalised; if there are repeating
fields (e.g. a group of fields for each month of the year) or redundant
data (e.g. the same contact details being repeated in multiple records)
it's possible to normalise them into a separate table, further reducing
the amount of data.

In fact it's possible to normalise data at the textfile stage, into
multiple tables contained in separate smaller textfiles which can more
easily be imported.

Tools for the above:
-Perl (http://www.activestate.com) for just about any kind of textfile
processing, including conversion from fixed width to csv

-Unix textutils such as "split" (one file into a number of shorter
ones), "cut" (extract a subset of fields), "nl" (add line numbers for a
primary key), "sort", and "uniq" (dump duplicated lines); Windows
versions at http://unxutils.sourceforge.net

Post back with more information about the file (but please don't attach
it to your message<g>).

I'm using Access 2000 and trying to import a .txt file
that's more than 2 GB in size. But I get an error message
that says the file has no data. Is this because the .txt
file is too big? And if so, is there a way I can break it
apart and bring it in as separate tables in pieces?

--
John Nurick [Microsoft Access MVP]

Please respond in the newgroup and not by email.
.
 
Bob,

If you're just interested in some of the records and can define them,
one or other of the textfile tools I mentioned above can extract them to
another file. Because they just read the file sequentially the size
isn't a problem as long as you have enough disk space.

For instance, to extract lines 500,000 to 600,000 to another file all it
takes is:

perl -ne"print if $. >= 500000 and $. <= 600000" file.txt > new.txt

Or suppose there's a State field in columns 80 and 81 of the fixed width
record; you can extract records for the west coast with something like

perl -ne"print if m/^.{79}(CA|OR|WA)/" file.txt > new.txt

It's a fixed width file, and I've tried to open it with
WordPad but to no avail. The file is actual 2.8 GB in size,
so nothing I've tried so far will open it. If I could get
it open, I'd probably do what someone suggested -- take the
part out of it I'm most interested in, delete the rest, and
save that as its own table. But I can't get it open. ... I
used to be familiar with a text editor program that could
open anything and save out a selection of about any size,
but I've forgotten its name.
-----Original Message-----
Hi Bob,

I'm pretty sure that this is because the file is too big for Access's
import routines to handle. Depending on what's in the file, it may even
contain more data than can be contained in an Access database (max size
2GB, including tables, indexes and all).

But most big text files contain a lot of wasted space and/or redundant
data, and can be munged at the textfile level into something more
compact and importable. For instance, if it's a fixed-width file with
mostly empty fields, converting to a delimited format can dramatically
reduce its size. If you can get it down to say 1 GB total you can then
think in terms of of splitting that into a dozen or more smaller files
that can be imported one at a time, giving you a big but not totally
unmanageable table. Minimise the number of indexes on this table while
you're importing.

Also, few big text files are fully normalised; if there are repeating
fields (e.g. a group of fields for each month of the year) or redundant
data (e.g. the same contact details being repeated in multiple records)
it's possible to normalise them into a separate table, further reducing
the amount of data.

In fact it's possible to normalise data at the textfile stage, into
multiple tables contained in separate smaller textfiles which can more
easily be imported.

Tools for the above:
-Perl (http://www.activestate.com) for just about any kind of textfile
processing, including conversion from fixed width to csv

-Unix textutils such as "split" (one file into a number of shorter
ones), "cut" (extract a subset of fields), "nl" (add line numbers for a
primary key), "sort", and "uniq" (dump duplicated lines); Windows
versions at http://unxutils.sourceforge.net

Post back with more information about the file (but please don't attach
it to your message<g>).

I'm using Access 2000 and trying to import a .txt file
that's more than 2 GB in size. But I get an error message
that says the file has no data. Is this because the .txt
file is too big? And if so, is there a way I can break it
apart and bring it in as separate tables in pieces?

--
John Nurick [Microsoft Access MVP]

Please respond in the newgroup and not by email.
.
 
http://www.ultraedit.com/
is pretty good.

--
Joe Fallon
Access MVP



It's a fixed width file, and I've tried to open it with
WordPad but to no avail. The file is actual 2.8 GB in size,
so nothing I've tried so far will open it. If I could get
it open, I'd probably do what someone suggested -- take the
part out of it I'm most interested in, delete the rest, and
save that as its own table. But I can't get it open. ... I
used to be familiar with a text editor program that could
open anything and save out a selection of about any size,
but I've forgotten its name.
-----Original Message-----
Hi Bob,

I'm pretty sure that this is because the file is too big for Access's
import routines to handle. Depending on what's in the file, it may even
contain more data than can be contained in an Access database (max size
2GB, including tables, indexes and all).

But most big text files contain a lot of wasted space and/or redundant
data, and can be munged at the textfile level into something more
compact and importable. For instance, if it's a fixed-width file with
mostly empty fields, converting to a delimited format can dramatically
reduce its size. If you can get it down to say 1 GB total you can then
think in terms of of splitting that into a dozen or more smaller files
that can be imported one at a time, giving you a big but not totally
unmanageable table. Minimise the number of indexes on this table while
you're importing.

Also, few big text files are fully normalised; if there are repeating
fields (e.g. a group of fields for each month of the year) or redundant
data (e.g. the same contact details being repeated in multiple records)
it's possible to normalise them into a separate table, further reducing
the amount of data.

In fact it's possible to normalise data at the textfile stage, into
multiple tables contained in separate smaller textfiles which can more
easily be imported.

Tools for the above:
-Perl (http://www.activestate.com) for just about any kind of textfile
processing, including conversion from fixed width to csv

-Unix textutils such as "split" (one file into a number of shorter
ones), "cut" (extract a subset of fields), "nl" (add line numbers for a
primary key), "sort", and "uniq" (dump duplicated lines); Windows
versions at http://unxutils.sourceforge.net

Post back with more information about the file (but please don't attach
it to your message<g>).

I'm using Access 2000 and trying to import a .txt file
that's more than 2 GB in size. But I get an error message
that says the file has no data. Is this because the .txt
file is too big? And if so, is there a way I can break it
apart and bring it in as separate tables in pieces?

--
John Nurick [Microsoft Access MVP]

Please respond in the newgroup and not by email.
.
 
I think I'm going to have to go back to the source and ask
them to chop up the file into smaller chunks and deal with
those separately, extract the records I need, then
reassemble them into my own table. ... My guess is there
are at least 5 million records, each about 50 fields. I
found a text editor (Professional File Editor) that will
open it, but it chokes at just over 1 million records (of
course, this is a RAM problem, and though I have 512 MB of
RAM, it's apparently not enough). The records weren't
sorted prior to creation of the file. If they were, I could
do as someone suggested and just go in and get the specific
records I need, rows X through Y. But I can't do that until
they are sorted. And someone elsewhere suggested I use the
module function in Access to do the same thing, but again I
can't do that because the records aren't yet in order. ...
Thanks for all your help, but I'll just go back to the
vendor and reorder the data and ask that it be chopped up
into smaller files.
-----Original Message-----
Bob,

If you're just interested in some of the records and can define them,
one or other of the textfile tools I mentioned above can extract them to
another file. Because they just read the file sequentially the size
isn't a problem as long as you have enough disk space.

For instance, to extract lines 500,000 to 600,000 to another file all it
takes is:

perl -ne"print if $. >= 500000 and $. <= 600000" file.txt
new.txt

Or suppose there's a State field in columns 80 and 81 of the fixed width
record; you can extract records for the west coast with something like

perl -ne"print if m/^.{79}(CA|OR|WA)/" file.txt > new.txt

It's a fixed width file, and I've tried to open it with
WordPad but to no avail. The file is actual 2.8 GB in size,
so nothing I've tried so far will open it. If I could get
it open, I'd probably do what someone suggested -- take the
part out of it I'm most interested in, delete the rest, and
save that as its own table. But I can't get it open. ... I
used to be familiar with a text editor program that could
open anything and save out a selection of about any size,
but I've forgotten its name.
-----Original Message-----
Hi Bob,

I'm pretty sure that this is because the file is too big for Access's
import routines to handle. Depending on what's in the file, it may even
contain more data than can be contained in an Access database (max size
2GB, including tables, indexes and all).

But most big text files contain a lot of wasted space and/or redundant
data, and can be munged at the textfile level into something more
compact and importable. For instance, if it's a fixed-width file with
mostly empty fields, converting to a delimited format can dramatically
reduce its size. If you can get it down to say 1 GB total you can then
think in terms of of splitting that into a dozen or more smaller files
that can be imported one at a time, giving you a big but not totally
unmanageable table. Minimise the number of indexes on this table while
you're importing.

Also, few big text files are fully normalised; if there are repeating
fields (e.g. a group of fields for each month of the year) or redundant
data (e.g. the same contact details being repeated in multiple records)
it's possible to normalise them into a separate table, further reducing
the amount of data.

In fact it's possible to normalise data at the textfile stage, into
multiple tables contained in separate smaller textfiles which can more
easily be imported.

Tools for the above:
-Perl (http://www.activestate.com) for just about any kind of textfile
processing, including conversion from fixed width to csv

-Unix textutils such as "split" (one file into a number of shorter
ones), "cut" (extract a subset of fields), "nl" (add line numbers for a
primary key), "sort", and "uniq" (dump duplicated lines); Windows
versions at http://unxutils.sourceforge.net

Post back with more information about the file (but please don't attach
it to your message<g>).

On Wed, 29 Oct 2003 19:02:30 -0800, "Bob"

I'm using Access 2000 and trying to import a .txt file
that's more than 2 GB in size. But I get an error message
that says the file has no data. Is this because the .txt
file is too big? And if so, is there a way I can break it
apart and bring it in as separate tables in pieces?

--
John Nurick [Microsoft Access MVP]

Please respond in the newgroup and not by email.
.

--
John Nurick [Microsoft Access MVP]

Please respond in the newgroup and not by email.
.
 
Back
Top