how to import fast

  • Thread starter Thread starter PF
  • Start date Start date
P

PF

I am reading a huge text file to transfer
it in access in five tables


I read a line, parse it then i create
the record with a recordset (no parameter so
i suppose it is a dbopentable recordset)
command like rst.addnew

It is slow. Is there a faster way?
like a sql statement or any other type of connection

I write in a regular access mdb.

I cannot import with a transfertext since the file
is a printer dump.
 
Hi PF,

It depends on the contents of the file. As you say "printer dump" I
assume it's got page headers and/or footers, column headers, blank
lines, and formfeeds. If the actual data lines are all in the same
format, it's usually possible to filter out all the non-data lines at
the textfile stage (most quickly done with Perl, awk or whatever),
leaving a fixed-width textfile which TransferText can handle.

On the other hand if the textfile contains groupings that you have to
preserve (e.g. if it's a printout of invoices or orders, each with
several header lines, a variable number of detail lines, and footer
lines for tax and totals) there's no escape from reading the file line
by line and parsing each line. In this situation it's sometimes worth
using textfile tools to parse the file and re-assemble it into two or
more simple delimited files ready for Access to import, but if you've
already written working VBA code probably the best thing to do is start
it running and head for the pub.
 
thanks for the answer

I think in my case the parsing is the best,
what i want to know is when it comes to add data in access

is there something faster than rst.addnew (add to a recordset)

is there a way to insert say 1000 records in one insert?

Before i was generating a flat text file then import the text file.

Now i don't generate the textfile since i build the tables in access
directly
while normalising them using seek to find already transfer data etc...
It is long and i would like improving the speed.

regards

pf
 
AFAIK the only way you can insert a bunch of records into a table in a
single operation is with an append query - which requires you to have
the data in tables already.

An alternative to creating a record with rst.AddNew and then setting the
values of each field is to build and execute a single-record append
query
INSERT INTO target [(field1[, field2[, ...]])]
VALUES (value1[, value2[, ...])
for each record.

I don't know whether this would be faster than recordset operations.



thanks for the answer

I think in my case the parsing is the best,
what i want to know is when it comes to add data in access

is there something faster than rst.addnew (add to a recordset)

is there a way to insert say 1000 records in one insert?

Before i was generating a flat text file then import the text file.

Now i don't generate the textfile since i build the tables in access
directly
while normalising them using seek to find already transfer data etc...
It is long and i would like improving the speed.

regards

pf
 
I don't really see any way to improve the speed using DAO. The dbOpenTable
is the fastest of the recordset types and using the SEEK is the fastest
method to access the data. It sounds like you are checking for the
existance of the data first (which is a GOOD thing) before importing it, so
that has to take some time too.

If you are not using the WITH operator:

With rs
.addnew
(etc)

that might speed things up a bit.

Also, if the tables are heavily indexed, you might experiment with removing
them (aside from the primary key and whatever you are SEEKing on), import
the data, then re-build the indexes. That is sometimes faster.

Lastly, I've heard claims that ADO is faster, but I cannot verify that.
Anybody else know?

--
--Roger Carlson
www.rogersaccesslibrary.com
Reply to: Roger dot Carlson at Spectrum-Health dot Org

PF said:
thanks for the answer

I think in my case the parsing is the best,
what i want to know is when it comes to add data in access

is there something faster than rst.addnew (add to a recordset)

is there a way to insert say 1000 records in one insert?

Before i was generating a flat text file then import the text file.

Now i don't generate the textfile since i build the tables in access
directly
while normalising them using seek to find already transfer data etc...
It is long and i would like improving the speed.

regards

pf
 
i'll try the with operator, good idea thanks



Roger Carlson said:
I don't really see any way to improve the speed using DAO. The dbOpenTable
is the fastest of the recordset types and using the SEEK is the fastest
method to access the data. It sounds like you are checking for the
existance of the data first (which is a GOOD thing) before importing it, so
that has to take some time too.

If you are not using the WITH operator:

With rs
.addnew
(etc)

that might speed things up a bit.

Also, if the tables are heavily indexed, you might experiment with removing
them (aside from the primary key and whatever you are SEEKing on), import
the data, then re-build the indexes. That is sometimes faster.

Lastly, I've heard claims that ADO is faster, but I cannot verify that.
Anybody else know?
 
Back
Top