Text files to Excel to ACCESS DB

  • Thread starter Thread starter jay
  • Start date Start date
J

jay

HI folks,

I hope this forum can take me through a situation I need to get resolved in
under 2 weeks (11/1/04).

I am new to VB (and VBA) so please bear with me. Additionally I just
purchased VB 6 Professional so I can get familiar with the Object Oriented
way of doing things and use it in this application.

Now..my situation.

I have 200 text delimited files that will be updated weekly. These files
define 100 sets, each made up of 2 text files.

I need to automatically call these files from a PC directory using Excel,
one set (2 text files) at a time. Once the set is imported I need to
generate an Excel file from each file and write it to a PC directory.

I then need to automatically call these Excel files from a PC directory into
ACCESS, one set (2 Excel files) at a time. Once the set is imported I need
to generate a database from these files. My guess at this point is the
database to be generated will be about 60 fields wide and maybe 60 records
deep. Once the new database has been constructed from the 2 Excel files, I
need to write the new database file to a new directory on the PC. Once the
main database for each of these sets has been created, I will then need to
need to use these 100 new databases to create one master database.

As noted above I have VB 6.0 so if the control needed to go from text files
to one master database can be done under VB 6, great! I do not want to
construct a database within VB 6 as I have been told it can be tough to
handle.

This is a lot to ask but I am really in a bind and any assiatance with this
would be greatly appreciated.


Jay
Southern California
(e-mail address removed)
 
Hi Jay,

What you propose seems very convoluted. It appears that every week you
will receive 100 pairs of text files, and you intend to use them
generate 200 Excel workbooks and 100 Access databases. Is this right? If
so, it's a *very* unusual requirement. At the end of a year you will
have 5200 databases, which is not something a sane person normally
wants.

If the goal is simply to be able to analyse the data from any of these
pairs of text files, the usual approach would be to create an
application that would read the data directly from the text files and
then process it. Depending on the size of the files and the kind of
analysis required, this could be done with Access, Excel, VB6, or many
other tools.

On the other hand, to store and analyse data from all the sets of files,
one would usually import all the data into a single database.

Without knowing more about your data there's not much more I can say.
 
Hi John,

It may appear convoluted but its not. ACCESS has a problem reading files in
that it assumes after the first 20 or so records, it assumes the rest of the
file is the same. Well, most of my files have few fields in the first few
records and data flows in fields 16-60 after line 25. I have taken the
files, pushed them through Excel, then into ACCESS and it works all the time
with the varying field widths. There is a thread out here about that problem
with no resolution except adding a dummy record1 with 60 fields to each
text file and that's not easy on 200 text files per week!

The quantity of databases (history) is very important. Our line of work
needs us to keep traceability and this provides an ideal way of tracking the
week to week changes in the raw data. The ability to automatically highlight
a change in raw data automatically is very important since the changes in
the raw data are not documented in a method that would assist me and the
team.

So, in a nutshell 200 TEXT FILES-->200 EXCEL FILES -->100 ACCESS FILES
would be the flow. The good thing is that each file is really small.

Jay


--
*******************Confidentiality and Privilege Notice*******************
This email is intended only to be read or used by the addressee. It is
confidential and may contain legally privileged information. If you are not
the addressee indicated in this message (or responsible for delivery of the
message to such person), you may not copy or deliver this message to
anyone, and you should destroy this message and kindly notify the sender by
reply email. Confidentiality and legal privilege are not waived or lost by
reason of mistaken delivery to you
 
Hi Jay,

If I understand correctly - Excel is only for 'correctly' converting the
data for importing into Access.

Also - do you really need 100 Access files?

I'm with John so ....

based upon what I gathered, here's another approach:

1) Create/copy a table that the raw data is imported into. (only done once)

2) when importing, import into the 'RawData' table.
3) Then append this data to a single table WITH additional fields that
identify this batch (key data elements)
4) then delete the contents (not the table) of where the 'RawData' was
imported into.
5) now your ready to do it again.

This will result in 1 table with all your history to query and report on.
If you really require a single table for each week, then at #3, use a make
table query instead of an append table and use some key data element in the
table name.

Hope that helps
SteveD


Jay said:
Hi John,

It may appear convoluted but its not. ACCESS has a problem reading files in
that it assumes after the first 20 or so records, it assumes the rest of the
file is the same. Well, most of my files have few fields in the first few
records and data flows in fields 16-60 after line 25. I have taken the
files, pushed them through Excel, then into ACCESS and it works all the time
with the varying field widths. There is a thread out here about that problem
with no resolution except adding a dummy record1 with 60 fields to each
text file and that's not easy on 200 text files per week!

The quantity of databases (history) is very important. Our line of work
needs us to keep traceability and this provides an ideal way of tracking the
week to week changes in the raw data. The ability to automatically highlight
a change in raw data automatically is very important since the changes in
the raw data are not documented in a method that would assist me and the
team.

So, in a nutshell 200 TEXT FILES-->200 EXCEL FILES -->100 ACCESS FILES
would be the flow. The good thing is that each file is really small.

Jay


--
*******************Confidentiality and Privilege Notice*******************
This email is intended only to be read or used by the addressee. It is
confidential and may contain legally privileged information. If you are not
the addressee indicated in this message (or responsible for delivery of the
message to such person), you may not copy or deliver this message to
anyone, and you should destroy this message and kindly notify the sender by
reply email. Confidentiality and legal privilege are not waived or lost by
reason of mistaken delivery to you
 
Hi John,

It may appear convoluted but its not. ACCESS has a problem reading files in
that it assumes after the first 20 or so records, it assumes the rest of the
file is the same. Well, most of my files have few fields in the first few
records and data flows in fields 16-60 after line 25. I have taken the
files, pushed them through Excel, then into ACCESS and it works all the time
with the varying field widths. There is a thread out here about that problem
with no resolution except adding a dummy record1 with 60 fields to each
text file and that's not easy on 200 text files per week!

Adding a dummy record is very easy. Create a textfile containing the
dummy record and call it DummyRecord.txt. Then in the loop of your code
that imports each textfile, include something like this, where
strFileSpec specifies the file you want to import,:

Dim strTempFile As String

strTempFile = Environ("TEMP") & "\MyData$$$.txt"

'copy dummy record plus textfile to temp file
Shell "CMD.exe /C ""COPY ""D:\Folder\DummyRecord.txt"" + """ _
& strFileSpec & """ """ & strTempFile & """""

'import temp file
DoCmd.TransferText blah blah

'delete temp file
Kill strTempFile

It's been suggested in another thread that you should use MaxScanRows=0
in schema.ini to force the Jet import routine to examine the entire file
ratheer than just the first few rows. This is worth trying, though as
schema.ini stores the schemas by file name you will probably need to
copy or rename each input file to a fixed temporary name and location to
match the schema.ini entry before importing.

But as I think I said in another thread, I don't understand your data;
you seemed to have several record types in each text file, and that
normally requires a custom import procedure that reads the file line by
line, parses each line, and decides what to do with it.
 
HI John,

Here's a cut/paste of a file...each record begins with a letter (H, R, P, W)

H|2ZZZ2201-0010||XXX ZZZZZ|PWR, XXX XXX, R|XXX|Wire Harness|XXXXXX|||||||||
R|2450P4/PL4|REFDES|Wire Harness|||||1|ea||M/W
2450J4/PL4||377.66|76.18|9.94|
R|2450P8/A2|REFDES|Wire Harness|||||1|ea||M/W 2450J8/A2||368.7|98.48|45.85|
R|9152P401|REFDES|Wire Harness|||||1|ea||42201
9152P401||245.679|79.755|15.597|
P|MS3459W14-3S|1|ea|2450P4/PL4|1|ea|*|
P|MS3459W16-12S|1|ea|2450P8/A2|1|ea|*|
P|MS3459W28-10P|1|ea|9152P401|1|ea|*|
P|XXXC03-28C|1|ea|9152P401|1|ea||
P|XXXC05-16C|1|ea|2450P8/A2|1|ea||
P|XXXC05-14C|1|ea|2450P4/PL4|1|ea||
P|XXX21-2|570|mm|||||
P|AA52084A3|1|ea|||||
P|YYY-MFZ21FR1|1|ea|||||
P|QQ591631I0010|1|ea|||||
P|YYY-ZC10175N45|1|ea|||||
P|KK23053/5-106-0|155|mm|||||
P|KK23053/5-107-9|50|mm|||||
P|YYYY-MF069CL1TY00|1|ea|||||
P|YYYYYY-MMKK991TY100|1|ea|||||
P|XXXX-4-9|185|in|||||
P|XXXXX-8-9|160|in|||||
W|42201-0003-|WIRE|Wire
Harness|||1|ea||3|||||||9152P401|M39029/29-215|||||C|||||||||1|2450P8/A2|M39
029/30-221|||||A|||||||||2|XXXY05-4-9|185|in|M23053/5-107-9|7|ea||||
W|42201-0004-|WIRE|Wire
Harness|||1|ea||3|||||||9152P401|M39029/29-214|||||B|||||||||1|2450P4/PL4|M3
9029/30-220|||||A|||||||||2|XXXY05-8-9|160|in|||||||
X|2CWH42201-R001|1|ea|
 
Hi Jay,

The sample appears to contain five different kinds of record:
H 17 fields
R 17 fields
P 9 fields
W 58 fields
X 5 fields

Two things follow. First, "importing" it into a database table is
conceptually and practically a very dubious undertaking. Second, it's
not the kind of file that Access's text import wizard was designed to
handle.

One of the defining characteristics of a table in a relational database
system such as Access is that every record represents the same kind of
"entity" in the domain that the database is modelling (storing
information about) and therefore has the same structure. In the sample,
not only do some records have more fields than others; the same field
contains different information in different records (e.g. field 4 is a
part description in some records and a unit of quantity in others).

If you want to use Access to analyse the data, you need to get it into a
properly normalised structure of related tables (probably there would be
five tables for the five kinds of records, plus possibly more if the
other file in the "set" you've previously referred to contains yet more
kinds of records, plus probably another table to keep track of the
"sets".

It's perfectly possible to write Access VBA code that will read and
parse files like this and append the data to tables in a normalised
structure.

If you don't want to use Access to analyse the data, I can't see why you
should ever want to put it into hundreds and thousands of tiny MDB
files.


HI John,

Here's a cut/paste of a file...each record begins with a letter (H, R, P, W)

H|2ZZZ2201-0010||XXX ZZZZZ|PWR, XXX XXX, R|XXX|Wire Harness|XXXXXX|||||||||
R|2450P4/PL4|REFDES|Wire Harness|||||1|ea||M/W
2450J4/PL4||377.66|76.18|9.94|
R|2450P8/A2|REFDES|Wire Harness|||||1|ea||M/W 2450J8/A2||368.7|98.48|45.85|
R|9152P401|REFDES|Wire Harness|||||1|ea||42201
9152P401||245.679|79.755|15.597|
P|MS3459W14-3S|1|ea|2450P4/PL4|1|ea|*|
P|MS3459W16-12S|1|ea|2450P8/A2|1|ea|*|
P|MS3459W28-10P|1|ea|9152P401|1|ea|*|
P|XXXC03-28C|1|ea|9152P401|1|ea||
P|XXXC05-16C|1|ea|2450P8/A2|1|ea||
P|XXXC05-14C|1|ea|2450P4/PL4|1|ea||
P|XXX21-2|570|mm|||||
P|AA52084A3|1|ea|||||
P|YYY-MFZ21FR1|1|ea|||||
P|QQ591631I0010|1|ea|||||
P|YYY-ZC10175N45|1|ea|||||
P|KK23053/5-106-0|155|mm|||||
P|KK23053/5-107-9|50|mm|||||
P|YYYY-MF069CL1TY00|1|ea|||||
P|YYYYYY-MMKK991TY100|1|ea|||||
P|XXXX-4-9|185|in|||||
P|XXXXX-8-9|160|in|||||
W|42201-0003-|WIRE|Wire
Harness|||1|ea||3|||||||9152P401|M39029/29-215|||||C|||||||||1|2450P8/A2|M39
029/30-221|||||A|||||||||2|XXXY05-4-9|185|in|M23053/5-107-9|7|ea||||
W|42201-0004-|WIRE|Wire
Harness|||1|ea||3|||||||9152P401|M39029/29-214|||||B|||||||||1|2450P4/PL4|M3
9029/30-220|||||A|||||||||2|XXXY05-8-9|160|in|||||||
X|2CWH42201-R001|1|ea|
 
Back
Top