B
bcc
Hi all,
We're in the process of revamping our import process from a valuation system
(Fortran) to a reporting system (Access 2000) and trying to figure out some
new approaches. Currently, we have a system that writes out a single CSV
file with the first field for each row as an identifier as to which table
that row's results should be slugged. We then import this CSV file into a
temporary Access table, and cycle through a DAO recordset assigning the
record and its values to the appropriate table(s). Some of the problems
we're running into is that the calculation engine and the reporting tool
have to maintain the same structure (i.e. ordinal position of the fields) or
the whole process falls apart. Some sample data:
.....
"A01",123456789,11111
"A02",987654321,22222
"A03","ABC"
.....
I know it is possible to open an XML file as an ADO recordset as long as the
fields are consistent through out the file. As there are 12 tables the data
if fed to, individual XML files are not desireable - we would prefer to keep
it down to a single output file. One possible format would be (ignoring the
schema) is:
.....
<z:row TblName="tblData1" FldName="Fld1" DataVal="123456789" />
<z:row TblName="tblData1" FldName="Fld2" DataVal="11111" />
<z:row TblName="tblData2" FldName="Fld1" DataVal="987654321" />
<z:row TblName="tblData2" FldName="Fld1" DataVal="22222" />
<z:row TblName="tblData3" FldName="Fld1" DataVal="ABC" />
.....
While this approach would seem to work, I'm not sure we have gained anything
in the long run other than removing our dependence on the import file's
underlying structure and bypassing the use of a temporary table. Is there
another approach (either XML or some other transfer method from a flat file
to Access tables) that anyone can suggest or has used in the past? I would
appreciate any comments/suggestions.
TIA,
Chris
We're in the process of revamping our import process from a valuation system
(Fortran) to a reporting system (Access 2000) and trying to figure out some
new approaches. Currently, we have a system that writes out a single CSV
file with the first field for each row as an identifier as to which table
that row's results should be slugged. We then import this CSV file into a
temporary Access table, and cycle through a DAO recordset assigning the
record and its values to the appropriate table(s). Some of the problems
we're running into is that the calculation engine and the reporting tool
have to maintain the same structure (i.e. ordinal position of the fields) or
the whole process falls apart. Some sample data:
.....
"A01",123456789,11111
"A02",987654321,22222
"A03","ABC"
.....
I know it is possible to open an XML file as an ADO recordset as long as the
fields are consistent through out the file. As there are 12 tables the data
if fed to, individual XML files are not desireable - we would prefer to keep
it down to a single output file. One possible format would be (ignoring the
schema) is:
.....
<z:row TblName="tblData1" FldName="Fld1" DataVal="123456789" />
<z:row TblName="tblData1" FldName="Fld2" DataVal="11111" />
<z:row TblName="tblData2" FldName="Fld1" DataVal="987654321" />
<z:row TblName="tblData2" FldName="Fld1" DataVal="22222" />
<z:row TblName="tblData3" FldName="Fld1" DataVal="ABC" />
.....
While this approach would seem to work, I'm not sure we have gained anything
in the long run other than removing our dependence on the import file's
underlying structure and bypassing the use of a temporary table. Is there
another approach (either XML or some other transfer method from a flat file
to Access tables) that anyone can suggest or has used in the past? I would
appreciate any comments/suggestions.
TIA,
Chris