R
Rich
Hello,
I have to read and write around one million records from
an external data source to Sql Server2k every night.
That's a lot of I/O. I am using VB6 for this (takes
hours). I am connecting to the external data source with
API's from its object library (as opposed to ODBC in which
case I would just use DTS to pull this data) and just
looping. I was thinking that with VB.Net I could read
this data into a dataset in memory (currently have 4 gigs
of mem - may upgrade to 16 gigs if they let me have win2K
Advanced Server - currently 2 processors 1 gig each). My
idea is that if I could just read the data into memory (a
dataset) it would be much faster than writing each record
(180 fields per record). Then I could just do a
dataAdapter.Update or InsertInto to Sql Server once the
data is all in local memory. Any thoughts?
While I'm at it, the records come from 4 different
sources. I was thinking about using multi-threading and
pull the data simultaneously. I am aware than Sql
Server2k only supports 4 gigs a mem. But if I have more
than 4 gigs of data can one VB.Net app manage datasets in
more than 4 gigs of memory? Once I fill my datasets, I
would just do one InsertInto at a time. How is VB.Net for
multi-threading? Again, in VB6 I invoke 4 separate apps
which simultaneously pull the data each night. They write
to 4 separate table in Sql Server2k. I would really like
to have all this in one app and read/write directly to
memory. Is VB.Net a way to do this?
Thanks,
Rich
I have to read and write around one million records from
an external data source to Sql Server2k every night.
That's a lot of I/O. I am using VB6 for this (takes
hours). I am connecting to the external data source with
API's from its object library (as opposed to ODBC in which
case I would just use DTS to pull this data) and just
looping. I was thinking that with VB.Net I could read
this data into a dataset in memory (currently have 4 gigs
of mem - may upgrade to 16 gigs if they let me have win2K
Advanced Server - currently 2 processors 1 gig each). My
idea is that if I could just read the data into memory (a
dataset) it would be much faster than writing each record
(180 fields per record). Then I could just do a
dataAdapter.Update or InsertInto to Sql Server once the
data is all in local memory. Any thoughts?
While I'm at it, the records come from 4 different
sources. I was thinking about using multi-threading and
pull the data simultaneously. I am aware than Sql
Server2k only supports 4 gigs a mem. But if I have more
than 4 gigs of data can one VB.Net app manage datasets in
more than 4 gigs of memory? Once I fill my datasets, I
would just do one InsertInto at a time. How is VB.Net for
multi-threading? Again, in VB6 I invoke 4 separate apps
which simultaneously pull the data each night. They write
to 4 separate table in Sql Server2k. I would really like
to have all this in one app and read/write directly to
memory. Is VB.Net a way to do this?
Thanks,
Rich