B
Bernie Hunt
Has anyone seen any documentation or white papers with suggestions on how
to handle large datasets. My task is to read in 10K records and then step
through each one processing it's data. The processing involved fetching 3
other records that match the current, manipulate the data and then write
out 5 records to a different database.
DataReader is out of the question, because this whole precessing time could
take 10 or 20 minutes, so it looks like a DataSet. My concern is what
happens if a DataSet exceeds memory. Is it cached on disk in the swapfile
or does .Net handle it's temp storage. I'd love to find some kind of
guideline on how large can a DataSet get without causing calaterail damage
in the environment.
Thanks for any guidance.
Bernie
to handle large datasets. My task is to read in 10K records and then step
through each one processing it's data. The processing involved fetching 3
other records that match the current, manipulate the data and then write
out 5 records to a different database.
DataReader is out of the question, because this whole precessing time could
take 10 or 20 minutes, so it looks like a DataSet. My concern is what
happens if a DataSet exceeds memory. Is it cached on disk in the swapfile
or does .Net handle it's temp storage. I'd love to find some kind of
guideline on how large can a DataSet get without causing calaterail damage
in the environment.
Thanks for any guidance.
Bernie