DataSet and Xml performance problem

  • Thread starter Thread starter Andreas Schulz
  • Start date Start date
A

Andreas Schulz

Hi,

in my PPC application I have a pocket access database storing a lot of
data (about 10.000 - 60.000 rows per table). I use a web service to
synchronise the data wrapped in xml format using soap. My problem now is
that binding the data from database to a dataset and then using the
GetXml() method returning the Xml as string takes a lot of time (up to
15 or 20 minutes!!!!). Often I get an OutOfMemoryException because
getting about 10.0000 entries from database could not be handled.

So does anyone know how to handle or solve that problem? Would be great.

Thanks
 
Hi,

I think u are better of using sqlce database instead of xml file as in ur
case the volume of data is pretty large.
pocket access is also not optimized for handling large amounts of data.

Girish
 
I don't think that is his problem

I suffer a similar fate but I do use sqlce

my data comes as a dataset that is 5 MB in size and I have to upload this
into a sqlce database - just loading the dataset takes 35 mins and then
inserting this into the sqlce database takes another 20 mins. (luckily we
don't have to do this often)

I am actually considering parsing the dataset by hand ie using a filestream
(or derived) class to chop my dataset into smaller bits and then load them
as chunks
messy but not as resource hungry.

You may want to do this with you webservice and go for chatty calls rather
than chunky ones
 
I don't think that is his problem
I suffer a similar fate but I do use sqlce

my data comes as a dataset that is 5 MB in size and I have to upload
this into a sqlce database - just loading the dataset takes 35 mins
and then inserting this into the sqlce database takes another 20
mins. (luckily we don't have to do this often)

I am actually considering parsing the dataset by hand ie using a
filestream (or derived) class to chop my dataset into smaller bits
and then load them as chunks
messy but not as resource hungry.

You may want to do this with you webservice and go for chatty calls
rather than chunky ones

I don't know whether this will help you but regardless....
I was using XML files as my data store and they were 3-4 Kb is size and
taking 7-8 seconds to load into the dataset. So I stripped off the
inline schema from XML file. I have created the schema in code (using
DataSet.Tables.Add etc.) and then loaded the XML with ignoreSchema
option and was able to bring my load time from 7 seconds down to 3
seconds.

And IMO you should use this in conjuction (rather than looking it as
replacement) with splitting the large chunk into smaller chunks.

-Vinay.
 
As an update

I did the split into small chunks (20 records) today and got my 5MB (5000
records) dataset to load and insert into the database in 1/2 the time

Reading the schema isn't a real problem to me as I would only save a few
seconds and the Datasets I am loading used to take 50 mins (now 25)

I am a happy bunny today :)

Shaun
 
Back
Top