Is there a limit to how many table you can have in a dataset?

  • Thread starter Thread starter moondaddy
  • Start date Start date
M

moondaddy

Is there a limit to how many tables you can have in a dataset? I have a
situation where it would really be advantageous to use one main dataset to
marshal various tables via a web service. the problem is that we have a
group of about 50 tables that are all very closely related and we usually
need to send or receive about 1 to 3 tables at a time, but it can be any
combination of the 50 tables. if we break them down into smaller datasets
(logical groupings of tables) then we will have to retrieve multiple
datasets which would require several round trips or serializing them all
into a blob and parsing them out on the other side (messy). So, is there
any good reason why having 50 or so tables in a dataset schema would be a
bad idea?

Thanks
 
The limit is 2 billionish - whatever the highest number is on Int32 - you
will probably run out of memory way before that.

Try not to send TOO MUCH data using a webservice, and try not to hold TOO
MUCH data in a dataset.

Logical question - what is TOO MUCH? - Thats subjective and depends on a lot
of uses - just remember - a dataset is not a database :)

One good reason NOT to specify 50 tables in a dataset schema - assuming when
you say schema you mean the XSD that generates the strongly typed dataset,
is the insanely complex code and super insanely complex constructor
alongwith all the event hooking it will have to do; everytime the dataset is
instantiated.

You might want to look into a custom business object for this purpose IMO.

- Sahil Malik
http://codebetter.com/blogs/sahil.malik/
 
We have several DataSets that contain more tables than that (I think one had
about 75 tables), we are passing as XML via Web Services, and have not
noticed anything bogging down because of it. So, no, I don' t think it's a
problem.

~~Bonnie
 
Thanks for the feedback. yes, we're using a little less then 2 billion
tables and I recognize that it does take longer to instantiate a dataset in
the designer, but haven't noticed any difference at run time. Also, there's
not tones of users hitting the server at any given time where lots of these
datasets would be created at one time. So I guess its a matter of whether
its really a performance issue vs. convenience. Strongly typed datasets
sure are a nice way to package multiple blocks of data for marshaling. I
appreciate your comments

--
(e-mail address removed)
Sahil Malik said:
The limit is 2 billionish - whatever the highest number is on Int32 - you
will probably run out of memory way before that.

Try not to send TOO MUCH data using a webservice, and try not to hold TOO
MUCH data in a dataset.

Logical question - what is TOO MUCH? - Thats subjective and depends on a
lot of uses - just remember - a dataset is not a database :)

One good reason NOT to specify 50 tables in a dataset schema - assuming
when you say schema you mean the XSD that generates the strongly typed
dataset, is the insanely complex code and super insanely complex
constructor alongwith all the event hooking it will have to do; everytime
the dataset is instantiated.

You might want to look into a custom business object for this purpose IMO.

- Sahil Malik
http://codebetter.com/blogs/sahil.malik/
 
Back
Top