Limit in Dataset?

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

Hi,
Could somebody tell me whether there is a limit in the no. of records that a
Dataset/Dataset tables can handle? I read in a book that Dataset is not a
good option if you have more than 200 records.

Thanks
LJ
 
Internally AFAIK, it uses an ArrayList, so no set hard limits. I think
ArrayList dynamically resizes memory usage. The consideration however is the
available memory on your system. 200 tiny rows might not be a big deal, but
20 ultra huge rows could be. It all depends.

- Sahil Malik
http://dotnetjunkies.com/weblog/sahilmalik
Please reply to the newsgroups instead of email so everyone can benefit from
your reply.
 
Użytkownik "LJ said:
Hi,
Could somebody tell me whether there is a limit in the no. of records that
a
Dataset/Dataset tables can handle? I read in a book that Dataset is not a
good option if you have more than 200 records.

Maybe 200 thousands?
I suppose that answer on the question relies on many factors, for example:

1. hardware equipment: CPU, RAM, network bandwith,
2. what kind of data you have in your datatable and how fat is one record -
number of columns,
3. type of application (www or winforms),
4. admissible delay with dataset fill,
5. number of concurrent users in your database,
etc.

I hope it helps.
Grzegorz
 
Hi,

I believe DataTable could hold up to 2,147,483,648 rows. I have a dataset
with 80,000 rows in it and it works fine. It does not mean it is a good idea
to load dataset with the whole table from the database, but in some case
(like mine) you need to load data from the XML file, not from the database
and you will be able to load big dataset. Performance depends on a design of
the application. If you retrieve data from the database, then try to select
only rows, which you really need. If you have loaded big DataSet, then it
works pretty fast even with thousands rows in it. From my personal
experience I was able to write code, which joined DataTable in a DataSet
and it produces around 3,500,000 joined rows in approx 40 seconds.
 
Hi LJ,

Val's right.

Get your customers to have 60-80 gig hd's and at least a gig of ram.

I load 500,000 rows of a 25 column table frequently - the original fill can
take some time (a minute or so), but there is no real problem beyond a
modest performance hit. Of course, it goes without saying that you should
filter the data - both vertically and horizontally - to the extent possible.

I'd say, safely, a dataset/datatable with anything less than 100,000 average
size rows (say 15 columns or so) is no problem at all.

HTH,

Bernie Yaeger
 
LJ,

Did you not read that it is in a lot of situations not a good option to make
datasets that holds more than 200 records.

It is the best to keep the dataset as small as possible. As reasons you can
think on concurrency or when you have to transport it over the Net.

Nevertheless are there situations where that is not true.

Just my thought,

Cor
 
U¿ytkownik "Cor Ligthert said:
LJ,

Did you not read that it is in a lot of situations not a good option to
make datasets that holds more than 200 records.

It is the best to keep the dataset as small as possible. As reasons you
can think on concurrency or when you have to transport it over the Net.

Nevertheless are there situations where that is not true.

Ok, but I think that in situation where data is read only (for example data
mart filled by night), net is fast, and complex computations are performed
on client side), using fat datasets may be a proper solution.

Regards,
Grzegorz
 
Since each datatable contains collection of the DataRows and indexes in a
collection are limited to int32, then it will be 2,147,483,648 rows for each
table.
 
Back
Top