M
Marcin Wiszowaty
Hello,
I am working with databases,datasets, sqlserver 2000, vb.net, vs.net 03
But this is more of a general question.
I like to do my dataacces by reading in the whole db table into a
dataset(ds) and then doing searches inside of that ds. Change the found data
and then at program completion updata that ds to the table in the db. I like
this way because i can use the intelisense by saing row.columnname and
dont have to wory abour sql querry laguage. I drag a data adapter onto the
designer specify a select query and it generates the insert and update for
me.
Is this the way it should be done?
I am working on a project now that will potentialy have millions of records
in that table. (After a while hundreds of millions, but am planing of moving
the oldest million records into a history table, history will probably not
be used ever) Which i will then read into a dataset and haveto search
through. The size will slow this down.
Is the better solution to read in smaller chunks and query the db like 100
times per program run to make the dataset smaller or to move older
information into a history table?
Should i have broken this big table into some 5 smaller tables with same
exact column names about 30 columns in all tables?
As you can see i have very little experience with this.
Another question is that i am now using a reqular int colum as the primary
key. None of the other data is unique. I could have made a primary key out
of 4 other varchar columns. Should i have done that? The idea there was that
the 4 columns would have slowed data access as per article i found online.
But now i am limited to 2^31-1 primary key values and if i move the records
into history table i will have to decrement this column for each record in
this table (which will always stay at about 1million records). Anyone know
of an easy way to do this? Tables without a primary key would make updates
much harder and the querys would not be generated for me automaticaly by
visual studio.
Please give me your experienced input on this matter. Hopefully the lenght
of this post will not scare people off from reading it.
I will try to post the all helpfull inputs on this topic to other sites
because i beleive this is important and i couldnt find the answears
elsewere.
Thank you.
I am working with databases,datasets, sqlserver 2000, vb.net, vs.net 03
But this is more of a general question.
I like to do my dataacces by reading in the whole db table into a
dataset(ds) and then doing searches inside of that ds. Change the found data
and then at program completion updata that ds to the table in the db. I like
this way because i can use the intelisense by saing row.columnname and
dont have to wory abour sql querry laguage. I drag a data adapter onto the
designer specify a select query and it generates the insert and update for
me.
Is this the way it should be done?
I am working on a project now that will potentialy have millions of records
in that table. (After a while hundreds of millions, but am planing of moving
the oldest million records into a history table, history will probably not
be used ever) Which i will then read into a dataset and haveto search
through. The size will slow this down.
Is the better solution to read in smaller chunks and query the db like 100
times per program run to make the dataset smaller or to move older
information into a history table?
Should i have broken this big table into some 5 smaller tables with same
exact column names about 30 columns in all tables?
As you can see i have very little experience with this.
Another question is that i am now using a reqular int colum as the primary
key. None of the other data is unique. I could have made a primary key out
of 4 other varchar columns. Should i have done that? The idea there was that
the 4 columns would have slowed data access as per article i found online.
But now i am limited to 2^31-1 primary key values and if i move the records
into history table i will have to decrement this column for each record in
this table (which will always stay at about 1million records). Anyone know
of an easy way to do this? Tables without a primary key would make updates
much harder and the querys would not be generated for me automaticaly by
visual studio.
Please give me your experienced input on this matter. Hopefully the lenght
of this post will not scare people off from reading it.
I will try to post the all helpfull inputs on this topic to other sites
because i beleive this is important and i couldnt find the answears
elsewere.
Thank you.