Serialization/Compression

  • Thread starter Thread starter Bala Nagarajan
  • Start date Start date
B

Bala Nagarajan

Hello,

We are developing an application (Windows forms) that allows users to take
a snapshot of the some database tables and save them another set set of
tables (called Model tables)and work with them . Since the data that goes
into model tables are huge (in the order of 30000 records for each table) we
envisaged that we are going to run out of database space if more users start
hitting our application database. To solve this problem i suggetsed to split
the records that go into the model tables in chunks of say 5000 ,binary
serialize the data, compress it and store the compressed form of the data in
a blob field in the database. Of course the application will have to the
reverse : decompress, deserialize and then render data to GUI. This proces
does incur overhead because of the intermediate operations but i thought it
is worth implementing since it can save us atleast 60-70% of space which i
guess is pretty signifcant. Also the time taken to retrieve 6 records
(instead of 30000) from the database which contains 30000 records in
serialized format will be much efficient and faster.

I just want to know if this approach is a good solution. Please let me know
if there is a better way of resolving this issue. The downside if this
approach is when the user modifies the data. The problems are as follows.

1. If the user has edited the data i will have to find out which chunk he
has modified, serialize and compress only that portion of the data. I don't
want to serialize all the chunks of data if the user just modifies only one
chunk of the data. Though i can use some kind of identifier to identify the
chunk the process may be cumbersome.

2. Even If the user just modifies one record i will have to serialize and
compress 5000 records not matter what, which is kind of bad.

I am not sure as to how to tackle these problem and will greatly apperciate
if you help me out.

Thanks a lot for the help.

Bala
 
Here are some questions.

1) Will the snapshots exists forever or only while they are being worked
on? Will you be able to delete the snapshots at some point in time?
2) How many users are expected to make snapshots at around the same time?
3) Are the snapshots only for viewing or will the user be making changes as
well?
4) If the user can make changes to a snapshot, where will those changes
persist? Will they be pushed back to the main take that the snapshot came
from?
 
I wrote an smart client multiuser application that handles much less
data, but it does serialize individual data on the client as XML.

I find that a more balanced approach, architecturally.
 
Peter,
Thanks for responding. Let me know if you have more questions. I really
apperciate your time.

1) Will the snapshots exists forever or only while they are being worked
on? Will you be able to delete the snapshots at some point in time?
The snapshot will get refreshed every month. Every month a batch process
will be run to refresh the snapshot.

2) How many users are expected to make snapshots at around the same time?
Around 50.

3) Are the snapshots only for viewing or will the user be making changes
as
well?

The Snapshot is for viewing only. But the users can create a copy of this
snapshot (called Model in our world). The users can isert/update/delete
data at the model level.

Thanks
Bala
 
Hi

Commonly we did not recommend to compress whole table and stored in the
database. Because that will make the database hard to maintain. Once a mini
error occur, the whole database will be unavailable, because common a mini
error in a compress package will cause the whole package unavailable. That
is why we stored the data in the database, and the database helped to store
and maintain. The database have special mechanisms to maintain the data
including backup.

Also I am curious why you need to take a tables snapshot, if the snapshot
is for view only, we can query from the db directly.
If the users will make change to the modal, so will the modal be update
back into the database.
If no, why we did not use a dataset directly.

Best regards,

Peter Huang
Microsoft Online Partner Support

Get Secure! - www.microsoft.com/security
This posting is provided "AS IS" with no warranties, and confers no rights.
 
Back
Top