C
Chuck
I am working on an application for use on laptop computers (field use).
When the laptop is docked, the primary data source will be a remote dataset
hosted on the server which is connected to the database. When the laptop is
disconnected the application will use an XML cache of a subset of the data
to fill and save a local copy of the dataset. The application may be
started and stopped several times and the laptop will be turned off and on
as well. This kills any way of using memory caching.
Now, the problem is synching the local dataset with the remote one. I don't
want to directly access the database as it is in a state of flux and I want
the remote service to act as a facade.
I will have added, and modified records in the local dataset. Deletes are
generally not allowed. The remote dataset may have records that are
different or new to the local copy.
The only thing I have come up with is to merge the local records that are
added or modified into the remote 'master' dataset then update the master
into the database. Afterwards, I can then merge the master back into the
local copy.
This feels clumsy to me. Can anyone come up with a better way to do this
sync.
If the application was permantly connected to the server, I would not have a
local cache copy, so the problem would not arise.
Chuck
When the laptop is docked, the primary data source will be a remote dataset
hosted on the server which is connected to the database. When the laptop is
disconnected the application will use an XML cache of a subset of the data
to fill and save a local copy of the dataset. The application may be
started and stopped several times and the laptop will be turned off and on
as well. This kills any way of using memory caching.
Now, the problem is synching the local dataset with the remote one. I don't
want to directly access the database as it is in a state of flux and I want
the remote service to act as a facade.
I will have added, and modified records in the local dataset. Deletes are
generally not allowed. The remote dataset may have records that are
different or new to the local copy.
The only thing I have come up with is to merge the local records that are
added or modified into the remote 'master' dataset then update the master
into the database. Afterwards, I can then merge the master back into the
local copy.
This feels clumsy to me. Can anyone come up with a better way to do this
sync.
If the application was permantly connected to the server, I would not have a
local cache copy, so the problem would not arise.
Chuck