N
njem
I a db with the data on the server and several stations doing data
entry. We had mysterious instances of missing data. I finally
discovered one station that had off-line files enabled. It was
apparently syncing it's local copy with that on the server and,
depending on which was newer, sometimes the server copy would get
overwritten, sometimes the local copy would.
1. What would turn on off-line files? This user couldn't have found
that setting to save their life, much less have just stumbled on it.
And they're a very reasonable, reliable person who wouldn't just play
with settings or fail to mention if, for instance, some pop up
suggested they turn on some feature unknown to them. Are there
applications that enable the feature on install? Does Windows enable
it itself if it thinks it's needed?
2. This is on a desktop that doesn't seem to have any connection
problems, and the network and server are always on (except for
maintenance reboot or such). Even if off-line files is enabled, why
would the local station ever revert to using local files when the live
files are available? The way I discovered this was a balloon popped up
from the task bar telling me off-line files were in use. I immediately
checked if server files were available, and they were. (I've since
turned off the feature.)
I could see maybe the off-line feature fetching copies on shutdown or
on going to sleep, and checking which is most current on start up. But
if it's doing it right it should see that the server copy is at least
as current and so always work on that copy. There would never be a
reason for it to prefer to use the local copy when the server one is
available. So even if enabled, why would it use it?
Kind of scary to realize one little, obscure, unnoticed checkbox can
mess up so much data.
Thanks,
Tom
entry. We had mysterious instances of missing data. I finally
discovered one station that had off-line files enabled. It was
apparently syncing it's local copy with that on the server and,
depending on which was newer, sometimes the server copy would get
overwritten, sometimes the local copy would.
1. What would turn on off-line files? This user couldn't have found
that setting to save their life, much less have just stumbled on it.
And they're a very reasonable, reliable person who wouldn't just play
with settings or fail to mention if, for instance, some pop up
suggested they turn on some feature unknown to them. Are there
applications that enable the feature on install? Does Windows enable
it itself if it thinks it's needed?
2. This is on a desktop that doesn't seem to have any connection
problems, and the network and server are always on (except for
maintenance reboot or such). Even if off-line files is enabled, why
would the local station ever revert to using local files when the live
files are available? The way I discovered this was a balloon popped up
from the task bar telling me off-line files were in use. I immediately
checked if server files were available, and they were. (I've since
turned off the feature.)
I could see maybe the off-line feature fetching copies on shutdown or
on going to sleep, and checking which is most current on start up. But
if it's doing it right it should see that the server copy is at least
as current and so always work on that copy. There would never be a
reason for it to prefer to use the local copy when the server one is
available. So even if enabled, why would it use it?
Kind of scary to realize one little, obscure, unnoticed checkbox can
mess up so much data.
Thanks,
Tom