maintaining DB Concurrency in Ado.net

  • Thread starter Thread starter Babz
  • Start date Start date
B

Babz

Hi,
How I can prevent concurrent updation of a records in .net.
I am using Dataset which is disconnected from the data source. In this
situation if
two users opens the same record, modifies the data and updates one after
another,
my data won't be in the correct state. How I can prevent this. Is there any
locking technics.


Need Advice

Babz
 
4 choices:

1- Do updates by primary key only and take a last in wins approach
2- Do updates with all the columns in your where clause
3- Do updates with only those columns modifed in your where clause
4- Do updates with the Primary Key and a time-stamp column

I suggest #4

Now, is there a way to 'prevent' this built into ADO.NET? Not really, you
will need to manage this yourself. If you use approach 4, you will have to
make a business decision on what to do if the timestamp has changed since
your last selected the record.
 
4 choices:

1- Do updates by primary key only and take a last in wins approach
2- Do updates with all the columns in your where clause
3- Do updates with only those columns modifed in your where clause
4- Do updates with the Primary Key and a time-stamp column

I suggest #4

Now, is there a way to 'prevent' this built into ADO.NET? Not really,
you will need to manage this yourself. If you use approach 4, you will
have to make a business decision on what to do if the timestamp has
changed since your last selected the record.

All of these are flawed, because they ALL result in loss of work
and time spend, OR the time/work of the person who saved the data
currently in teh table(s), OR the time/work of the person who tries to
save data he/she produced to tables which have changed.

See: http://weblogs.asp.net/fbouma/posts/7499.aspx for an article
about this.

The only way to really prevent loss of work is to lock
functionality in the GUI/BL so there is no way a user can START working on
something when another person already is working on that same data. THEN
concurrency control is not an issue and you will not lose any work/time.
All other mechanisms are bad excuses for concurrency control. I mean: what
will you display on the screen of the person who just spend some time to
supply some data and when he/she hit 'save' it seems some other person has
already locked/saved data so the person can't save what he/she just typed
in, 'Bad luck' ? ' Sorry' ? That's software design of the 1970's.

FB
 
I've been programming since the '70s and we've been addressing problems like
this for decades. I often suggest another approach: don't design apps that
permit two (or more) clients to work on the same row at the same time. For
example, in many situations you can prevent many collisions by making sure
that client X only has access to a subset of the rows. That is, make sure
that specific clients are assigned a specific geographic region or type of
customers. In some cases clients have access to the same row, but different
columns. This takes even more finesse than ADO.NET is capable of handling in
its present state.

I think it's more important to design intersections to prevent collisions
than worry about where to place the ambulances to pick up the pieces after a
crash.

If you start your design with this point of view, the course of action you
take when there is a collision is far clearer. No, this approach does not
work universally. But take a closer look at the real situations where
collisions occur. Why is the same row being updated by more than one client?
Ok, suppose that there are situations where there are opportunities for
multiple operations on the same row. In these cases, it makes sense to
monitor the server-side data more closely. This is where server-side cursors
come into play and these can be implemented using ANSI cursors with ADO.NET.
It means that you'll have to requery periodically to get the current state
which is not easy with an ASP program, but is simple with a WinForm
application. And if the server-side state changes, you still need to resolve
the conflict. The techniques already discussed are well established and
understood.

Yes, you can use pessimistic locking, but this takes quite a bit discipline
as it can lock more than a single row. You also have to make sure that the
lock is released in a timely fashion to permit other legitimate access.
ADO.NET supports this approach as well with Transactions that implement
Repeatable Read concurrency. However, I've found that my designs are often
far more complex to support client-managed locking or transaction
management.


hth

--
____________________________________
Bill Vaughn
MVP, hRD
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 
I've been programming since the '70s and we've been addressing problems
like this for decades. I often suggest another approach: don't design
apps that permit two (or more) clients to work on the same row at the
same time. For example, in many situations you can prevent many
collisions by making sure that client X only has access to a subset of
the rows. That is, make sure that specific clients are assigned a
specific geographic region or type of customers. In some cases clients
have access to the same row, but different columns. This takes even more
finesse than ADO.NET is capable of handling in its present state.

I think it's more important to design intersections to prevent
collisions than worry about where to place the ambulances to pick up the
pieces after a crash.

If you start your design with this point of view, the course of action
you take when there is a collision is far clearer. No, this approach
does not work universally. But take a closer look at the real situations
where collisions occur. Why is the same row being updated by more than
one client? Ok, suppose that there are situations where there are
opportunities for multiple operations on the same row. In these cases,
it makes sense to monitor the server-side data more closely. This is
where server-side cursors come into play and these can be implemented
using ANSI cursors with ADO.NET. It means that you'll have to requery
periodically to get the current state which is not easy with an ASP
program, but is simple with a WinForm application. And if the
server-side state changes, you still need to resolve the conflict. The
techniques already discussed are well established and understood.

Yes, you can use pessimistic locking, but this takes quite a bit
discipline as it can lock more than a single row. You also have to make
sure that the lock is released in a timely fashion to permit other
legitimate access. ADO.NET supports this approach as well with
Transactions that implement Repeatable Read concurrency. However, I've
found that my designs are often far more complex to support
client-managed locking or transaction management.


hth

Excellent article, Bill. Couldn't have said it better.

FB
 
Hi,
Thanx to every one.
The article by Frans Bouma is realy helpful. If the debate continues we
can get more valuable suggesions...


Babz
 
I would still nuance a bit this appreciation. IMO the locking scheme is also
to choose in regard with your app. For example an app, in which inputs are
taken from papers may work well with optimistic locking because "locking" is
done by the real world work organization and an application locking scheme
would be probably a bit over-engineered in this case. This is not only a
technical thing (and yes there is IMO no "best" locking mecanism).

Patrice

--
 
Back
Top