Patterns anyone?

  • Thread starter Thread starter Jim Hubbard
  • Start date Start date
J

Jim Hubbard

I am looking for patterns for a distributed .Net application for a small
retail chain.

The owner wants the stores to have access to all data (no matter which store
it comes from) in real time. In the case of an internet outage, the store
owner would like for the individual stores to still have access to (at
least) all data from all stores as of the previous day.

This looks like a possible pain in the *........real-time data in N
locations and separate data stores in each location. The data concurrency
alone is looking scary......

Any suggestions or pointers to design patterns for such applications would
be greatly appreciated.

Thanks in advance!
 
Hello, Jim!

JH> The owner wants the stores to have access to all data (no matter which
JH> store it comes from) in real time. In the case of an internet outage,
JH> the store owner would like for the individual stores to still have
JH> access to (at least) all data from all stores as of the previous day.

JH> This looks like a possible pain in the *........real-time data in N
JH> locations and separate data stores in each location. The data
JH> concurrency alone is looking scary......

Data can be stored in single store ( remote db server ). No need to put data on every location as
this will result in synchronization issues. Db server can be made higly available using mirroring,

--
Regards, Vadym Stetsyak
www: http://vadmyst.blogspot.com
 
Mirroring will not make the data available when there is an Internet outage.
Some caching may indeed be necessary. This can be done by using DataSets or
a local database that can be synchronized with the server(s). And it will be
problematic, requiring a certain amount of transactional processing, and
handling for concurrency issues.

--
HTH,

Kevin Spencer
Microsoft MVP
Professional Numbskull

Show me your certification without works,
and I'll show my certification
*by* my works.
 
Hi,

No need for a pattern here, just the correct placement of the DB store, I
will assume that you are using SQL-server.

What you can do is using a central server for all transactions , it should
be fast enough for a small retail outlet if using a business dsl.

You will also install a local server that will sync with the remote from
time to time.
If the remote server fails you can switch to the local one and will work
with the latest data the local has.

If your DB operations in your app are located in the same place (like a
static class) it's easy to switch, just change the connection string and you
are ready, the rest of the system will work as usual
You may need to create a timer or some other mechanism to check for the
restore of the main server.

Then the other part is how to send the locally stored data to the remote
server. for this you could insert the remote as a linked server to the local
and update the remote table from the local one.
 
"Kevin Spencer"
Mirroring will not make the data available when there is an Internet
outage. Some caching may indeed be necessary. This can be done by using
DataSets or a local database that can be synchronized with the server(s).
And it will be problematic, requiring a certain amount of transactional
processing, and handling for concurrency issues.

--
HTH,

Kevin Spencer
Microsoft MVP
Professional Numbskull


The caching mechanism could be implemented with a remote proxy. If there's
connection, the proxy retrieves the data from the remote server, otherwise
it retrieves data from local cache.

How the cache is updated and synchronized is another story.

Padu
 
There are two ways to approach this.

The first has already been outlined by several here (Ignacio gave the
most detailed explanation): find a database that is intended to be used
in a distributed way and comes with replication and synching software.
Many database vendors will sell you replication along with their
databases.

Then write each store as a stand-alone system that simply replicates
data back to the central server.

The advantage here is that you don't have to write any code. So far as
your application is concerned, the database is always available.
Synchronization is the database's problem, not yours. Of course, you
have to think carefully about the consequences of working with stale
data and having the database blindly synchronize with head office when
a connection is restored, but that's the tradeoff.

The second way is to look at your store's system as a "Smart Client".
Microsoft's Patterns and Practices team has written quite a bit on how
to build Smart Client applications that run in a sometimes-connected
environment. Usually that sort of thing is intended for laptops and
PDAs, but there's no reason that an entire store system couldn't be
written as a Smart Client.

The advantage of this approach is that you retain total control over
what is synchronized back to head office and when. The disadvantage is
that you have write a lot more code.

Check out Microsoft's Patterns and Practices "Smart Client" site:

http://msdn.microsoft.com/practices/apptype/smartclient/default.aspx

In particular, information on the Offline Application Block:

http://msdn.microsoft.com/practices...px?pull=/library/en-us/dnpag/html/offline.asp
 
Ignacio Machin ( .NET/ C# MVP ) said:
Hi,

No need for a pattern here, just the correct placement of the DB store, I
will assume that you are using SQL-server.

What you can do is using a central server for all transactions , it should
be fast enough for a small retail outlet if using a business dsl.

You will also install a local server that will sync with the remote from
time to time.
If the remote server fails you can switch to the local one and will work
with the latest data the local has.

If your DB operations in your app are located in the same place (like a
static class) it's easy to switch, just change the connection string and
you are ready, the rest of the system will work as usual
You may need to create a timer or some other mechanism to check for the
restore of the main server.

Then the other part is how to send the locally stored data to the remote
server. for this you could insert the remote as a linked server to the
local and update the remote table from the local one.


Thanks to everyone that has posted.

My personal choice would be to get business DSL service for each location
with a service level agreement that guarantees DSL service at a set % of
uptime and guarantees a set time limit for fixes when the connection does go
down - then just use a central server. But, that leaves the store down in
case of an internet outage (which the store owners admit is rare).

But, I am thinking of making this a simple app with a local server/db for
each store and add a synchronization feature that runs at a set interval.
The data won't exactly be real time, but doing real-time updates in
disconnected locations will require some type of message queue in case the
internet connection goes down between locations and you also have a fight on
your hands when dealing with concurrency.

I also cannot forget that the owner is planning on adding 6 stores this
year. So, each new store needs a ?simple? way of catching up with all of
the other stores before beginning processing on its own.

The questions I am trying to answer are .....

If a store is out of the loop for 1...n days, how do you apply it's
changes without overwriting newer db changes already in the system? The
only thing I can think of is a timestamp on each change. Does SQL Server do
this internally automatically?

What about SQL Server Master-Master relationships? Could I connect SQL
instances over the web and have them auto-update each other with only the
most recent changes to a record?

Synchronizing the DBs looks like its going to be a pain.......but there
must be an answer. How do banks, credit agencies, national companies keep
their data real-time and always available?

Keep those great ideas coming......I can use all of the help I can get with
this one.

Thanks again for your help. I will be glad to outline the solution here for
all as soon as I decide on one.


If a
 
I'm wondering if there is a big factor being missed here.

I see a lot of talk about to handle the situation about how to deal with the
situation when the internet is 'down'. I don't see any risk analysis of the
probability of the internet being 'down'.

If the internet is 'down' for '1...n days' then I would suggest that the
cause of such an outage would most likely be an event of such magnitude that
other services will also be affected and the store won't be operating anyway
(e.g. major earthquake, major flood, major storm etc.).

If the internet needs to be 'down' for as long as a few hours then it is
most likely to be as a result of planned maintenance by an ISP or carrier
Telco. Any such providers should be scheduling such maintenance for times
when there is going to be the least possible impact on their consumers,
(e.g. when most sane people are asleep).

Minor outages of service (a few seconds at a time) happen all the time any
half decent networking stack and/or networking hardware is designed to
handle and recover from these 'micro-outages'.

Now we come to what the 'customer' is actually looking for and what their
definition of 'real-time' is.

For example, if a given store does not have a particular item in stock do
they really need to know that another store or stores has one or more in
stock RIGHT NOW or do thatey just need to know if another store or stores
had one or more in stock as at close of business yesterday.

It really comes down to risk analysis and management.
 
I'm with you. Programming and testing hours, days or weeks of code for low
probability or low impact events is hardly ever profitable.

The client has settled on a db scheme that involves a timestamp on every
record of that record's entry or last change time. This timestamp would be
used to update all records in all tables that have a timestamp greater than
the last time a sync was initiated between databases.

This is preferred by the customer over SQL Server's built in
merge/replication because they don't want to pay for a SQL license at every
location and would prefer to use the MSDE version of SQL 2005.

It isn't perfect, but it makes them feel good and I can do it.

Thanks to all that posted here!
 
Back
Top