D
Dave
I have been developing a MS Access ADP database with SQL Server 2000.
Originally, my customers needed a useable database to store 60 roughtly
60 megs of data as quickly as possible, otherwise I would have used
something else, like VB or C#. What sucks is that over time, they have
requested additional functionality and before I realized it, I suddenly
had a 300 mb database with over 30 tables, and tons and tons of views,
stored procedures, functions, triggers, etc... It could theoretically
store up to to 1gb of data within the next few years... All the
business logic has been implemented in T-SQL, and the Access client is
a dumb as possible.
I'm planning to replace the Access client with a C#
System.Windows.Forms client. There will only a handful of users for
each instance of the application at best. They will always always be
connected on a 100mb LAN. Most of the users will require very little
bandwith except for a select fiew who will require the ability to run
large reports. The design of the database can be changed, as I will be
responsible for all aspects of the design and implementation of this
project in the long term. Being that I am the only programmer on this
project and it is complex, I'll need to maximize time in the
development process. This could easily become my entire life, and I'd
rather that not happen.
The way I look at it, I've got a few choices for the basic design and
implementation of my System.Windows.Forms client. I could:
Use ADO.Net Connections to sql server, storing a connection instance
and related data adapters / sets within in each form they are used.
Good side: it's a drop in replacement requiring no changes to the
already exising SQL Database. Bad side: there would be thousands of
lines of T-SQL stored procedures and triggers to manage and I don't
think I should migriate the bussiness logic to the client side given
the scenario that the users are delayed in updating their clients to a
newer version that fixes bugs in the business logic, for example.
--OR--
Use .NET Remoting within a custom windows service or IIS to provide an
interface to inteface to query a database and return typed DataSets
that will be merged with the form's bound dataset. Good side: if I use
IIS, I don't have to write authentication or permissions into the
service. Also, since I'm working on this long term, I can slowly
migrate its business logic into the service (and C#), eventually
eliminating all of T-SQL's awkard error handling...
Would any of you chose either of these design strategies or something
else entirely? It is still months away for it to even be expected that
I'm working on it, but I'm trying to get an early start and am in the
planning stages.
Originally, my customers needed a useable database to store 60 roughtly
60 megs of data as quickly as possible, otherwise I would have used
something else, like VB or C#. What sucks is that over time, they have
requested additional functionality and before I realized it, I suddenly
had a 300 mb database with over 30 tables, and tons and tons of views,
stored procedures, functions, triggers, etc... It could theoretically
store up to to 1gb of data within the next few years... All the
business logic has been implemented in T-SQL, and the Access client is
a dumb as possible.
I'm planning to replace the Access client with a C#
System.Windows.Forms client. There will only a handful of users for
each instance of the application at best. They will always always be
connected on a 100mb LAN. Most of the users will require very little
bandwith except for a select fiew who will require the ability to run
large reports. The design of the database can be changed, as I will be
responsible for all aspects of the design and implementation of this
project in the long term. Being that I am the only programmer on this
project and it is complex, I'll need to maximize time in the
development process. This could easily become my entire life, and I'd
rather that not happen.
The way I look at it, I've got a few choices for the basic design and
implementation of my System.Windows.Forms client. I could:
Use ADO.Net Connections to sql server, storing a connection instance
and related data adapters / sets within in each form they are used.
Good side: it's a drop in replacement requiring no changes to the
already exising SQL Database. Bad side: there would be thousands of
lines of T-SQL stored procedures and triggers to manage and I don't
think I should migriate the bussiness logic to the client side given
the scenario that the users are delayed in updating their clients to a
newer version that fixes bugs in the business logic, for example.
--OR--
Use .NET Remoting within a custom windows service or IIS to provide an
interface to inteface to query a database and return typed DataSets
that will be merged with the form's bound dataset. Good side: if I use
IIS, I don't have to write authentication or permissions into the
service. Also, since I'm working on this long term, I can slowly
migrate its business logic into the service (and C#), eventually
eliminating all of T-SQL's awkard error handling...
Would any of you chose either of these design strategies or something
else entirely? It is still months away for it to even be expected that
I'm working on it, but I'm trying to get an early start and am in the
planning stages.