Application Performance

  • Thread starter Thread starter John
  • Start date Start date
J

John

Hi

I have this weird problem with my vb.net app that seems to have variable
performance and I can't seem to find out why. This vb.net app with access 97
backend has a sister app in access 97 also with an access 97 backend. Both
apps link some of each others tables. Now the problem is that access app
works fine even if there are 15 users on the same network/server but vb.net
app starts to get stuck even if 2-3 users have opened the app and only one
is actually doing anything to test the app. Even this is just moving to the
next record which only brings one record at a time from server using data
adapter/dataset. As no significant data activity is going on I suspect it is
not data access issue. The machines are all new or newish dell winxp pro
machines (2.4 GHz etc.).

- As the backend is access 97, could it be that vb.net tries to access it as
access 2000 and possibly using the wrong data access dlls? Should I upgrade
back end databases to 2000? It does not happen all the time, though. I have
a feeling that if I open the access 97 app first then vb.net generally
behaves better.

- Could optimistic/pessimistic locking on the server have anything to do
with this?

- Anything to do with MDAC version of net framework sp? Anything on the
similar lines?

- Any other possibilities that I can try?

Thanks

Regards
 
PS: The app works fine on a standalone pc. It may be an intermittent locking
issue that comes between 2 or more users.
 
John:

There could be a few things and you certainly have nothign to lose by
updating your MDAC components altough I don't think they are the problem.
I'm wondering about the design though.... Are you really using a DataAdapter
to bring over 1 record at a time? The comparison to Access is a lot
different b/c it's using DAO (or one of those things) not ADO.NET. However,
using a DataAdapter/DataTable to bring over one record each time the user
flips through something is ghastly...You definitely don't want to do this.
There are many reasons why this is bad and even ifyou were using SQL Server
or Oracle, using these objects to do such light lifting is anthithetical to
the whole way ADO.NET is designed to perform. Instead, you'd want to grab
your data, using a BindingContext for instace, use the locally cached data
for whatever the user is doing, and send back the updates/inserts/deletes at
some interval dictated by business rules. You should keep trips to the DB
to an Absolute minimum. So using a heavy object like a DataAdapter coupled
with a dataset/datatable alone to grab one record could be causing
performance issues. I'd think Access was part of the problem but if it
works within Access than it's not it.

If you're accessing records one at a time (if you absolutely must hit things
one at a time, use a DataReader, but I'd still recommend against that), you
are going to be creating a ton of objects that you are throwing away
quickly. Let's say that you move through 10 records. If you hit the db
each time, that's a minimum of 10 connection objects, 10 command objects, 10
DataTables unless you are reusing the stuff. Even if you are, then you are
still clearing that datatable a bunch and you are still firing 10 commands
against the db when 1 would work. You can use SELECT, Compute etc on the
Local DataTable and this should dramatically boost your performance. In
addition to the data accesss issues, this approach wastes network resources,
strains the db and concievably costs Garbage Collections (also very
expensive) that may otherwise not need to happen. How bad this is depends
on how you are running those queries, but grabbing the data a piece at a
time isn't the way to go, particularly with access.. can you post some of
the code too? Also, you say that it doesn't always happen right? What
tends to be the common denominator when it does? I'm thinking that the
problem is both client and server (if I understand the scenario correctly,
then there are almost certainly concerns on both ends) and the solution is
going to be taking some pressure off of the db. This will lead to less
locking, less stress on the db and a much more efficient streamlined app.

Check out my articles on this and let me know if you're hitting the db one
record at a time. If so, then it's good news b/c fixing this willl be quite
easy.

http://www.knowdotnet.com/articles/dataviews1.html
http://www.knowdotnet.com/articles/expressions.html
http://www.knowdotnet.com/articles/adopartiii.html

A friend of mine recently sold an Access app that did one at a time queries
and it definitely suffereed from some performance problems so we can
definitely do a good bit of stuff to boost the performance.

Let me know.

Bill
 
Hi Bill

Thanks for the info. Point taken. I will go through these and come back. In
the meanwhile I have just hit on a little short cut. The trick is to
actually open a connection, instead of just relying on the data adapters to
open/close connection. After that all data adapter actions seem to be much
faster on my standalone machine. I still need to test it on the client LAN
but I hope that performance increase will be reflected over there too.

Thanks

Regards
 
Hi John:

If you have a bunch of successive queries, than leaving the connection open
while you fire them (only if they are all back to back) probably makes sense
and can speek things up. I'd just caution you though to make sure that you
trap the connection open with a try catch block and use a finally statement
which includes a call to close. If you open them yourself then you have to
close them and if your query bombs midstream, you want to defintiely close
the connection(s). I'm a little neurotic about opening connections anyway
since many things can cause opening them to fail outside of my code so I
like treating opening and closing them as a seperate issue from the actual
query - this lets me write the data to a xml file for instance if I'm trying
to issue an Update/Insert/Delete command an I blow up on the connection...
but that's another issue.

I'm glad you have it sped up and for the record, you can really do some
pretty complex stuff using built in methods of the various objects. For
instance, if you use Expression columns for computer fields (assume you have
a price and a quantity field, your SQL Statement might originally look like
SELECT Price, Quantity, (Price * Quantity) 'Total' From someTable) vs SQL
Statements, you'll see a bunch of benefits. In this example if you changed
the quantity of an item from 1 to 5, the Total would stay the same if you
used the above query so you'd have to refresh the query or recalculate it.
To do this, you'd need to trap an event in every place where the user could
change price or quantity. On the other hand, if you add an Expression
Column, you save having to include the Total in the query, plus as soon as
the user changes price or quantity ANYWHERE, the Total will automatically
update itself.

Simimilary, using the DataTable.Compute Function can calculate a ton of
different values for you without causing you to have to go back to the DB.
IF you wanted to know the total of all sales...
myDataTable.Compute("SUM(Total)", "Total > 0").. this would sum all values
of total where total was > 0. You could use whatever filter you wanted
[maybe > 0 doesn't make any sense] but you get a lot of functionality here
wihtout having to hit the db. Over a session, this could reduce the number
of connections, round trips and objects created tremendously. Done
correctly this could also reduce or emilinate (totally depends on the code
though, no quarantees here) Garbage Collections which are very costly.

Using the BindingManagerBase/BindingContext can make this really easy. Also,
if you use the DataRelation object, it can automatically create foriegn key
values in items like a datagrid which really makes life a lot easier.

If you need any help or have any questions, let me know and I'll be glad to
do what I can [defintiely made my share of mistakes on this subject so I
probably have some decent insight].

HTH,

Bill
 
Back
Top