dotnet windows forms vs. Access

  • Thread starter Thread starter PMK
  • Start date Start date
Rob, great, winter is quite suitable if nobody else would have time
before that.


Regards,
Thomas
 
Thomas said:
I have a proposal for trying to solve this dilemma: if you have one or
two days in the following month or two, we can imagine simple fictitious
business application (it can be also some usable problem which we can
sell then as a product, too :-)).

Or how about for a non profit group that has a geniune need for such a solution.

Tony

--
Tony Toews, Microsoft Access MVP
Please respond only in the newsgroups so that others can
read the entire thread of messages.
Microsoft Access Links, Hints, Tips & Accounting Systems at
http://www.granite.ab.ca/accsmstr.htm
Tony's Microsoft Access Blog - http://msmvps.com/blogs/access/
 
A simple question has been posed - does it make sense to move from MS Access
to .NET?

Back in 1999, I rewrote an application for the third time. The first
version was in DBase II and the second version was in Progress. The plan for
version 3 was VB on the frontend and Sybase SQL Anywhere on the backend.
With 10% of the project completed, I was through 25% of the money - $10,000
at this point. (Math quiz: what was the budget and what was the projected
cost overrun?)

Two questions had to be answered. 1) How did I screw up the budget so
terribly and 2) how could I fix it?

The answer to (1) is that I budgeted the project based on what I could do it
in MS Access. The reason that I didn't use Access is that for multi-user
applications using bound forms and linked tables across the network, Access
stinks. It is slow and unreliable.

So what to do, what to do? The obvious question was to fix the data
marshalling in MS Access so that it is fast and reliable.
Did that. My Access applications now run 125 times faster on the network
with bound forms to linked tables. Starting over with Access on the
frontend and Sybase on the backend , the total project cost (including the
initial $10,000) came in at $44,000. Speed over the network is
instantaneous and it has never broken.

So why the shift to .NET? 1) My Access middleware DLL is written in VB and
utilizes RDO both of which are nearing the end of life. 2) My middleware
only works with Sybase, though with some work, it could be made to work with
SQL Server. 3) Access looks old. 4) Microsoft is pushing .NET.

The project I'm now finishing up uses .NET, C# and SQL Server. The budget
was set at 1.3 times a comparable Access project. It appears that the final
cost will come in at budget.

The pluses: The database is in New Jersey with the frontend up here in
Maine. Performance is unbelievably fast - you wouldn't know the database
was off site over a wide area network. It doesn't break. It looks great.
Plus, the project is coming in on budget.

The minuses:
- Long learning curve.
- Out-of-the-box doesn't cut it - I use the Developer Express controls
(fantastic).
- The data designers are pretty much useless for the sql logic - wrote my
own which work.
- There are some nasty gotchas in ADO.NET requiring work-a-rounds.
- VS 2005 is just plain painful to design forms - dropping, renaming and
binding controls is way too slow.
- Microsoft is tardy in getting service packs out to fix framework bugs.

So back to the initial question, "does it make sense to move from MS Access
to .NET?" Even with some of the crap in Visual Studio, the outcome is
positive and well worth the investment.

My clients love the results which means business will be good for some time.
New product development to replace aging Access is proving to be a smart
move.
 
This only tell us that you have started a project in VB without knowing VB.
I never saw someone starting a project with a technology that he (/her/them)
doesn't know without going into massive cost overruns.

As to the data marshalling that you have done, I would be very surprised if
you can't do the same thing with either VB or .NET. In fact, I would be
even more surprised if you can't do any better with NET than you did with
Access.

NET has been designed to give you the fastest response as possible with
every kind of backend (JET, SQL-Server, Oracle, heterogeneous, etc.) over
any network (Local, LAN, WAN) as possible. If you make a test that gives a
slower response with .NET than with anything else, it's not because NET is
slower, it's because you have made one or more errors somewhere.

Don't forget that with practically any kind of new technology, you will
usually need at least one year of experience to have some mastery of it.
 
Sylvain Lafontaine said:
This only tell us that you have started a project in VB without knowing
VB. I never saw someone starting a project with a technology that he
(/her/them) doesn't know without going into massive cost overruns.

The VB6 databinding killed me. .NET databinding is so much better and
explains why the project costs for Access and .NET are in the same ballpark.
My experience parallels others - VB 6 takes about 3 to 5 times longer to
write compared to Access. .NET is coming in at 1.3 times which, when you
look at the other benefits, is very acceptable.
As to the data marshalling that you have done, I would be very surprised
if you can't do the same thing with either VB or .NET. In fact, I would
be even more surprised if you can't do any better with NET than you did
with Access.

ADO.NET disconnected datasets mirrors my 1999 Access/VB RDO architecture
with the Fill and Update methods. That's one of the reasons I'm so
comfortable working with ADO.NET.
NET has been designed to give you the fastest response as possible with
every kind of backend (JET, SQL-Server, Oracle, heterogeneous, etc.) over
any network (Local, LAN, WAN) as possible. If you make a test that gives
a slower response with .NET than with anything else, it's not because NET
is slower, it's because you have made one or more errors somewhere.

Couldn't agree with you more. With multi-threading, you can give the
illusion of even faster speed.
Don't forget that with practically any kind of new technology, you will
usually need at least one year of experience to have some mastery of it.

Plan on at least a year. But as I say, replacing aging Access with .NET
has turned out be a smart business move.
 
Per Jim Rand:
So what to do, what to do? The obvious question was to fix the data
marshalling in MS Access so that it is fast and reliable.
Did that. My Access applications now run 125 times faster on the network
with bound forms to linked tables.

Can you list some of the things you did to make that improvement
in marshalling?
 
So you budgeted to do a project in Access and then you tried to do it in VB
for the same budget (duh!), and because you couldn't this is somehow Access'
fault?

Access using bound forms and linked tables for multi-user apps is so slow
and unreliable that I am always getting panic-stricken support calls from my
many customers. Some of them I hear from every year! Mind you, I do use
SQL Server, not Sybase.

I think your post tells us a lot more about you than it does about any
technologies.

Jim Rand said:
A simple question has been posed - does it make sense to move from MS Access
to .NET?

Back in 1999, I rewrote an application for the third time. The first
version was in DBase II and the second version was in Progress. The plan for
version 3 was VB on the frontend and Sybase SQL Anywhere on the backend.
With 10% of the project completed, I was through 25% of the money - $10,000
at this point. (Math quiz: what was the budget and what was the projected
cost overrun?)

Two questions had to be answered. 1) How did I screw up the budget so
terribly and 2) how could I fix it?

The answer to (1) is that I budgeted the project based on what I could do it
in MS Access. The reason that I didn't use Access is that for multi-user
applications using bound forms and linked tables across the network, Access
stinks. It is slow and unreliable.
<snip>
 
Where do I start? How about the bloated, festering dotnet framework,
redolent as it is with the promise of versioning nightmares in years ahead?
How about not being able to write unmanaged code in anything but C++? How
about it being so damned slow? How about the IDE being so damned awful?
How about MS's sheer, arrogant supidity in trying to kill the world's most
popular development tool (I refer of course to VB6) without even providing
backward compatibility?

That's a few gripes about the big stuff, I could think of a lot more if I
tried. There's LOADS more stuff at the small level, my favourite in
2002/2003 was the godawful combo box which couldn't even do what a VB6 combo
could do, let alone an Access combo. Sure, you can build your own controls,
which is exactly what I did to get a half-decent combo box (or you can buy
'em if you didn't already think you'd wasted enough dosh on this garbage),
but that's what you have to do all the time: so much of it doesn't quite
work as you would expect or want that you spend all your damn time
finding/creating workarounds and alternatives.

Yeeeuk! I don't blame you for puffing this stuff - you've got a living to
make - but frankly it makes me ill.
 
These are the relative speeds for connecting with backend databases (source:
Microsoft).

1.0 ODBC API
1.1 Query pass-through (readonly way to get the data)
1.1 RDO (remote data objects) - Microsoft's only comment was "it is very
fast".
125 - 150 Access dynasets via linked tables.

So the answer is to get the data from the backend database via query
pass-throughs and to write changes back with RDO.

Implementation:

I have a program mdb on the client which creates a temporary data mdb on the
client. The middleware is written as a VB DLL for speed (compiled) and for
use of RDO. Prior to opening a form, the middleware
- Creates tables in the data mdb using model tables saved in the program
mdb as a templates
- Links to data tables in temp.mdb (local link - not over the network)
- Creates query pass-throughs for each table
- Creates append queries using the query pass-throughs as its source to
load the local linked tables.
- Fires the append queries (got the data now - Access is totally faked
out - both for design and runtime)

The sql for each query pass-through is stored in the backend database. Here
is a sample:

SELECT B.BudgetID, B.ChartAcntID, B.BudgetYear, B.January, B.February,
B.March, B.April, B.May, B.June,
B.July, B.August, B.September, B.October, B.November,
B.December,
CAST(B.TS AS CHAR) AS ExactTS
FROM Tiger.Budget AS B
WHERE B.BudgetYear = ::01
ORDER BY B.ChartAcntID

BudgetID is the auto increment primary key issued by the database. B.TS is
a timestamp cast as char for concurrency. (Note: in .NET with SQL Server,
you would cast B.TS as an int since SQL Server's timestamp is really a
binary row version). ::01 is a placeholder for parameter tokens.

For children of parent tables, you would refire the append queries on the
Form_Current event - i.e., you only have the invoice detail locally for one
invoice.

*** In the parent form ***
' Retrieve line items this order from Sybase and requery subform
Set f = Me.zInvoiceLineitems.Form
f.RequeryData lInvoiceID:=Nz(Me.InvoiceID, 0)
Set f = Nothing

*** In child form ***
ReDim vNewTokenValues(1)
vNewTokenValues(1) = lInvoiceID
goTblsSybase.Item("InvoiceLineItem").RefreshData
vNewTokenValues:=vNewTokenValues
Me.Requery


Writing the data back is a little more tricky. To do this, I have a
CFormDataHandler class which each form instantiates. Public methods include
AddTimerItem, Form_AfterDelConfirm, Form_AfterInsert, Form_AfterUpdate,
Form_Current, Form_Delete, Form_Timer, Form_Unload, Initialize and
RefreshData. Each form and each subform delegates activity back to the
oFormDataHandler object. The middleware figures out what the insert,
delete, and update statements are from the initial select statement - sort
of like .NET's command builder but smarter.

The framework is complex, but once it is written, it's really simple to use.
Access is totally faked out. It becomes a really fast single user database
working only with local data. Since you are not relying on linked tables
across the network, reliability goes to 100% - it never breaks. Plus all
access to the backend database is at the ODBC API speed.

Since the initial project back in 1999, I've reused this framework on
numerous projects. For one, we had to connect via a really slow fractional
T-1. After making an entry in a cell in a grid, the client pressed the
down-arrow key which caused the writeback to the database via the framework.
The time to get to the next row was instantaneous. The client said "wow,
its like the server was right there".

This is the magic of disconnected data. Think about it. That is what
ADO.NET is all about with its disconnected datasets.

They say "real programmers don't use bound forms". From my experience,
bound forms offer hugh productivty gains. Access has it as does .NET.

Jim
 
yes, you are wasting your time

Microsoft is going to make a new .NET version of ADP with the next release--
so .NET is not a total waste of time

but it is obvious that ADP is a much much much better platform than .NET
 
Where you should start? Well, you should start with VB6, VC6, COM/DCOM/MTS,
ActiveX and the registries base themselves!

In case you forgot, this whole system was already in the process of
imploding under its own weigth 10 years ago. Doing a program today is no
longer the process of aligning a few text boxes, comboboxes and listboxes on
a simple form, tabbed form or continuous form on an isolated machine,
without internet and maybe even without a LAN.

Take a look at all the criticisms that were made about VB6, unmanaged C++,
the DLL hell, the total lack of security and the constant corruption of the
registries base - to name only that - and now think about what would be the
situation if instead of dumping that into the garbage bin, MS would have
increasing the size of these dinosaurs by a factor of at least ten to one
hundred. Increasing the size of the registry base by a factor of at least
one hundred is not only a reasonable assumption but it's quite likely a
gross under-estimation of the true space that would have been required to
cover the actual possibilities of the .NET framework using these old
technologies.

I agree with you that even on a Core 2 Duo, running the actual Framework 2.0
is slow but if you would have tried to do the same thing with
COM/DCOM/ActiveX - as it was with VB6 - probably that not only your machine
will be running slow but probably that it would have imploding into a black
hole.

Buying a dual core is now standard and they will soon be replaced with quad
core as the basic developer machine. (Probably that the price of the quad
core will be cut by two this autumn, if not before.). Next year, you will
start to see machines with 8 cores and from 16 to 32 Gigs of memory running
Vista 64 bit as the basic machine bought by most developers. With such
power running in 2008, do you really think that people wanted to keep
VB6/VC6 - with a few more gugus here and there - as their main developer
tools?

VB6 was the most popular tool in the past years? So do were DBase3, Lotus
1-2-3 and Word Perfect. Now, all these tools are gone because Ashton-Tate,
Lotus and Word-Perfect Corporation were believing that whence you have
reached a market share of 90% and more, you don't have to evolve anymore and
your base of loyal users will remain with you for eternity.
 
I'm surprised you're so emphatic about this, Sylvain. That's not like you,
from what I've seen.

I have to go with Baz on this one. Many, MANY people are moving away from
the .NET framework (Delphi seems to be a popular choice) because VS
2002/2003 were very slow, and while VS 2005 offers some improvements, I
gather, it's still often sluggish when using managed code, and many VB6
developers have no interest in learning C++ as the only unmanaged
alternative. And, of course, there's the nightmare of having to
redistribute the megalith that is the Framework itself.

As for how fast it will eventually be on upcoming OS's/hardware, designing a
system for stuff that's not out yet isn't usually the best idea, as people
have to use this stuff today on their current OS and hardware, not a year or
two (or more) from now. I'm also a firm believer that the hardware
shouldn't have to compensate for the underlying speed of the application.

I completely disagree that VB6 with COM/DCOM/ActiveX would be a "black
hole". Most tests have shown that in fact, it's significantly faster than
..NET in most cases. There are certainly some tests where .NET outperforms
VB6, and it's a lot more capable in some areas like multithreading, but most
of the articles I've read have put VB6 as 1.5 - 2x faster than the
equivalent .NET code.

But regardless of any of that, even if VS 2005 works faster in any given
scenario, Access is still VBA-based, and as such, you'd still be using COM,
etc. *If* you're going to use Access, which many people still prefer to
..NET solutions, it makes sense that you'd supplement it with VB6/COM if
necessary, rather than using .NET and having to go through Interop.

I will agree that it looks like VB6 is likely to die eventually, and
unfortunately, Microsoft seems to have no understanding of why people are
complaining about that fact. An unmanaged version of VB with greater
backwards compatibility has been suggested, and requested by thousands
(http://classicvb.org/Petition/), but again, it seems unlikely.
Nevertheless, VB6 continues to be supported on Vista, at least in some
fashion, so I don't really see much of a problem continuing to use that for
the time being.

All that said, if .NET is fast enough for you (and as I say, I gather 2005
made improvements in that area), then certainly there's a lot to be said for
it in terms of enterprise development, Internet development, etc. So, as
they say, "if it works, don't knock it." For myself, however, the
sluggishness was intolerable, and at least so far, I'm sticking with VB6
DLLs to back up my Access projects where necessary.


Rob
 
yes, you are wasting your time

Microsoft is going to make a new .NET version of ADP with the next release--
so .NET is not a total waste of time

but it is obvious that ADP is a much much much better platform than .NET

..NET version of an Access Data Project? I don't get it. Tell me more.

Peter
 
PMK said:
.NET version of an Access Data Project? I don't get it. Tell me more.

Unfortunately Aaron Ke mpf's answer to every problem is ADPs.

Tony
--
Tony Toews, Microsoft Access MVP
Please respond only in the newsgroups so that others can
read the entire thread of messages.
Microsoft Access Links, Hints, Tips & Accounting Systems at
http://www.granite.ab.ca/accsmstr.htm
Tony's Microsoft Access Blog - http://msmvps.com/blogs/access/
 
Yeah, even me can get emphatic about something, sometime. It's only that in
this case (ADP), I got tired to hear people coming here to say that even if
they know nothing about SQL-Server, ADP or .NET - from their own admission
and they are proud of it, too - they will now tell us everything we need to
know about these.

As to VB6, I agree with you that on many occasions, it will be faster than
..NET. The problem here is not the performance when you are doing old stuff
like 10 years ago; the problem is when you want to have new stuff. When was
the last time that you have used some kind of datagrid control with VB6 and
that you didn't have the taste of eating your own keyboard after a few days
of work?

Something as simple as having rows of different colors can put VB6 or Access
down to their knees and I won't speak here about stuff like having images or
adding an unbound control to a continuous form or even something very basic
like keeping the multi-rows selection active when the user click on a
button.

When you need to add some extensibility of any kind, VB6/Access are simply
not the way to go.
 
Yeah, okay, for the type of issues you're talking about, I can see .NET
being more useful. Last time I tried it, though, it was unacceptably slow,
and until someone can prove to me that the speed is now more
acceptable--which I've heard for 2005--and that there's a decent upgrade
path from VB6 (or better yet Access)--which I've never heard from
anybody--it won't ever be on my list of things to upgrade to.



Rob
 
There is no doubt that in comparaison of VB6/VBA, you need a lot of firing
power inside your machine to tame .NET but this is precisely what we are in
the process of getting this year as the basic configuration for a new
machine.

As for a decent upgrading path from Access, this is probably something we
should see next year. The SSMA-Access is already written in .NET, so I
won't be surprised if in one year or two there is a new version to upgrade
not only the backend to SQL-Server but also the frontend to .NET (and of
course with the option of upgrading only the FE, without touching the BE).
 
Ummm...have you ever used the VB6 to VB.NET "Upgrade" Wizard? IIRC, it
doesn't upgrade forms at all, and it does a less-than-stellar job at
upgrading VB6 code. I can't say I have confidence in MS to write something
that will upgrade my FE in a way that would be better than simply starting
from scratch. (For me personally, my BE is already SQL Server, so that's
not an issue, though obviously that's strictly for me and not everybody.)


Rob
 
I must admit that there is a difference between an upgrade path and a decent
upgrade path. If I were the happy owner of MS, I would have put a little
more money into this upgrading wizard but as I'm not, ...
 
Sylvain said:
I must admit that there is a difference between an upgrade path and a decent
upgrade path. If I were the happy owner of MS, I would have put a little
more money into this upgrading wizard but as I'm not, ...
There is a decent upgrade path from C to C#. MS put
a lot of time and effort into making sure that a VS C
project could be ported to VS.Net without tears.


I think it's like Vatican watching, or China watching as was. You see
the effects of the decisions that are made by some hidden process, and
then you get to work back from there and speculate about cause, and
speculate about possible future actions.
 
Back
Top