Am I wrong

  • Thread starter Thread starter T. Bernstein
  • Start date Start date
T

T. Bernstein

Hello,

I have used ADO for years, and I loved it. When I take a look at ADO.NET, I
somehow get the feeling I am worse off. I hope you can convince me of the
opposite.

Yes, I am more in control, but:
-- I write a lot more code
-- A simple update of a record has become a lot more difficult (used to be:
recordset.update, period !)
-- There are a lot more components involved (connection, adapter, dataset,
table).

Three questions:

1.
Do you have some arguments, which could take away my idea, that this ADO.NET
is not really an improvement?

2.
Is updating a batch of records not very dangerous, because all kinds of
things could have happened to the records, in the time between FILL and
UPDATE. How do you control such a mess, and how do you tell the user, that
update #3 of 40 did not work out? Do I have to extend their memories?

3.
Do you use bulk-updates, or one-at-a-time updates? What is your advice?

Thanks in advance
Jan van Toor
 
Hi Jan,

Yes, you're more in control but, let's face it, updating can be a pain when
you have to develop your own updating code (although the commandbuilder
frees you from this in very simple, direct updates).

Nevertheless, the benefits far, far outweigh the disadvantages. The
disconnected object model is tremedously valuable, not only because it
virtually eliminates data clashes but also because it speeds things up
(except when you have to load a large - very large - table into memory).

Finally, yes, memory is the key to success here - you can't have too much.
My clients get pc's with xp pro and a minimum of 1 gig ram.

HTH,

Bernie Yaeger
 
T. Bernstein said:
Hello,

I have used ADO for years, and I loved it. When I take a look at ADO.NET, I
somehow get the feeling I am worse off. I hope you can convince me of the
opposite.

Yes, I am more in control, but:
-- I write a lot more code

True, no getting around it. (Although, for simple db connectivity, VS.NET
let's you build connections, dataAdapters, commands and datasets by click &
drag).
-- A simple update of a record has become a lot more difficult (used to be:
recordset.update, period !)

But we are working with disconnected copies of the data now so concurrency
issues are a bigger concern, hence the update is more involved.
-- There are a lot more components involved (connection, adapter, dataset,
table).

And that is a good thing. Gives your more control over the amount of memory
allocated to hold the data and what you can do with the data.
Three questions:

1.
Do you have some arguments, which could take away my idea, that this ADO.NET
is not really an improvement?

The ONLY downside I see with ADO.NET is that you do write more code.
Everything else you mention is correct and I see those as arguments to the
positive. Coming from ADO for many years myself, it is a bit of a "mindset
switch", but well worth it once the light bulb goes on.
2.
Is updating a batch of records not very dangerous, because all kinds of
things could have happened to the records, in the time between FILL and
UPDATE. How do you control such a mess, and how do you tell the user, that
update #3 of 40 did not work out? Do I have to extend their memories?

This is where and why the update is more involved than in ADO. But, a
properly written update command (that uses primary key identifiers) gets
around this.
3.
Do you use bulk-updates, or one-at-a-time updates? What is your advice?

Since you have a copy of the data, you can change it as much as you want and
then send all the changes back to the source at once with the dataadapters'
fill method. No reason to fill after each change, wait 'till you are done.
 
T. Bernstein said:
Hello,

I have used ADO for years, and I loved it. When I take a look at ADO.NET, I
somehow get the feeling I am worse off. I hope you can convince me of the
opposite.

Yes, I am more in control, but:
-- I write a lot more code
-- A simple update of a record has become a lot more difficult (used to be:
recordset.update, period !)
-- There are a lot more components involved (connection, adapter, dataset,
table).

Three questions:

1.
Do you have some arguments, which could take away my idea, that this ADO.NET
is not really an improvement?

2.
Is updating a batch of records not very dangerous, because all kinds of
things could have happened to the records, in the time between FILL and
UPDATE. How do you control such a mess, and how do you tell the user, that
update #3 of 40 did not work out? Do I have to extend their memories?

3.
Do you use bulk-updates, or one-at-a-time updates? What is your advice?

Thanks in advance
Jan van Toor
<<I write a lot more code>> That totally depends on what you are doing and
just as one might make the case they write more code up front, I've found I
write less code overall becuase of the overall flexibility that ADO gives
me.

<< A simple update of a record has become a lot more difficult (used to be:
recordset.update, period !)>> In it's simplest form, an Update can be
accomplished with either DataAdapter.Update (one line) or using
command.ExecuteNonQuery(which can be a few lines including opening and
closing connections. Either way, not too big of an issue, and with the
Advent of ADO.NET 2.0 where batch updates are supported by simply setting
One property, it's hard to see how this is a big deal

table).>> So, you don't have to use them at all to get a lot of thigns
done although that's probably not a good idea. How is having more objects
worse than having fewer? You can refine what you need to do tailored to
your needs. You can do a heck of a lot just using connected objects
(Connection, Commands and DataReaders)

<<Do you have some arguments, which could take away my idea, that this
ADO.NET
is not really an improvement?>>

Yes, there are tons of them. The main one is that you can model your data
much better with the DataSet/DataTable model. You can relate your tables
using DataRelations so you don't need to pull over redundant data. You can
take data from any datasource and move it to myriad other sources with the
exact same logic. That means I can move my data from SQL Server to my PDA
and then to an Oracle DataBase by simply using a different data adapter.
Also, XML support under ADO is a pain in the butt..it's a dream with ADO.NET

Perhaps in your specific case, you don'thave the need for the new power or
flexibilty, but your needs may change and there are certainly many people
who do need it...and it was a nightmare with ADO.

ADO was the ultimate black box and if you needed new functionality, tough.
Compare that with the ability to subclass a DataReader for instance, or use
Data Objects that don't even need to ever talk to a database....

And best of all, you can use ADO with .NET if you have a compelling reason,
but I can't use ADO.NET with VB6 VC++ < .NET...So you can still do wahtever
you used to plus more..it doesn't work the other way around.

2) It can be, but it isn't necessarily so (and there aren't true Batch
Updates until ADO.NET v2.0). You can use RowUpdated and excpetion handling
to handle such errors any of a hundred different ways. That's not a
limitation in any regards of ADO.NET

3) At present, the DataAdapter only supports one at a time updates so you
can't do bulk updates in the sense that most people think of bulk updates.
Since the DA updates everythign one at a time, there really isn't a question
to answer.

HTH,

Bill
 
Hi Bernie,

I think that you next time have to tell how hugh those dataset are, now it
looks if 1 gig is needed for a normal datast.
Finally, yes, memory is the key to success here - you can't have too much.
My clients get pc's with xp pro and a minimum of 1 gig ram.

I think that when it was a recordset as big as you now are making, that the
difference would not be much more and that it is not unthinkable that the
recordset would consume more memory

:-)

Cor.

I
 
If you go over to the ADO classic newsgroups you'll see countless questions
and complaints about deployment and problems with programs working on one
platform or not another. You'll also find programs that work when first
deployed, but stop working later for unexplainable reasons. This it the
nature of COM. We lobbied Microsoft for years to fix it. They did. ADO.NET
is build without the need for COM. While it requires MDAC for its netlibs,
ADO.NET is not affected by MDAC "upgrades".

ADO classic (what I dub ADOc) Update method black box makes a number of
arbitrary decisions about concurrency and while it gives you more
flexibility today (you can change the Update behavior by using the Update
Criteria property), I expect the Whidbey version of ADO.NET to be more
flexible in this respect. However, consider that serious applications have
trouble using ADOc Update--even when carefully tuned. That's because the
Update method makes a fundamentally flawed assumption--that you're directly
updating a database table. In most scalable professional systems, we've seen
that data tables are neither queried nor updated directly. In these systems
stored procedures form the first line of interface to the data--it's a
common and widely used practice. While it's harder to implement this
approach, it's safer, faster and far more scalable. None of these developers
complain about the memory footprint of a Dataset (or a Recordset) because
it's so small. That's because they fetch very few rows from the stored
procedures and rarely all the rows of a table. Sure, lookup tables are
queried as a whole, but in this case the number of rows is typically a few
hundred or fewer rows and rarely updated from a typical client application.

ADO.NET is part of the .NET Framework and it's very different than ADOc. It
approaches data access problems with entirely new technology. For the first
time the data provider can use the low-level "native" protocol without
having to go through an one-size-fits-all OLE DB providers. This means
ADO.NET can be faster, lighter and expose more of the target DBMS features
than ADOc.

Sure, these differences make ADO.NET harder to learn. I've found that if you
don't try to compare the two, you're better off. There are so many
differences that it's just easier to think of ADO.NET as a whole new game.
I discuss the transition between ADOc and ADO.NET in my books.

hth

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 
Hi Cor,

I see your point, but I think he will understand that substantial amounts of
ram will always be of value.

Bernie
 
Back
Top