ADO.NET 2.0 Batch Update

  • Thread starter Thread starter joeycalisay
  • Start date Start date
J

joeycalisay

I've been seeing that sqldataadapter now can handle batch update in version
2.0. Anyone who can provide some insights on how is it doing it internally?
Thanks!
 
Basically it allows a DataAdapter to group updates to the server
instead of sending them one row at a time. The new UpdateBatchSize
property specifies the size, or number of rows, that you want sent in
each batch. Setting this property to 0 will cause the batch size to be
as large as the server can handle, and setting it to 1 disables it
(sending one row at a time, the default behavior). An extremely large
batch could negatively impact performance, so you need to test for the
optimum batch size for your app. There will be more information on
batch updating in the Beta 2 documentation.

--Mary
 
In the 1.x implementation, the DataAdapter loops through data and inserts one
row at a time. In 2.0, it will batch if you set a batch size (0 allows the
server to do as many as it can at one time).

I have not checked the internals, but I would assume it has something to do
with the fact the DataSet is rendered as XML under the hood. It would be
fairly easy to run a "decompile" of the 2.0 class(es) in ILDASM to see the
behavior.


---

Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

***************************
Think Outside the Box!
***************************
 
Ah, Greg, the DataSet is not stored as XML under the hood. The DataTable is
a block of arrays and the DataSet is simply a pointer to the DataTable
objects. ADO only converts the data into XML on demand.

The idea behind batch updates (as was supported in ADO classic) is to reduce
(or eliminate) needless round trips. It uses an expanded paradigm to deal
with exceptions that occur and the events that have to fire before and after
the operations take place.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________
 
After downloading the beta framework and disassembling the code, I was able
to see that the batch handling was done by creating a semicolon delimited
query strings executed using sp_executesql. Initially I thought under the
hood, it's using XML through updategram/diffgrams or other techniques.

Thanks a lot guys.


private void BuildCommandTextCall(string cmdTxt, DbParameter cmdTextParam,
DbParameter cmdParamDef, DbParameter[] parameters)

{

this._commandText.Append("exec sp_executesql
").Append(cmdTextParam.ParameterName);

if ((parameters == null) || (0 >= parameters.Length))

{

return;

}

string text1 = ", ";

this._commandText.Append(text1).Append(cmdParamDef.ParameterName);

for (int num1 = 0; num1 < parameters.Length; num1++)

{


this._commandText.Append(text1).Append(parameters[num1].ParameterName);

}

this._commandText.Append(";");

}
 
We are no longer using this design (as of the November CTP drop) since using
sp_executesql runs into a 2100 parameter limit very very fast. Take a look
at Julias blog for more information
http://www.thedatafarm.com/blog/PermaLink.aspx?guid=989b060e-3c01-4300-b968-c1119c135aa2

--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging about ADO.NET: http://weblogs.asp.net/angelsb/




joeycalisay said:
After downloading the beta framework and disassembling the code, I was able
to see that the batch handling was done by creating a semicolon delimited
query strings executed using sp_executesql. Initially I thought under the
hood, it's using XML through updategram/diffgrams or other techniques.

Thanks a lot guys.


private void BuildCommandTextCall(string cmdTxt, DbParameter cmdTextParam,
DbParameter cmdParamDef, DbParameter[] parameters)

{

this._commandText.Append("exec sp_executesql
").Append(cmdTextParam.ParameterName);

if ((parameters == null) || (0 >= parameters.Length))

{

return;

}

string text1 = ", ";

this._commandText.Append(text1).Append(cmdParamDef.ParameterName);

for (int num1 = 0; num1 < parameters.Length; num1++)

{


this._commandText.Append(text1).Append(parameters[num1].ParameterName);

}

this._commandText.Append(";");

}



William (Bill) Vaughn said:
Ah, Greg, the DataSet is not stored as XML under the hood. The DataTable is
a block of arrays and the DataSet is simply a pointer to the DataTable
objects. ADO only converts the data into XML on demand.

The idea behind batch updates (as was supported in ADO classic) is to reduce
(or eliminate) needless round trips. It uses an expanded paradigm to deal
with exceptions that occur and the events that have to fire before and after
the operations take place.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________

in message news:[email protected]...
 
AdapterBatchUpdate has been completelly reworked. We are no longing
manually batching commands in SqlClient since this runs into a 2100
parameter limit >with the sp_executesql stored procedure

Can you send me a link which fully describes the above problem for
sp_executesql? thanks...
You will now be able to batch as many rows as you want without restriction.

Then how are you doing batch update this time? Sorry for the series of
questions but I am interested on what will be the new version doing
regarding batch update

Angel Saenz-Badillos said:
We are no longer using this design (as of the November CTP drop) since using
sp_executesql runs into a 2100 parameter limit very very fast. Take a look
at Julias blog for more information
http://www.thedatafarm.com/blog/PermaLink.aspx?guid=989b060e-3c01-4300-b968-c1119c135aa2

--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging about ADO.NET: http://weblogs.asp.net/angelsb/




joeycalisay said:
After downloading the beta framework and disassembling the code, I was able
to see that the batch handling was done by creating a semicolon delimited
query strings executed using sp_executesql. Initially I thought under the
hood, it's using XML through updategram/diffgrams or other techniques.

Thanks a lot guys.


private void BuildCommandTextCall(string cmdTxt, DbParameter cmdTextParam,
DbParameter cmdParamDef, DbParameter[] parameters)

{

this._commandText.Append("exec sp_executesql
").Append(cmdTextParam.ParameterName);

if ((parameters == null) || (0 >= parameters.Length))

{

return;

}

string text1 = ", ";

this._commandText.Append(text1).Append(cmdParamDef.ParameterName);

for (int num1 = 0; num1 < parameters.Length; num1++)

{


this._commandText.Append(text1).Append(parameters[num1].ParameterName);

}

this._commandText.Append(";");

}



William (Bill) Vaughn said:
Ah, Greg, the DataSet is not stored as XML under the hood. The
DataTable
is
a block of arrays and the DataSet is simply a pointer to the DataTable
objects. ADO only converts the data into XML on demand.

The idea behind batch updates (as was supported in ADO classic) is to reduce
(or eliminate) needless round trips. It uses an expanded paradigm to deal
with exceptions that occur and the events that have to fire before and after
the operations take place.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no rights.
__________________________________

in message In the 1.x implementation, the DataAdapter loops through data and inserts
one
row at a time. In 2.0, it will batch if you set a batch size (0
allows
the
server to do as many as it can at one time).

I have not checked the internals, but I would assume it has
something
would
be see
the
 
Joey,
I used to have the link to the books online page that shows the different
Sql Server limits, I am sorry to say I can't find it now. The 2100 parameter
limit is not specific to sp_executesql, it is the max number of parameters
that a Sql Server 2000 or 2005 stored procedure can handle, for Sql Server 7
this limit was only 1024. In both cases stored procedures are limited to
128MB max size.

If you look at the sql profiler when executing a batch you will see the
exact same behavior that you are used to, each row will have an individual
query. Instead of the client building one query and then sending it to the
server it now will build "adapter batch update" rows and send the entire
group off in one round trip. There is a very good article coming up in msdn
from Julia Lerman on this feature that includes the new design behavior,
here is a link to her blog post
http://www.awprofessional.com/articles/article.asp?p=25288&seqNum=4

--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging about ADO.NET: http://weblogs.asp.net/angelsb/




joeycalisay said:
AdapterBatchUpdate has been completelly reworked. We are no longing
manually batching commands in SqlClient since this runs into a 2100
parameter limit >with the sp_executesql stored procedure

Can you send me a link which fully describes the above problem for
sp_executesql? thanks...
You will now be able to batch as many rows as you want without
restriction.

Then how are you doing batch update this time? Sorry for the series of
questions but I am interested on what will be the new version doing
regarding batch update

Angel Saenz-Badillos said:
We are no longer using this design (as of the November CTP drop) since using
sp_executesql runs into a 2100 parameter limit very very fast. Take a look
at Julias blog for more information
http://www.thedatafarm.com/blog/PermaLink.aspx?guid=989b060e-3c01-4300-b968-c1119c135aa2
--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging about ADO.NET: http://weblogs.asp.net/angelsb/




joeycalisay said:
After downloading the beta framework and disassembling the code, I was able
to see that the batch handling was done by creating a semicolon delimited
query strings executed using sp_executesql. Initially I thought under the
hood, it's using XML through updategram/diffgrams or other techniques.

Thanks a lot guys.


private void BuildCommandTextCall(string cmdTxt, DbParameter cmdTextParam,
DbParameter cmdParamDef, DbParameter[] parameters)

{

this._commandText.Append("exec sp_executesql
").Append(cmdTextParam.ParameterName);

if ((parameters == null) || (0 >= parameters.Length))

{

return;

}

string text1 = ", ";

this._commandText.Append(text1).Append(cmdParamDef.ParameterName);

for (int num1 = 0; num1 < parameters.Length; num1++)

{


this._commandText.Append(text1).Append(parameters[num1].ParameterName);

}

this._commandText.Append(";");

}



Ah, Greg, the DataSet is not stored as XML under the hood. The DataTable
is
a block of arrays and the DataSet is simply a pointer to the DataTable
objects. ADO only converts the data into XML on demand.

The idea behind batch updates (as was supported in ADO classic) is to
reduce
(or eliminate) needless round trips. It uses an expanded paradigm to deal
with exceptions that occur and the events that have to fire before and
after
the operations take place.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no
rights.
__________________________________

"Cowboy (Gregory A. Beamer) - MVP"
wrote
in message In the 1.x implementation, the DataAdapter loops through data and
inserts
one
row at a time. In 2.0, it will batch if you set a batch size (0 allows
the
server to do as many as it can at one time).

I have not checked the internals, but I would assume it has
something
to
do
with the fact the DataSet is rendered as XML under the hood. It
would
be
fairly easy to run a "decompile" of the 2.0 class(es) in ILDASM to see
the
behavior.


---

Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

***************************
Think Outside the Box!
***************************

:

I've been seeing that sqldataadapter now can handle batch update in
version
2.0. Anyone who can provide some insights on how is it doing it
internally?
Thanks!
 
thanks a lot!

Angel Saenz-Badillos said:
Joey,
I used to have the link to the books online page that shows the different
Sql Server limits, I am sorry to say I can't find it now. The 2100 parameter
limit is not specific to sp_executesql, it is the max number of parameters
that a Sql Server 2000 or 2005 stored procedure can handle, for Sql Server 7
this limit was only 1024. In both cases stored procedures are limited to
128MB max size.

If you look at the sql profiler when executing a batch you will see the
exact same behavior that you are used to, each row will have an individual
query. Instead of the client building one query and then sending it to the
server it now will build "adapter batch update" rows and send the entire
group off in one round trip. There is a very good article coming up in msdn
from Julia Lerman on this feature that includes the new design behavior,
here is a link to her blog post
http://www.awprofessional.com/articles/article.asp?p=25288&seqNum=4

--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging about ADO.NET: http://weblogs.asp.net/angelsb/




joeycalisay said:
manually batching commands in SqlClient since this runs into a 2100
parameter limit >with the sp_executesql stored procedure

Can you send me a link which fully describes the above problem for
sp_executesql? thanks...
restriction.

Then how are you doing batch update this time? Sorry for the series of
questions but I am interested on what will be the new version doing
regarding batch update
http://www.thedatafarm.com/blog/PermaLink.aspx?guid=989b060e-3c01-4300-b968-c1119c135aa2
--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging about ADO.NET: http://weblogs.asp.net/angelsb/




After downloading the beta framework and disassembling the code, I was
able
to see that the batch handling was done by creating a semicolon delimited
query strings executed using sp_executesql. Initially I thought
under
the
hood, it's using XML through updategram/diffgrams or other techniques.

Thanks a lot guys.


private void BuildCommandTextCall(string cmdTxt, DbParameter cmdTextParam,
DbParameter cmdParamDef, DbParameter[] parameters)

{

this._commandText.Append("exec sp_executesql
").Append(cmdTextParam.ParameterName);

if ((parameters == null) || (0 >= parameters.Length))

{

return;

}

string text1 = ", ";

this._commandText.Append(text1).Append(cmdParamDef.ParameterName);

for (int num1 = 0; num1 < parameters.Length; num1++)

{


this._commandText.Append(text1).Append(parameters[num1].ParameterName);

}

this._commandText.Append(";");

}



Ah, Greg, the DataSet is not stored as XML under the hood. The DataTable
is
a block of arrays and the DataSet is simply a pointer to the DataTable
objects. ADO only converts the data into XML on demand.

The idea behind batch updates (as was supported in ADO classic) is to
reduce
(or eliminate) needless round trips. It uses an expanded paradigm to
deal
with exceptions that occur and the events that have to fire before and
after
the operations take place.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and confers no
rights.
__________________________________

"Cowboy (Gregory A. Beamer) - MVP"
wrote
in message In the 1.x implementation, the DataAdapter loops through data and
inserts
one
row at a time. In 2.0, it will batch if you set a batch size (0 allows
the
server to do as many as it can at one time).

I have not checked the internals, but I would assume it has something
to
do
with the fact the DataSet is rendered as XML under the hood. It would
be
fairly easy to run a "decompile" of the 2.0 class(es) in ILDASM
to
see
the
behavior.


---

Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

***************************
Think Outside the Box!
***************************

:

I've been seeing that sqldataadapter now can handle batch
update
 
It would help if I actually posted the link I intended, Julias blog post is
at:

http://www.thedatafarm.com/blog/PermaLink.aspx?guid=989b060e-3c01-4300-b968-c1119c135aa2

The link below is for a website that has Sql 7 parameter limit information.
I could not find the books online official link.

--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging about ADO.NET: http://weblogs.asp.net/angelsb/




joeycalisay said:
thanks a lot!

Angel Saenz-Badillos said:
Joey,
I used to have the link to the books online page that shows the different
Sql Server limits, I am sorry to say I can't find it now. The 2100 parameter
limit is not specific to sp_executesql, it is the max number of parameters
that a Sql Server 2000 or 2005 stored procedure can handle, for Sql
Server
7
this limit was only 1024. In both cases stored procedures are limited to
128MB max size.

If you look at the sql profiler when executing a batch you will see the
exact same behavior that you are used to, each row will have an individual
query. Instead of the client building one query and then sending it to the
server it now will build "adapter batch update" rows and send the entire
group off in one round trip. There is a very good article coming up in msdn
from Julia Lerman on this feature that includes the new design behavior,
here is a link to her blog post
http://www.awprofessional.com/articles/article.asp?p=25288&seqNum=4

--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging about ADO.NET: http://weblogs.asp.net/angelsb/




joeycalisay said:
AdapterBatchUpdate has been completelly reworked. We are no longing
manually batching commands in SqlClient since this runs into a 2100
parameter limit >with the sp_executesql stored procedure

Can you send me a link which fully describes the above problem for
sp_executesql? thanks...

You will now be able to batch as many rows as you want without restriction.

Then how are you doing batch update this time? Sorry for the series of
questions but I am interested on what will be the new version doing
regarding batch update

We are no longer using this design (as of the November CTP drop) since
using
sp_executesql runs into a 2100 parameter limit very very fast. Take
a
look
at Julias blog for more information
http://www.thedatafarm.com/blog/PermaLink.aspx?guid=989b060e-3c01-4300-b968-c1119c135aa2
--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging about ADO.NET: http://weblogs.asp.net/angelsb/




After downloading the beta framework and disassembling the code, I was
able
to see that the batch handling was done by creating a semicolon
delimited
query strings executed using sp_executesql. Initially I thought under
the
hood, it's using XML through updategram/diffgrams or other techniques.

Thanks a lot guys.


private void BuildCommandTextCall(string cmdTxt, DbParameter
cmdTextParam,
DbParameter cmdParamDef, DbParameter[] parameters)

{

this._commandText.Append("exec sp_executesql
").Append(cmdTextParam.ParameterName);

if ((parameters == null) || (0 >= parameters.Length))

{

return;

}

string text1 = ", ";

this._commandText.Append(text1).Append(cmdParamDef.ParameterName);

for (int num1 = 0; num1 < parameters.Length; num1++)

{


this._commandText.Append(text1).Append(parameters[num1].ParameterName);

}

this._commandText.Append(";");

}



Ah, Greg, the DataSet is not stored as XML under the hood. The
DataTable
is
a block of arrays and the DataSet is simply a pointer to the DataTable
objects. ADO only converts the data into XML on demand.

The idea behind batch updates (as was supported in ADO classic)
is
to
reduce
(or eliminate) needless round trips. It uses an expanded
paradigm
to before
and
ILDASM
to update
 
Joey-

I posted a screenshot and some further explanation on my blog for you.

Here

http://www.thedatafarm.com/blog/PermaLink.aspx?guid=4099cc98-ccba-4a30-b7bf-4b1c64200e36

Julie


Angel Saenz-Badillos said:
It would help if I actually posted the link I intended, Julias blog post
is
at:

http://www.thedatafarm.com/blog/PermaLink.aspx?guid=989b060e-3c01-4300-b968-c1119c135aa2

The link below is for a website that has Sql 7 parameter limit
information.
I could not find the books online official link.

--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging about ADO.NET: http://weblogs.asp.net/angelsb/




joeycalisay said:
thanks a lot!

message
Joey,
I used to have the link to the books online page that shows the different
Sql Server limits, I am sorry to say I can't find it now. The 2100 parameter
limit is not specific to sp_executesql, it is the max number of parameters
that a Sql Server 2000 or 2005 stored procedure can handle, for Sql
Server
7
this limit was only 1024. In both cases stored procedures are limited
to
128MB max size.

If you look at the sql profiler when executing a batch you will see the
exact same behavior that you are used to, each row will have an individual
query. Instead of the client building one query and then sending it to the
server it now will build "adapter batch update" rows and send the
entire
group off in one round trip. There is a very good article coming up in msdn
from Julia Lerman on this feature that includes the new design
behavior,
here is a link to her blog post
http://www.awprofessional.com/articles/article.asp?p=25288&seqNum=4

--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging about ADO.NET: http://weblogs.asp.net/angelsb/




AdapterBatchUpdate has been completelly reworked. We are no longing
manually batching commands in SqlClient since this runs into a 2100
parameter limit >with the sp_executesql stored procedure

Can you send me a link which fully describes the above problem for
sp_executesql? thanks...

You will now be able to batch as many rows as you want without
restriction.

Then how are you doing batch update this time? Sorry for the series of
questions but I am interested on what will be the new version doing
regarding batch update

We are no longer using this design (as of the November CTP drop) since
using
sp_executesql runs into a 2100 parameter limit very very fast. Take a
look
at Julias blog for more information
http://www.thedatafarm.com/blog/PermaLink.aspx?guid=989b060e-3c01-4300-b968-c1119c135aa2
--
Angel Saenz-Badillos [MS] Managed Providers
This posting is provided "AS IS", with no warranties, and confers
no
rights.Please do not send email directly to this alias.
This alias is for newsgroup purposes only.
I am now blogging about ADO.NET: http://weblogs.asp.net/angelsb/




After downloading the beta framework and disassembling the code,
I was
able
to see that the batch handling was done by creating a semicolon
delimited
query strings executed using sp_executesql. Initially I thought under
the
hood, it's using XML through updategram/diffgrams or other techniques.

Thanks a lot guys.


private void BuildCommandTextCall(string cmdTxt, DbParameter
cmdTextParam,
DbParameter cmdParamDef, DbParameter[] parameters)

{

this._commandText.Append("exec sp_executesql
").Append(cmdTextParam.ParameterName);

if ((parameters == null) || (0 >= parameters.Length))

{

return;

}

string text1 = ", ";


this._commandText.Append(text1).Append(cmdParamDef.ParameterName);

for (int num1 = 0; num1 < parameters.Length; num1++)

{



this._commandText.Append(text1).Append(parameters[num1].ParameterName);

}

this._commandText.Append(";");

}



Ah, Greg, the DataSet is not stored as XML under the hood. The
DataTable
is
a block of arrays and the DataSet is simply a pointer to the
DataTable
objects. ADO only converts the data into XML on demand.

The idea behind batch updates (as was supported in ADO classic) is
to
reduce
(or eliminate) needless round trips. It uses an expanded
paradigm
to
deal
with exceptions that occur and the events that have to fire before
and
after
the operations take place.

--
____________________________________
William (Bill) Vaughn
Author, Mentor, Consultant
Microsoft MVP
www.betav.com
Please reply only to the newsgroup so that others can benefit.
This posting is provided "AS IS" with no warranties, and
confers no
rights.
__________________________________

"Cowboy (Gregory A. Beamer) - MVP"
<[email protected]>
wrote
in message
In the 1.x implementation, the DataAdapter loops through data and
inserts
one
row at a time. In 2.0, it will batch if you set a batch size (0
allows
the
server to do as many as it can at one time).

I have not checked the internals, but I would assume it has
something
to
do
with the fact the DataSet is rendered as XML under the hood. It
would
be
fairly easy to run a "decompile" of the 2.0 class(es) in
ILDASM
to
see
the
behavior.


---

Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

***************************
Think Outside the Box!
***************************

:

I've been seeing that sqldataadapter now can handle batch update
in
version
2.0. Anyone who can provide some insights on how is it
doing it
internally?
Thanks!
 
Back
Top