A
Abdul Malik Said
Does anyone know how best to do diagnostics on the data operations performed
by an ADO.NET dataset? I am interested in measuring the speed to compare
with the speed of doing the same operations to the same data in a SQL
database directly.
Any thoughts on the architecture of DB solutions that use ADO.NET
recordsets? I think it is slower to use an ADO.NET dataset for operations
such as searching for certain keys or joining. I think SQL must be optimised
to do that kind of operation better. But I am not an expert. I am interested
in the fastest way to make complex calculations on data and then update the
database. Is it better to put database operations in stored procs and call
them from the .NET code? Or is it better to perform the operations in a
dataset and then update the SQL tables afterward?
Is there a way to do a bulk insert from an ADO.NET dataset? By this, I mean
a true bulk insert like in SQL Server, instead of looping through the
dataset to insert updated records.
Any discussion would be greatly appreciated.
Abdul Malik Said
by an ADO.NET dataset? I am interested in measuring the speed to compare
with the speed of doing the same operations to the same data in a SQL
database directly.
Any thoughts on the architecture of DB solutions that use ADO.NET
recordsets? I think it is slower to use an ADO.NET dataset for operations
such as searching for certain keys or joining. I think SQL must be optimised
to do that kind of operation better. But I am not an expert. I am interested
in the fastest way to make complex calculations on data and then update the
database. Is it better to put database operations in stored procs and call
them from the .NET code? Or is it better to perform the operations in a
dataset and then update the SQL tables afterward?
Is there a way to do a bulk insert from an ADO.NET dataset? By this, I mean
a true bulk insert like in SQL Server, instead of looping through the
dataset to insert updated records.
Any discussion would be greatly appreciated.
Abdul Malik Said