How to read OleDbSchemaGuid tables 1 record at a time ?

  • Thread starter Thread starter tommaso.gastaldi
  • Start date Start date
T

tommaso.gastaldi

Hi,

if we have an OleDbConnection it is possible to retrieve a datatable
containing information about the database fields using the statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases can be very
large. For instance if you do it with SAP you receive over half million
rows.

My question:

I would like to do the same operation programmatically and not using
the statement GetOleDbSchemaTable. Possibly taking 1 record at a time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
Of course Miha, but I was talking about processing rows (not fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:
You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a datatable
containing information about the database fields using the statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases can be very
large. For instance if you do it with SAP you receive over half million
rows.

My question:

I would like to do the same operation programmatically and not using
the statement GetOleDbSchemaTable. Possibly taking 1 record at a time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
....And just loop through the rows using the DataTable.Rows collection (it
would interesting to check if restrictions are applied on the result or are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored procedures).

--
Patrice

Of course Miha, but I was talking about processing rows (not fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:
You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a datatable
containing information about the database fields using the statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases can be very
large. For instance if you do it with SAP you receive over half million
rows.

My question:

I would like to do the same operation programmatically and not using
the statement GetOleDbSchemaTable. Possibly taking 1 record at a time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
Hi Patrice. Thanks... Hmmm,
it seems I really have problem with my English :) Sorry!

It's clear that - once one has a DataTable - one can read a
row at a time.

What I was trying to explain is that I do not want to get the
whole DataTable, but just get one record at a time.

This would be the same difference between:

DataAdapter + Fill() [gets the whole table at a time]
DataReader + Read() [gets 1 record at a time]

I would like to know how to implement something
similar to the DataReader + Read() schema

for OleDbSchemaGuid tables. And I want it works
for any OleDb connection (independently of the DBMS)

I hope now it is more clear my goal.

Thank you!



Patrice ha scritto:
...And just loop through the rows using the DataTable.Rows collection (it
would interesting to check if restrictions are applied on the result or are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored procedures).

--
Patrice

Of course Miha, but I was talking about processing rows (not fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:
You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a datatable
containing information about the database fields using the statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases can be very
large. For instance if you do it with SAP you receive over half million
rows.

My question:

I would like to do the same operation programmatically and not using
the statement GetOleDbSchemaTable. Possibly taking 1 record at a time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
But why would you want to?
There is no significant gain....

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi Patrice. Thanks... Hmmm,
it seems I really have problem with my English :) Sorry!

It's clear that - once one has a DataTable - one can read a
row at a time.

What I was trying to explain is that I do not want to get the
whole DataTable, but just get one record at a time.

This would be the same difference between:

DataAdapter + Fill() [gets the whole table at a time]
DataReader + Read() [gets 1 record at a time]

I would like to know how to implement something
similar to the DataReader + Read() schema

for OleDbSchemaGuid tables. And I want it works
for any OleDb connection (independently of the DBMS)

I hope now it is more clear my goal.

Thank you!



Patrice ha scritto:
...And just loop through the rows using the DataTable.Rows collection (it
would interesting to check if restrictions are applied on the result or
are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific
statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored procedures).

--
Patrice

Of course Miha, but I was talking about processing rows (not fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:
You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a datatable
containing information about the database fields using the statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases can be
very
large. For instance if you do it with SAP you receive over half
million
rows.

My question:

I would like to do the same operation programmatically and not using
the statement GetOleDbSchemaTable. Possibly taking 1 record at a
time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
English is not may native language either. I should just have been more
explicit.

This is the second part then. Use a custom replacement for you DBMS with a
DataReader (note that a DataReader still uses a buffer).

What I wanted to convey with the first part is that the problem lloks like
to me that that you get all data at once. With restrictions (as suggested by
Miha) you can get only those you are interested using a limited amount of
memory. Though you may want to check that restrictions are processed server
side it looks like quite a good solution.

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Hi Patrice. Thanks... Hmmm,
it seems I really have problem with my English :) Sorry!

It's clear that - once one has a DataTable - one can read a
row at a time.

What I was trying to explain is that I do not want to get the
whole DataTable, but just get one record at a time.

This would be the same difference between:

DataAdapter + Fill() [gets the whole table at a time]
DataReader + Read() [gets 1 record at a time]

I would like to know how to implement something
similar to the DataReader + Read() schema

for OleDbSchemaGuid tables. And I want it works
for any OleDb connection (independently of the DBMS)

I hope now it is more clear my goal.

Thank you!



Patrice ha scritto:
...And just loop through the rows using the DataTable.Rows collection (it
would interesting to check if restrictions are applied on the result or
are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific
statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored procedures).

--
Patrice

Of course Miha, but I was talking about processing rows (not fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:
You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a datatable
containing information about the database fields using the statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases can be
very
large. For instance if you do it with SAP you receive over half
million
rows.

My question:

I would like to do the same operation programmatically and not using
the statement GetOleDbSchemaTable. Possibly taking 1 record at a
time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
Dear Miha , Patrice, ... and all

the advantage of having a mechanism similar to DataReader is obvious.

Assume I am getting only the very first field of the COLUMN schema
table.

If I connect to SAP and run the GetOleDbSchemaTable command I get a
datatable with 1 field
and over half million rows.

I would like to get 1 value at a time to process it instead of holding
over half million
values in memory at the same time, which I do not need.
Further I usually need to take and process many more columns at the
same time and the size of the datatable become enormous.

Of course with small systems this is not a big problem. But try to work
with real world databases...

-tom

Patrice ha scritto:
English is not may native language either. I should just have been more
explicit.

This is the second part then. Use a custom replacement for you DBMS with a
DataReader (note that a DataReader still uses a buffer).

What I wanted to convey with the first part is that the problem lloks like
to me that that you get all data at once. With restrictions (as suggestedby
Miha) you can get only those you are interested using a limited amount of
memory. Though you may want to check that restrictions are processed server
side it looks like quite a good solution.

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Hi Patrice. Thanks... Hmmm,
it seems I really have problem with my English :) Sorry!

It's clear that - once one has a DataTable - one can read a
row at a time.

What I was trying to explain is that I do not want to get the
whole DataTable, but just get one record at a time.

This would be the same difference between:

DataAdapter + Fill() [gets the whole table at a time]
DataReader + Read() [gets 1 record at a time]

I would like to know how to implement something
similar to the DataReader + Read() schema

for OleDbSchemaGuid tables. And I want it works
for any OleDb connection (independently of the DBMS)

I hope now it is more clear my goal.

Thank you!



Patrice ha scritto:
...And just loop through the rows using the DataTable.Rows collection (it
would interesting to check if restrictions are applied on the result or
are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific
statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored procedures).

--
Patrice

Of course Miha, but I was talking about processing rows (not fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:

You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a datatable
containing information about the database fields using the statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases can be
very
large. For instance if you do it with SAP you receive over half
million
rows.

My question:

I would like to do the same operation programmatically and not using
the statement GetOleDbSchemaTable. Possibly taking 1 record at a
time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
You have basically two options :

1) Even with GetOleDbSchemaTable you *don't* have to read all columns. With
the "restrictions" parameter, you are able to retrieve the columns for THE
table you want. The table name is in third position.
(or it could be not supported by SAP, also it's strange to me that you have
only 1 field in the resulting DataTable, I would expect at least both the
column name and the data type)

2) Else you can issue a custom statement to query for schema information.
then you'll be able to use a DataReader (but you'll loose OleDB independance
as you asked).

To make it clear, even with #1 there is nothing that forces you to read half
million rows. You just read the columns for the table you are currently
interested in before fetching data for the next table.

Try :

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,New
Object() {null,null,"<TableName>"})

I gave this a try with SQL Server and it worked fine (and yes restrictions
are processed server side, the SP used just ask data for this table).

Hopes it's clearer now. (do you access SAP data through some kind of
specific driver or do you just use the driver for the underlying database,
could it be a driver limitation ?)

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Dear Miha , Patrice, ... and all

the advantage of having a mechanism similar to DataReader is obvious.

Assume I am getting only the very first field of the COLUMN schema
table.

If I connect to SAP and run the GetOleDbSchemaTable command I get a
datatable with 1 field
and over half million rows.

I would like to get 1 value at a time to process it instead of holding
over half million
values in memory at the same time, which I do not need.
Further I usually need to take and process many more columns at the
same time and the size of the datatable become enormous.

Of course with small systems this is not a big problem. But try to work
with real world databases...

-tom

Patrice ha scritto:
English is not may native language either. I should just have been more
explicit.

This is the second part then. Use a custom replacement for you DBMS with a
DataReader (note that a DataReader still uses a buffer).

What I wanted to convey with the first part is that the problem lloks like
to me that that you get all data at once. With restrictions (as suggested
by
Miha) you can get only those you are interested using a limited amount of
memory. Though you may want to check that restrictions are processed
server
side it looks like quite a good solution.

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Hi Patrice. Thanks... Hmmm,
it seems I really have problem with my English :) Sorry!

It's clear that - once one has a DataTable - one can read a
row at a time.

What I was trying to explain is that I do not want to get the
whole DataTable, but just get one record at a time.

This would be the same difference between:

DataAdapter + Fill() [gets the whole table at a time]
DataReader + Read() [gets 1 record at a time]

I would like to know how to implement something
similar to the DataReader + Read() schema

for OleDbSchemaGuid tables. And I want it works
for any OleDb connection (independently of the DBMS)

I hope now it is more clear my goal.

Thank you!



Patrice ha scritto:
...And just loop through the rows using the DataTable.Rows collection
(it
would interesting to check if restrictions are applied on the result or
are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific
statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored procedures).

--
Patrice

Of course Miha, but I was talking about processing rows (not fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:

You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a
datatable
containing information about the database fields using the
statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases can be
very
large. For instance if you do it with SAP you receive over half
million
rows.

My question:

I would like to do the same operation programmatically and not
using
the statement GetOleDbSchemaTable. Possibly taking 1 record at a
time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
Wow, that's big.
Well, I still suggest you to do filtering and process the small amount of
rows.
As another solution, Patrice already mentioned, do execute proper sql
statments to query the database metadata and you will have full control.
Of course, it is database specific then.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Dear Miha , Patrice, ... and all

the advantage of having a mechanism similar to DataReader is obvious.

Assume I am getting only the very first field of the COLUMN schema
table.

If I connect to SAP and run the GetOleDbSchemaTable command I get a
datatable with 1 field
and over half million rows.

I would like to get 1 value at a time to process it instead of holding
over half million
values in memory at the same time, which I do not need.
Further I usually need to take and process many more columns at the
same time and the size of the datatable become enormous.

Of course with small systems this is not a big problem. But try to work
with real world databases...

-tom

Patrice ha scritto:
English is not may native language either. I should just have been more
explicit.

This is the second part then. Use a custom replacement for you DBMS with a
DataReader (note that a DataReader still uses a buffer).

What I wanted to convey with the first part is that the problem lloks like
to me that that you get all data at once. With restrictions (as suggested
by
Miha) you can get only those you are interested using a limited amount of
memory. Though you may want to check that restrictions are processed
server
side it looks like quite a good solution.

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Hi Patrice. Thanks... Hmmm,
it seems I really have problem with my English :) Sorry!

It's clear that - once one has a DataTable - one can read a
row at a time.

What I was trying to explain is that I do not want to get the
whole DataTable, but just get one record at a time.

This would be the same difference between:

DataAdapter + Fill() [gets the whole table at a time]
DataReader + Read() [gets 1 record at a time]

I would like to know how to implement something
similar to the DataReader + Read() schema

for OleDbSchemaGuid tables. And I want it works
for any OleDb connection (independently of the DBMS)

I hope now it is more clear my goal.

Thank you!



Patrice ha scritto:
...And just loop through the rows using the DataTable.Rows collection
(it
would interesting to check if restrictions are applied on the result or
are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific
statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored procedures).

--
Patrice

Of course Miha, but I was talking about processing rows (not fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:

You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a
datatable
containing information about the database fields using the
statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases can be
very
large. For instance if you do it with SAP you receive over half
million
rows.

My question:

I would like to do the same operation programmatically and not
using
the statement GetOleDbSchemaTable. Possibly taking 1 record at a
time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
Thank you very much all, I will reply to each one:

Patrice, Miha

I know well the filtering mechanism but it does not seem
to help. Because, for my purposes, I DO have to read ALL
the records.

What I wished is not to have to store all of them in memory.
Just process and dismiss.

Your suggestion would turn to be "the" solution
if you could be able to provide a filtering
mechanism which allows to take N rows at a time

n and N and N and N .... and so on

Are you able to do that, for any DBMS and
without knowing a priori what is stored in the records -
(this condition is crucial, because I am working against
several kinds of DBMS. For instance SAP can be on SQL server,
Oracle, ...)

Cor,

Yes that one is fine, but it's another piece of information.
I am scanning all the schemaguids to get the complete
DB structure.

-tom



Miha Markic [MVP C#] ha scritto:
Wow, that's big.
Well, I still suggest you to do filtering and process the small amount of
rows.
As another solution, Patrice already mentioned, do execute proper sql
statments to query the database metadata and you will have full control.
Of course, it is database specific then.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Dear Miha , Patrice, ... and all

the advantage of having a mechanism similar to DataReader is obvious.

Assume I am getting only the very first field of the COLUMN schema
table.

If I connect to SAP and run the GetOleDbSchemaTable command I get a
datatable with 1 field
and over half million rows.

I would like to get 1 value at a time to process it instead of holding
over half million
values in memory at the same time, which I do not need.
Further I usually need to take and process many more columns at the
same time and the size of the datatable become enormous.

Of course with small systems this is not a big problem. But try to work
with real world databases...

-tom

Patrice ha scritto:
English is not may native language either. I should just have been more
explicit.

This is the second part then. Use a custom replacement for you DBMS with a
DataReader (note that a DataReader still uses a buffer).

What I wanted to convey with the first part is that the problem lloks like
to me that that you get all data at once. With restrictions (as suggested
by
Miha) you can get only those you are interested using a limited amount of
memory. Though you may want to check that restrictions are processed
server
side it looks like quite a good solution.

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Hi Patrice. Thanks... Hmmm,
it seems I really have problem with my English :) Sorry!

It's clear that - once one has a DataTable - one can read a
row at a time.

What I was trying to explain is that I do not want to get the
whole DataTable, but just get one record at a time.

This would be the same difference between:

DataAdapter + Fill() [gets the whole table at a time]
DataReader + Read() [gets 1 record at a time]

I would like to know how to implement something
similar to the DataReader + Read() schema

for OleDbSchemaGuid tables. And I want it works
for any OleDb connection (independently of the DBMS)

I hope now it is more clear my goal.

Thank you!



Patrice ha scritto:
...And just loop through the rows using the DataTable.Rows collection
(it
would interesting to check if restrictions are applied on the result or
are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific
statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored procedures).

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Of course Miha, but I was talking about processing rows (not fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:

You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a
datatable
containing information about the database fields using the
statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases can be
very
large. For instance if you do it with SAP you receive over half
million
rows.

My question:

I would like to do the same operation programmatically and not
using
the statement GetOleDbSchemaTable. Possibly taking 1 record at a
time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
Basically I would do something like :

for each table (using GetOleDbSchemaTable to get all tables)
read columns for this table (using GetOleDbSchemaTable with restrictions
to get columns only for this table)
process these columns
next

You never have all of them in memory. You just process each table in
turn....

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Thank you very much all, I will reply to each one:

Patrice, Miha

I know well the filtering mechanism but it does not seem
to help. Because, for my purposes, I DO have to read ALL
the records.

What I wished is not to have to store all of them in memory.
Just process and dismiss.

Your suggestion would turn to be "the" solution
if you could be able to provide a filtering
mechanism which allows to take N rows at a time

n and N and N and N .... and so on

Are you able to do that, for any DBMS and
without knowing a priori what is stored in the records -
(this condition is crucial, because I am working against
several kinds of DBMS. For instance SAP can be on SQL server,
Oracle, ...)

Cor,

Yes that one is fine, but it's another piece of information.
I am scanning all the schemaguids to get the complete
DB structure.

-tom



Miha Markic [MVP C#] ha scritto:
Wow, that's big.
Well, I still suggest you to do filtering and process the small amount of
rows.
As another solution, Patrice already mentioned, do execute proper sql
statments to query the database metadata and you will have full control.
Of course, it is database specific then.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Dear Miha , Patrice, ... and all

the advantage of having a mechanism similar to DataReader is obvious.

Assume I am getting only the very first field of the COLUMN schema
table.

If I connect to SAP and run the GetOleDbSchemaTable command I get a
datatable with 1 field
and over half million rows.

I would like to get 1 value at a time to process it instead of holding
over half million
values in memory at the same time, which I do not need.
Further I usually need to take and process many more columns at the
same time and the size of the datatable become enormous.

Of course with small systems this is not a big problem. But try to work
with real world databases...

-tom

Patrice ha scritto:
English is not may native language either. I should just have been more
explicit.

This is the second part then. Use a custom replacement for you DBMS with
a
DataReader (note that a DataReader still uses a buffer).

What I wanted to convey with the first part is that the problem lloks
like
to me that that you get all data at once. With restrictions (as
suggested
by
Miha) you can get only those you are interested using a limited amount
of
memory. Though you may want to check that restrictions are processed
server
side it looks like quite a good solution.

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Hi Patrice. Thanks... Hmmm,
it seems I really have problem with my English :) Sorry!

It's clear that - once one has a DataTable - one can read a
row at a time.

What I was trying to explain is that I do not want to get the
whole DataTable, but just get one record at a time.

This would be the same difference between:

DataAdapter + Fill() [gets the whole table at a time]
DataReader + Read() [gets 1 record at a time]

I would like to know how to implement something
similar to the DataReader + Read() schema

for OleDbSchemaGuid tables. And I want it works
for any OleDb connection (independently of the DBMS)

I hope now it is more clear my goal.

Thank you!



Patrice ha scritto:
...And just loop through the rows using the DataTable.Rows collection
(it
would interesting to check if restrictions are applied on the result
or
are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific
statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored procedures).

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Of course Miha, but I was talking about processing rows (not
fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:

You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a
datatable
containing information about the database fields using the
statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases can be
very
large. For instance if you do it with SAP you receive over half
million
rows.

My question:

I would like to do the same operation programmatically and not
using
the statement GetOleDbSchemaTable. Possibly taking 1 record at a
time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
Yes Patrice,

I never said I am holding all the tables in memory. That would
be a suicide :) For now, I am scanning one table at a time.

Just wanted to figure ot if i could get a row at a time and dismiss it.
But
probably it's not something immediate to do, otherwise the answer had
probably already popped up.

Probably by using some low level function, also used by the
GetOledbSchema
it would be possible to change it's beaviour from the FILL() way to a
kind of READ() way.

But I do not have information about that ... We need some OleDb guru
....

-tom

Patrice ha scritto:
Basically I would do something like :

for each table (using GetOleDbSchemaTable to get all tables)
read columns for this table (using GetOleDbSchemaTable with restrictions
to get columns only for this table)
process these columns
next

You never have all of them in memory. You just process each table in
turn....

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Thank you very much all, I will reply to each one:

Patrice, Miha

I know well the filtering mechanism but it does not seem
to help. Because, for my purposes, I DO have to read ALL
the records.

What I wished is not to have to store all of them in memory.
Just process and dismiss.

Your suggestion would turn to be "the" solution
if you could be able to provide a filtering
mechanism which allows to take N rows at a time

n and N and N and N .... and so on

Are you able to do that, for any DBMS and
without knowing a priori what is stored in the records -
(this condition is crucial, because I am working against
several kinds of DBMS. For instance SAP can be on SQL server,
Oracle, ...)

Cor,

Yes that one is fine, but it's another piece of information.
I am scanning all the schemaguids to get the complete
DB structure.

-tom



Miha Markic [MVP C#] ha scritto:
Wow, that's big.
Well, I still suggest you to do filtering and process the small amount of
rows.
As another solution, Patrice already mentioned, do execute proper sql
statments to query the database metadata and you will have full control.
Of course, it is database specific then.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Dear Miha , Patrice, ... and all

the advantage of having a mechanism similar to DataReader is obvious.

Assume I am getting only the very first field of the COLUMN schema
table.

If I connect to SAP and run the GetOleDbSchemaTable command I get a
datatable with 1 field
and over half million rows.

I would like to get 1 value at a time to process it instead of holding
over half million
values in memory at the same time, which I do not need.
Further I usually need to take and process many more columns at the
same time and the size of the datatable become enormous.

Of course with small systems this is not a big problem. But try to work
with real world databases...

-tom

Patrice ha scritto:
English is not may native language either. I should just have been more
explicit.

This is the second part then. Use a custom replacement for you DBMS with
a
DataReader (note that a DataReader still uses a buffer).

What I wanted to convey with the first part is that the problem lloks
like
to me that that you get all data at once. With restrictions (as
suggested
by
Miha) you can get only those you are interested using a limited amount
of
memory. Though you may want to check that restrictions are processed
server
side it looks like quite a good solution.

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Hi Patrice. Thanks... Hmmm,
it seems I really have problem with my English :) Sorry!

It's clear that - once one has a DataTable - one can read a
row at a time.

What I was trying to explain is that I do not want to get the
whole DataTable, but just get one record at a time.

This would be the same difference between:

DataAdapter + Fill() [gets the whole table at a time]
DataReader + Read() [gets 1 record at a time]

I would like to know how to implement something
similar to the DataReader + Read() schema

for OleDbSchemaGuid tables. And I want it works
for any OleDb connection (independently of the DBMS)

I hope now it is more clear my goal.

Thank you!



Patrice ha scritto:

...And just loop through the rows using the DataTable.Rows collection
(it
would interesting to check if restrictions are applied on the result
or
are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific
statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored procedures).

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Of course Miha, but I was talking about processing rows (not
fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:

You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a
datatable
containing information about the database fields using the
statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases canbe
very
large. For instance if you do it with SAP you receive over half
million
rows.

My question:

I would like to do the same operation programmatically and not
using
the statement GetOleDbSchemaTable. Possibly taking 1 record ata
time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
What would be the benefit of having "a row" (even the DataReader uses the
buffer) as opposed to having all rows *for a single table* at any one time
in memory.

Else use the second approach (using a db specific statements).

A third approach would be to use a reflector to see what are the calls that
..NET is doing under the hood and try to reproduce uisng a datareader.

IMO it would be best to make sure first you have some kind of real issue
here before wasting too much time on "optimizing" this (do you have actually
*seen* a problem with scanning one table at a time ?)

Good luck...

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Yes Patrice,

I never said I am holding all the tables in memory. That would
be a suicide :) For now, I am scanning one table at a time.

Just wanted to figure ot if i could get a row at a time and dismiss it.
But
probably it's not something immediate to do, otherwise the answer had
probably already popped up.

Probably by using some low level function, also used by the
GetOledbSchema
it would be possible to change it's beaviour from the FILL() way to a
kind of READ() way.

But I do not have information about that ... We need some OleDb guru
....

-tom

Patrice ha scritto:
Basically I would do something like :

for each table (using GetOleDbSchemaTable to get all tables)
read columns for this table (using GetOleDbSchemaTable with
restrictions
to get columns only for this table)
process these columns
next

You never have all of them in memory. You just process each table in
turn....

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Thank you very much all, I will reply to each one:

Patrice, Miha

I know well the filtering mechanism but it does not seem
to help. Because, for my purposes, I DO have to read ALL
the records.

What I wished is not to have to store all of them in memory.
Just process and dismiss.

Your suggestion would turn to be "the" solution
if you could be able to provide a filtering
mechanism which allows to take N rows at a time

n and N and N and N .... and so on

Are you able to do that, for any DBMS and
without knowing a priori what is stored in the records -
(this condition is crucial, because I am working against
several kinds of DBMS. For instance SAP can be on SQL server,
Oracle, ...)

Cor,

Yes that one is fine, but it's another piece of information.
I am scanning all the schemaguids to get the complete
DB structure.

-tom



Miha Markic [MVP C#] ha scritto:
Wow, that's big.
Well, I still suggest you to do filtering and process the small amount
of
rows.
As another solution, Patrice already mentioned, do execute proper sql
statments to query the database metadata and you will have full control.
Of course, it is database specific then.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Dear Miha , Patrice, ... and all

the advantage of having a mechanism similar to DataReader is obvious.

Assume I am getting only the very first field of the COLUMN schema
table.

If I connect to SAP and run the GetOleDbSchemaTable command I get a
datatable with 1 field
and over half million rows.

I would like to get 1 value at a time to process it instead of holding
over half million
values in memory at the same time, which I do not need.
Further I usually need to take and process many more columns at the
same time and the size of the datatable become enormous.

Of course with small systems this is not a big problem. But try to work
with real world databases...

-tom

Patrice ha scritto:
English is not may native language either. I should just have been
more
explicit.

This is the second part then. Use a custom replacement for you DBMS
with
a
DataReader (note that a DataReader still uses a buffer).

What I wanted to convey with the first part is that the problem lloks
like
to me that that you get all data at once. With restrictions (as
suggested
by
Miha) you can get only those you are interested using a limited amount
of
memory. Though you may want to check that restrictions are processed
server
side it looks like quite a good solution.

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Hi Patrice. Thanks... Hmmm,
it seems I really have problem with my English :) Sorry!

It's clear that - once one has a DataTable - one can read a
row at a time.

What I was trying to explain is that I do not want to get the
whole DataTable, but just get one record at a time.

This would be the same difference between:

DataAdapter + Fill() [gets the whole table at a time]
DataReader + Read() [gets 1 record at a time]

I would like to know how to implement something
similar to the DataReader + Read() schema

for OleDbSchemaGuid tables. And I want it works
for any OleDb connection (independently of the DBMS)

I hope now it is more clear my goal.

Thank you!



Patrice ha scritto:

...And just loop through the rows using the DataTable.Rows
collection
(it
would interesting to check if restrictions are applied on the result
or
are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific
statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored
procedures).

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Of course Miha, but I was talking about processing rows (not
fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:

You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a
datatable
containing information about the database fields using the
statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases can
be
very
large. For instance if you do it with SAP you receive over half
million
rows.

My question:

I would like to do the same operation programmatically and not
using
the statement GetOleDbSchemaTable. Possibly taking 1 record at
a
time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
Thank you Patrice,

hmmm...

I find quite interesting the part where you talk about a "reflector".
Actually
I have often seen people on this group who were able to provide details
of the inner working of some procedures and I have been wondering where
they get that information. Sounds like a kind of tool to see the inner
working
of the language commands.

Do you have any pointer about that? Is there a tool that can be
downloaded?

Thanks,

tom

Patrice ha scritto:
What would be the benefit of having "a row" (even the DataReader uses the
buffer) as opposed to having all rows *for a single table* at any one time
in memory.

Else use the second approach (using a db specific statements).

A third approach would be to use a reflector to see what are the calls that
.NET is doing under the hood and try to reproduce uisng a datareader.

IMO it would be best to make sure first you have some kind of real issue
here before wasting too much time on "optimizing" this (do you have actually
*seen* a problem with scanning one table at a time ?)

Good luck...

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Yes Patrice,

I never said I am holding all the tables in memory. That would
be a suicide :) For now, I am scanning one table at a time.

Just wanted to figure ot if i could get a row at a time and dismiss it.
But
probably it's not something immediate to do, otherwise the answer had
probably already popped up.

Probably by using some low level function, also used by the
GetOledbSchema
it would be possible to change it's beaviour from the FILL() way to a
kind of READ() way.

But I do not have information about that ... We need some OleDb guru
...

-tom

Patrice ha scritto:
Basically I would do something like :

for each table (using GetOleDbSchemaTable to get all tables)
read columns for this table (using GetOleDbSchemaTable with
restrictions
to get columns only for this table)
process these columns
next

You never have all of them in memory. You just process each table in
turn....

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Thank you very much all, I will reply to each one:

Patrice, Miha

I know well the filtering mechanism but it does not seem
to help. Because, for my purposes, I DO have to read ALL
the records.

What I wished is not to have to store all of them in memory.
Just process and dismiss.

Your suggestion would turn to be "the" solution
if you could be able to provide a filtering
mechanism which allows to take N rows at a time

n and N and N and N .... and so on

Are you able to do that, for any DBMS and
without knowing a priori what is stored in the records -
(this condition is crucial, because I am working against
several kinds of DBMS. For instance SAP can be on SQL server,
Oracle, ...)

Cor,

Yes that one is fine, but it's another piece of information.
I am scanning all the schemaguids to get the complete
DB structure.

-tom



Miha Markic [MVP C#] ha scritto:
Wow, that's big.
Well, I still suggest you to do filtering and process the small amount
of
rows.
As another solution, Patrice already mentioned, do execute proper sql
statments to query the database metadata and you will have full control.
Of course, it is database specific then.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Dear Miha , Patrice, ... and all

the advantage of having a mechanism similar to DataReader is obvious.

Assume I am getting only the very first field of the COLUMN schema
table.

If I connect to SAP and run the GetOleDbSchemaTable command I get a
datatable with 1 field
and over half million rows.

I would like to get 1 value at a time to process it instead of holding
over half million
values in memory at the same time, which I do not need.
Further I usually need to take and process many more columns at the
same time and the size of the datatable become enormous.

Of course with small systems this is not a big problem. But try to work
with real world databases...

-tom

Patrice ha scritto:

English is not may native language either. I should just have been
more
explicit.

This is the second part then. Use a custom replacement for you DBMS
with
a
DataReader (note that a DataReader still uses a buffer).

What I wanted to convey with the first part is that the problem lloks
like
to me that that you get all data at once. With restrictions (as
suggested
by
Miha) you can get only those you are interested using a limited amount
of
memory. Though you may want to check that restrictions are processed
server
side it looks like quite a good solution.

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Hi Patrice. Thanks... Hmmm,
it seems I really have problem with my English :) Sorry!

It's clear that - once one has a DataTable - one can read a
row at a time.

What I was trying to explain is that I do not want to get the
whole DataTable, but just get one record at a time.

This would be the same difference between:

DataAdapter + Fill() [gets the whole table at a time]
DataReader + Read() [gets 1 record at a time]

I would like to know how to implement something
similar to the DataReader + Read() schema

for OleDbSchemaGuid tables. And I want it works
for any OleDb connection (independently of the DBMS)

I hope now it is more clear my goal.

Thank you!



Patrice ha scritto:

...And just loop through the rows using the DataTable.Rows
collection
(it
would interesting to check if restrictions are applied on the result
or
are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific
statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored
procedures).

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Of course Miha, but I was talking about processing rows (not
fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:

You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a
datatable
containing information about the database fields using the
statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases can
be
very
large. For instance if you do it with SAP you receive over half
million
rows.

My question:

I would like to do the same operation programmatically and not
using
the statement GetOleDbSchemaTable. Possibly taking 1 record at
a
time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
Reflector

http://www.aisto.com/roeder/dotnet/

It has not to do with reflection

Cor


<[email protected]> schreef in bericht
Thank you Patrice,

hmmm...

I find quite interesting the part where you talk about a "reflector".
Actually
I have often seen people on this group who were able to provide details
of the inner working of some procedures and I have been wondering where
they get that information. Sounds like a kind of tool to see the inner
working
of the language commands.

Do you have any pointer about that? Is there a tool that can be
downloaded?

Thanks,

tom

Patrice ha scritto:
What would be the benefit of having "a row" (even the DataReader uses the
buffer) as opposed to having all rows *for a single table* at any one time
in memory.

Else use the second approach (using a db specific statements).

A third approach would be to use a reflector to see what are the calls
that
.NET is doing under the hood and try to reproduce uisng a datareader.

IMO it would be best to make sure first you have some kind of real issue
here before wasting too much time on "optimizing" this (do you have
actually
*seen* a problem with scanning one table at a time ?)

Good luck...

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Yes Patrice,

I never said I am holding all the tables in memory. That would
be a suicide :) For now, I am scanning one table at a time.

Just wanted to figure ot if i could get a row at a time and dismiss it.
But
probably it's not something immediate to do, otherwise the answer had
probably already popped up.

Probably by using some low level function, also used by the
GetOledbSchema
it would be possible to change it's beaviour from the FILL() way to a
kind of READ() way.

But I do not have information about that ... We need some OleDb guru
...

-tom

Patrice ha scritto:
Basically I would do something like :

for each table (using GetOleDbSchemaTable to get all tables)
read columns for this table (using GetOleDbSchemaTable with
restrictions
to get columns only for this table)
process these columns
next

You never have all of them in memory. You just process each table in
turn....

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Thank you very much all, I will reply to each one:

Patrice, Miha

I know well the filtering mechanism but it does not seem
to help. Because, for my purposes, I DO have to read ALL
the records.

What I wished is not to have to store all of them in memory.
Just process and dismiss.

Your suggestion would turn to be "the" solution
if you could be able to provide a filtering
mechanism which allows to take N rows at a time

n and N and N and N .... and so on

Are you able to do that, for any DBMS and
without knowing a priori what is stored in the records -
(this condition is crucial, because I am working against
several kinds of DBMS. For instance SAP can be on SQL server,
Oracle, ...)

Cor,

Yes that one is fine, but it's another piece of information.
I am scanning all the schemaguids to get the complete
DB structure.

-tom



Miha Markic [MVP C#] ha scritto:
Wow, that's big.
Well, I still suggest you to do filtering and process the small amount
of
rows.
As another solution, Patrice already mentioned, do execute proper sql
statments to query the database metadata and you will have full
control.
Of course, it is database specific then.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Dear Miha , Patrice, ... and all

the advantage of having a mechanism similar to DataReader is obvious.

Assume I am getting only the very first field of the COLUMN schema
table.

If I connect to SAP and run the GetOleDbSchemaTable command I get a
datatable with 1 field
and over half million rows.

I would like to get 1 value at a time to process it instead of holding
over half million
values in memory at the same time, which I do not need.
Further I usually need to take and process many more columns at the
same time and the size of the datatable become enormous.

Of course with small systems this is not a big problem. But try to
work
with real world databases...

-tom

Patrice ha scritto:

English is not may native language either. I should just have been
more
explicit.

This is the second part then. Use a custom replacement for you DBMS
with
a
DataReader (note that a DataReader still uses a buffer).

What I wanted to convey with the first part is that the problem
lloks
like
to me that that you get all data at once. With restrictions (as
suggested
by
Miha) you can get only those you are interested using a limited
amount
of
memory. Though you may want to check that restrictions are processed
server
side it looks like quite a good solution.

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Hi Patrice. Thanks... Hmmm,
it seems I really have problem with my English :) Sorry!

It's clear that - once one has a DataTable - one can read a
row at a time.

What I was trying to explain is that I do not want to get the
whole DataTable, but just get one record at a time.

This would be the same difference between:

DataAdapter + Fill() [gets the whole table at a time]
DataReader + Read() [gets 1 record at a time]

I would like to know how to implement something
similar to the DataReader + Read() schema

for OleDbSchemaGuid tables. And I want it works
for any OleDb connection (independently of the DBMS)

I hope now it is more clear my goal.

Thank you!



Patrice ha scritto:

...And just loop through the rows using the DataTable.Rows
collection
(it
would interesting to check if restrictions are applied on the
result
or
are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific
statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored
procedures).

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Of course Miha, but I was talking about processing rows (not
fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:

You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a
datatable
containing information about the database fields using the
statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases
can
be
very
large. For instance if you do it with SAP you receive over
half
million
rows.

My question:

I would like to do the same operation programmatically and
not
using
the statement GetOleDbSchemaTable. Possibly taking 1 record
at
a
time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
Thank you Cor!! Just got it.

This opens up a whole new world!

'.....................
Public Function GetOleDbSchemaTable(ByVal schema As Guid, ByVal
restrictions As Object()) As DataTable
Dim table1 As DataTable
Dim ptr1 As IntPtr
OleDbConnection.ExecutePermission.Demand
Bid.ScopeEnter(ptr1,
"<oledb.OleDbConnection.GetOleDbSchemaTable|API> %d#, schema=%p{GUID},
restrictions" & ChrW(10), Me.ObjectID, schema)
Try
Me.CheckStateOpen("GetOleDbSchemaTable")
Dim internal1 As OleDbConnectionInternal =
Me.GetOpenConnection
If (OleDbSchemaGuid.DbInfoLiterals = schema) Then
If ((Not restrictions Is Nothing) AndAlso
(restrictions.Length <> 0)) Then
Throw
ODB.InvalidRestrictionsDbInfoLiteral("restrictions")
End If
Return internal1.BuildInfoLiterals
End If
If (OleDbSchemaGuid.SchemaGuids = schema) Then
If ((Not restrictions Is Nothing) AndAlso
(restrictions.Length <> 0)) Then
Throw
ODB.InvalidRestrictionsSchemaGuids("restrictions")
End If
Return internal1.BuildSchemaGuids
End If
If (OleDbSchemaGuid.DbInfoKeywords = schema) Then
If ((Not restrictions Is Nothing) AndAlso
(restrictions.Length <> 0)) Then
Throw
ODB.InvalidRestrictionsDbInfoKeywords("restrictions")
End If
Return internal1.BuildInfoKeywords
End If
If Not internal1.SupportSchemaRowset(schema) Then
Dim wrapper1 As IDBSchemaRowsetWrapper =
internal1.IDBSchemaRowset
Try
If (wrapper1.Value Is Nothing) Then
Throw
ODB.SchemaRowsetsNotSupported(Me.Provider)
End If
Finally
wrapper1.Dispose
End Try
Throw ODB.NotSupportedSchemaTable(schema, Me)
End If
Return internal1.GetSchemaRowset(schema, restrictions)
Finally
Bid.ScopeLeave(ptr1)
End Try
Return table1
End Function

'.....................

I am just wondering. Is this all legal??

On one thing I do not agree with you: I think this has all
to do with reflection! :)

Thanks a lot,

-tom

Cor Ligthert [MVP] ha scritto:
Reflector

http://www.aisto.com/roeder/dotnet/

It has not to do with reflection

Cor


<[email protected]> schreef in bericht
Thank you Patrice,

hmmm...

I find quite interesting the part where you talk about a "reflector".
Actually
I have often seen people on this group who were able to provide details
of the inner working of some procedures and I have been wondering where
they get that information. Sounds like a kind of tool to see the inner
working
of the language commands.

Do you have any pointer about that? Is there a tool that can be
downloaded?

Thanks,

tom

Patrice ha scritto:
What would be the benefit of having "a row" (even the DataReader uses the
buffer) as opposed to having all rows *for a single table* at any one time
in memory.

Else use the second approach (using a db specific statements).

A third approach would be to use a reflector to see what are the calls
that
.NET is doing under the hood and try to reproduce uisng a datareader.

IMO it would be best to make sure first you have some kind of real issue
here before wasting too much time on "optimizing" this (do you have
actually
*seen* a problem with scanning one table at a time ?)

Good luck...

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Yes Patrice,

I never said I am holding all the tables in memory. That would
be a suicide :) For now, I am scanning one table at a time.

Just wanted to figure ot if i could get a row at a time and dismiss it.
But
probably it's not something immediate to do, otherwise the answer had
probably already popped up.

Probably by using some low level function, also used by the
GetOledbSchema
it would be possible to change it's beaviour from the FILL() way to a
kind of READ() way.

But I do not have information about that ... We need some OleDb guru
...

-tom

Patrice ha scritto:
Basically I would do something like :

for each table (using GetOleDbSchemaTable to get all tables)
read columns for this table (using GetOleDbSchemaTable with
restrictions
to get columns only for this table)
process these columns
next

You never have all of them in memory. You just process each table in
turn....

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Thank you very much all, I will reply to each one:

Patrice, Miha

I know well the filtering mechanism but it does not seem
to help. Because, for my purposes, I DO have to read ALL
the records.

What I wished is not to have to store all of them in memory.
Just process and dismiss.

Your suggestion would turn to be "the" solution
if you could be able to provide a filtering
mechanism which allows to take N rows at a time

n and N and N and N .... and so on

Are you able to do that, for any DBMS and
without knowing a priori what is stored in the records -
(this condition is crucial, because I am working against
several kinds of DBMS. For instance SAP can be on SQL server,
Oracle, ...)

Cor,

Yes that one is fine, but it's another piece of information.
I am scanning all the schemaguids to get the complete
DB structure.

-tom



Miha Markic [MVP C#] ha scritto:

Wow, that's big.
Well, I still suggest you to do filtering and process the small amount
of
rows.
As another solution, Patrice already mentioned, do execute proper sql
statments to query the database metadata and you will have full
control.
Of course, it is database specific then.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Dear Miha , Patrice, ... and all

the advantage of having a mechanism similar to DataReader is obvious.

Assume I am getting only the very first field of the COLUMN schema
table.

If I connect to SAP and run the GetOleDbSchemaTable command I get a
datatable with 1 field
and over half million rows.

I would like to get 1 value at a time to process it instead of holding
over half million
values in memory at the same time, which I do not need.
Further I usually need to take and process many more columns at the
same time and the size of the datatable become enormous.

Of course with small systems this is not a big problem. But try to
work
with real world databases...

-tom

Patrice ha scritto:

English is not may native language either. I should just have been
more
explicit.

This is the second part then. Use a custom replacement for you DBMS
with
a
DataReader (note that a DataReader still uses a buffer).

What I wanted to convey with the first part is that the problem
lloks
like
to me that that you get all data at once. With restrictions (as
suggested
by
Miha) you can get only those you are interested using a limited
amount
of
memory. Though you may want to check that restrictions are processed
server
side it looks like quite a good solution.

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Hi Patrice. Thanks... Hmmm,
it seems I really have problem with my English :) Sorry!

It's clear that - once one has a DataTable - one can read a
row at a time.

What I was trying to explain is that I do not want to get the
whole DataTable, but just get one record at a time.

This would be the same difference between:

DataAdapter + Fill() [gets the whole table at a time]
DataReader + Read() [gets 1 record at a time]

I would like to know how to implement something
similar to the DataReader + Read() schema

for OleDbSchemaGuid tables. And I want it works
for any OleDb connection (independently of the DBMS)

I hope now it is more clear my goal.

Thank you!



Patrice ha scritto:

...And just loop through the rows using the DataTable.Rows
collection
(it
would interesting to check if restrictions are applied on the
result
or
are
processed server side for your DB).

Else if you want really a replacement you'll have to use specific
statements
for this (for example for SQL Server you could use the
information_schema.columns view or the appropriate stored
procedures).

--
Patrice

<[email protected]> a écrit dans le message de (e-mail address removed)...
Of course Miha, but I was talking about processing rows (not
fields).
It's clear that one
can select columns passing an argument to GetOleDbSchemaTable.

My question was another: I want a programmatic substitute of
GetOleDbSchemaTable
that allows me to consider 1 record at a time.

Thank you for your reply.

-Tom

Miha Markic [MVP C#] ha scritto:

You can pass an array of filter values as a last parameter.

--
Miha Markic [MVP C#, INETA Country Leader for Slovenia]
RightHand .NET consulting & development www.rthand.com
Blog: http://cs.rthand.com/blogs/blog_with_righthand/

Hi,

if we have an OleDbConnection it is possible to retrieve a
datatable
containing information about the database fields using the
statement
(watch out line breaking):

OleDbConnection.GetOleDbSchemaTable(System.Data.OleDb.OleDbSchemaGuid.Columns,
New Object() {})

This however return a whole DataTable, which in some cases
can
be
very
large. For instance if you do it with SAP you receive over
half
million
rows.

My question:

I would like to do the same operation programmatically and
not
using
the statement GetOleDbSchemaTable. Possibly taking 1 record
at
a
time,
similarly to the DataReader mechanism.

Can anyone suggest how to achieve that?

-Tom
 
Back
Top