Looping thru array structure is VERY slow

  • Thread starter Thread starter Young
  • Start date Start date
Y

Young

I have a table with about 30 fields in it and approx. 110 records.

I created a datatable to read all the records and defined a correspoding
structure (array) so I can store the data in the structure. Eg.

Structure MyStruct
F1
F2
....
F30
End Structure

Dim MyArrayStruct(x) as MyStruct

I then created the datatable then loop thru all the records and assign F1 to
F1 on the struct. I do this for each row and assign it to the array "index".

This is extremely slow.

I am converting this from a VB6 app. The VB6 app is VERYquick.

Can someone please help?

TIA
Young
 
Young said:
I have a table with about 30 fields in it and approx. 110 records.

I created a datatable to read all the records and defined a correspoding
structure (array) so I can store the data in the structure. Eg.

Structure MyStruct
F1
F2
...
F30
End Structure
Too big. Use a class instead. Structures have the nice property of being
immutable, but there's overhead associated with copying them around. Storing
them in arrays can result in some very big arrays.
I then created the datatable then loop thru all the records and assign F1 to
F1 on the struct. I do this for each row and assign it to the array "index".
Why do you need to do this at all? The data table practically *is* an array
of items already. Rewriting the code that accepts the array to accept either
a DataTable directly (fast but not good for abstraction) or an
IEnumerable(Of MyClass) (harder to write but maximally flexible) would
eliminate the copying overhead completely. If you can use LINQ this is simple:

MyDataTable.AsEnumerable().Select(MakeMyClass(Row))

with

Function MakeMyClass(Row As DataRow) As MyClass
Return New MyClass With {
.F1 = Row["F1"],
.F2 = Row["F2"], ...
}
End Function

And if you need an array after that you can call .ToArray() on the result.
This is extremely slow.
Define "extremely slow". Also, you don't show your code. It's always
possible that it contains a monumental blunder. Almost no operation on ~100
items should be slow without something fishy going on. You could, for
example, doing this conversion many times in a row when it only needs to be
done once.
 
why dont you create specific readers for what you want
sqlreaderToArray
sqlReaderToDataTable
etc
this wll bypass all the other classes involved

DaveP

Jeroen Mostert said:
Young said:
I have a table with about 30 fields in it and approx. 110 records.

I created a datatable to read all the records and defined a correspoding
structure (array) so I can store the data in the structure. Eg.

Structure MyStruct
F1
F2
...
F30
End Structure
Too big. Use a class instead. Structures have the nice property of being
immutable, but there's overhead associated with copying them around.
Storing them in arrays can result in some very big arrays.
I then created the datatable then loop thru all the records and assign F1
to F1 on the struct. I do this for each row and assign it to the array
"index".
Why do you need to do this at all? The data table practically *is* an
array of items already. Rewriting the code that accepts the array to
accept either a DataTable directly (fast but not good for abstraction) or
an IEnumerable(Of MyClass) (harder to write but maximally flexible) would
eliminate the copying overhead completely. If you can use LINQ this is
simple:

MyDataTable.AsEnumerable().Select(MakeMyClass(Row))

with

Function MakeMyClass(Row As DataRow) As MyClass
Return New MyClass With {
.F1 = Row["F1"],
.F2 = Row["F2"], ...
}
End Function

And if you need an array after that you can call .ToArray() on the result.
This is extremely slow.
Define "extremely slow". Also, you don't show your code. It's always
possible that it contains a monumental blunder. Almost no operation on
~100 items should be slow without something fishy going on. You could, for
example, doing this conversion many times in a row when it only needs to
be done once.
 
Back
Top