E
Eniac
Hiya guys,
earlier on today I had a discussion (to avoid calling it an argument
^.^) with my friend. We were discussing about whether there's really
one good way to efficiently retrieve data from a database.
To sum it up, his opinion was that the only good way to retrieve data
is to set your command object, open the connection, retrieve the data
and close it.
so in the code, it would look like this more or less...
********
class A :
combo1.Datasource = classB.GetCategories
combo2.Datasource = classB.GetTypes
Combo3.Datasource = classB.GetLanguages
class B :
public function GetCategories as dataset
....create command with parameters
....open db connection
....execute
....close connection
end function
public function GetTypes as dataset
....create command with parameters
....open db connection
....execute
....close connection
end function
public function GetLanguages as dataset
....create command with parameters
....open db connection
....execute
....close connection
end function
*****
on the other hand, my opinion is there isn't just one good way of doing
things, for instance, it might be better to just leave the connection
open while you proceed with all the database operations in sequence
then close the connection.
so it would look more or less like this :
class A :
classB.OpenConn()
combo1.Datasource = classB.GetCategories
combo2.Datasource = classB.GetTypes
combo3.Datasource = classB.GetLanguages
classB.CloseConn
class B :
public sub OpenConn
....open connection (global)
end sub
public sub CloseConn
....close connection (global)
end sub
public function GetCategories as dataset
....create command with parameters
....execute
end function
public function GetTypes as dataset
....create command with parameters
....execute
end function
public function GetLanguages as dataset
....create command with parameters
....execute
end function
His argument to back his claim is that his way is what is taught on
MSDN and most "expert" articles. He says that this is how scalable
systems are built because it allows the connection pooling object to
really manage efficiently the available connection.
Which is a very good point I admit. But I find that this technique is
most appropriate when there are non-database operation in-between the
various calls. Its also a one-size fits all type of architecture, tried
and true but unoptimized. Like a generic cd-rom vs a specific cd-rom
for a cd-rom drive.
What I was saying is that using that technique, even with the pooling,
opening and closing a connection object repetively is an overhead if
you know that all your database operations are done in sequence.
Whatever little time you spend creating & destroying the object are
lost microseconds if you know that right after, you'll be
re-instantiating that very same object and that in that respect, it was
pointless to destroy it.
What do you guys think, is there really one way to do things, my
opinion is that we're both a little right and a little wrong at the
same time.
earlier on today I had a discussion (to avoid calling it an argument
^.^) with my friend. We were discussing about whether there's really
one good way to efficiently retrieve data from a database.
To sum it up, his opinion was that the only good way to retrieve data
is to set your command object, open the connection, retrieve the data
and close it.
so in the code, it would look like this more or less...
********
class A :
combo1.Datasource = classB.GetCategories
combo2.Datasource = classB.GetTypes
Combo3.Datasource = classB.GetLanguages
class B :
public function GetCategories as dataset
....create command with parameters
....open db connection
....execute
....close connection
end function
public function GetTypes as dataset
....create command with parameters
....open db connection
....execute
....close connection
end function
public function GetLanguages as dataset
....create command with parameters
....open db connection
....execute
....close connection
end function
*****
on the other hand, my opinion is there isn't just one good way of doing
things, for instance, it might be better to just leave the connection
open while you proceed with all the database operations in sequence
then close the connection.
so it would look more or less like this :
class A :
classB.OpenConn()
combo1.Datasource = classB.GetCategories
combo2.Datasource = classB.GetTypes
combo3.Datasource = classB.GetLanguages
classB.CloseConn
class B :
public sub OpenConn
....open connection (global)
end sub
public sub CloseConn
....close connection (global)
end sub
public function GetCategories as dataset
....create command with parameters
....execute
end function
public function GetTypes as dataset
....create command with parameters
....execute
end function
public function GetLanguages as dataset
....create command with parameters
....execute
end function
His argument to back his claim is that his way is what is taught on
MSDN and most "expert" articles. He says that this is how scalable
systems are built because it allows the connection pooling object to
really manage efficiently the available connection.
Which is a very good point I admit. But I find that this technique is
most appropriate when there are non-database operation in-between the
various calls. Its also a one-size fits all type of architecture, tried
and true but unoptimized. Like a generic cd-rom vs a specific cd-rom
for a cd-rom drive.
What I was saying is that using that technique, even with the pooling,
opening and closing a connection object repetively is an overhead if
you know that all your database operations are done in sequence.
Whatever little time you spend creating & destroying the object are
lost microseconds if you know that right after, you'll be
re-instantiating that very same object and that in that respect, it was
pointless to destroy it.
What do you guys think, is there really one way to do things, my
opinion is that we're both a little right and a little wrong at the
same time.