I guess, like anything else, it depends. Is it faster to cursor through
the data in C++ or T-SQL?
I'm guessing C++. That's why I want to check it out with the whole lot
running in C++.
How expensive is each initialization of the XP?
Don't know. Probably average relatively... but the COM object has to be
called ~78m times, at the most recent count of the output rows. It currently
takes about 30 hours to do all that lot with a VB program. But that includes
all the time it takes it to send the data to SQL server. So I'm guessing that
to keep one connection open with the code that's calling the COM object all
the way through would be the best.
I just can't accept that initializing a stored procedure, or initalizing and
opening a connection, or pretty much initializing *anything* 78 million times
can be a good thing. That's why I want to do it all within the C++ code of an
extended proc.
Will it be more expensive for the XP to have to make new database
connections to read/write, rather than just using the connection that
spawned the XP?
That's what I'm thinking, yes.
faster?
There is, IMO, no reason to believe that an XP would be that much
faster. Did you profile your COM objects to figure out where the
performance drain was?
The COM object isn't under our control. But it is at the very very very
centre of all the loops. It is the absolute bottleneck. And before you ask,
no we can't write it in SQL because the company that wrote it have protected
it with a dongle, so I would think they'll be unlikely to do anything that
might be tantamount to giving the algorithm away.
I apologize if I implied that you can pass a row as-is -- you can't.
You can only pass scalar datatypes. Reading data using an XP is the same as
reading it from any other code -- set up a connection to the DB and query
the data.
I'm afraid I insist. Can you please tell me how to read a table from within
an XP, even if just to satisfy my own curiousity regarding testing how fast
it actually will be?