J
John B.
I am processing a great deal of information and need to make in excess
of 9000 inserts per work item. The table is used to create an audit
trail of past work operations. Now to my question: I am trying to
maximize performance so would it be better to call adapter.update with
9000 rows in the dataset or would it be better to updated every 1000
or so? If it is better to break up the updates into smaller 'chunks,'
how do you determine an ideal sized 'chunk.'
not sure if it matters but the database is oracle 9i and the table
structure is:
VARCHAR2(8)
VARCHAR2(20)
VARCHAR2(8)
VARCHAR2(32)
DATE
NUMBER(1)
VARCHAR2(80)
NUMBER(3)
VARCHAR2(10)
VARCHAR2(10)
DATE
CHAR(1)
VARCHAR2(32)
NUMBER(4)
of 9000 inserts per work item. The table is used to create an audit
trail of past work operations. Now to my question: I am trying to
maximize performance so would it be better to call adapter.update with
9000 rows in the dataset or would it be better to updated every 1000
or so? If it is better to break up the updates into smaller 'chunks,'
how do you determine an ideal sized 'chunk.'
not sure if it matters but the database is oracle 9i and the table
structure is:
VARCHAR2(8)
VARCHAR2(20)
VARCHAR2(8)
VARCHAR2(32)
DATE
NUMBER(1)
VARCHAR2(80)
NUMBER(3)
VARCHAR2(10)
VARCHAR2(10)
DATE
CHAR(1)
VARCHAR2(32)
NUMBER(4)