slow program in a loop

  • Thread starter Thread starter chris
  • Start date Start date
C

chris

I have a vba program which opens a new workbook, extracts data from an
access database into this workbook, performs some calculations, writed the
result to the workbook then saves it and closes it.

This works fine but I wanted to process several cases so I put in a for loop
which goes thru the program several times. The first time thru the loop,
takes say 3 mins, then next time 6 mins and the next time about 15 mins and
so on. If I run each cases singly, they take about 3 mins each.

Can anyone suggest what might be going on? Is there something that needs to
be closed or shut down at the end of each pass thru the loop?

I am running windows xp and excel xp with 384meg ram

Thanks
Chris
 
3 minutes sounds pretty long as it is unless you have a huge amount of data
coming in. It sounds like something is hanging around. Are you using a
disconnected recordset? Are you analysing a recordset or dumping the data
into a workbook? If so, are you setting the RS to nothing before you run
your analysis or making sure it is set to nothing when you finish the
analysis before you rerun? If you have it as a public variable in a module
it could hang around.

Robin Hammond
www.enhanceddatasystems.com
Check out our XspandXL add-in
 
I am getting the records from access and then dumping them into a worksheet
using:

With ActiveSheet.QueryTables.Add(Connection:=Array(Array( _
"ODBC;DSN=MS Access Database;DBQ=" & pathDbase & dbname & ";" _
& "DefaultDir=" & pathDbase & ";DriverId=2"), _
Array("5;FIL=MS Access;MaxBufferSize=2048;PageTimeout=5;")), _
Destination:=Range("A1"))
.CommandText = Array( _
"SELECT stock_data.Stock_name, stock_data.[Date], stock_data.[Open],
stock_data.[High], " _
& "stock_data.Low, stock_data.[Close], stock_data.Volume" & Chr(13)
& "" & Chr(10) & _
"FROM `" & pathDbase & dbname & "`.stock_data stock_d", "ata" &
Chr(13) & "" & Chr(10) & _
"WHERE (stock_data.Stock_name='" & stockname & "') AND
(`stock_data`.Date>= #" & startdate & " #)" & Chr(13) & "" & Chr(10) & _
"ORDER BY stock_data.[Date]")
.Name = "Query from MS Access Database_1"
.FieldNames = True
.RowNumbers = False
.FillAdjacentFormulas = False
.PreserveFormatting = True
.RefreshOnFileOpen = False
.BackgroundQuery = True
.RefreshStyle = xlInsertDeleteCells
.SavePassword = True
.SaveData = True
.AdjustColumnWidth = True
.RefreshPeriod = 0
.PreserveColumnInfo = True
.Refresh BackgroundQuery:=False
End With

I have timed this and it works quite fast. Similarly the code which process
the extracted records (usually only about 80 variables in 5 columns) takes
only a few seconds.

As far as I can see, it is creating a new worksheet to take each batch or
records and then saving the workbook which is taking all the time.

The code is too long to post but essentially this is what happens

for i=1 to no_of _runs
check if workbook exists for run1 - if it doesn't exist, create
one

for j=1 to no_of_cases
create worksheet for case1
get data from access and put in case1 worksheet
process data and put results in results worksheet
next j
save and close workbook
next i

It seems that as the number of cases increases, it takes longer and longer
to create a new worksheet - I guess because all previous sheets still there.

I would like to keep all data extracted from access but am wondering if I
deleted each new sheet after I had processed the data (and keep only the
results) whether this would eliminate the problem.

Chris
 
If it is doing what you say it is doing I don't see why it should take so
long. How many cases are you looping into one workbook?

It might be that Excel is spitting the dummy on too many querytables.
Perhaps you could copy the datarange and paste as values although this would
mean losing your linkages if you did want to refresh the data at a later
date. Presumably that is possible given that you seem to have the code to
recreate the whole thing.

Usual advice about screenupdating = false, calculation =
xlcalculationmanual, clean up your temp folder etc would also apply.

Sorry I can't see much else wrong with it.

Robin Hammond
www.enhanceddatasystems.com
Check out our XspandXL add-in


chris said:
I am getting the records from access and then dumping them into a worksheet
using:

With ActiveSheet.QueryTables.Add(Connection:=Array(Array( _
"ODBC;DSN=MS Access Database;DBQ=" & pathDbase & dbname & ";" _
& "DefaultDir=" & pathDbase & ";DriverId=2"), _
Array("5;FIL=MS Access;MaxBufferSize=2048;PageTimeout=5;")), _
Destination:=Range("A1"))
.CommandText = Array( _
"SELECT stock_data.Stock_name, stock_data.[Date], stock_data.[Open],
stock_data.[High], " _
& "stock_data.Low, stock_data.[Close], stock_data.Volume" & Chr(13)
& "" & Chr(10) & _
"FROM `" & pathDbase & dbname & "`.stock_data stock_d", "ata" &
Chr(13) & "" & Chr(10) & _
"WHERE (stock_data.Stock_name='" & stockname & "') AND
(`stock_data`.Date>= #" & startdate & " #)" & Chr(13) & "" & Chr(10) & _
"ORDER BY stock_data.[Date]")
.Name = "Query from MS Access Database_1"
.FieldNames = True
.RowNumbers = False
.FillAdjacentFormulas = False
.PreserveFormatting = True
.RefreshOnFileOpen = False
.BackgroundQuery = True
.RefreshStyle = xlInsertDeleteCells
.SavePassword = True
.SaveData = True
.AdjustColumnWidth = True
.RefreshPeriod = 0
.PreserveColumnInfo = True
.Refresh BackgroundQuery:=False
End With

I have timed this and it works quite fast. Similarly the code which process
the extracted records (usually only about 80 variables in 5 columns) takes
only a few seconds.

As far as I can see, it is creating a new worksheet to take each batch or
records and then saving the workbook which is taking all the time.

The code is too long to post but essentially this is what happens

for i=1 to no_of _runs
check if workbook exists for run1 - if it doesn't exist, create
one

for j=1 to no_of_cases
create worksheet for case1
get data from access and put in case1 worksheet
process data and put results in results worksheet
next j
save and close workbook
next i

It seems that as the number of cases increases, it takes longer and longer
to create a new worksheet - I guess because all previous sheets still there.

I would like to keep all data extracted from access but am wondering if I
deleted each new sheet after I had processed the data (and keep only the
results) whether this would eliminate the problem.

Chris


Robin Hammond said:
3 minutes sounds pretty long as it is unless you have a huge amount of data
coming in. It sounds like something is hanging around. Are you using a
disconnected recordset? Are you analysing a recordset or dumping the data
into a workbook? If so, are you setting the RS to nothing before you run
your analysis or making sure it is set to nothing when you finish the
analysis before you rerun? If you have it as a public variable in a module
it could hang around.

Robin Hammond
www.enhanceddatasystems.com
Check out our XspandXL add-in


for
loop mins
and needs
to
 
Back
Top