G
Guest
Hi,
I have a procedure that opens many text files for input 'as filehandles'.
Basically, at the minute, it's opening 150 files one after the other (60
lines of data per file) and putting the data from each line into a new line
in an Access table.
I've created timebars, initially to show progress from files 1 to 150 & now
to show the progress through lines 1 to 60 in each file - to give me an idea
of where the code is getting slow. Basically, it's pretty quick until it gets
to around 50 files & 3000 lines in the Access table I'm creating with it.
That's nothing! Tiny!
But having stopped the code whilst running at about 60 files, it seems
apparent to me that the problem is that the databases size has BALLOONED from
2mb originally to 38mb. If I compact it with the 3000 line table in it, it's
only 2.24mb big, so I'm clearly getting massive of amounts of extra data
somewhere which is slowing things down.
Does anyone know if there's a way to stop my code doing this? I thought that
my
close Filehandle
&
close Freefile()
statements would free up mem, but that's not happening.
A cut down version of the code is below. (Roundup is not an Access function,
I have a separate function in the module that does that for me).
Any help is greatly appreciated cos I am stuck & need this to work on
thousands of files, not just 50!
Cheers
Doug
'--------------------------------------------------------------------------
Do Until n = targetfl.RecordCount
Forms!frmbenchmarking!timebar.Visible = True
Forms!frmbenchmarking.timebar.Max = targetfl.RecordCount
'
'set the new file
'
bFile = strParent & strCust & "\RVZ\" & targetfl![filename] & ".RVZ"
'
'create the new filehandle
'
Filehandle = FreeFile()
Open bFile For Input As #Filehandle
Forms!frmbenchmarking!timebar2.Value = 0
Forms!frmbenchmarking.Requery
Do
Line Input #Filehandle, strData
'm is number of lines down a file you are
m = m + 1
'add a new line to tblrpvs
With tbl
.AddNew
.Update
End With
'go to that new line
With tbl
.MoveLast
.Edit
'populate the table fields with data from the the rpv/byd
Do
![cust] = strCust
![Site] = Left(targetfl![filename], varTrim)
.Update
.Edit
![rpv_line] = m
![rpv year] = RoundUp(m / 12)
If Mid(strData, 26, 10) = " " Then
![eact£] = 0
Else
![eact£] = Mid(strData, 26, 10)
End If
'update the line with the data we've imported
.Update
Forms!frmbenchmarking!timebar2.Value = m
Forms!frmbenchmarking.Requery
Loop Until ![cust] > 0
'stop using the table
End With
Loop Until EOF(Filehandle)
m = 0 'resets the line counts once an entire file is imported
'this bit moves it to the next record in the table with the file names
in it.
With targetfl
.MoveNext
End With
On Error Resume Next
bFile = strParent & "\" & strCust & "\RVZ\" & targetfl![filename] & ".RVZ"
'as far as I know, these two statements should free up memory...
Close Filehandle
Close FreeFile
n = n + 1
Forms!frmbenchmarking!timebar.Value = n
Forms!frmbenchmarking.Requery
Loop
I have a procedure that opens many text files for input 'as filehandles'.
Basically, at the minute, it's opening 150 files one after the other (60
lines of data per file) and putting the data from each line into a new line
in an Access table.
I've created timebars, initially to show progress from files 1 to 150 & now
to show the progress through lines 1 to 60 in each file - to give me an idea
of where the code is getting slow. Basically, it's pretty quick until it gets
to around 50 files & 3000 lines in the Access table I'm creating with it.
That's nothing! Tiny!
But having stopped the code whilst running at about 60 files, it seems
apparent to me that the problem is that the databases size has BALLOONED from
2mb originally to 38mb. If I compact it with the 3000 line table in it, it's
only 2.24mb big, so I'm clearly getting massive of amounts of extra data
somewhere which is slowing things down.
Does anyone know if there's a way to stop my code doing this? I thought that
my
close Filehandle
&
close Freefile()
statements would free up mem, but that's not happening.
A cut down version of the code is below. (Roundup is not an Access function,
I have a separate function in the module that does that for me).
Any help is greatly appreciated cos I am stuck & need this to work on
thousands of files, not just 50!
Cheers
Doug
'--------------------------------------------------------------------------
Do Until n = targetfl.RecordCount
Forms!frmbenchmarking!timebar.Visible = True
Forms!frmbenchmarking.timebar.Max = targetfl.RecordCount
'
'set the new file
'
bFile = strParent & strCust & "\RVZ\" & targetfl![filename] & ".RVZ"
'
'create the new filehandle
'
Filehandle = FreeFile()
Open bFile For Input As #Filehandle
Forms!frmbenchmarking!timebar2.Value = 0
Forms!frmbenchmarking.Requery
Do
Line Input #Filehandle, strData
'm is number of lines down a file you are
m = m + 1
'add a new line to tblrpvs
With tbl
.AddNew
.Update
End With
'go to that new line
With tbl
.MoveLast
.Edit
'populate the table fields with data from the the rpv/byd
Do
![cust] = strCust
![Site] = Left(targetfl![filename], varTrim)
.Update
.Edit
![rpv_line] = m
![rpv year] = RoundUp(m / 12)
If Mid(strData, 26, 10) = " " Then
![eact£] = 0
Else
![eact£] = Mid(strData, 26, 10)
End If
'update the line with the data we've imported
.Update
Forms!frmbenchmarking!timebar2.Value = m
Forms!frmbenchmarking.Requery
Loop Until ![cust] > 0
'stop using the table
End With
Loop Until EOF(Filehandle)
m = 0 'resets the line counts once an entire file is imported
'this bit moves it to the next record in the table with the file names
in it.
With targetfl
.MoveNext
End With
On Error Resume Next
bFile = strParent & "\" & strCust & "\RVZ\" & targetfl![filename] & ".RVZ"
'as far as I know, these two statements should free up memory...
Close Filehandle
Close FreeFile
n = n + 1
Forms!frmbenchmarking!timebar.Value = n
Forms!frmbenchmarking.Requery
Loop