instr isn't finding all the matches

  • Thread starter Thread starter teatree
  • Start date Start date
T

teatree

i've a worksheet, and in column one, i've got a range that contains on
of the following:

TOTAL Employees
firstname, lastname
NAME
or a blank row

i'm using Instr to find the commas, and if it fails, i want it t
delete the entire row. so that i have a list of firstname, lastname
only.

when i run my code once

it deletes all occurrances of TOTAL Employees

i run it again, it removes the blank rows

again, and it cuts out all the NAME cells

can anyone help me get them all with one pass?

'begin code

Sub Clipped()
fc = "A3"
'column C is the only column with continuous data all
'the way down...
lr = Range("C3").End(xlDown).Row
lc = "A" & lr
Range(fc, lc).Select

For Each cll In Selection
iscomma = InStr(cll.Value, ",")

If iscomma = 0 Then
Range(cll.Address).EntireRow.Delete
End If

Next cll
End Sub

'end code

what i've tried:

a case statement in lieu of IF ...Then, Case ing on "0", Null and Cas
Else

subbing "," with Chr(44)

Having instr look for " ,"

always the same result

i'm using Excel 97 on Win 98.

strangely i modified this only slightly to remove middle initials
which were stuck on the end of the lastname:

Public, John Q.

run this same loop on "."
and it grabs them all, every time.

but then i'm keeping/modifying the text, not deleting the row
 
When deleting, work from the bottom up.

Try this alternative

Sub Clipped()
Dim i As Long
Dim IsComma As Long

For i = Cells(Rows.Count, "C").End(xlUp).Row To 3 Step -1
IsComma = InStr(Cells(i, "A").Value, ",")
If IsComma = 0 Then
Cells(i, "A").EntireRow.Delete
End If
Next i
End Sub


--

HTH

Bob Phillips
... looking out across Poole Harbour to the Purbecks
(remove nothere from the email address if mailing direct)
 
Sub Clipped()
Dim lr as Long, i as Long
lr = Range("C3").End(xlDown).Row
For i = lr to 3 step -1
if InStr(Cells(i,1).Value, ",") = 0 then
Cells(i,1).EntireRow.Delete
End If
Next i
End Sub
 
As Tom's post does, a loop will be functional if you loop backwards or reset the range

But, how about an autofilter

Sub Clear_Em(
Application.ScreenUpdating = Fals
With Sheets(2
Range(.[c3], .[c65536].End(3)).AutoFilter
Field:=1, Criteria1:="*,*
On Error Resume Nex
.[c4:c65536].SpecialCells(xlVisible)
.EntireRow.Delet
.AutoFilterMode = Fals
If CBool(InStrB(1, .[c3].Value, ",")) Then
.[c3].EntireRow.Delet
End Wit
Application.ScreenUpdating = Fals
End Su

Takes care of it with 2 deletes. Change Sheets(2) to your target sheet

Regards
Nate Oliver
 
Sorry

I just Noticed I had the check backwords, the following won't do the opposite of your goal set

Sub Clear_Em2(
Application.ScreenUpdating = Fals
With Sheets(2
Range(.[c3], .[c65536].End(3)).AutoFilter
Field:=1, Criteria1:="<>*,*
On Error Resume Nex
.[c4:c65536].SpecialCells(xlVisible)
.EntireRow.Delet
.AutoFilterMode = Fals
If Not CBool(InStrB(1, .[c3].Value, ",")) Then
.[c3].EntireRow.Delet
End Wit
Application.ScreenUpdating = Fals
End Su

Regards
Nate Oliver
 
i generally use autofilter for larger ranges, and a bubble sort fo
smaller ones, but i hadn't thought of using specialcells(xlvisible) t
delete the filtered results. that's pretty slick.

i don't tend to select all cells to row ~65000, because my version o
excel (97) won't reset the Used Range, very easily and on some of m
more extensive workbooks, it causes crashes
 
And fix the obvious typo

Application.ScreenUpdating = Fals
End Su

t

Application.ScreenUpdating = Tru
End Su

Apologies, I'm off to get some coffee

Regards, Nat
 
Hello again, a few tack-on thoughts

Agreed, the Special Cells method is a good one

Sheets(2).UsedRange

Should reset the UsedRange, and while you're running VBA at this point, it seems like an easy additional step

You're actually not selecting anything here. And, If you preferred, you could implement the End Method again, versus reviewing the entire column, you'd want to bake in a conditional test where Not [c65536].End(xlUp).Row < 4

Regards
Nate Oliver
 
Note that using notation like [C3] is 14 times slower than Range("C3")

Don't get in the habit.
 
I came up with ~30% time gain with the Range Object over the Evaluate Method (.000116 seconds) [versus ~1,400%]. Point noted however and note, it should vary with the language, I can't necessarily advocate the Method's use here, even though it probably isn't going to break the camel's back in a routine. Also noteworthy at this point, I was able to get a similar pick up with the Cells Property over the Range Object @ ~23%. All variance times noted on 30-run averages per sub

Public Declare Function QueryPerformanceFrequency
Lib "kernel32.dll" (
lpFrequency As Currency) As Lon

Public Declare Function QueryPerformanceCounter
Lib "kernel32.dll" (
lpPerformanceCount As Currency) As Lon

Sub EvalMethod(
Dim n As Currency, str As Currency, fin As Currenc
Dim y As Currency, f As Varian
QueryPerformanceFrequency
QueryPerformanceCounter st

f = [a1].Valu

QueryPerformanceCounter fi
n = (fin - str
Debug.Print Format(n, "##########.############") /
End Su

Sub RangeObj(
Dim n As Currency, str As Currency, fin As Currenc
Dim y As Currency, f As Varian
QueryPerformanceFrequency
QueryPerformanceCounter st

f = Range("a1").Valu

QueryPerformanceCounter fi
n = (fin - str
Debug.Print Format(n, "##########.############") /
End Su

Sub CellsProp(
Dim n As Currency, str As Currency, fin As Currenc
Dim y As Currency, f As Varian
QueryPerformanceFrequency
QueryPerformanceCounter st

f = Cells(1, 1).Valu

QueryPerformanceCounter fi
n = (fin - str
Debug.Print Format(n, "##########.############") /
End Su

Note, this does not account for the double-digit microsecond overhead on the 2nd QueryPerformance call

Regards
Nate Oliver
 
do a double transpose in Excel (not VBA) to get a single dimension
horizontal array, when you could just use the original 2D array or do a
straight assignment

Sheets(1).Range("A2:IT2").Value = Sheets(2).Range("C6:IV6").Value

varr = Sheets(2).Range("C6:IV6").Value
Sheets(2).Range("A2").Resize(,Ubound(varr)).Value = varr

Why?

Anyway, I used Microsoft's code for timing and got the following

( 101959058.8107 - 101959076.2179 - 0.0006 ) / 119.318
10000 assignments took 0.145884108013879 seconds ' Evaluate
( 101960815.814 - 101960822.4361 - 0.0006 ) / 119.318
10000 assignments took 5.54945607536164E-02 seconds ' Range

'Ratio
? 0.145884108013879/5.54945607536164E-02
2.62880012081855


Private Sub Time_Addition()
Dim Ctr1 As Currency, Ctr2 As Currency, Freq As Currency
Dim Overhead As Currency, A As Long, I As Long
QueryPerformanceFrequency Freq
QueryPerformanceCounter Ctr1
QueryPerformanceCounter Ctr2
Overhead = Ctr2 - Ctr1 ' determine API overhead
QueryPerformanceCounter Ctr1 ' time loop
For I = 1 To 10000
A = Range("A1").Value
'A = [A1].Value
'A = Cells(1,1).Value
Next I
QueryPerformanceCounter Ctr2
Debug.Print "("; Ctr1; "-"; Ctr2; "-"; Overhead; ") /"; Freq
Debug.Print "10000 assignments took";
Debug.Print (Ctr2 - Ctr1 - Overhead) / Freq; "seconds"
End Sub

http://support.microsoft.com/default.aspx?scid=kb;en-us;172338
HOWTO: Use QueryPerformanceCounter to Time Code


for cells I got:
( 101992290.0282 - 101992297.9884 - 0.0005 ) / 119.318
10000 assignments took 6.67099683199517E-02 seconds

slightly slower.

--
Regards,
Tom Ogilvy




Nate Oliver said:
I came up with ~30% time gain with the Range Object over the Evaluate
Method (.000116 seconds) [versus ~1,400%]. Point noted however and note, it
should vary with the language, I can't necessarily advocate the Method's use
here, even though it probably isn't going to break the camel's back in a
routine. Also noteworthy at this point, I was able to get a similar pick up
with the Cells Property over the Range Object @ ~23%. All variance times
noted on 30-run averages per sub.
Public Declare Function QueryPerformanceFrequency _
Lib "kernel32.dll" ( _
lpFrequency As Currency) As Long

Public Declare Function QueryPerformanceCounter _
Lib "kernel32.dll" ( _
lpPerformanceCount As Currency) As Long

Sub EvalMethod()
Dim n As Currency, str As Currency, fin As Currency
Dim y As Currency, f As Variant
QueryPerformanceFrequency y
QueryPerformanceCounter str

f = [a1].Value

QueryPerformanceCounter fin
n = (fin - str)
Debug.Print Format(n, "##########.############") / y
End Sub

Sub RangeObj()
Dim n As Currency, str As Currency, fin As Currency
Dim y As Currency, f As Variant
QueryPerformanceFrequency y
QueryPerformanceCounter str

f = Range("a1").Value

QueryPerformanceCounter fin
n = (fin - str)
Debug.Print Format(n, "##########.############") / y
End Sub

Sub CellsProp()
Dim n As Currency, str As Currency, fin As Currency
Dim y As Currency, f As Variant
QueryPerformanceFrequency y
QueryPerformanceCounter str

f = Cells(1, 1).Value

QueryPerformanceCounter fin
n = (fin - str)
Debug.Print Format(n, "##########.############") / y
End Sub

Note, this does not account for the double-digit microsecond overhead on the 2nd QueryPerformance call.

Regards,
Nate Oliver
 
Tom,

Can you share where you got this timing info. I was particularly interested
to read your comments on the [C3] type syntax. I personally find it ugly and
inelegant in code, but I recall another poster frequently posts an
alternative based upon that style. To hear it is slower is very interesting.

Bob

Tom Ogilvy said:
do a double transpose in Excel (not VBA) to get a single dimension
horizontal array, when you could just use the original 2D array or do a
straight assignment

Sheets(1).Range("A2:IT2").Value = Sheets(2).Range("C6:IV6").Value

varr = Sheets(2).Range("C6:IV6").Value
Sheets(2).Range("A2").Resize(,Ubound(varr)).Value = varr

Why?

Anyway, I used Microsoft's code for timing and got the following

( 101959058.8107 - 101959076.2179 - 0.0006 ) / 119.318
10000 assignments took 0.145884108013879 seconds ' Evaluate
( 101960815.814 - 101960822.4361 - 0.0006 ) / 119.318
10000 assignments took 5.54945607536164E-02 seconds ' Range

'Ratio
? 0.145884108013879/5.54945607536164E-02
2.62880012081855


Private Sub Time_Addition()
Dim Ctr1 As Currency, Ctr2 As Currency, Freq As Currency
Dim Overhead As Currency, A As Long, I As Long
QueryPerformanceFrequency Freq
QueryPerformanceCounter Ctr1
QueryPerformanceCounter Ctr2
Overhead = Ctr2 - Ctr1 ' determine API overhead
QueryPerformanceCounter Ctr1 ' time loop
For I = 1 To 10000
A = Range("A1").Value
'A = [A1].Value
'A = Cells(1,1).Value
Next I
QueryPerformanceCounter Ctr2
Debug.Print "("; Ctr1; "-"; Ctr2; "-"; Overhead; ") /"; Freq
Debug.Print "10000 assignments took";
Debug.Print (Ctr2 - Ctr1 - Overhead) / Freq; "seconds"
End Sub

http://support.microsoft.com/default.aspx?scid=kb;en-us;172338
HOWTO: Use QueryPerformanceCounter to Time Code


for cells I got:
( 101992290.0282 - 101992297.9884 - 0.0005 ) / 119.318
10000 assignments took 6.67099683199517E-02 seconds

slightly slower.

--
Regards,
Tom Ogilvy




Nate Oliver said:
I came up with ~30% time gain with the Range Object over the Evaluate
Method (.000116 seconds) [versus ~1,400%]. Point noted however and note, it
should vary with the language, I can't necessarily advocate the Method's use
here, even though it probably isn't going to break the camel's back in a
routine. Also noteworthy at this point, I was able to get a similar pick up
with the Cells Property over the Range Object @ ~23%. All variance times
noted on 30-run averages per sub.
Public Declare Function QueryPerformanceFrequency _
Lib "kernel32.dll" ( _
lpFrequency As Currency) As Long

Public Declare Function QueryPerformanceCounter _
Lib "kernel32.dll" ( _
lpPerformanceCount As Currency) As Long

Sub EvalMethod()
Dim n As Currency, str As Currency, fin As Currency
Dim y As Currency, f As Variant
QueryPerformanceFrequency y
QueryPerformanceCounter str

f = [a1].Value

QueryPerformanceCounter fin
n = (fin - str)
Debug.Print Format(n, "##########.############") / y
End Sub

Sub RangeObj()
Dim n As Currency, str As Currency, fin As Currency
Dim y As Currency, f As Variant
QueryPerformanceFrequency y
QueryPerformanceCounter str

f = Range("a1").Value

QueryPerformanceCounter fin
n = (fin - str)
Debug.Print Format(n, "##########.############") / y
End Sub

Sub CellsProp()
Dim n As Currency, str As Currency, fin As Currency
Dim y As Currency, f As Variant
QueryPerformanceFrequency y
QueryPerformanceCounter str

f = Cells(1, 1).Value

QueryPerformanceCounter fin
n = (fin - str)
Debug.Print Format(n, "##########.############") / y
End Sub

Note, this does not account for the double-digit microsecond overhead on the 2nd QueryPerformance call.

Regards,
Nate Oliver
 
I put my information in the post you are responding to - It showed 2.6 times
slower for [C3]. I got the original figure a while back using a less robust
timer - but I mistated - it wasn't 14 times, but 2 to 3 times, consistent
with the above. I couldn't find my original post where I did that - sorry.
For this timing, the code and the source of the code are in the msg you
responded to. It just makes sense that having to interface into Excel to do
an evaluate and resolve the string to a range reference would be slower than
furnishing a range reference in VBA. Not to knock evaluate or the shorter
square brackets - there are plenty of uses for them, but seems using them
for a range reference is abusive <g>.

Tushar did some comparison of Dir, Evaluate, InstrRev and a looping approach
to get a file name stripped from a fully qualified path. Not directly
related, but interesting
http://tinyurl.com/32adx


--
Regards,
Tom Ogilvy



Bob Phillips said:
Tom,

Can you share where you got this timing info. I was particularly interested
to read your comments on the [C3] type syntax. I personally find it ugly and
inelegant in code, but I recall another poster frequently posts an
alternative based upon that style. To hear it is slower is very interesting.

Bob

Tom Ogilvy said:
do a double transpose in Excel (not VBA) to get a single dimension
horizontal array, when you could just use the original 2D array or do a
straight assignment

Sheets(1).Range("A2:IT2").Value = Sheets(2).Range("C6:IV6").Value

varr = Sheets(2).Range("C6:IV6").Value
Sheets(2).Range("A2").Resize(,Ubound(varr)).Value = varr

Why?

Anyway, I used Microsoft's code for timing and got the following

( 101959058.8107 - 101959076.2179 - 0.0006 ) / 119.318
10000 assignments took 0.145884108013879 seconds ' Evaluate
( 101960815.814 - 101960822.4361 - 0.0006 ) / 119.318
10000 assignments took 5.54945607536164E-02 seconds ' Range

'Ratio
? 0.145884108013879/5.54945607536164E-02
2.62880012081855


Private Sub Time_Addition()
Dim Ctr1 As Currency, Ctr2 As Currency, Freq As Currency
Dim Overhead As Currency, A As Long, I As Long
QueryPerformanceFrequency Freq
QueryPerformanceCounter Ctr1
QueryPerformanceCounter Ctr2
Overhead = Ctr2 - Ctr1 ' determine API overhead
QueryPerformanceCounter Ctr1 ' time loop
For I = 1 To 10000
A = Range("A1").Value
'A = [A1].Value
'A = Cells(1,1).Value
Next I
QueryPerformanceCounter Ctr2
Debug.Print "("; Ctr1; "-"; Ctr2; "-"; Overhead; ") /"; Freq
Debug.Print "10000 assignments took";
Debug.Print (Ctr2 - Ctr1 - Overhead) / Freq; "seconds"
End Sub

http://support.microsoft.com/default.aspx?scid=kb;en-us;172338
HOWTO: Use QueryPerformanceCounter to Time Code


for cells I got:
( 101992290.0282 - 101992297.9884 - 0.0005 ) / 119.318
10000 assignments took 6.67099683199517E-02 seconds

slightly slower.

--
Regards,
Tom Ogilvy




Nate Oliver said:
I came up with ~30% time gain with the Range Object over the Evaluate
Method (.000116 seconds) [versus ~1,400%]. Point noted however and note, it
should vary with the language, I can't necessarily advocate the Method's use
here, even though it probably isn't going to break the camel's back in a
routine. Also noteworthy at this point, I was able to get a similar pick up
with the Cells Property over the Range Object @ ~23%. All variance times
noted on 30-run averages per sub.
Public Declare Function QueryPerformanceFrequency _
Lib "kernel32.dll" ( _
lpFrequency As Currency) As Long

Public Declare Function QueryPerformanceCounter _
Lib "kernel32.dll" ( _
lpPerformanceCount As Currency) As Long

Sub EvalMethod()
Dim n As Currency, str As Currency, fin As Currency
Dim y As Currency, f As Variant
QueryPerformanceFrequency y
QueryPerformanceCounter str

f = [a1].Value

QueryPerformanceCounter fin
n = (fin - str)
Debug.Print Format(n, "##########.############") / y
End Sub

Sub RangeObj()
Dim n As Currency, str As Currency, fin As Currency
Dim y As Currency, f As Variant
QueryPerformanceFrequency y
QueryPerformanceCounter str

f = Range("a1").Value

QueryPerformanceCounter fin
n = (fin - str)
Debug.Print Format(n, "##########.############") / y
End Sub

Sub CellsProp()
Dim n As Currency, str As Currency, fin As Currency
Dim y As Currency, f As Variant
QueryPerformanceFrequency y
QueryPerformanceCounter str

f = Cells(1, 1).Value

QueryPerformanceCounter fin
n = (fin - str)
Debug.Print Format(n, "##########.############") / y
End Sub

Note, this does not account for the double-digit microsecond overhead
on
the 2nd QueryPerformance call.
Regards,
Nate Oliver
 
Thanks Tom.

I saw the bit in the msg, but when you referred to '... I used Microsoft's
code for timing ...' it was that that I was asking about.

2-3 times is not quite so impressive as 14 times, but still a better
justification than my 'I don't like it'.

Thanks again

Bob
 
Good Day,
do a double transpose in Excel (not VBA) to get a single dimension
horizontal array, when you could just use the original 2D array or do a
straight assignment
Sheets(1).Range("A2:IT2").Value = Sheets(2).Range("C6:IV6").Value

varr = Sheets(2).Range("C6:IV6").Value
Sheets(2).Range("A2").Resize(,Ubound(varr)).Value = varr

Probably should discuss this in the thread in question eh,
but, why not? The madness to the method was for the sake of
supplying functional vba. Obviously I'm using Excel
Object's native Transpose Function, as I'm trying to stay
away from a loop. Here's the good and bad
Sheets(1).Range("A2:IT2").Value =
Sheets(2).Range("C6:IV6").Value

Is functional on my PC, and yes, much more efficient than
flipping an array twice. Each flip could be costing you
100% lead-time... But it's not dynamic as such, which may
or may not be fine, so be it. Re:

Dim myArr2 As Variant
myArr2 = Sheets(2).Range("C6:IV6").Value
Sheets(1).Range("A2").Resize(, UBound(myArr2)).Value = myArr2

I would love to pursue this, but it doesn't work with my
software (Excel '00 SP2). I simply get the first element
returned to A2 in my destination sheet. Flip it a few
times, and you're in like Flynn. A performance hit, but it
actually does what it's intended to do.
Anyway, I used Microsoft's code for timing and got
the following

Note, the only material difference in MS' code and the code
I provided is the accounting for overhead, ~19
microseconds. What is a bit of an issue here is that we're
talking apples to oranges in a certain respect. Using MS'
timing procedure, we'll add a couple of api calls to the
kernel in a module:

Declare Function QueryPerformanceCounter _
Lib "kernel32" ( _
X As Currency) As Boolean

Declare Function QueryPerformanceFrequency _
Lib "kernel32" ( _
X As Currency) As Boolean

Now we'll look at single assignments with:

Private Sub Time_Addition1()
Dim Ctr1 As Currency, Ctr2 As Currency, Freq As Currency
Dim Overhead As Currency, A As Long
QueryPerformanceFrequency Freq
QueryPerformanceCounter Ctr1
QueryPerformanceCounter Ctr2
Overhead = Ctr2 - Ctr1 ' determine API overhead
QueryPerformanceCounter Ctr1 ' time loop

A = Range("A1").Value
Rem A = [A1].Value
Rem A = Cells(1, 1).Value

QueryPerformanceCounter Ctr2
Debug.Print (Ctr2 - Ctr1 - Overhead) / Freq
End Sub

I run this 53 times for each, and here are the averages I'm
looking at:

Range Object: 9.28E-05 Seconds
Evaluate Method: 1.31E-04 Seconds
Cells Property: 7.45E-05 Seconds

So, with the single assignment, I get a 41.5%
performance-hit variance with the Evaluate Method relative
to the Range Object (a difference of .000039 seconds), and
I get a 24.6% performance-hit variance with the Range
Object relative to the Cells Property.

Now we loop 20,000 times:

Private Sub Time_Addition2()
Dim Ctr1 As Currency, Ctr2 As Currency, Freq As Currency
Dim Overhead As Currency, A As Long, I As Long
QueryPerformanceFrequency Freq
QueryPerformanceCounter Ctr1
QueryPerformanceCounter Ctr2
Overhead = Ctr2 - Ctr1 ' determine API overhead
QueryPerformanceCounter Ctr1 ' time loop

For I = 1 To 20000
Rem A = Range("A1").Value
Rem A = [A1].Value
Rem A = Cells(1, 1).Value
Next

QueryPerformanceCounter Ctr2
Debug.Print (Ctr2 - Ctr1 - Overhead) / Freq
End Sub

I run this 35 times for each, and here are the averages I'm
looking at:

Range Object: ~.289 Seconds
Evaluate Method: ~.539 Seconds
Cells Property: ~.276 Seconds

I see a 186.8% performance-hit variance with the Evaluate
Method relative to the Range Object (a difference of .25
seconds), and I get a 4.4% performance-hit variance with
the Range Object relative to the Cells Property.

So, it appears the performance of various approaches aren't
moving linearly together, and the Evaluate Method's
performance decays to a greater extent with application
relative to the Range Object. As 41.5% and 186.8% don't
strike me as small variances, I would have to say, the
short hand isn't worth it...

As for the Evaluate Method being ugly with respect to Range
References, I under the opposite impression; I think it
actually looks better than longer reference approaches. I
tend to like shorter code if the performance holds up. And
yes, based on these numbers, it is abusive with VBA, while
being less abusive on my keyboard. It seems one's loss is
another gain. <g>

Regards,
Nate Oliver
 
There was a typo on the Array approach.

These were fine in Excel 97 and later:

Sub Tester1()
Dim myArr2 As Variant
myArr2 = Sheets(2).Range("C6:IV6").Value
Sheets(1).Range("A2").Resize(, UBound(myArr2, 2)).Value = myArr2

cols = 254
Sheets(1).Range("A4").Resize(1, cols).Value = _
Sheets(2).Range("C6").Resize(1, cols).Value

End Sub

re: >But it's not dynamic as such

as written no it wasn't, but it can be as dynamic as any other method.
Limited only by the imagination.

--
Regards,
Tom Ogilvy



Nate Oliver said:
Good Day,
do a double transpose in Excel (not VBA) to get a single dimension
horizontal array, when you could just use the original 2D array or do a
straight assignment
Sheets(1).Range("A2:IT2").Value = Sheets(2).Range("C6:IV6").Value

varr = Sheets(2).Range("C6:IV6").Value
Sheets(2).Range("A2").Resize(,Ubound(varr)).Value = varr

Probably should discuss this in the thread in question eh,
but, why not? The madness to the method was for the sake of
supplying functional vba. Obviously I'm using Excel
Object's native Transpose Function, as I'm trying to stay
away from a loop. Here's the good and bad
Sheets(1).Range("A2:IT2").Value =
Sheets(2).Range("C6:IV6").Value

Is functional on my PC, and yes, much more efficient than
flipping an array twice. Each flip could be costing you
100% lead-time... But it's not dynamic as such, which may
or may not be fine, so be it. Re:

Dim myArr2 As Variant
myArr2 = Sheets(2).Range("C6:IV6").Value
Sheets(1).Range("A2").Resize(, UBound(myArr2)).Value = myArr2

I would love to pursue this, but it doesn't work with my
software (Excel '00 SP2). I simply get the first element
returned to A2 in my destination sheet. Flip it a few
times, and you're in like Flynn. A performance hit, but it
actually does what it's intended to do.
Anyway, I used Microsoft's code for timing and got
the following

Note, the only material difference in MS' code and the code
I provided is the accounting for overhead, ~19
microseconds. What is a bit of an issue here is that we're
talking apples to oranges in a certain respect. Using MS'
timing procedure, we'll add a couple of api calls to the
kernel in a module:

Declare Function QueryPerformanceCounter _
Lib "kernel32" ( _
X As Currency) As Boolean

Declare Function QueryPerformanceFrequency _
Lib "kernel32" ( _
X As Currency) As Boolean

Now we'll look at single assignments with:

Private Sub Time_Addition1()
Dim Ctr1 As Currency, Ctr2 As Currency, Freq As Currency
Dim Overhead As Currency, A As Long
QueryPerformanceFrequency Freq
QueryPerformanceCounter Ctr1
QueryPerformanceCounter Ctr2
Overhead = Ctr2 - Ctr1 ' determine API overhead
QueryPerformanceCounter Ctr1 ' time loop

A = Range("A1").Value
Rem A = [A1].Value
Rem A = Cells(1, 1).Value

QueryPerformanceCounter Ctr2
Debug.Print (Ctr2 - Ctr1 - Overhead) / Freq
End Sub

I run this 53 times for each, and here are the averages I'm
looking at:

Range Object: 9.28E-05 Seconds
Evaluate Method: 1.31E-04 Seconds
Cells Property: 7.45E-05 Seconds

So, with the single assignment, I get a 41.5%
performance-hit variance with the Evaluate Method relative
to the Range Object (a difference of .000039 seconds), and
I get a 24.6% performance-hit variance with the Range
Object relative to the Cells Property.

Now we loop 20,000 times:

Private Sub Time_Addition2()
Dim Ctr1 As Currency, Ctr2 As Currency, Freq As Currency
Dim Overhead As Currency, A As Long, I As Long
QueryPerformanceFrequency Freq
QueryPerformanceCounter Ctr1
QueryPerformanceCounter Ctr2
Overhead = Ctr2 - Ctr1 ' determine API overhead
QueryPerformanceCounter Ctr1 ' time loop

For I = 1 To 20000
Rem A = Range("A1").Value
Rem A = [A1].Value
Rem A = Cells(1, 1).Value
Next

QueryPerformanceCounter Ctr2
Debug.Print (Ctr2 - Ctr1 - Overhead) / Freq
End Sub

I run this 35 times for each, and here are the averages I'm
looking at:

Range Object: ~.289 Seconds
Evaluate Method: ~.539 Seconds
Cells Property: ~.276 Seconds

I see a 186.8% performance-hit variance with the Evaluate
Method relative to the Range Object (a difference of .25
seconds), and I get a 4.4% performance-hit variance with
the Range Object relative to the Cells Property.

So, it appears the performance of various approaches aren't
moving linearly together, and the Evaluate Method's
performance decays to a greater extent with application
relative to the Range Object. As 41.5% and 186.8% don't
strike me as small variances, I would have to say, the
short hand isn't worth it...

As for the Evaluate Method being ugly with respect to Range
References, I under the opposite impression; I think it
actually looks better than longer reference approaches. I
tend to like shorter code if the performance holds up. And
yes, based on these numbers, it is abusive with VBA, while
being less abusive on my keyboard. It seems one's loss is
another gain. <g>

Regards,
Nate Oliver
 
Back
Top