Double vs Variant

  • Thread starter Thread starter JVBeaupre
  • Start date Start date
J

JVBeaupre

While experimenting with speeding things up by explicitly specifiying
the data type, I noticed that small differences were occuring between
executions with and without "as double" in Dim & Private statements
for arrays.
Does anyone have similar experiences or understand how this can happen?
 
JVBeaupre said:
While experimenting with speeding things up by
explicitly specifiying the data type, I noticed
that small differences were occuring between
executions with and without "as double" in Dim
& Private statements for arrays.

Does anyone have similar experiences or understand
how this can happen?

What surprises you about that? Which way do the "small differences" go?

It should be no surprise that typing variables as Double instead of Variant
(implicitly or explicitly) would improve performance. With a variant
variable, VB must interpret the subtype before performing any calculation.
With a double variable, VB always knows how to load and store the value.

If you are saying that typing variables as Variant (implicitly or
explicitly) instead of Double improved performance, that would be
surprising. Except....

Caveat about "small differences". Some variation, even without changes, is
likely. These can be due to many factors, including but not limited to
interrupt handling and CPU utilization by other processes. I have also seen
"large differences" between the first and subsequent times that I execute a
macro or UDF after making changes. I always ignore the first execution or
the first iteration.

And if you are measuring the performance of macros per se, there are "well
known" steps that you might need to take to minimize the effect that Excel
itself might have. (Sigh, the URL escapes me at the moment.)

Finally, the term "small" and "large" are subjective. It would help if you
were more specific, as well as if you explained exactly how you are
measuring the differences. Performance measurement is an art. It is not as
straight-forward as some articles would (mis)lead you to believe; that is,
the interpretation is not straight-forward. Certainly, some techniques are
more accurate than others. But that accuracy can also expose "small
differences" that are to be expected. You need to average your
observations, and you may need to use statistical techniques to compare
"small differences" in the averages.
 
While experimenting with speeding things up by explicitly specifiying
the data type, I noticed that small differences were occuring between
executions with and without "as double" in Dim & Private statements
for arrays.
Does anyone have similar experiences or understand how this can happen?

Hi JB,
for speed optimization have a look at http://www.decisionmodels.com/
and http://www.decisionmodels.com/optspeed.htm

HTH,
have a nice day
Cheers Michael
 
Back
Top