Different between ???.close and Set ??? = Nothing

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

Hi,

What's the different between Close and Nothing for cleanup?
ie.,
rst.Close
db.Close
Set rst = Nothing
Set db = Nothing

Is the purpose of cleanup to free up the RAM?

Also, I have been using Excel to retrieve some data from Access using ADO,
and I ran into some bugs before the Sub get to rs.close, does it mean that
some memory still got locked up?

Thanks
 
The golden rule is:
"If you opened it, you should close it."

BTW - you did not open db so you should not close it.

Your code should have error handlers.
When you get an error, you should jump to your clean up code to ensure you
do not leave stray references.
The point of Nothing is to remove the reference to your object so it the RAM
can be freed up.

You code should look somethin like:

MyErrorHandler:
'ignore any errors while cleaning up.
On Error ResumeNext
rst.Close
Set rst = Nothing
Set db = Nothing
 
...
The point of Nothing is to remove the reference to your object so it the RAM
can be freed up.

If the object variable merely a pointer (reference) you'll only be
releasing 4 bytes and I've yet to see code that sets explicitly
empties all Long variables <g>. More importantly, though, you'll be
decreasing the reference count so the actual object may be released
(if this doesn't happen you'll get a memory leak). However, it is rare
that an object variable needs to be explicitly set to nothing; simply
let it go out of scope and let the garbage collector do its work. If
you always set object variables to Nothing you risk obscuring the code
that *needs* to set object variables to Nothing e.g. explicit teardown
code to resolve circular references.

Jamie.

--
 
In the VBA IDE I always use the Toggle Breakpoint to test code, and I
frequently use it to stop the execution of code whenever I found a error.
Therefore, code never got a completed.
Would the garbage collector collect those objects or variables that got
referenced?

Is there way I could find RAM that locked up as the result of not freeing
resources (variables) up preoperly?

Thanks
 
Jamie Collins said:
...


If the object variable merely a pointer (reference) you'll only be
releasing 4 bytes and I've yet to see code that sets explicitly
empties all Long variables <g>. More importantly, though, you'll be
decreasing the reference count so the actual object may be released
(if this doesn't happen you'll get a memory leak). However, it is rare
that an object variable needs to be explicitly set to nothing; simply
let it go out of scope and let the garbage collector do its work. If
you always set object variables to Nothing you risk obscuring the code
that *needs* to set object variables to Nothing e.g. explicit teardown
code to resolve circular references.

There was a time, though, when the garbage collector could not be relied
on to do its work, and failing to explicitly set some object references
to nothing would cause Access to refuse to close. So a lot of people
have formed the habit of explicitly destroying all object variables
before they go out of scope. To me this seems a tidy habit, and I
follow it myself.

Against your concern about possibly obscuring those circumstances where
you really *must* clear the reference, I'd set the argument that
habitually doing so for all object variables makes it less likely that
one will overlook it when it's really necessary.
 
Dirk Goldgar said:
Against your concern about possibly obscuring those circumstances where
you really *must* clear the reference, I'd set the argument that
habitually doing so for all object variables makes it less likely that
one will overlook it when it's really necessary.

But unless you go back and remove the extraneous ones, it makes the
code harder to maintain e.g. for the developer who inherits your code.
I've recently inherited a code base where implicit ByRef has been used
for all arguments passed to subs/functions e.g.

Public Sub MakeWorkTable(TableName As String, TemplateName As
String)

The original author probably knew which arguments *need* to be ByRef
but because their fallback 'lowest common denominator' position was to
always use ByRef, I must check each one. Then there was that other
developer who used implicit Variant data type for all variables...

Jamie.

--
 
Jamie Collins said:
But unless you go back and remove the extraneous ones, it makes the
code harder to maintain e.g. for the developer who inherits your code.

I don't know, Jamie; I don't think it does. If the rule is always to
explicitly set object variables to Nothing when you're done with them,
then if this rule is followed, it seems to me the "information content"
of the code is the same.
I've recently inherited a code base where implicit ByRef has been used
for all arguments passed to subs/functions e.g.

Public Sub MakeWorkTable(TableName As String, TemplateName As
String)

The original author probably knew which arguments *need* to be ByRef
but because their fallback 'lowest common denominator' position was to
always use ByRef, I must check each one.

Now see, I think I follow the opposite protocol from you. I override
the default only and always when it matters. So if an argument *must*
be ByRef, because I'm going to change its value for the calling
procedure, I use the ByRef keyword. If an argument *must* be ByVal,
either because I plan to change its value inside my procedure and don't
want the change to be reflected upward, or because I hope to get a
performance boost from it, then I'll use the ByVal keyword.

To me, this policy conveys *more* meaning to the developer who might
read my code than just always explicitly stating "ByRef" because that's
the VB default.

More important than either policy, though, is to have a policy and be
consistent about it. That way everybody knows what to think when they
pick up somebody else's code.
Then there was that other
developer who used implicit Variant data type for all variables...

Yuck!
 
Back
Top