When "myObject" is a local variable of your subroutine, it is actually
a bad practice to set it to Nothing at the end of the body. That is
equivalent to calling GC.KeepAlive(). It prevents the object from
getting collected as early as it can be if the collection happens to
start just as the subroutine is busy executing but the remainder of
the body has no more references to the variable.
Unless VB.NET is doing something different to C#, it makes no
difference in release mode. The GC/JIT is clever enough to realise
that there will be no more *reads* of the variable - the can be
collected before the assignment is made.
Here's a sample app which shows this:
using System;
class Test
{
~Test()
{
Console.WriteLine("Finalizer called");
}
static void Main()
{
Console.WriteLine ("Before creation");
Test t = new Test();
GC.Collect();
GC.WaitForPendingFinalizers();
Console.WriteLine ("Before assignment");
t = null;
}
}
Forgetting to call Dispose() when a class implements it is not
something you'd typically notice. Your app just runs a bit "heavy",
holding on to unmanaged operating system resources longer than
necessary. After a garbage collection occurs, the finalizer ensures
the Dispose code runs to release the resources. Note however that you
do risk OOM, particularly when you use an object that uses lots of
unmanaged memory but very little GC heap space. Bitmap is a good
example. The one and only example I know of where not calling
Dispose() causes a leak: using a BindingSource on a Windows Forms
dialog and not calling Dispose() on the form after ShowDialog()
returns.
Memory is just one resource, however. More importantly, failure to
dispose of streams and database connections can leave files locked and
use pooled connections way beyond the desired time.
Basically, it's important to call Dispose, and shouldn't be thought of
as an optional extra.
Jon