Much larger memory use, normal?

  • Thread starter Thread starter Afanasiy
  • Start date Start date
A

Afanasiy

Should I be concerned with the amount of memory my C# applications use?

I have 2 gigs of ram, and I use most of that without running any C#
applications. So, when I see C# applications in development, which are
using much more memory than their Delphi and Python equivalents, I am a
bit worried.

Is there more than meets the eye or is it simply deemed acceptable for
the platform?

-AB
 
C# applications do have a larger footprint because the .NET framework has to
load into memory as well.
 
Should I be concerned with the amount of memory my C#
applications use?

I have 2 gigs of ram, and I use most of that without running any
C# applications. So, when I see C# applications in development,
which are using much more memory than their Delphi and Python
equivalents, I am a bit worried.

Is there more than meets the eye or is it simply deemed
acceptable for the platform?

Afanasiy,

Are you using Task Manager (TM) to calculate memory usage? If so, I
think you can ignore those numbers. AFAIK, Microsoft has never made
the algorithm TM uses to calculate memory usage public, so there is
no way to really know what TM is reporting.

Try this to see what I mean:

- open TM.
- start your .Net application.
- note your .Net app's memory usage as reported by TM.
- minimize your .Net app.
- again note your .Net app's memory usage as reported by TM.
- restore your .Net app.
- again note your .Net app's memory usage as reported by TM.

The three memory readings will vary wildly depending upon the state
of your application's main form.

Instead of using TM, use PerfMon:

http://msdn.microsoft.com/library/default.asp?url=/library/en-
us/cpgenref/html/cpconruntimeprofiling.asp

(or http://tinyurl.com/ypt4t)

or get a .Net-specific memory profiler:

http://www.scitech.se/memprofiler/.

Hope this helps.

Chris.
 
C# applications do have a larger footprint because the .NET framework has to
load into memory as well.

I suppose you mean every I use adds to the memory footprint. This
happens in Delphi and Python as well. Use a module and you incorporate
overhead.

If this is the reason C# applications use much more memory, then I
am surprised the resources for these modules is in no way shared.

I don't think this is exactly it though.
 
Afanasiy,

Are you using Task Manager (TM) to calculate memory usage? If so, I
think you can ignore those numbers. AFAIK, Microsoft has never made
the algorithm TM uses to calculate memory usage public, so there is
no way to really know what TM is reporting.

Actually, GetProcessMemoryInfo can report the same numbers task manager
does. I never ran an API monitor to verify this. However, the fact that
the numbers for WorkingSetSize and PeakWorkingSetSize as returned by the
Win32 API GetProcessMemoryInfo have always been the same as those seen in
the task manager is enough for me.
Try this to see what I mean:

- open TM.
- start your .Net application.
- note your .Net app's memory usage as reported by TM.
- minimize your .Net app.
- again note your .Net app's memory usage as reported by TM.
- restore your .Net app.
- again note your .Net app's memory usage as reported by TM.

The three memory readings will vary wildly depending upon the state
of your application's main form.

The memory use changes you see when minimizing applications is just
windows calling SetProcessWorkingSetSize. This is expected behaviour.

I have attempted some clever implementations of SetProcessWorkingSetSize
in my C# applications and they can bring down memory use, but that is
deceptive and destructive to performance. It is primarily a hack.

-AB
 
Well, anyone of you realizes that

GetProcessWorkingSetSize

does NOT report memory used? It reports the size of the working set, which
does mot mean that this set actually is allocated memory (virtual or
physical). THe numbers reported back are larger for .NET applcations because
the .NET runtime is a little aggressive setting it's working set size, given
that - she does not allocate any ressources (a.k.a. RAM in this context).

The TaskManager numbers are meaningless for managed applications.

--
Regards

Thomas Tomiczek
THONA Software & Consulting Ltd.
(Microsoft MVP C#/.NET)
(CTO PowerNodes Ltd.)
 
Well, anyone of you realizes that

GetProcessWorkingSetSize

does NOT report memory used? It reports the size of the working set, which
does mot mean that this set actually is allocated memory (virtual or
physical). THe numbers reported back are larger for .NET applcations because
the .NET runtime is a little aggressive setting it's working set size, given
that - she does not allocate any ressources (a.k.a. RAM in this context).

If the memory use I am seeing was that benign, then I would have no way
to explain the much larger number of page faults in my C# applications.

As such, I don't think that's just it. I think it is in fact using much
of the memory in the working set. It increases it often and page faults
often. That does not indicate simply having a large working set size for
aggressive performance.
 
A Page fault also happens when non-used memory gets allocated.

The working set in itself means nothing :-)

--
Regards

Thomas Tomiczek
THONA Software & Consulting Ltd.
(Microsoft MVP C#/.NET)
(CTO PowerNodes Ltd.)
 
A Page fault also happens when non-used memory gets allocated.

The working set in itself means nothing :-)

So basically, you are saying it's not using 3-10 times as much memory,
and that the 3-10 times as many page faults have nothing to do with
going to secondary storage for program instructions and data, indicative
of requiring a larger working set size. (which I watch it increase)

That I don't believe, and I must imagine approaching and surpassing my
virtual and physical memory limitations with a C# application would be
something to worry about.

As such, I can so far only imagine that C# is in fact using the working
set size, which it deems necessary to increase when it uses more, and
that the page faults are not completely benign.

Given the replies, I must say it is simply something people have accepted.

Since manually setting the working set size causes even more dramatic
performance degradation, I cannot consider that a possible solution.

So, is there one? A solution which would prove that C# applications are
in fact not using the memory in their working set, which they would
increase for no good reason, and which page fault for no bad reason?

-AB
 
Back
Top