CM said:
I need to run many processes at the same time but I keep running into
this exception:
System.ComponentModel.Win32Exception: Not enough storage is available to
process this command
at
System.Diagnostics.Process.StartWithShellExecuteEx(ProcessStartInfo
startInfo)
at System.Diagnostics.Process.Start(ProcessStartInfo startInfo)
Are there any limits to the number of processes started at the same
time? For instance, about 20 processes are being started from multiple
threads at roughly the same time.
There is a limit, but 20 processes – even started simultaneously –
should be well under it (I don't know off the top of my head, but even
on 32-bit Windows, I'd expect the limit to be on the order of 1000 or
so…even hundreds of processes should not be a problem, at least in terms
of them running; see below for my observations with respect to starting
them simultaneously).
So, you've got a problem. But the vague description of the problem
doesn't suggest any particular solution and there's no way for anyone to
advise regarding the problem without a concise-but-complete code example
that reliably demonstrates the problem.
For what it's worth, I've included an actual concise-but-complete code
example (see below), demonstrating the near-simultaneous initiation of
200 (or any number you want) processes, and at 200 it runs without any
trouble at all on my 32-bit Windows 7 installation, in a virtual machine
no less (configured for 768MB of RAM).
Interestingly, beyond 200 processes, I start to see failures. But the
exact nature, and in fact exact number, is variable. If I increase the
number to 300, then I only get the exception you're seeing, and actually
in some cases wind up starting _fewer_ than 200 processes successfully
(the exact outcome is variable). It seems that there is some temporary
overhead involved during the initialization of a process such that
simultaneous initialization of an excessive number of processes can
reduce the total number possible.
At 500, I still only see about 100 failures. There's obviously some
kind of pipeline that, as the throughput suffers and the code maxes out
on just how many threads can really be all fighting to start a process
at once, more successes are possible, because things are stretched out
longer.
At much higher numbers – 750 to 1000 – I also start seeing plain old
"out of memory" exceptions. I have a harder time understanding these;
the exception you're seeing, I assume relates to some unmanaged resource
that is in short supply (non-paged pool, for example). But it's hard to
imagine what kind of managed data structure could be large enough and
necessary for this operation that only 1000 of them causes a problem.
Interestingly, even at 1000 I don't actually have a problem with the
number of _threads_ I'm creating. It's still the processes themselves
(though, keeping in mind that some of the same kinds of resources
required for a process may be required for a thread, so increasing the
number of threads can actually make a failure to start a process more
likely; that may explain the occasions where past 200 processes, the
program actually fails to create even the 200 that it could before).
I ran the same program on a different PC, with 64-bit Windows 7
installed, and oddly enough got almost the same behavior. So whatever
limit you're running into, it's a resource limitation that is the same
between 32-bit and 64-bit Windows (I'd think that would rule out
non-paged pool memory, since even though the allocation limits are the
same for 32-bit Windows and 64-bit Windows, the two installations I
tried have very different physical RAM configurations: 768MB vs 12GB;
the 64-bit machine should have a limit 16 times higher than the 32-bit one).
The only difference I noticed was that on 64-bit Windows, I didn't get
any "out of memory" exceptions, even at 1000 processes (I didn't try
more, but I expect that to be true even at much higher numbers). But
that makes sense to me, as whatever normal managed memory allocation was
failing, the 64-bit system definitely should be able to handle vastly
more. I only saw the "Not enough storage…" exception on the 64-bit PC.
But I did get those in about the same numbers, and at about the same
thresholds I saw on 32-bit Windows.
One thing I did notice was that in fact, the non-paged pool does get
ridiculously high for the process while the process is trying to start
up the other processes. I saw it go as high as 500K, which for
non-paged pool is pretty large. Whether that's actually related to the
error, I'm not sure. As I note above, the limit for that should be much
higher on 64-bit Windows 7 and I didn't see a difference in behavior for
the two installations.
Anyway, the bottom line is that I only start to see failures when I get
to a point where I am starting a ridiculously large number of processes
all at once. And even there, the number of failures is mitigated by the
inability of that many threads to actually execute at once, so the more
I try to start, the more I am actually able to start.
As long as you really are only just trying to start on the order of
dozens at once, that should be no problem. If you find yourself trying
to start hundreds at once, then yes…it's entirely possible you'll start
seeing errors. But then, if you're trying to start that many processes
at once, there's something wrong with your design. That really is a
ridiculous number of processes to have running simultaneously, never
mind _started_ simultaneously, never mind have that many threads running
simultaneously (even that many threads total is usually excessive, but
having that many _runnable_ at the same time is definitely excessive).
If you can provide more specific information about the exact scenario,
perhaps more specific advice can be offered. In the meantime, if it
turns out you really are trying to start hundreds of processes
simultaneously, I'll suggest you fix the design so that doesn't happen.
Ideally, you'd simply bring whatever processing you're trying to do into
the main process, or at the very least implement the processing in some
kind of service that your process can interact with. But if you're
dealing with some external program that you have no way of modifying,
then at the very least serialize the execution of that program so that
you only have a small number of instances running at once (preferably no
more than the total number of CPU cores on the computer, since anything
much more than that you're only going to hurt total throughput anyway).
Pete
using System;
using System.Threading;
using System.Diagnostics;
using System.Collections.Generic;
namespace TestMultiProcess
{
class Program
{
const int kcprocessMax = 200;
static void Main(string[] args)
{
int cprocessMax = kcprocessMax;
if (args.Length > 0)
{
int cprocess;
if (!int.TryParse(args[0], out cprocess))
{
Console.WriteLine("dummy process");
Thread.Sleep(2000);
return;
}
cprocessMax = cprocess;
}
Console.WriteLine("attempting to start {0} processes",
cprocessMax.ToString());
object objLock = new object();
ManualResetEvent mre = new ManualResetEvent(false);
int cthreadStarted = 0;
// Create a bunch of threads, all of which will start a process
for (int cprocess = 0; cprocess < cprocessMax; cprocess++)
{
new Thread(delegate()
{
lock (objLock)
{
if (++cthreadStarted == cprocessMax)
{
Monitor.Pulse(objLock);
}
}
// Don't proceed until all threads are waiting
mre.WaitOne();
// At this point, all threads should try to start
// a new process, more or less simultaneously
StartProcess();
lock (objLock)
{
if (--cthreadStarted == 0)
{
Monitor.Pulse(objLock);
}
}
}).Start();
}
lock (objLock)
{
while (cthreadStarted < cprocessMax)
{
Monitor.Wait(objLock);
}
}
// Once all the threads are ready to go, wake them
// all up
mre.Set();
Console.WriteLine("threads released");
lock (objLock)
{
while (cthreadStarted > 0)
{
Monitor.Wait(objLock);
}
}
Console.WriteLine("all threads done");
Console.WriteLine(" success: " +
cprocessStarted.ToString());
foreach (KeyValuePair<ExceptionDescription, int> kvp in
dictExceptions)
{
Console.WriteLine(" failure, {0}: {1}",
kvp.Key.Type.ToString(), kvp.Value.ToString());
Console.WriteLine(" (\"{0}\")", kvp.Key.Message);
}
Console.ReadLine();
}
static int cprocessStarted;
static Dictionary<ExceptionDescription, int> dictExceptions =
new Dictionary<ExceptionDescription, int>();
static readonly object objDict = new object();
struct ExceptionDescription
{
public readonly Type Type;
public readonly string Message;
public ExceptionDescription(Type type, string strMessage)
{
Type = type;
Message = strMessage;
}
public override bool Equals(object obj)
{
if (obj.GetType() != typeof(ExceptionDescription))
{
return false;
}
return ((ExceptionDescription)obj).Type == Type;
}
public override int GetHashCode()
{
return Type.GetHashCode();
}
}
static void StartProcess()
{
try
{
ProcessStartInfo psi = new
ProcessStartInfo("TestMultiProcess.exe", "foo");
Process.Start(psi);
Interlocked.Increment(ref cprocessStarted);
}
catch (Exception exc)
{
AddException(exc);
}
}
static void AddException(Exception exc)
{
ExceptionDescription exd = new
ExceptionDescription(exc.GetType(), exc.Message);
lock (objDict)
{
int cexd;
if (!dictExceptions.TryGetValue(exd, out cexd))
{
cexd = 0;
}
cexd++;
dictExceptions[exd] = cexd;
}
}
}
}