S
Steve McLellan
Hi,
Wondering if anyone can shed some light on something that's troubled us for
some time... we write computationally expensive image processing apps for
both Windows 98, 2K, XP and Mac OS X. We tile all our calculations both for
responsiveness and memory reasons, but originally we only did this after we
hit memory allocation problems under Windows. My question is, is there any
way to predict how much contiguous memory the OS will let you allocate at
once? The figure seems enormously lower than the actual system memory, which
I imagine is a result of things getting shoved into an application's memory
space wherever the OS pleases. Is there any lower bound on this (i.e. can I
ALWAYS be sure of getting say 128MB allocated in a block) or any way of
asking (nicely, of course) to shuffle things about for some more room? I'm
mainly concerned with doing this under XP, as we're not planning to support
earlier OSs for a current project.
Thanks,
Steve
Wondering if anyone can shed some light on something that's troubled us for
some time... we write computationally expensive image processing apps for
both Windows 98, 2K, XP and Mac OS X. We tile all our calculations both for
responsiveness and memory reasons, but originally we only did this after we
hit memory allocation problems under Windows. My question is, is there any
way to predict how much contiguous memory the OS will let you allocate at
once? The figure seems enormously lower than the actual system memory, which
I imagine is a result of things getting shoved into an application's memory
space wherever the OS pleases. Is there any lower bound on this (i.e. can I
ALWAYS be sure of getting say 128MB allocated in a block) or any way of
asking (nicely, of course) to shuffle things about for some more room? I'm
mainly concerned with doing this under XP, as we're not planning to support
earlier OSs for a current project.
Thanks,
Steve