My Canon software opens files that are around 5MB in size. If I open 10-20
of them (50MB -100MB of files) and do some work in all of them (change white
balance, crop, resize, convert from canon raw format to JPG etc), I can fill
my 1.5GB RAM very quickly. My point is, don't be fooled by filesize - this
has nothing to do with how much RAM the software will use.
Once your 50MB files are opened and manipulated, you will definitely use
more than 50MB RAM - I would expect you could easily be using 500MB RAM
before you do barely anything!. All depends on the software and what you are
doing.
An experiment then. Open programs: software firewall, file manager, email,
browser, newsreader, bitmap editor. Used memory: 202 Mb.
Opened 54 Mb BMP file, used memory: 240 Mb. Made five copies of it,
kept all six files open, used memory 502 Mb. Anything above that starts
using pagefile. Six copies of the same file open at the same time is definitely
an overkill. On top of that, I opened 10 Firefox windows, each filled with
different Yahoo pages, used memory now is 562 Mb (installed physical 512)
and there is *still* no appreciable slow down in any of the above programs.
On top of that I can even play 600 Mb DivX file (used memory becomes
608 Mb) and it loads with delay but plays plays fine. In other words, I
can do *anything* with that 50 Mb file *and* keep several intermediate
copies of it open for comparison *and* keep all other priograms running
fine. IMHO, this kind of load is very atypical for vast majority of
users, wouldn't you agree? There is absolutely no question there are
tasks that benefit from very large memory sizes but those are rare,
particularly on the consumer market.
DK