That's worst-case. And if the files are 2M, you'd have to use 100 HDs
to need the extra one, which really is a shrug.
And if the MFT is caught halfway through this process?
Perhaps I don't see the bad mileage if FATxx that you do, because I
don't use one big C: for everything. All of my FATxx-based systems
have C: with 4k clusters, limited to just under 8G, and that's where
most of the write traffic is going on. The bulk of the capacity in E:
is FAT32 with big clusters, but there's not much traffic there.
I'm certainly not recommending 120G HDs be set up as one big FAT32
volume; even though that still has maintainability advantages over
NTFS, the reliability gap may be as wide as you claim.
There is one factor that could lead to software crashes, and that is
uncertainty about free space. An app may query the system for free
space, be told there's enough, and then dump on the HD without
checking for success. Normally it would be concurrant traffic and
disk compression that would cause this "oops", but FAT32 (not FAT16)
does add an extra factor; the free space value that is buffered in the
volume's boot record, which so often gets bent after a bad exit.
Then again, AFAIK startup always checks and recalculates this value;
it's one of the extra overheads of FAT32 at boot time.
Well I should hope so, as that's the mileage I'd expect in FATxx as
well. It shouldn't blink even if free space fell to 5M. C: might
look like it blew up with 25M to go, but that's prolly because a few
seconds ago it may have been down to 0k free due to a temp or swap
splurge... then again, I've seen 0k-free C: and I haven't seen data
carnage. The data carnage I see is usually where RAM has been flaky
for some time, or there's been a malware strike, or the HD is failing,
or the PC was overclocked, etc.
FATxx doesn't just fall over for no good reason, from what I've seen,
though persistant issues from bad exits might compound into
cross-links later. Not sure if that does happen, but possible, though
I'd expect to see a lot more cross-links if that were the case.
Depends what has gone wrong. Corrupt data can be better than none,
especially where text is concerned; OTOH, half a .DLL is no bread