File Defragmentation for large file systems

  • Thread starter Thread starter Guest
  • Start date Start date
G

Guest

What's the best way to defrag a large file system with TBs of data. It seems impossible to defrag a hugh file system that is in constant use. An attempt often will take days and be self defeating. The only solution I've found is to limit the size of virutal disks to a manageable size- often below 500 GB

Thanks.
 
You might try this little utility
(http://www.whitneyfamily.org/Hacks/?item=Defrag). It allows you to specify
folders, files, wildcards, etc. to defrag and allows for partial file
defragging (the defrag utility that comes with the OS is defrag the whole
file or nothing).

Generally, the time to defrag has relatively little bearing the the amount
of data (i.e. GBs), but lots to do with the file/fragmentation count and
spindle speed. A file is more or less a 'fragment' of a volume. And each
fragment will require a drive seek. So, for each file (or fragment), there
is an induced seek overhead of some time (most likely 2-->4ms).


Pat
 
Back
Top