F
Frank B Denman
Hi Folks,
I'm experimenting with using NTFS compression when backing up across the network
to a USB2.0 drive on a workstation. Backup software is the native ntbackup, the
uncompressed bkf files are around 38GB, the USB drive is 250GB. Ntbackup is
running on a Win2k SP4 server, and the USB drive is on a WinXPP SP2 machine.
Just to verify that compression was working, I copied an existing 16.8GB bkf
file to the empty compressed USB drive, where its size on disk was 14.1GB. Disk
Defragmenter sees the copied bkf file as having 2,587 fragments.
Next, I ran a backup across the network to the same compressed USB drive. The
result is a 37.9GB bkf file whose size on disk is 29.2GB. Disk Defragmenter
sees this file as having 151,238 fragments.
Merely deleting a file like this takes 10-20 minutes of maxed-out cpu.
So I'm wondering whether this is expected behavior with large files and NTFS
compression?
Thanks
Frank
Frank Denman
Denman Systems
(e-mail address removed)
[Please delete the "x" from my email address]
I'm experimenting with using NTFS compression when backing up across the network
to a USB2.0 drive on a workstation. Backup software is the native ntbackup, the
uncompressed bkf files are around 38GB, the USB drive is 250GB. Ntbackup is
running on a Win2k SP4 server, and the USB drive is on a WinXPP SP2 machine.
Just to verify that compression was working, I copied an existing 16.8GB bkf
file to the empty compressed USB drive, where its size on disk was 14.1GB. Disk
Defragmenter sees the copied bkf file as having 2,587 fragments.
Next, I ran a backup across the network to the same compressed USB drive. The
result is a 37.9GB bkf file whose size on disk is 29.2GB. Disk Defragmenter
sees this file as having 151,238 fragments.
Merely deleting a file like this takes 10-20 minutes of maxed-out cpu.
So I'm wondering whether this is expected behavior with large files and NTFS
compression?
Thanks
Frank
Frank Denman
Denman Systems
(e-mail address removed)
[Please delete the "x" from my email address]