Doc said:
I've been using Macrium Reflect for a while and it's worked fine up to
now, but just encountered an odd glitch. I set it on max compression
but the backup file is actually larger than the h/d contents. This is
the first time I've encountered this.
Any suggestions?
Thanks.
The software designer has some choices.
For example, if the designer noticed that the compressor does
a poor job, on a section of the archive, the designer has the
option of storing the uncompressed version instead. Which would
cap the size of the output archive, to be no bigger than the
original files. Then, the output archive, might be no more than
say 1% larger, than the original files. There'd be some overhead.
It sounds like, in this case, the output of the compressor,
is being accepted, no matter what it puts out (big or small output).
*******
You could:
1) Create a backup with compression disabled.
2) Run a copy of 7ZIP over the output file yourself.
That would take forever, but it would give good compression.
It would also complicate bare metal recovery later, when you
want to restore from backup. (Have to decompress the 7ZIP
first, then run Macrium later to do the restore. Probably
requiring an extra hard drive, and a Linux LiveCD.)
I've used such methods, but they're not exactly convenient.
You're probably better off just dealing with the bloated
backup instead.
*******
Macrium has a forum, and you can ask about the behavior over
there.
These are some key technologies involved. The second one, when
the backup runs, the output could be emitted in cluster
order, rather than in file order. So the compressor may
not be looking at an entire, contiguous file when it
does a compression. It could be looking at clusters from
different files, sitting next to one another (if the original
disk is fragmented). [And no, this is not a suggestion to
defragment the disk first...
It's a partial explanation
of the tough job the developer has to do, to make good
compressions.] As far as I know, Macrium is not a "file by file"
backup tool, the way NTBACKUP might be. So perhaps it would be
a trifle complicated, to have the compressor pick and
choose what to compress.
http://en.wikipedia.org/wiki/Macrium_Reflect
"Abraham Lempel LZ based compression..."
http://en.wikipedia.org/wiki/Volume_Snapshot_Service
A bloated archive, would result from the developer doing
something like this - simple pipelining of processes, with
no decision making along the way.
disk ---> VSS ---> ALZ ---> write_to_disk
HTH,
Paul