-Originally posted by Bob N.-
The computers we have are different. I'm sure we'll need
a master image of each one.
Just for the sake of my own curiousity, in what way are
they different? Different OS and software configurations
and/or different types of hardware?
I need to know what you mean by "all the files that
tell your computer what kind of operating system its
using and the like".
Most simply put: what your C drive looks like right after
you've installed all the software you want that machine to
use, without actual user files (word docs, jpgs, etc) on
it. By having a copy of this, you will ensure that
nothing from potential "non-secure" sources has been
installed. No sense in recovering an image that already
has the source of corruption on it.
How should the various pieces be distributed for backup
purposes?
As I said, it works best if all of your user data files
are located in one space (a file server), that way you can
perform a single backup on one location nightly.
Going back to one of my earlier questions, if a seperate
HD was added to the fastest machine and used for backup
only, is that a good idea?
IMO, no. You'd be better off buying a CDRW drive
(rewritable CD drive) and fifty blank disks, assuming that
your master image of each machine is takes less than 700
MB once compressed with Ghost. I'd recommend looking up
on Norton/Symantec's website about that for their
explanation on how that works. However, a second hard
drive would be a wonderful spot to use as your file server.
My issue with using a hard drive as opposed to a CDRW is
that the hard drive, if attached to the network for any
purpose, is exposed to any file corruption you might
encounter across your network due to worms or virii. A
CDRW, once you pull it out of the machine, is only
susceptible to physical dangers: breaking it, scratching
it, whatever.
Essentially, what I would envision is this: After backing
up all of your user files to one source (a CDRW, for
example), you would setup each machine "clean". You would
then create a "Ghost Image" for each machine which you
would setup on a CD of its own. Then you would set that
in a safe storage area. The only time you would ever
touch this is when the software environment for the
machine changes or when you need to restore a system.
Afterwards, you would setup one of the machines as a file
server, and dump all the user files you'd backed up into
this repository, and set up network mappings on all of the
stations to this file server. At the end of each night, a
backup script would be setup to copy all of the user
files. You could copy over this every other week or so,
and you'd be good to go.
Of course, if you have money to toss at the problem, you
could always just hire me.
