Ted said:
Thanks for your help, the info that comes with the monitor doesn't even talk
about any of that as far as I can see. How can I be sure that the images I see
on my computer look the same on other peoples computer and more importantly
print out the same?
That's a subject that deserves an entire book to address,
and in fact several have been written to try and tackle it
(although none that I'm aware of that try to get the concepts
across at a truly accessible level for most users. And it's
certainly not something that can even remotely be addressed
in its entirety in a posting here. Which, of course, will not
stop me from trying (and likely failing spectacularly!

.
You can't ever be sure that the images you see are going to be the
same on other peoples' computers, let alone that they are
"right" (i.e., reflect the "real" colors of the original scene, as if
you were seeing the real thing, or at least were the colors that
the creator of the original intended) without having at least three
or four key pieces of information regarding that original image.
First, what "white" is supposed to be (in other words, what color
should the display produce when all the inputs - all the primaries -
are set to their maximum allowable level). Next, what the colors
of those primaries were assumed to be in creating the original
information (what "color space" was assumed by the creator, or
used in the original image sensor), and then also what response
or "gamma" was assumed in encoding the image. Finally, to get
it really right (or at least close), you also need to know the intended
"brightness" (luminance) of the image, and the ambient lighting
conditions under which it was intended to be viewed.
There are a couple of ways that this is done in computer imagery.
The best would be to include all of the above information along
with the image data, and then also know the relevant characteristics
about the display device to be used. That's what the "ICC profile"
approach attempts to do - add information to the image data file
regarding its encoding, and then also always have "profile" information
available describing the display device such that you can translate
as best as possible between the two. Unfortunately, this approach
is rarely used to the extent that it might be, except for some high-end
or professional applications where this information is maintained
and properly used all along the process.
Another, simpler approach is just to assume a standard set of
conditions to be used throughout the chain, and try to get everything
to match that standard as closely as possible. This is what's done
in television and in such things as the "sRGB" standard in the PC
world. You standardize a set of primaries, a white point, the
"gamma" curve, and a few other things, and images will look
reasonably consistent as long as those settings are maintained.
This is currently the most widely used method in most "mainstream"
PC apps, with the "sRGB" model being among the most popular
if not the most popular. If your display is set to the sRGB specs,
then images created with that standard in mind should look "correct"
(at least as much as possible), and should be reasonably consistent
on any sRGB-compliant display. With an LCD monitor, about the
only thing the user can generally adjust to match this spec is the
white point (which should be 6500K), and possibly the gamma.
The primaries of pretty much any monitor are non-adjustable
(they're set by the phosphors or the color filters), and the only
thing left you're likely to be able to control would be the brightness
and the black level (although as already noted, the latter is
generally not controllable by the user in an LCD monitor).
The sRGB standard is very CRT-oriented, though - for one thing,
if you were to really apply it strictly, you'd be using a brightness
(luminance) setting that is unreasonably low for most LCD products
- only 80 cd/m^2 - and as already noted, most LCD products
currently available will not do a great job of matching a CRT-like
gamma. (under sRGB, it's supposed to be 2.4 with a slight
positive offset).
Making the images you see on your monitor match what comes
out of the printer is a separate but similar problem, further complicated
by the fact that the two use completely different color systems
(the additive RGB vs. the subtractive CMYK primary sets), with
color spaces that don't really match up all that well.
I'm not sure this has been all that helpful, but as noted this is an
extremely complicated subject and a problem which really has
only been partially addressed in the mainstream PC market
to date.
Bob M.