We're advised that on a Windows machine, we should set gamma to 1.8,
but on a Mac, 2.2 is the natural value.
Why?
What are the operating systems doing differently, that these values are
considered normal?
You have it backwards, should be 2.2 for Windows, 1.8 for Mac.
Monitors need images with gamma about 2.2 regardless, and Mac and
Windows might use the same monitor. Mac video hardware adds extra
midpoint boost to bring the 1.8 image into the 2.2 range needed by the
monitors. This means an image prepared for a Mac looks a bit dark in
Windows, and a Windows image is a bit bright on the Mac.
The reason is largely historical, just always been that way. The Mac
1.8 value was to match early Mac laser printers. It made a little more
sense back then, when most early work was printed documents, and we
viewed few images on the early video (not until we had at least 256
color video, and more so when we got 16 bit color video). Other systems
(generally later) instead matched the image gamma to the video rather
than the printer, meaning then the printer driver had to adjust. Mac
hasnt changed their system.