Digital Imaging and Color Management for Technically Curious Beginners:

Analogies with Temperature

Draft 0.1 - (22/02/07)

Color mode/color space:

- Saying that "the temperature outside is 0" (freezing) is clear and precise if you live in Europe - but probably not very precise if you live in North America. Saying "the temperature is 32 degrees" in North America is very precise, but will definitely NOT mean the same thing for a European (very hot). Unless one says the scale used (Celsius or Fahrenheit), the terms "0 degrees" or "32 degrees " are not properly defined. If you just read the number off a thermometer, without checking the scale used, you may have surprises. Fortunately, most people don't have any problems, as everyone in Europe uses Celsius and everyone in USA uses Fahrenheit. What about Japan? or India?

- It's the same situation for digital images. Files contain "24 bit" or "48 bit" color data, "8 bit/channel" or 16 million colors - but all this is meaningless if you don't clarify what is the scale used. sRGB, Adobe RGB, etc. are scales, just like Celsius and Fahrenheit. Digital imaging people call them "color spaces" while software people often call them "color modes". The value "red=200" doesn't mean the same red in sRGB or in aRGB scale, just like "32" doesn't mean the same temperature in Celsius or in Fahrenheit. In the world of computers and internet, it is not normally a problem, because like in Europe or in USA, everyone assumes the default scale used by everyone else, ie sRGB (for computers and internet/web). Unfortunately, if you travel to Japan or India, it would be a good idea to check what scale they are using...

Color models (models, not modes):

Unfortunately, colors are not as simple as temperature. One number alone is not enough to fully define a color. You need several numbers at the same time, and there is no single absolute way to define a color. For example (not real), a cheap printer with 3 color inks probably defines colors with 3 numbers. A high-end printer with 7 inks probably needs 7 numbers to define a color. "Color models", eg RGB, CMYK, etc. are scientific terms to describe how many numbers you need to describe a color, and what the meaning of these numbers is (which number goes to which color).

Note that this is different from "scale". Suppliers A and B might both make ink for the same printer, but the ink might not have exactly the same intensity. Hence, a 100-red would not be the same red for both inks, even though they both would be printed on the 7-color printer...

Other color modes:

- Why bother with more than one scale? Fahrenheit and Celsius are historical relics, why can't we live in a modern, standard world for digital imaging? Well, try telling Americans to use Celsius, or Europeans to use Fahrenheit ;-) The real complication comes from the fact that different industries/science need different scales to work efficiently.

- People working with subsub-zero frozen superconductor molecules find it a lot easier to use kelvin for their calculations (that is the official standard BTW, and no, there is no capital K, and no such thing as a kelvin degree. It’s just kelvin). Astrophysicists prefer MegaElectronVolts (MeV) to compare the temperatures of stars...

- For similar reasons, manufactures of color ink, color monitors, paper, film, etc. all prefer to use different scales, all more appropriate to describe accurately the physical properties of the materials they deal with. That's why professionals need scales such as ProPhoto RGB, aRGB, etc.

Gamut:

- "gamut" defines how "far" a color will go on a given scale (how red a red can be at the maximum red of that scale). This is a bit like kelvin versus Celsius. In kelvin, the lowest temperature is 0 degree K. It is physically impossible to go lower. In Celsius however, it is common to go below 0. In the same manner, it is not possible for colors to be darker than "pure black". This is tricky because pure black is very hard to make: your monitor turned off is very dark gray, not pure black. So "zero" is not a straight forward notion for colors.

- From a physical theory point of view, there is no upper limit to temperature, in any scale. In color **models**, there is no theoretical upper limit either (it is conceivable to make redder reds for ever in theory). In practice however, there is a very hard, low limit to the colors we can see or we can manufacture. It is a bit pointless to argue about colors we can't see or can't make.

- That is where color scales come in: color scales are usually devised to describe the full range of colors covered by the physical properties of the particular setup: range of your eyes, range of the inks, range of the light source, etc. The sRGB scale was devised to describe the visible colors on an average computer screen (the maximum colors one can make with a decent monitor, ignoring the black issue described earlier). Adobe RGB was created to expand this color range (gamut) to include the printable colors of certain classes of printers, which are a bit different than the range of typical monitors. Some expensive high-end monitors can display Adobe RGB colors which cannot be viewed on a regular sRGB monitor.

ICC profiles:

- Imagine two ovens, one made in USA and one made in France. Both have different temperature scales. Both are also probably not very accurate. The 450 degree Fahrenheit setting might actually be 454 degrees, while the 220 degree Celsius might turn out to be 235 if cheap components were used.

In order to cook the same recipe in both ovens, it would be useful to have a little chart next to each oven, telling the cook what each setting on the knob really achieves as a temperature in the oven.

- The same problem arise with monitors, printers, etc. Any input/output device will behave in its own fancy way. An ICC profile is like a little chart that tells color management applications how to **really** display any given color on the specific monitor/printer, instead of just using the "number on the knob" (eg. to display a red-100, the software might have to say red-104" to be correct)

- While this is not a big problem for many "amateur situations", it can be very frustrating if you care about the colors of your pictures. If you and your "pro" friend bought the same monitors at a discount shop, and you send him your pictures for "improvement" on Photoshop, you are going to have surprises when the pictures come back. Even though they are viewed on the same brand of monitor, they probably look quite different on each, if you do not both use custom-made ICC color profiles.

Calibration:

If you tune (calibrate) your monitor carefully, but your mum has hers too bright because she can't see very well, your beautiful pics will never come up right on her screen... Both calibration and profiling are necessary elements to manage color properly. You can be in any of 3 situations:

a) you have no calibration or profile. You are in 95% of the population ;-)

b) you manually/visually calibrate and profile your monitor with software utilities (eg Adobe Gamma). You are around 4% of the population.

c) You use a profiling device (eg a Pantone eye-one display 2) to both calibrate and profile your monitor. Congratulations, at least you stand a reasonable chance to view your 10 MP shots decently. Enjoy all these colors!

Color management:

- Well, it should be pretty easy now. Color management is like assuming you are a world-wide traveler, never knowing what thermometer and what temperature scale the locals use at your next destination. You need to find out the temperature there, the scale used, little charts tell you how to compensate for the weird distortions in measurement, and to make life easy for you, a standard conversion table from Celsius to Fahrenheit so that you can make sense out of any numbers given to you.

- Similarly, with different brands of cameras, different software applications and different printing methods, you cannot know in advance what a red-100 will look like. Every device used in your processing chain will need an ICC profile to describe how to display arbitrary colors properly on that device. Also, the RAW data coming out of your new DSLR gives you raw numbers: red-100 measured on the electronic sensor. So it is important, if you are going to use this image on anything else than that camera alone, to decide and write down what scale you are using (called an embedded ICC profile in Photoshop, but nothing to do with ICC device profiles. In this case it is more like a tag than a full profile). This becomes even more important if you are going to edit or "improve" your picture on different software. Your digital image will not look the same from one application to the next, if you do not indicate at the same time what scale you used (imagine cooking recipes where the temperature varies, if you do not indicate Celsius or Fahrenheit).

- If you ever decide to use your image on drastically different media (e.g. from high-quality printouts to web pages), it is sometimes advisable to convert the scale of your picture from one to another. Photoshop does this very easily (eg convert Adobe RGB to sRGB). Be careful not to confuse this with *assigning* a color profile: many pictures do not have any scale embedded (they most likely assume a default sRGB, but this is not always true). Photoshop can help you clarify and embed the real scale used,properly.

I don't use all this stuff and I don’t have any problems with colors. Why?

- Well, you are probably using a DSLR shooting by default in sRGB, viewing your pictures on the same windows PC all the time, and publishing your best shots on a web page, which you view again with the same PC all the time. Without knowing it, you are color managing, poorly, in sRGB. The industry is converging towards this default value, but it is also taking shortcuts. Try viewing your pictures on a friend's older PC for a change. Or your Mum's Macintosh. Try putting an expensive LCD screen next to a decent CRT. Expensive is not necessarily the best...

- Or perhaps you are the lucky user of a Mac, which apparently does all this for you automatically. I don’t have one so I cannot say categorically, but it certainly appears to be that way in the Mac world.

- I bought a new monitor after shooting 10 MP raw files for a year around the world (hobby). I felt like suing camera manufacturers, monitor manufacturers and software vendors all at the same time when I opened up my old "best" shots on the new screen. The issue does affect you, it's just that you do not know it yet, you haven't replaced your PC hardware since you bought your last DSLR...

 

I would love to hear your comments/suggestions/corrections. Many advanced users feel this explanation is confusing at best, and useless at worst. Some beginners have said this was useful to them. Let me know what you think.

Email me at

virgule at virgulesystems dot com and put the words [magic] (in brackets) at the beginning of the subject of your email (spam filtering…)

 

CAUTION: this is a draft, out for comment. I make no claim to be a digital photography expert, I am not selling anything, and will assume no responsibility if you screw up your pics while following my advice. I don’t give out much advice, so I doubt you will screw up anything by reading this. I do feel however that it is easier than what most people believe, to screw up your own images yourself on your own computer – that’s why I wrote this explanation. If you want to experiment with color management but are not sure of what you are doing, there is an easy way: backup your original images, make a working copy on a clearly identified folder, and play to death with those copies. Think really hard before you start doing it on your most cherished original files, it is very hard to undo a color screw up once its saved, and it often comes at the expense of image quality.

-->