Wilt wrote in post #12085366
What is hard to grasp about using a reflected light sensitometer to measure black as 0% reflectivity, measure white as 100% reflectivity, and the middle tone as having 18% reflectivity? It is no different than a measuring cup with 0 fl.oz. or full at 8 fl.oz, and half full at 4 fl.oz., is it? And such sensitometry predated digital imaging, being used for film for decades (which had their own gamma values!)
The difference is that while you might measure the 18% card at 18%, but then it will not measure 50%, no matter what name you call it. Yet you want to call it "middle gray", and assume it somehow must be at 50%. It is not, it is 18%. The word middle has many meanings.
Film gamma is a concept entirely different.. is the slope of the liner film response curve, a measure of contrast. Kodachrome had more contrast than Tri-X. This is not remotely related to encoding digital data to be different numbers. (many different curves are called gamma, it just just a greek letter... just like X in algegra).
See http://en.wikipedia.org/wiki/Gamma_correction
about gamma.
It is really not a hard subject, yet it always seems to come out as being difficult. The big boys seem to complicate it.
Trying to be helpful - a very short simple version:
Earliest work knew that CRT tubes were not linear. Only the strong (brightest) signals showed up on the face of the CRT, and dim values simply disappeared. This was not so bad for oscilloscopes, which typically only showed one signal level, but grayscale involved many different tones, over a wide range. CRT was unacceptable use for grayscale. Of course, there was no grayscale until early television.
Earliest television solved this by encoding the transmitted signal oppositely. They boosted the weakest values more and the strong values less. This boost curve was called gamma. Each data value was increased using an exponent of roughly 1/2.2. Exponent is near 1/2, which is close to concept of using square root of all data values. Technically strong signals were reduced more than weak, but in effect, weak values were boosted more than strong signals.
-1 stop = (0.5 ^ 1/2.2) = 0.73, x255 = 187, 73% of 255 full scale
-2 stops = (0.25 ^ 1/2.2) = 0.54, x255 = = 137, 54% of full scale
18% Gray card = (0.18 ^ 1/2.2) = 0.46, x255 = 117, 46% of full scale
That 1/2.2 gamma curve boost corresponded to about how much low level signal the CRT tube lost, and it came out looking just about right on the screen of the CRT. Doing this encoding in the transmitter once was much better than adding circuits in EVERY television receiver to do it. Boosting the low level signals helped noise on the transmitted analog signals too. The data was gamma encoded. Then all a CRT had to do was to display it, and the losses were precomputed, and it came out right.
Computer CRT monitors had the same losses, and when we started getting into computer images (popular around 1990), this gamma became the world standard for all images... All analog TV signals, and all computer digital data was gamma encoded, with exponent of roughly 1/2.2. This compensates for the CRT losses which were roughly exponential with a 2.2 exponent.
This means EVERY RGB image has been gamma encoded. The encoded data values are NOT the same as the linear data values. The histogram shows different numbers now.
Todays LCD are linear... no losses, they do not not need gamma. However, all of the RGB data in the world is gamma encoded now, digital TV signals too, so the LCD has to deal with it too. A LCD monitor specifically must decode gamma first, where a CRT does the same by merely showing it.
Perhaps the day might come that we don't use gamma, but I doubt it. LCD is pretty recent, and they have to deal with the data that exists. And printers need some degree of it too, may be not as much, but most of it. They are programmed to deal with gamma 2.2 data, and to adjust it to their needs. Easier to continue than to change every thing, and I mean everything.
Long story short... Analog data (except television) has no concept of this gamma. The camera sensor is linear, and it has no concept of this gamma (in the RAW data at the sensor). The word linear has two meanings in video, one is the "graph as a straight line" idea in math, but mostly in video, it also imples "not yet gamma encoded".
But all subsequent uses in the world do, and all RGB data is gamma encoded. All cameras and all scanners output gamma RGB images. All RGB images are gamma encoded. All histograms show gamma encoded data. The histogram numbers are simply different than the linear numbers we might imagine. Just how life is.
It will be a good thing to believe it too. 