tonylong wrote in post #12146061
OK Wayne, thanks for engaging here!
So in the above quote, I'm not getting something. You say that our LCD monitors "undo" the gamma correction, but then the Raw processors "redo" it? Is that correct?
The "redo" is out of place here. The RAW processors "do" it the first time.
Look, it is much simpler than that. Try to imagine this is the easiest possible stuff, and look at it in the easiest possible way. It is not magic, it is simply like the words say.
I tried to write this up, at http://www.scantips.com/lights/gamma.html
Hopefully clearly and studied, at least in my own mind, but sometimes the assumptions about what is clear are not appropriate for all readers. I would appreciate hearing any comments about what specific words in that posted article is not clearly understood by anyone. Via PM here if you don't care to be public. I suppose I can PM here, and I really would like to know which words were not understood by all. It will be my fault, and maybe I can fix it, if I knew which words.
But I will try hastily to summarize again here:
The light from the original scene is linear, and digital camera sensors are linear, and their RAW data is linear. LCD monitors are linear too. If that were all there were to it, we would not need gamma. Hard to imagine it would have ever been invented, if we only started doing it yesterday.
But CRT monitors are not linear - and they used to be universal. CRT requires gamma encoded data to correct their nonlinear response, to cause their output to look correct, meaning to look linear to the human eye. Without gamma correction, their images appear very dark and unacceptable and unusable - like you see that "RAW" image. Gamma is done to all images shown on CRT displays, to correct those CRT displays.
So only for that CRT reason (which used to be a BIG deal), the world standard (like sRGB) is that all RGB data is encoded with gamma 2.2. We must use RGB data because that is how monitors work, and our PC printers expect it too, and our eyes too. We have RGB sensors in our eye too. I use RGB in that sense. I only intend to exclude RAW.
The act of that CRT showing it - decodes it.. Gamma correction is done to correct the output of the CRT screen. So at that point of showing it on CRT, then it is undone, and gone. The correction was effective, and was used, and is now gone, meaning, the eye sees linear RGB again. Gamma correction caused that data emitted from the CRT screen to look normal and linear to the human eye, which expects to see the same scene that the camera lens saw. Gamma makes that true for CRT monitors. Not true otherwise. It is a correction for the CRT screen, so its output will appear true and linear. That is all there is to it.
So back to the camera... the sensor RAW data is linear.
But for above CRT reason (and for sRGB too, which is due to CRT), the camera outputs gamma in the JPG images.
Or it can output RAW, and defer the gamma encoding for output from the processor. But before that image is usable, it is gamma encoded.
By its output... I mean ANY output.
When you save it to a disk file.
Or when this program simply shows it to you on the screen. All screens expect gamma encoded data.
Any output.
Unless you tell it not to.
The CRT's act of showing it decodes it. Or LCD monitors simply "undo" it, resets it to be liner (because they don't need it). But when that light leaves either screen (towards the eye), it is linear again, one way or other. Which is a good thing for the eye. We expect to see real world linear data.
It's too bad you don't have the use of DPP, because the "Linear" checkbox has me on this track -- if you click it you get the dramaticall dark lower tones and just the bright tones show up as something "close to normal" although the picture as a whole tends to look like crap!
So, with the default setting (Linear "off"), this appears to be what you are calling "RGB gamma encoding", correct? But it is not the "old" gamma correction curve but something different? But it still has the affect of applying a curve to the linear data like you are describing, a "levels" curve?
I am sorry if I confused it.
There is just the one gamma. Done on all RGB data, for the CRT monitor correction.
Very crude analogy, don't pin me down, but correction like putting a color filter on our camera, which makes wrong things right again, and our eyes like to see it that way.
OK, I follow that but I'm trying to nail down what goes on "behind the scenes in, say, ACR: am I correct that the Raw processor does apply a curve to the linear data? It seems that you are confirming that it does. But you are saying that it isn't the "gamma correction curve" designed for CRTs, I got that, but is it correct to describe it as a "gamma correction curve" designed specifically for digital Raw data? This is what I'm trying to nail down as an intelligent way to present this to other "laypeople".
There is just the one gamma, now that we agreed on 2.2.
The sRGB standard requires it. It assumes you are using a CRT monitor. And you used to use one, and you might still have one, and the standards require it regardless. Probably someone somewhere still does.
But much more important, ALL of the RGB image data in the world is already gamma encoded, so it is very convenient to be able to view it. The world standards were made to require it. It's a good thing to comply.
I only say RGB to exclude RAW. RAW data is NOT YET gamma encoded.
Regardless, any image data you will ever see has been gamma encoded, for the CRT monitor correction - right up until the time that data leaves the monitor towards your eye. The CRT decodes it just by showing it. It corrects the CRT just by showing the corrected data. The LCD monitor decodes it because it knows it is there, and that your eyes have no use for it. We only see decoded data. The eye expects linear RGB data. The CRT cannot otherwise provide it, without gamma correction.
So you have this DPP program, and it has a control to show you linear data. It cannot show you RAW, your monitor has no clue how to show RAW. They can leave out the gamma encoded (so that it looks way too dark), but they convert it to RGB to show it to your RGB monitor, and your RGB eyes.
Specifically, to beat this to death, DPP leaving out gamma to show RAW simply just shows you the original scenes data (as RGB of course)... which was not too dark in the first place. It was linear too.
But any act of showing it (RAW or not) expects to "undo" gamma (one way for CRT, another way for LCD), so this decoding makes it be darker. So the only reason you think it is darker is because it has not previously been made brighter. But your monitor does not know that, and it will decode it anyway. Either a CRT by simply showing it, or a LCD because it knows it needs to.
I am not sure that is helpful, but it is precisely correct.
Well, this goes back to the question of the DPP Linear control, and then to the Lightroom/ACR controls and the bit of back-and-forth I had with Wilt. In DPP I've assumed that the Linear control actually does present the "preview" of the linear data, no curve applied, otherwise turning it off applied the correction curve, whatever we end up calling it (I'd love a decisive name so I can pass it on to others

). In ACR/Lightroom then, I assumed that the Brightness slider set by default to 50 was applying a "semulated-gamma-correction-curve" since the Brightness control retains the White Point while boosting the midtones and shadows, although I believe it keeps the Black Point in place (so retains black "clipping").
But Wilt shot that down with the idea that the curve is being applied independent of an adjustment control. Oh well.
You are reading too many words too literally.
Best to understand the concept, and then you can say it in your own words.
Gamma is always done, yes, independent of the adjustment controls we might otherwise apply. In the sense that our RAW processor is going to show gamma data on the screen to show you, before file output, yes, we can say it is done first.
But if you ask DPP not to add gamma, then it wont add gamma.
But your monitor is still going to decode it, so your view will be dark then, because your monitor expects gamma encoded data.
Hope that helps.
EDIT: ADDED...
Searching Google for the key words gamma correction finds the Wikipedia article under the same name. It has faults, mostly organization and lack of a clear overview (and film gamma should not be mentioned in this article). It includes this good graph:
http://en.wikipedia.org/wiki/File:Gamma06_600.png
Note the bottom curve is the response curve of the CRT monitor.. specifically, the example is marked to show input (along bottom scale) of 50% only comes out 21.8% (along the left side scale). Darker data is shown even darker, and only brighter data is shown brighter. This is the problem which CRT has, and which gamma corrects, to allow CRT monitors to be usable for images.
To make the CRT show the data properly, the RGB data is gamma encoded ... the top curve. Low end signals are boosted more than high end signals, and example is marked showing that input of 21.8% actually comes out 50% (on the screen, again same as linear, for the eye).
We don't need to understand the math, but it really is fairly simple...
The curve representing the CRT response is: output = input ^ gamma (^ is to the power of gamma, which is 2.2 exponent)
The curve representing the correction to the data is: output = input ^ 1/gamma (opposite correction - new response is straight line in the middle - linear)
CRT monitors are no longer as significant as previously, however, all of the worlds RGB data is already gamma encoded, so we simply continue.