It has been my assumption, and I underline assumption, that the dynamic range of a RAW file is about one stop greater than that of a JPEG. As such, in a situation in which the scene's dynamic range exceeds the camera's capability, a JPEG file correctly exposed to record highlight detail will still lose roughly one stop of shadow detail compared to a similarly exposed RAW file. Am I wrong about this; is there some algorithmic linear factor that makes RAW's one stop advantage only applicable to highlight information?
I ask this, because whenever I read, "if you get the exposure right in camera, you do not need RAW," I feel the need to point out that on one end or the other of the tonal spectrum, you are going to lose about a stop's worth of information compared with RAW, if, once again, the setting's dynamic range extends beyond that of the camera's sensor. However, this may be an erroneous assumption.
Any clarification would be great.
Thanks


