Approve the Cookies
This website uses cookies to improve your user experience. By using this site, you agree to our use of cookies and our Privacy Policy.
OK
Forums  •   • New posts  •   • RTAT  •   • 'Best of'  •   • Gallery  •   • Gear
Guest
Forums  •   • New posts  •   • RTAT  •   • 'Best of'  •   • Gallery  •   • Gear
Register to forums    Log in

 
FORUMS Cameras, Lenses & Accessories Canon Digital Cameras 
Thread started 15 Sep 2015 (Tuesday) 21:35
Search threadPrev/next
sponsored links (only for non-logged)

Dynamic Range-Can't they or Won't they?

 
davesrose
Title Fairy still hasn't visited me!
4,568 posts
Likes: 879
Joined Apr 2007
Location: Atlanta, GA
Post edited over 8 years ago by davesrose.
     
Oct 13, 2015 04:41 |  #316

sploo wrote in post #17743421 (external link)
You mean will have completely clipped highlights and shadows, surely? Unless the camera just squashed the whole 14bit range into 8bits - which would probably look a bit odd if done in a linear fashion (very low contrast output)

I think this is where there's confusion/different terms/etc with you and Wilt. You both seem to still refuse to acknowledge there's such a thing as tonal range. Read my and Andy's post above a bit more carefully. In a digital system that linearly converts (whether 32bpc, 16bpc, or 14bpc)....it automatically "squishes" the white and black point to a 8bit system. It doesn't just throw out "shadows" or "highlights". When you first preview a RAW in a converter, the 8bpc image you're viewing is purely linearly converted. If you slide an "exposure" setting, you're applying a gamma curve that raises both your highlight and shadow tones. "shadow" and "black" sliders just raises gamma curves in the darker regions of the image. "highlight" and "white" just raises gamma curves in the lighter regions of the image. In the digital world, any image that's over 8bpc is considered a HDR image. Even with single RAW that you manually adjust tonal range in ACR is considered "tone mapping". This is the best explanation I can think of (numerous times, I've linked articles defining HDR as anything above 8bpc)...hopefully I'm clear enough now to lay to rest "tonal range"? Can we shake hands, not throw stones anymore, have a drink, and move on now?:-)

If this thread needs to get back to DR of sensor, then the main question is how long will it be before sensors mature enough to need 16bpc ADCs. For folks concerned with HDR, that will be the next major step for DR. From purely unbiased specs of the sensor tech, the advantage of Sony sensors is that they have a clean noise floor while Canons can have a higher saturation point. Canon is still catching up with lower noise floors, and Sony sensors apparently still need to raise saturation points in order to go 16bpc. With an ideal sensor, you'd strive for a high saturation point and lower noise floor to be able to get to filling 16bpc.


Canon 5D mk IV
EF 135mm 2.0L, EF 70-200mm 2.8L IS II, EF 24-70 2.8L II, EF 50mm 1.4, EF 100mm 2.8L Macro, EF 16-35mm 4L IS, Sigma 150-600mm C, 580EX, 600EX-RT, MeFoto Globetrotter tripod, grips, Black Rapid RS-7, CAMS plate and strap system, Lowepro Flipside 500 AW, and a few other things...
smugmug (external link)

  
  LOG IN TO REPLY
sploo
premature adulation
2,668 posts
Gallery: 5 photos
Likes: 645
Joined Nov 2011
Location: West Yorkshire, UK
     
Oct 13, 2015 07:13 |  #317

davesrose wrote in post #17743451 (external link)
I think this is where there's confusion/different terms/etc with you and Wilt. You both seem to still refuse to acknowledge there's such a thing as tonal range.

I'm quite happy with the existence of tonal range, thanks.

davesrose wrote in post #17743451 (external link)
Read my and Andy's post above a bit more carefully. In a digital system that linearly converts (whether 32bpc, 16bpc, or 14bpc)....it automatically "squishes" the white and black point to a 8bit system. It doesn't just throw out "shadows" or "highlights". When you first preview a RAW in a converter, the 8bpc image you're viewing is purely linearly converted.

Do you mean that 0 in a 16 bit image effectively becomes 0 in an 8 bit output, and 65535 becomes 255 (with all other values scaled linearly in between)? That's quite likely for a simple conversion from a high bit image to a low bit image on a computer, but absolutely isn't the mapping that'll be happening when a raw is converted to a JPEG in-camera (it would result in a very washed out low contrast image).

When you first preview a raw in a converter I'd bet my house that the 8bpc image you're viewing is not purely linearly converted; it'll be converted using the default profile (either of the raw converter) or picked up from the picture profile that was selected in-camera (from the raw file's metadata).


davesrose wrote in post #17743451 (external link)
If you slide an "exposure" setting, you're applying a gamma curve that raises both your highlight and shadow tones. "shadow" and "black" sliders just raises gamma curves in the darker regions of the image. "highlight" and "white" just raises gamma curves in the lighter regions of the image..

Gamma is probably not the best term for messing with shadow and highlights sliders (one's about the "middle", the others are about the "ends"), but yes, any manipulation is changing the mapping between the input range and output range. Whether or not there's also a change in the bit depth between the input and output is orthogonal in that instance.

davesrose wrote in post #17743451 (external link)
If this thread needs to get back to DR of sensor, then the main question is how long will it be before sensors mature enough to need 16bpc ADCs. For folks concerned with HDR, that will be the next major step for DR. From purely unbiased specs of the sensor tech, the advantage of Sony sensors is that they have a clean noise floor while Canons can have a higher saturation point. Canon is still catching up with lower noise floors, and Sony sensors apparently still need to raise saturation points in order to go 16bpc. With an ideal sensor, you'd strive for a high saturation point and lower noise floor to be able to get to filling 16bpc.

I couldn't find data for the sensor in the a7R II, but numbers I've seen for the a7R indicate around 50,000e full well capacity, which would equate to around 15.6 stops if there was absolutely zero noise in the system, so, ironically, Sony could increase DR by reducing read noise too ;-)a


Camera, some lenses, too little time, too little talent

  
  LOG IN TO REPLY
davesrose
Title Fairy still hasn't visited me!
4,568 posts
Likes: 879
Joined Apr 2007
Location: Atlanta, GA
Post edited over 8 years ago by davesrose.
     
Oct 13, 2015 07:59 |  #318

sploo wrote in post #17743570 (external link)
When you first preview a raw in a converter I'd bet my house that the 8bpc image you're viewing is not purely linearly converted; it'll be converted using the default profile (either of the raw converter) or picked up from the picture profile that was selected in-camera (from the raw file's metadata).

If you're setting your color profiles to "natural", then the 8bpc image is pretty much "just" being converted linearly (it is just being squeezed).

sploo wrote in post #17743570 (external link)
Gamma is probably not the best term for messing with shadow and highlights sliders (one's about the "middle", the others are about the "ends"), but yes, any manipulation is changing the mapping between the input range and output range. Whether or not there's also a change in the bit depth between the input and output is orthogonal in that instance.

As with all the links I've posted, gamma is the term always used with HDR graphics, because tonal range is stored logarithmically. In an HDR system, the only linear element is the sensor. Maybe that's another confusion here...as it appears you and Wilt have only talked about "levels" and not gamma curves....You get much more flexibility adjusting contrast by being able to manually adjust gamma curves (instead of just sliding white or black point sliders).

sploo wrote in post #17743570 (external link)
I couldn't find data for the sensor in the a7R II, but numbers I've seen for the a7R indicate around 50,000e full well capacity, which would equate to around 15.6 stops if there was absolutely zero noise in the system, so, ironically, Sony could increase DR by reducing read noise too ;-)a

Yes, so even in an ideal exposure with no read noise....the best of the best sensor doesn't go to a full 16 stops of luminance capture (or 16bpc tonal range);-)a DR with sensor capture gets more complicated with ISO as well. With increase in the sensor's sensitivity, you're decreasing it's DR capture ability. The latest reviews I've seen of the A7RII vs 5DS are interesting to me. So the 5DS is taking a huge high ISO DR hit with their extra MP. Sony has implemented a backlit sensor that appears not to increase base ISO DR, but improves DR at high ISO.


Canon 5D mk IV
EF 135mm 2.0L, EF 70-200mm 2.8L IS II, EF 24-70 2.8L II, EF 50mm 1.4, EF 100mm 2.8L Macro, EF 16-35mm 4L IS, Sigma 150-600mm C, 580EX, 600EX-RT, MeFoto Globetrotter tripod, grips, Black Rapid RS-7, CAMS plate and strap system, Lowepro Flipside 500 AW, and a few other things...
smugmug (external link)

  
  LOG IN TO REPLY
John ­ Sheehy
Goldmember
4,542 posts
Likes: 1215
Joined Jan 2010
     
Oct 13, 2015 09:15 |  #319

sploo wrote in post #17742803 (external link)
In a nutshell: the more DR we have at capture, the better chance you have in holding detail in your highlights (due to taking a darker exposure) whilst still getting acceptable quality in shadow areas.

That's all that we mean by capture DR; how far below clipping that we still have acceptable noise. For most of the cameras under discussion, all this means is less noise in the darkest areas. For other cameras, such as ones that have different pixels with the same color filters but different sensitivities, or ones that read different pixels with different gains, then such techniques may cause extra noise in upper tones and/or loss of resolution while increasing DR.

Some people seem to be forgetting that tomorrow's displays may be high-DR LEDs in blackened rooms or blackened head-mounted goggles which can display much more DR than we are used to now in media, and it is going to be the low DR captures that may look funny or artificial in them, once we get used to seeing high-DR images on them. Our brains develop certain expectations about specific types of display devices.




  
  LOG IN TO REPLY
Wilt
Reader's Digest Condensed version of War and Peace [POTN Vol 1]
Avatar
46,469 posts
Gallery: 1 photo
Likes: 4570
Joined Aug 2005
Location: Belmont, CA
Post edited over 8 years ago by Wilt. (3 edits in all)
     
Oct 13, 2015 09:23 |  #320

since we seem to have some misunderstanding of the concepts, I made this illustration...

IMAGE: http://i69.photobucket.com/albums/i63/wiltonw/Principles/dynamic%20range%20noise_zps3zfpumgv.jpg


This illustration assumes identical light:signal response from both sensors...i.e. neither one is better than the other at recording a 'brighter' subject, and both have the same slope (rate of decline in response to lowering levels of light). But one (the Exmor chip) has a lower noise floor, so the non-Exmor chip visible signal is reduced (less dynamic range) because low level signals are 'lost in the noise'.

You need to give me OK to edit your image and repost! Keep POTN alive and well with member support https://photography-on-the.net/forum/donate.p​hp
Canon dSLR system, Olympus OM 35mm system, Bronica ETRSi 645 system, Horseman LS 4x5 system, Metz flashes, Dynalite studio lighting, and too many accessories to mention

  
  LOG IN TO REPLY
John ­ Sheehy
Goldmember
4,542 posts
Likes: 1215
Joined Jan 2010
     
Oct 13, 2015 09:28 |  #321

davesrose wrote in post #17742838 (external link)
But the point is that there's still situations where an exposure will clip in either your highlights, shadows, or both. It was true with film, and it's true with any photographer shooting RAW: the photographer finds a full exposure range that gets you as much tonal range as possible. Lets say I do shoot a scene that I notice is clipped in the highlights....I then "underexpose" further down to get more detail in the highlights. If my scene has a higher luminance range then what my sensor is able to record, then if I only expose for highlights, I will get blocked up shadows (and won't be able to "recover" what's not there).

RAW shadows do not "block up". RAW shadows are dynamic, jagged terrain with a mixture of signal from the subject and noise. They are farther from smooth than a sharp capture of a detailed scene in the highlights. Your signal is always there. There is nothing that is "not there" except an acceptable SNR (unless exposure is so weak that no photons hit the sensor, which is unlikely if there are any exposed areas in the frame).

With enough frames added together from a still subject, you can add what "isn't there" (and would be "blocked" in a JPEG) up into a clean summed capture, because the signal adds up while the noise cancels itself out with both positive and negative offsets from successive frames.




  
  LOG IN TO REPLY
davesrose
Title Fairy still hasn't visited me!
4,568 posts
Likes: 879
Joined Apr 2007
Location: Atlanta, GA
Post edited over 8 years ago by davesrose.
     
Oct 13, 2015 09:47 |  #322

OK, people are still getting terms mixed up it seems (and why "measurements" in sensor DR gets more complicated)....This is the Cambridge definition of "dynamic range" as it relates to "digital photography"

"Dynamic range in photography describes the ratio between the maximum and minimum measurable light intensities (white and black, respectively)." link (external link)

A digital image is still considered the same as any other HDR medium. HDR image formats were created to have "tonal ranges" that adequately recorded and simulated light intensity. They are stored as 16bpc or 32bpc....the only reason RAWs aren't stored that way is it would take up unnecessary disc space. The linear "DR" captured medium is the "range"/"contrast"/"di​fference" between your saturation point and "acceptable" noise floor (it's "white" and "black" point). But, the image you see (whether print or monitor) has a different range of contrast. Whether it's your camera or your RAW conversion program adjusting contrast, the software is still "squeezing" 14bpc to 8bpc. Digitally captured RAW images work exactly the same as any other graphics medium:

High Dynamic Range Imaging (external link)

It seems the current *best of the best* sensor can't capture 16bpc tonal space yet. For computer imaging, graphics standards settled on 32bpc for precision as well as adequate tonal range for any light simulation. No matter how you cut it, any digital sensor is currently limited to the recorded DR of 14 "stops" of tone.


Canon 5D mk IV
EF 135mm 2.0L, EF 70-200mm 2.8L IS II, EF 24-70 2.8L II, EF 50mm 1.4, EF 100mm 2.8L Macro, EF 16-35mm 4L IS, Sigma 150-600mm C, 580EX, 600EX-RT, MeFoto Globetrotter tripod, grips, Black Rapid RS-7, CAMS plate and strap system, Lowepro Flipside 500 AW, and a few other things...
smugmug (external link)

  
  LOG IN TO REPLY
TeamSpeed
01010100 01010011
Avatar
40,862 posts
Gallery: 116 photos
Best ofs: 2
Likes: 8923
Joined May 2002
Location: Midwest
Post edited over 8 years ago by TeamSpeed. (2 edits in all)
     
Oct 13, 2015 10:02 |  #323

davesrose wrote in post #17743752 (external link)
OK, people are still getting terms mixed up it seems (and why "measurements" in sensor DR gets more complicated)....This is the Cambridge definition of "dynamic range" as it relates to "digital photography"

"Dynamic range in photography describes the ratio between the maximum and minimum measurable light intensities (white and black, respectively)." link (external link)

A digital image is still considered the same as any other HDR medium. HDR image formats were created to have "tonal ranges" that adequately recorded and simulated light intensity. They are stored as 16bpc or 32bpc....the only reason RAWs aren't stored that way is it would take up unnecessary disc space. The linear "DR" captured medium is the "range"/"contrast"/"di​fference" between your saturation point and "acceptable" noise floor (it's "white" and "black" point). But, the image you see (whether print or monitor) has a different range of contrast. Whether it's your camera or your RAW conversion program adjusting contrast, the software is still "squeezing" 14bpc to 8bpc. Digitally captured RAW images work exactly the same as any other graphics medium:

High Dynamic Range Imaging (external link)

It seems the current *best of the best* sensor can't capture 16bpc tonal space yet. For computer imaging, graphics standards settled on 32bpc for precision as well as adequate tonal range for any light simulation. No matter how you cut it, any digital sensor is currently limited to the recorded DR of 14 "stops" of tone.

LETS NOT GET INTO THIS AGAIN.... If you continue to do so, I will request the CDS or another mod to step in. We are NOT going down this spiral again. We are all on the same page regarding terminology, besides you, for the topic at hand here for this thread, and this round and round discussion has been had about 4 times now. No need to continue it.


Past Equipment | My Personal Gallery (external link) My Business Gallery (external link)
"Man only has 5 senses, and sometimes not even that, so if they define the world, the universe, the dimensions of existence, and spirituality with just these limited senses, their view of what-is and what-can-be is very myopic indeed and they are doomed, now and forever."

  
  LOG IN TO REPLY
davesrose
Title Fairy still hasn't visited me!
4,568 posts
Likes: 879
Joined Apr 2007
Location: Atlanta, GA
     
Oct 13, 2015 10:14 |  #324

TeamSpeed wrote in post #17743770 (external link)
LETS NOT GET INTO THIS AGAIN.... If you continue to do so, I will request the CDS or another mod to step in. We are NOT going down this spiral again. We are all on the same page regarding terminology, besides you, for the topic at hand here for this thread, and this round and round discussion has been had about 4 times now. No need to continue it.

I don't think everyone is on the same page about terminology for HDR...the thread came back to HDR imaging with this:

Wilt wrote in post #17742395 (external link)
Dexter has been taking some recent heat for his statements, but I think the one he makes above is on target! After all, pros shooting for any print media (advertising, company brochures, 10K reports, product literature) have to reduce the DR of a scene into a range which can be offset printed -- which is barely even 6EV of DR -- regardless of B+W vs. color transparency, this has always been a limitation! So then that raises a response from folks, "But if I can capture a wider DR, I can nevertheless compress it to fit my output". That raises the reaction that you hear from others, when HDR techniques have been applied to a shot, "It (HDR) looks artificial!".

That leads me to offer this challenging question, to hear the responses:

So if DR compression (e.g. HDR) results in artificial appearance, and our media (offset printed page, our monitors, even photographic prints) are inherently 'limited DR' media, just why is it so necessary to get any more than 12EV of DR than can be accomplished today (via Sony sensor)?!

And I'll make a request of Tareg, who is uniquely outfitted to do this kind of comparison...
We hear that medium format digital has a very wide DR due to the difference in the sensor technology (CCD vs. CMOS). So...
Can you shoot a wide DR shot on both medium format and with existing Canon gear, and post any comparative sections from both which shows how the wider DR medium format shot results in any presentable (on monitor) detail which is NOT VISIBLE in the Canon shot?


But yes, I guess I will just be labeled as the "non-experienced" photographer graphics guy, and we get into asinine debates about the validity of 12-14 "stops" of "recorded" or "processed" light. I'll try to leave it be Teamspeed....


Canon 5D mk IV
EF 135mm 2.0L, EF 70-200mm 2.8L IS II, EF 24-70 2.8L II, EF 50mm 1.4, EF 100mm 2.8L Macro, EF 16-35mm 4L IS, Sigma 150-600mm C, 580EX, 600EX-RT, MeFoto Globetrotter tripod, grips, Black Rapid RS-7, CAMS plate and strap system, Lowepro Flipside 500 AW, and a few other things...
smugmug (external link)

  
  LOG IN TO REPLY
Wilt
Reader's Digest Condensed version of War and Peace [POTN Vol 1]
Avatar
46,469 posts
Gallery: 1 photo
Likes: 4570
Joined Aug 2005
Location: Belmont, CA
Post edited over 8 years ago by Wilt.
     
Oct 13, 2015 10:20 |  #325

davesrose wrote in post #17743789 (external link)
I don't think everyone is on the same page about terminology for HDR...the thread came back to HDR imaging with this:

But yes, I guess I will just be labeled as the "non-experienced" photographer graphics guy, and we get into asinine debates about the validity of 12-14 "stops" of "recorded" or "processed" light. I'll try to leave it be Teamspeed....

What was that about?! The way I see it, my chart and even YOUR defintion are consistent with each other.

BTW no one has ever mentioned '14 stops', a 'stop' is not the same as number of BITS (which is the encoding of a single-color-site in RAW, which was only 12-bits a decade ago).
And we all are in agreement that, regardless of how many EV of response we can extract from a sensor, it all is eventually compressed to fit within the 8-bit range of white-to-black that JPG forces us to use. And even if we choose to use 16-bit TIFF, it still gets compressed into 8-bit space when we print or view on our monitors.


You need to give me OK to edit your image and repost! Keep POTN alive and well with member support https://photography-on-the.net/forum/donate.p​hp
Canon dSLR system, Olympus OM 35mm system, Bronica ETRSi 645 system, Horseman LS 4x5 system, Metz flashes, Dynalite studio lighting, and too many accessories to mention

  
  LOG IN TO REPLY
davesrose
Title Fairy still hasn't visited me!
4,568 posts
Likes: 879
Joined Apr 2007
Location: Atlanta, GA
     
Oct 13, 2015 10:25 |  #326

Wilt wrote in post #17743794 (external link)
What was that about?! The way I see it, my chart and even YOUR defintion are consistent with each other.

BTW no one has ever mentioned '14 stops', a 'stop' is not the same as number of BITS (which is the encoding of a single-color-site in RAW, which was only 12-bits a decade ago).

It's apparent you haven't read my links on HDR :-(

So what is the topic of this thread?:-) If it's latest sensor DR technology, it seems Sony backlit sensors are improving sensitivity with high ISO for large MP sensors.


Canon 5D mk IV
EF 135mm 2.0L, EF 70-200mm 2.8L IS II, EF 24-70 2.8L II, EF 50mm 1.4, EF 100mm 2.8L Macro, EF 16-35mm 4L IS, Sigma 150-600mm C, 580EX, 600EX-RT, MeFoto Globetrotter tripod, grips, Black Rapid RS-7, CAMS plate and strap system, Lowepro Flipside 500 AW, and a few other things...
smugmug (external link)

  
  LOG IN TO REPLY
sploo
premature adulation
2,668 posts
Gallery: 5 photos
Likes: 645
Joined Nov 2011
Location: West Yorkshire, UK
     
Oct 13, 2015 10:55 |  #327

davesrose wrote in post #17743623 (external link)
If you're setting your color profiles to "natural", then the 8bpc image is pretty much "just" being converted linearly (it is just being squeezed).

I'll reserve judgement, but I'd be surprised. You'd end up with a very flat looking image if you just squeeze the whole range linearly.

davesrose wrote in post #17743623 (external link)
As with all the links I've posted, gamma is the term always used with HDR graphics, because tonal range is stored logarithmically. In an HDR system, the only linear element is the sensor. Maybe that's another confusion here...as it appears you and Wilt have only talked about "levels" and not gamma curves....You get much more flexibility adjusting contrast by being able to manually adjust gamma curves (instead of just sliding white or black point sliders).

I'm afraid a lot of those words together don't really make sense. Gamma is a term always used with HDR graphics? Linear element is the sensor? You and Wilt have only talked about "levels" and not gamma curves? You get much more flexibility adjusting contrast by being able to manually adjust gamma curves (instead of just sliding white or black point sliders)?

I'm not really that convinced you actually understand the guts of what's going on with those corrections. I hate it when someone pulls the "I'm a pro photographer so I know better than you" argument, but... I'm a sw engineer by trade, and I've written software to do such contrast, curves, gamma and colour correction. I do understand the guts of what's going on. At the very least some of the terminology you're using is unfamiliar, so it's hard to know if you're referring to the same things or not - but we have been around that particular road many times haven't we?


Camera, some lenses, too little time, too little talent

  
  LOG IN TO REPLY
davesrose
Title Fairy still hasn't visited me!
4,568 posts
Likes: 879
Joined Apr 2007
Location: Atlanta, GA
Post edited over 8 years ago by davesrose.
     
Oct 13, 2015 11:04 as a reply to  @ sploo's post |  #328

Tone Mapping (external link)

"Tone mapping is a technique used in image processing and computer graphics to map one set of colors to another to approximate the appearance of high dynamic range images in a medium that has a more limited dynamic range. Print-outs, CRT or LCD monitors, and projectors all have a limited dynamic range that is inadequate to reproduce the full range of light intensities present in natural scenes. Tone mapping addresses the problem of strong contrast reduction from the scene radiance to the displayable range while preserving the image details and color appearance important to appreciate the original scene content."

"Solutions to the tone reproduction issue have been attempted since the days of early painters. These painters only had access to the limited contrast range of available pigments. Leonardo Da Vinci resorted to using midrange colors for all objects in order to attempt to achieve the desired contrast in the image, despite this distorting the actual brightness levels [1]. The introduction of film-based photography created further issues since capturing the enormous dynamic range of lighting from the real world on a chemically limited negative was very difficult. Early film developers attempted to remedy this issue by designing the film stocks and the print development systems that gave a desired S-shaped tone curve with slightly enhanced contrast (about 15%) in the middle range and gradually compressed highlights and shadows [2]. Photographers have also used Dodging and burning to overcome the limitations of the print process [3]."

"Computer graphic techniques capable of rendering high-contrast scenes shifted the focus from color to luminance as the main limiting factor of display devices. Several tone mapping operators were developed to map high dynamic range (HDR) images to standard displays. More recently, this work has branched away from utilizing luminance to extend image contrast and towards other methods such as user-assisted image reproduction. Currently, image reproduction has shifted towards display-driven solutions since displays now possess advanced image processing algorithms that help adapt rendering of the image to viewing conditions, save power, up-scale color gamut and dynamic range."

"A simple example of global tone mapping filter is V_{\text{out}}=\frac{V​_{\text{in}}}{V_{\text​{in}}+1}, where Vin is the luminance of the original pixel and Vout is the luminance of the filtered pixel.[2] This function will map the luminance Vin in the domain [0,\infty) to a displayable output range of [0,1). While this filter provides a decent contrast for parts of the image with low luminance (particularly when Vin < 1), parts of the image with higher luminance will get increasingly lower contrast as the luminance of the filtered image goes to 1.

A perhaps more useful global tone mapping method is gamma compression, which has the filter V_{\text{out}}=A\,V_{\​text{in}}^{\gamma}, where A > 0 and 0 < γ < 1. This function will map the luminance Vin in the domain [0,1/A^{1/\gamma}] to the output range [0,1]. γ regulates the contrast of the image; a lower value for lower contrast. While a lower constant γ gives a lower contrast and perhaps also a duller image, it increases the exposure of underexposed parts of the image while at the same time, if A < 1, it can decrease the exposure of overexposed parts of the image enough to prevent them from being overexposed."

OpenEXR HDR image format (external link)

"It is notable for supporting 16-bit-per-channel floating point values (half precision), with a sign bit, 5 bits of exponent, and a 10-bit significand. This allows a dynamic range of over 30 stops of exposure."


Canon 5D mk IV
EF 135mm 2.0L, EF 70-200mm 2.8L IS II, EF 24-70 2.8L II, EF 50mm 1.4, EF 100mm 2.8L Macro, EF 16-35mm 4L IS, Sigma 150-600mm C, 580EX, 600EX-RT, MeFoto Globetrotter tripod, grips, Black Rapid RS-7, CAMS plate and strap system, Lowepro Flipside 500 AW, and a few other things...
smugmug (external link)

  
  LOG IN TO REPLY
Tom ­ Reichner
"That's what I do."
Avatar
17,636 posts
Gallery: 213 photos
Best ofs: 2
Likes: 8389
Joined Dec 2008
Location: from Pennsylvania, USA, now in Washington state, USA, road trip back and forth a lot
Post edited over 8 years ago by Tom Reichner.
     
Oct 13, 2015 11:09 |  #329

sploo wrote in post #17743837 (external link)
I hate it when someone pulls the "I'm a pro photographer so I know better than you" argument....

I hate that too! I am a pro photographer, and will freely admit that 99% of what has been written in this thread is way over my head and I don't understand it at all. What's more, I think that if I did fully understand all of it, that knowledge wouldn't help me create photos that are any better than those I now create.

As soon as anyone tries to "prove" something mathematically or scientifically, I am completely lost. Photography is much more art than science; it is primarily about how things look, and that is really quite a bit more subjective than many on this board care to acknowledge. Many tend to embrace the hard, cold "facts" when they are not really what photography is about.


"Your" and "you're" are different words with completely different meanings - please use the correct one.
"They're", "their", and "there" are different words with completely different meanings - please use the correct one.
"Fare" and "fair" are different words with completely different meanings - please use the correct one. The proper expression is "moot point", NOT "mute point".

  
  LOG IN TO REPLY
AJSJones
Goldmember
Avatar
2,647 posts
Gallery: 6 photos
Likes: 92
Joined Dec 2001
Location: California
     
Oct 13, 2015 11:35 as a reply to  @ davesrose's post |  #330

Dave, In the computer world, you can assign any pixel any value with precision dependent only on the bitspace. The DR definition you quote has the key word "measurable" and in the real world we have to deal with noise/uncertainty of the value for a pixel.

I'll try another analogy to get the sensor DR measurement issue clearer and it requires an understanding of noise, which you always seem to ignore but it is central to the discussion. This will only involve sensor DR.

We all know pixels are like buckets collecting photons and turning them into elecrons and we use the rain analogy to illustrate the concept of saturation/overflow. When we want to record the rainfall we need to weigh the bucket to see how much rain there is.
For dynamic range, the top is the weight of the full bucket. For the bottom, it is the weight of the empty bucket.

Our weighing device has a dial with divisions that go down to 1g
Our bucket weights ~1 kg and holds 2 litres of water.
The total weight of the full bucket is 3kg
So far it's D'oh, right? But bear with it:D

Now comes the tricky part.
The empty bucket.
The scale has divisions that represents the 1s of grams but it is not precise: sometimes when you put the empty bucket on it reads 1.011, sometimes 1.009 sometimes 1.003 sometimes 1.006, sometimes 1.010 etc So we are simply unable to accurately measure 1 gram of water. If we added 1 gram of water to our bucket and reweighed it, it could read more or less than before we put the water in. All I can read is the variation from the "noise" in the scale. If I put in 10 grams of water, I'd be more able to convince myself and anyone else that there was a "measurable" amount of water in the bucket. If I weigh the empty bucket enough times, I will get an idea of how much variation there is. Let's say the average is ±5 in this case (a bell shaped curve centred on that number). So if I put 1 gram in, I can't get a reliable measurement. If I put in 20 grams, I will know there was some water added, but I know there is some possible error, so I would say there are 20±5 grams in there. So I have a signal (20 grams) and some noise (±5 grams) and a signal to noise ratio of 4.

As you have probably guessed the scale is the sensor.

Some people say "there is definitely some water in there" only when the value is twice the noise, and that's how they define the bottom of the utility of the scale. So our scale can weigh from 2000 grams down to 10 grams before we lose confidence in the measured value. This is a "range from 10- 2000" and a "Dynamic range" (the ratio) of 200.

So what do we do? We find a better scale (the weights of the bucket and water have not changed) and find the guy across the street has one that also reads to ±1 grams. It still has variability/noise, but in his case is only 2 times the value of the finest division. So when we weigh the empty bucket, we will get 1.005, 1.006, 1.004, 1.007, etc with an average of ±2 grams. With this scale we can confidently "Measure" down to 4 grams and our range is 4-2000, and our dynamic range is 500.

None of the above discussion involves digitization so bit-space is not yet a relevant topic in the discussion (it is only an issue when the digitized values need to be recorded - the bigger the DR the more bits you need to use to report the whole information).

In the discussion above, one scale (sensor) has a better DR than the other. The actual values for the capacity of the bucket may vary from sensor to sensor, but that doesn't affect the discussion above. We set the weight of the full bucket to the "top of the DR" and it is simply the variabiity (noise) in the scale readout that determines the "bottom of the DR".

This is sensor DR and all your photons are belong us:D


My picture galleries (external link)

  
  LOG IN TO REPLY
sponsored links (only for non-logged)

113,467 views & 127 likes for this thread, 39 members have posted to it and it is followed by 20 members.
Dynamic Range-Can't they or Won't they?
FORUMS Cameras, Lenses & Accessories Canon Digital Cameras 
AAA
x 1600
y 1600

Jump to forum...   •  Rules   •  Forums   •  New posts   •  RTAT   •  'Best of'   •  Gallery   •  Gear   •  Reviews   •  Member list   •  Polls   •  Image rules   •  Search   •  Password reset   •  Home

Not a member yet?
Register to forums
Registered members may log in to forums and access all the features: full search, image upload, follow forums, own gear list and ratings, likes, more forums, private messaging, thread follow, notifications, own gallery, all settings, view hosted photos, own reviews, see more and do more... and all is free. Don't be a stranger - register now and start posting!


COOKIES DISCLAIMER: This website uses cookies to improve your user experience. By using this site, you agree to our use of cookies and to our privacy policy.
Privacy policy and cookie usage info.


POWERED BY AMASS forum software 2.58forum software
version 2.58 /
code and design
by Pekka Saarinen ©
for photography-on-the.net

Latest registered member is IoDaLi Photography
1719 guests, 146 members online
Simultaneous users record so far is 15,144, that happened on Nov 22, 2018

Photography-on-the.net Digital Photography Forums is the website for photographers and all who love great photos, camera and post processing techniques, gear talk, discussion and sharing. Professionals, hobbyists, newbies and those who don't even own a camera -- all are welcome regardless of skill, favourite brand, gear, gender or age. Registering and usage is free.