The title may sound absurd, but bare with me for a second . I was today at a concert and was wondering about the light, knowing that it decays with distance and thinking about the scene in general, and then... a doubt jumped at me:
- The light coming from a light bulb, a flash, the sun, etc. decays with the distance. We all (or I assume most ) know that. But...
- After it hits an object and reflects (which is the light our eyes, or the sensor finally sees) does it still decay? The probably obvious answer is that it does still decay once it is reflected from a surface. Therefore, and this is my question:
If I am on a concert on a row 200 feet away, would I need to expose my frame 2 stops brighter than a fellow that is in a row 100 feet away? And the person 50 feet away could get with 4 stops faster shutter than me. Is this correct? It doesn't seem that way! It would seem logical that everyone needs to expose with the same exposure, yet, the light is still traveling... my sensor can only 'see' what enters the lens and that light is loosing strength with the distance.
So... what am I missing? What am I overlooking here? Does any of this makes sense or does anybody even care about such things or am I, as I have long suspected, a nerd! .
Thanks for any input,