I love proposing this technical dilema, and it's fascinating to see how many how many people swear they know the answer. To everyone's defence, no pro photographer I have ever asked has the solution.
Q: Why does the inverse square law of lighting intensity apply only to source-to-subject distance, and not apply to subject-to-observer distance?
To explain: we all (hopefully) know that if you double the distance of a light source to its subject, the amount of light falling on that subject is 1/4 the intensity, not half. Quadruple the distance and you've got 1/16th the intensity.
Let's assume you're 5 feet from the subject in your studio as the observer/photographer and the proper exposure is f/8. The light source (flash) is on a stand and will not move. Now you, the photographer, move to 20 feet away...the exposure does not change, it's still f/8. Move to 50 feet...it's still f/8. Don't believe me? Try it yourself.
Go across the street with a telephoto capable of shooting f/8 and the exposure is still the same. Go 2 miles away with a long enough lens and proper exposure is still f/8. Shoot a rock concert using stage lighting and you'll find that the exposure is the same, regardless of whether you're touching the stage or in the very back of the stadium.
The upshot of my quandary: distance is meaningless, subject-to-observer, as far as light intensity is concerned.
So why doesn't the light intensity diminish as you back away from the subject? Light is light, isn't it? The laws of physics that causes light to diminish between the source and the subject (i.e., non-cohesive light) should apply for subject-to-observer intensity, correct?


