Finally we get to your key misunderstanding of the difference between the real world of sensors (which record photons and NOISE) and file systems that are artificial. In the computer you assign values - a pixel that is "white" is given maximum (16,000ish, e.g.) and one that is "black" is assigned a value of 0. So you think that down in the black there is no noise. That is true in the computer synthesis of an image but it is completely wrong when thinking about how sensors work. The bold text in your quote above is completely false - Wilt pointed that out before (#445).
The importance of understanding the concept of noise in the real world data captured by a sensor is still something you have not grasped, and that is confusing you no end. Wilt's diagrams in #320 and my rainfall bucket analogy (#330) seem to have been ignored by you - or you do not understand them (or as you have put it "you show ignorance on noise"
). This also leads you to call the discussion (in which I and others have tried to help you understand and learn) "asinine".
In #440 you said
I still do not understand what you mean by a "full exposure" and what the sentences mean. In a 16 bit computer file, any pixel can have an assigned luminance value and it can be between 0 and 16384. No problem. Real world not so much.
When we expose a pixel to incoming light, some of the photons gets converted to electrons and stored until read-out. Let's say we have 200 photons landing on each pixel on average and half are converted to electrons. We have 100 electrons on average in such a pixel. Statistics of photon counting (unavoidable in the real world) means that 2/3 of those pixels will have 90-110 electrons and the rest will have <90 or >110 electrons. So there is one source of noise. When the electrons are "read-out" more noise is added from the circuitry and interference from the "electronic environment" and then they are digitized. This read noise is in addition to the statistical noise. So those pixels that all got, on average, 200 photons will be read out as 80-120 ± read noise. So some of the 80 pixels will be reported as 60 and some as 100, if the average read noise is say 20. And some of the 120 pixels will be read out as 100 and some as 140. So now our incoming photons value will be reported as somewhere between 60 and 140, ready to be encoded into the digital word.
In the computer you assign all these pixels (in a simulation) as having a value of 100, but those same pixels coming from a real sensor will report values anywhere from below 60 to above 140 - simply because of the noise in the capture/read/digitize process. So "black" is nowhere near "no noise" - in fact it is the opposite: it is where the number of photons is smaller than the noise, so we can no longer distinguish between 10 and 20 photons because the noise is too high. Thus we cannot see image detail from those photons because of the "snow" from the noise. In the terms we (but, so far, not you) have been using above, the "photon signal" gets lost in the "sensor system noise". Now, it should be apparent that two sensors with different "read noise" will have different abilities to resolve differences between e.g., 40 and 60 photons and to distinguish such a signal from the noise.
Perhaps a slightly different example will help. Back to the rain bucket
The upated system now directly reads the amount of rain and there is a display that goes up to 16,384.
We shine a light on the pixel and find out how bright it needs to be to read 16,384ish (top of detector DR)
Now we keep lowering the brightness and the number goes down. Someone at the back of the room asks "What happens with no light? Let's put the lens cap on
" We now look at the display and, horrors, it doesn't read "0". It is flickering around what looks like a value of 32 (goin from perhaps 20 - 44 at any instant, but averaging 32. Now we take the lens cap off and slowly turn the light up from its off position. As the light gets brighter and brighter, from 0.1 to say 5, we can't tell just by looking at the display, whether the light is on or off. Perhaps by 10 we'll be able to see a difference between turning the light on and off. Now, by 32, it will be a definite that everyone can tell when the light is on and when it is off. We have now found the bottom of the DR (from 16,000 down to 32). The Canon display flickers around ~32 while the Sony display flickers aroud ~8 (relatively speaking in this analogy). So we can accurately record light levels 2 stops lower, for the same amount of noise.
That was my last attempt to explain the importance and concept of noise - how it occurs in the real world but not in simulations.
(Added: Not totally coincidenatally some real data: Read noise at ISO 100 for 5D2 is 30.6 electrons and saturation is 61072 electrons from http://www.sensorgen.info/CanonEOS-5D-Mark-II.html
)






