With all the megapixel/AF/sensor cleaning features of the newest cameras, the recent introduction of the 14 bit depth feature has kind of come in under the wire. I've heard from some sources that it makes no difference to image quality, while others suggests that it's easily the most important improvement of this camera generation.
If some one could point me to a source of disinterested information I'd be greatful.
In any case these are the questions I'd like to get answered:
I assume that the 14 bits advantages lay only in RAW and only if the RAW file is decoded with a 16 bit decoder. Is this true? Having said that, am I leaving quality on the table when I routinely decode my 12 bit RAW in 8 bit?
If there are advantages to the 14 bit RAW files (or even my current 12 bit RAW files) when they are decoded to 16 bit; do those advantages (obviously locked in place) survive if my finished product is an 8 bit (the only way they come) jpeg?
In an 8x10 commercially made print (like from a Fuji Frontier) is there a visible difference between well exposed 12 bit and 14 images decoded to it's optimal quality?
Would an ink jet printer that can use a higher dpi than a Frontier make more use of the 14 bit as well?
Does the pay off for 14 bit images show when the exposure is bad, especially quality in an underexposed shadow area?
Well there they are. The 14 bit questions. And if you'll forgive me I'm not particularly interested in opinions. I have lots of opinions and guesses too, as to what's going on, and what I'd like now are some facts.