I was reading reviews of the T2i and I came across this...
Now, I come from the film days, and this argument seems completely bogus to me. Back in the day, it was always a challenge for the lens manufacturers to produce lenses with the resolving power to get the most out of fine-grain film, whether Kodachrome 25, Pan-X B&W or even Tri-X. No one criticized the film because to get the most out of it, you may need to upgrade your lens. No one said that grainier film was better because it did not expose the flaws in the less expensive lenses people might own. They placed the criticism where it belonged... on the lens.
What has changed? Has the fact that we now depend on the camera / lens manufacturer to also produce the "film" (sensor) resulted the the manufacturers producing sub-standard lenses since the sensor can't see the difference anyway?
It is fair to criticize the technology trade-offs, for example, the same review stated this:
I don't know about the camera shake deal... this reviewer seemed unreasonably biased against Canon throughout, and this seems like a real reach for something to criticize. But, the points about dynamic range and high ISO performance may be correct, I don't know. I do know the same issues faced film manufacturers. Typically, fine-grained film did not perform well when pushed to high ISO (I still want to say ASA... so you know I'm old) ratings, resulting typically in contrasty and grainy images.
But, the initial point... a 12 MP sensor is better (and, hence, the Nikon D90's sensor is better) because it lets you get away with a cheaper lens compared with an 18 MP sensor... seems like the criticism is misplaced... that is the fault of the lens, not the sensor. And, in fact, what this really means is the camera's sensor is limiting the benefit you might get in the future should you choose to upgrade your lens.



