Canon Digital Photography Forums  

Go Back   Canon Digital Photography Forums > 'Sharing Knowhow' section > Talk About Photography > Macro Talk
Register Rules FAQ Members List Search Today's Posts Mark Forums Read



Reply
 
Thread Tools Display Modes
Old 29th of February 2012 (Wed)   #1
gatorlink
Member
 
gatorlink's Avatar
 
Join Date: Mar 2011
Location: Los Angeles, CA
Posts: 875
Default Pixel density on APS-C sensors and diffraction

Consider the following hypothetical situation: I am shooting a shot of a fly's head with the MP-E 65mm at 4:1 magnification. The fly is dead, so it isn't going anywhere. I am in the studio with constant light, camera on a tripod, etc. I take one shot at f/11 and ISO 100 with the 450D. Then, I take another shot from the exact same position at f/11 and ISO 100 with the 550D. I upload both shots onto my computer and ensure that the settings for everything related to raw file rendering (especially sharpening!) are set exactly the same. I then downsize both files to be 1024 pixels on the long side using DPP. When I view the exported, 1024-pixel-wide photos at 100%, do I see a difference in sharpness?

The sensors in these two cameras have a different pixel density, but theoretically, the engineering quality of the sensors should be very close. The issue at hand is the diffraction limited aperture (see first table in this link for details about these and other current and recent DSLRs: http://www.the-digital-picture.com/R...ra-Review.aspx). In other words, particularly when shooting very high-magnification macro (i.e., greater than 1:1), is it actually better to have a camera with fewer pixels covering the same sensor size when it comes to photos that are printed small or re-sized? Obviously the answer will change if you start changing sensor size (i.e., let's leave the 5D out of this!), and it will also change if you view the two photos at 100% WITHOUT re-sizing them first. Let's set those issues aside. The key question is: is the extreme macro photographer who wishes only to show web-resolution photos actually doing him/herself a disservice by getting the highest density APS-C cameras out there, because he/she will need to shoot at a slightly narrower aperture to get the same pixel-level sharpness after re-sizing to 1024 pixels on the long side?

The answer to this question may come down to something I don't really understand: the algorithm used by most software to re-size photos. In a purely hypothetical example, imagine one camera literally had twice as many pixels covering the same exact sensor size. The DLA would be pretty low in the high-density camera when compared to the low-density camera, but images from both would have to be re-sized significantly to fit into a 1024-pixel width. When downsizing, do two slightly blurry pixels on a super-dense sensor equal the same re-sized quality as one sharp pixel on a less-dense sensor? I would guess that they do, but I feel like anecdotal examples I've seen suggest otherwise.

If this has already been answered, please feel free to point me in the right direction. A quick search did not turn up much. If you don't know the answer but still want to be helpful, and you own multiple APS-C cameras released within a couple years of each other, you could actually run this test and post some photos. Then you would be a hero, and who doesn't like that??
__________________
Ryan

Flickr
Facebook
Gear List
gatorlink is offline   Reply With Quote
This ad block will go away when you log in as member
Old 29th of February 2012 (Wed)   #2
gmillerf
Member
 
Join Date: Jan 2012
Posts: 117
Default Re: Pixel density on APS-C sensors and diffraction

If you're resizing from 5000+ pixels to 1024, the density probably isn't going to matter. You may be biasing your results by knowing what you're looking for, a more scientific test would be to have others say which one looks better, who don't know which is which.

You also need to be aware that even for the same quality of sensor, larger pixels are more sensitive to light, and will generally have less noise.

It's also not likely to be helpful to compare images that haven't been sharpened in any way. And applying sharpening can be very subjective, so it's really going to be difficult to compare results in a truly unbiased manner.
gmillerf is offline   Reply With Quote
Old 1st of March 2012 (Thu)   #3
gatorlink
Member
 
gatorlink's Avatar
 
Join Date: Mar 2011
Location: Los Angeles, CA
Posts: 875
Default Re: Pixel density on APS-C sensors and diffraction

Thanks for the input, gmillerf. I agree that any visual comparison from real cameras is likely to be flawed in at least a few ways. Ideally, what I would love is for someone to come along who has a good understanding of the physics of optics and exactly how DLA works, because I suspect he/she can simply give an academic example that negates the need for visual inspection of actual photos.
__________________
Ryan

Flickr
Facebook
Gear List
gatorlink is offline   Reply With Quote
This ad block will go away when you log in as member
Old 2nd of March 2012 (Fri)   #4
LordV
Macro Photo-Lord of the Year 2006
 
LordV's Avatar
 
Join Date: Oct 2005
Location: Worthing UK
Posts: 51,173
Default Re: Pixel density on APS-C sensors and diffraction

Not sure if this thread on another forum will help
http://photomacrography.net/forum/vi...mited+aperture
__________________
http://www.flickr.com/photos/lordv/
http://www.lordv.smugmug.com/
Macro Hints and tips
Canon 600D, Canon 40D, Canon 5D mk2, Tamron 90mm macro, Canon MPE-65,18-55 kit lens X2, canon 200mm F2.8 L, Tamron 28-70mm xrdi, Other assorted bits
LordV is offline   Reply With Quote
Old 2nd of March 2012 (Fri)   #5
gmillerf
Member
 
Join Date: Jan 2012
Posts: 117
Default Re: Pixel density on APS-C sensors and diffraction

Quote:
Originally Posted by gatorlink View Post
Ideally, what I would love is for someone to come along who has a good understanding of the physics of optics and exactly how DLA works, because I suspect he/she can simply give an academic example that negates the need for visual inspection of actual photos.
I doubt that. I know exactly enough about optics to shoot myself in the foot, but I know this problem is too complex to be solved in a general sense without limiting the discussion to an exact set of hardware.

There are actually different methods for calculating the diffraction limit, the most simple is the Airy disk. This method only applies to the ability of a telescope to determine if two stars of the same brightness, seen in a telescope, are in fact two stars as opposed to appearing to be one. It does not mean that the image of the two stars are perfect and non-overlapping. So any attempts to use this method for computing diffraction limitation is useless for evaluating image quality. And unless the calculator you're using asks for distances of optical elements, it's using the Airy method (or something similar).

Other methods for computing diffraction can be better, but are useless without very specific information about the entire optical system. One of the big details you need to know is the angle of a given pixel from the center of the field stop. So, diffraction doesn't affect all pixels equally, and which pixels are affected and by how much is determined by the size of the field stop opening and the distance it is from the sensor. Complicating matters is the fact that the field stops in lenses aren't circles, they are usually about 6 or so pieces of metal with straight edges, so you'd need an algorithm specifically designed for that case.

And even if you knew that information, it's only accounting for one factor causing diffraction. Each element of a lens will also cause diffraction, and the diffraction patterns of other optical elements will interact with each other. So, at a minimum, you'd need to know the exact size and spacing of every element in the optical system, just to get started. Then you can move on to what the actual glass elements of the system are made up of and how that affects diffraction (the manufacturer will never give you this information).

Now that you've got all that, you need to account for the fact that diffraction affects different wavelengths of light at different amounts. And then on to the fact that the sensor in your camera is really just a black & which sensor with a Bayer Array laid over top of it. So blue light will be more affected by the diffraction of a system, but most Bayer Arrays have green pixels closer together than blue ones, so you're likely to get diffraction on green objects before you see it on blue objects.

I actually had the rare opportunity to see the diffraction model of a 24" Celestron Ritchie Chretien telescope when I met a Celestron engineer installing one at the University of Louisville Moore Observatory. Even though all of the optical elements are round, I was surprised to see the diffraction pattern was actually more like an elongated figure 8. And, it took his computer a good 5 or so seconds just to render it, even though there's only about three optical elements.

So, needless to say, computing an actual diffraction limit is probably far beyond what can be done at an amateur level. You can put upper limits on it, but it's impossible to know which parts you can ignore without actually measuring and computing them. I think the short answer is, if you have to result to such elaborate calculations to determine if one image is better than another, then it probably doesn't matter. If you can't actually *see* a difference, who cares?

Just a final note. You can actually measure the diffraction of an optical system using a technique called flat fielding. The general idea to to take a photo of something that's perfectly evenly illuminated, and then measure the difference in pixel values across the sensor. Even though the concept is simple, actually doing it is pretty tough. Just getting something evenly illuminated is tough enough (a lot of people use the sky, and it's good enough). But then, there are a lot of other factors that will cause differentiation in pixel values like noise and dust. So it'll be hard to tell if you're actually seeing the diffraction pattern, or something else.
gmillerf is offline   Reply With Quote
Old 2nd of March 2012 (Fri)   #6
gatorlink
Member
 
gatorlink's Avatar
 
Join Date: Mar 2011
Location: Los Angeles, CA
Posts: 875
Default Re: Pixel density on APS-C sensors and diffraction

Thanks, Brian, for that helpful link, and thanks, gmillerf, for the detailed response. Both your posts help me clarify my rather muddy thinking on this issue and helped me realize the ways in which I was somewhat missing the forest for the trees. It occurred to me that there is a simpler way to think about things than had previously occurred to me. If you take a lens off a camera and set it to f/11, the image coming through the lens is the same no matter what camera you attach to the end of it. That is a constant. Of course, an APS-C camera will record less of the image than a 35mm frame, but we're only talking APS-C right now, so that fact doesn't matter. The DLA of a high-density APS-C sensor is lower than the DLA of a low-density sensor simply because the high-density sensor can pick up finer changes in resolution that occur as the lens is stopped down.

These facts may seem perfectly obvious to you guys, but they didn't seem perfectly obvious to me until now. Now that I am thinking about the issue in this manner, things are simpler. Basically, given that an identical image is hitting both a high-density and low-density sensor, the question is whether the two will recorded images will look noticeably different after being downsized to 1024 on the long side. gmillerf, as you mentioned, actually calculating some sort of "absolute" difference is very complex and really not worth doing. I also agree that what really matters is whether a difference is obvious to the eye. I suspect that in my example, the difference would not be obvious if it was even detectable at all. I think people generally agree that even a downsized image from a low-density sensor will have less noise than a downsized image from a high-density sensor, and this can make a noticeable difference at high ISOs, but I'm not concerned about that. I'm generally shooting at ISO 100 or 200 when shooting at greater than 1:1, so better performance at ISO 3200 is not of much interest to me.

Thanks again for your help, folks
__________________
Ryan

Flickr
Facebook
Gear List
gatorlink is offline   Reply With Quote
Reply


Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
1 pixel sensors ? mikewinburn General Photography Talk 17 7th of February 2011 (Mon) 20:16
Small pixel sensors do not have worse performance Daniel Browning General Photography Talk 125 25th of September 2009 (Fri) 16:43
Pixel Density / Pixel Size Question BTBeilke Canon EOS Digital Cameras 19 29th of August 2005 (Mon) 09:28


All times are GMT -5. The time now is 04:25.


Powered by vBulletin® Version 3.6.12
Copyright ©2000 - 2014, Jelsoft Enterprises Ltd.
This forum is not affiliated with Canon in any way and is run as a free user helpsite by Pekka Saarinen, Helsinki Finland. You will need to register in order to be able to post messages. Cookies are required for registering and posting. HTML in messages is not allowed, plain website addresses are automatically made active by the board.