It's funny how much you pay attention to sharpening once you actually pay attention to sharpening. I remember back when I had no idea what it actually did or how to use it, let alone a term like "capture sharpening." Heck, with my Kodak point and shoot 12 years ago, the only "adjustment" I ever made to an image was zooming it in and out on the screen!
So here I am today, a 33 year old chap with ADHD, OCD and enough Photoshop sliders and gizmos to keep my constantly revving mind cranking for eons, not to mention enough literature out there to keep me studying until I'm two feet in the grave and elbows on banana peels.
So, my latest little fixation is this whole concept of matching your sharpening radius to a measurement of the finest detectable detail the human eye can pick up on. I'm not sure if it was Bruce Fraser who first introduced the concept or not, but he certainly made it quite well known.
The theory makes complete sense to me: you have pixels that get crammed together at different "tightnesses" based on how much cramming your output device is doing. So, if you want a 1/100in. radius then that would be 3px for 300ppi, 1px for 100ppi, 2.7px for 270ppi and so on. And now, my curiosities:
1) 1/100th of an inch, which is the common number recommended, seems awfully large. For a print to a high-end Epson at 360ppi output, that would mean a radius close to 4 pixels! (3.6 according to the formula). I'm in no way a printing guru but isn't that an incredibly large radius?
2) Consider this: let's say you are viewing an image on a monitor that for the sake of simplicity, has a 100ppi display output (which is a reasonable average these days). If you sharpen for the web/that monitor, you may use a radius like myself and many others use, which is .3px. I find this to have quite a nice punchy effect (definitely noticeable). Now considering everything so far, that would mean that your and my eyes are making out a tremendous amount of detail at 1/333 of an inch; more than three times smaller than the aforementioned 1/100th! (100 pixels per inch/.3 pixels)
Now I know there's a difference between monitors and printers and different output devices, and that every image is different from the next, and that ink spreads on paper, but I'm talking quantifiable numbers and mathematics here; a pixel is a pixel. An inch is an inch. 360 multiplied by 1/100 is equal to 3.6. These are not really subjective measurements so I would reckon that there should be some truth to some areas of this. What do we think?


