In my opinion, based on real-life experience, the whole diffraction limitation is overplayed. Yes, if you pixel-peep a comparison of two identical shots, one at f/11 and one at f/22, you will indeed see a difference, but you wouldn't look at the f/22 picture on its own and declare it "soft". I've taken many shots at f/22 or even f/32 and they've been fine. At the same time, if I want ultimate sharpness, then yes I might be drawn towards f/8 or f/11. The best thing is to try it yourself and see what you're happy with in practice.
That is the truth of it. There is 'way too much angst over diffraction these days, primarily because digital makes pixel-peeping too easy when it almost never matters when the picture will be displayed.
Frank is right about the mechanics. Not even sensor size matters--f/32 on a lens in front of an 8x10 sheet of film produces exactly as much diffraction as f/32 in front of a 24x36mm sensor.
This is a matter of gradation, and like every other choice in photography, always a matter of compromising one factor for another. Once you realize that diffraction is an observable factor at every aperture with several factors that may mitigate it, you can settle down and start to think in terms of what factors are important to your image and how they balance.
So if I carefully compare the finest detail at the focus plane of an image shot at f/4 with the same image and sensor shot at f/22, I may see that the f/22 image is very slightly degraded. But if I'm shooting a group of people that needs the depth of field of f/22 to get everyone in focus, it would be stupid to shoot at f/4--yes, that one person would be slightly sharper, but everyone else would be totally blurry.
And depending on the actual final display, there might not be any discernible diffraction difference at all, even on comparison. Uncropped images shown on a digital monitor or on prints even up to 30x or 40x probably won't show differences even on head-to-head comparison--the difference simply isn't resolved in the display medium.