If you're interested in a high frame rate for high-speed shooting, the buffer will fill more quickly with the Superfine/Large quality. If I remember correctly (from testing with various qualities), even on Fine/Large, the frame rate never decreases (okay, I only took about 60 seconds worth of shots, so maybe 'never' is an exageration, but I suspect the buffer would have filled by then if ever), wheras after about 4 seconds it slows down in Sf/L (again, from memory). Also, after a burst of pictures, it takes longer to flush the buffer to the card, during which time some operations are limited.
If you don't use that feature much, and you've got a big card, I'd say Sf/L is a good default.
I'd like to compare the three compression levels on the G6 to see if I can determine what percent compression each one uses--I just haven't taken the time to do so. Should be easy: take 4 identical shots, 1 RAW, and three at each of the compression levels (all Large). Convert the RAW to jpeg at varying compression and compare both file size and (harder to judge) image quality. Sort of academic, I suppose, but I'd just like to know that Superfine is 70%, Fine is 60% and Normal is 40% (or whatever they really are).
Don't suppose anyone else has already figured that out?