right, it's weird. It's not really good or bad, they've basically just fudged the standards, which is generally a bad thing, but with film out of the equation, basically you're getting good photos so does it really matter?
Maybe they way they're looking at it is by the end result of the standards instead of the standards themselves... if their (indicated) 50 gives you the noise you'd expect from (standardized) 100, then why not call that "50" because many amateurs think of ISO as being "hown grainy the film is" rather than "how sensitive the film is"
From what I understand, the ISO (International Standards Org) designed standards (if you're as old as I am, which is only 36, you still call it "ASA" all the time) so that all film and camera manufacturers' products would be comparable. Now that film is out of the equation, ISO is sort of moot, and Canon decided to optimize their settings for the best photos possible, then assign "ISO" numbers to them as a reference to people that understood photography.
I bet in the next few years, sensor sensitivity becomes far more variable and ISO standards will be thrown out entirely. Why have four settings when you could have 12? With ever-increasing noise reduction and metering, why not throw ISO into the mix with aperture and shutter speed to come up with the perfect photo? for that matter, why are cameras still using preset apertures and speeds, couldn't a computer find points within those "notches" that would result in better photos? If 1/60 will overexpose just a hair, why not use 1/65?
But I really don't know much about this stuff, i'm just guessing what Canon was thinking, and writing science fiction. I'm sure Robert Lay will kick in a couple hundred words that will be far better researched than these.