pknight wrote in post #17068057
It is difficult to judge how popular eye-control actually was based on forum posts, but I have to assume that there was some serious problem with it for Canon to drop what seems to be an obviously useful feature. Eye-tracking technology (in non-photography applications) has improved a lot, so perhaps someone will bring it back. However, it may also be that this level of accuracy is cost- or technology-prohibitive for including in a camera.
I suspect that with 40+ focus points on many cameras, there may be problems with eye tracking making reliable distinctions between adjacent points. The old EOS eye-control system operated with a small number of points, which probably made it more reliable. If there is any advantage to having dozens of focus points, useful eye-control would have to reliably detect fine differences in where the eye is looking to fully realize this advantage. Given the 100% reliable (albeit more complex) joystick-control of focus points we have now, any inability of eye control to accurately utilize all of the points will probably limit its acceptance.
There's an entire thread about ECF and a few of us have done some extensive research into this. To start, the Eos 3 had 45 AF points and worked perfectly fine. It was much faster and more accurate (for me) than my joystick and wheels are on the 7D. With both you either have to go slow enough to count clicks or individual pushes of the joystick for each point of movement, or you risk overshooting or stopping short of your intended point. I can do it pretty quickly on my 7D with just 19 points and I could probably do it on a 1D or 5DIII as well but when compared to the ECF on the Eos 3, it's laborious. The true value of ECF increases as AF points increase.
From what we could deduce from a variety of sources, the problem was one of sampling frequency. Every person's eyes constantly vibrate (for lack of a better word). The point of focus is moving around randomly and rapidly. The ECF system sampled this at a certain frequency and made an average. Calibration determined where that average was. But if the user's eye movement frequency varied, or if it got too close to the sampling frequency itself, then it could cause problems. Sometimes glasses could be a culprit as well but this had more to do with the RX and the distance of the eye from the eye cup.
But those weren't that common. The various sources we've been able to find gave stated ranges of problems anywhere from 3-7%. No more or less than people have success with servo focus mode and other such features.
The product itself wasn't pulled. Cameras with ECF were manufactured until about 2007 when they got out of the prosumer 35mm camera business. So just 7 years ago you could buy a new Eos 3 with 45 AF points and ECF. Overall, they made ECF cameras for about 12 years. This is hardly the mark of a failed product. The problem appears to be more to do with processing. The ECF was processor intensive for the processors of the time. But a film camera didn't have much else to process. But once the move was made to digital there was both a space crunch for the electronics and a processor crunch where the processors were barely able to keep up with the tasks it already had. Throwing ECF in on top of it would have been nearly impossible. Plus, if you were going to continue the product, you would want to have an improved version to reduce the number of people with problems. Cost was also another factor. Digital sensors meant that the lifecycle of cameras was shorter while the expense was greater. And they were already much more expensive than film cameras without the ECF cost. ECF was optional and you paid more for a camera with ECF. Plus, it was a battery hog. In the end, it didn't make a lot of sense to add it to digital cameras.
Fast forward to now. The megapixel race has slowed. Mobile processors have become incredibly powerful and cheap. The electronics have become smaller. Battery tech has come a long way. It would cost less to implement now. It would use a much smaller fraction of the power and resources. Sampling frequencies could be increased tenfold without need for more powerful processors. Algorithms can be put in place to analyze the eye movement in real-time and correct for lots of things that just wasn't possible before. You can do things such as variable sampling rates which would virtually eliminate the sampling problem. And with a joystick you can have instant over-ride of a bad AF point select with a simple nudge of the stick.
When it comes down to it, If Canon finds themselves in need of a differentiator, then this could very well do the trick. The only reason not to do it now is to simply hold the technology until there's a need for such a differentiator and now, with a 7DII which the 7D is known as the engineer's plaything anyways, seems like the right time and right camera for it. If it isn't in this one, it would probably lie dormant until a mk III version or other replacement came around.
I am serious....and don't call me Shirley.
Canon 7D and a bunch of other stuff