When technologists talk about next-generation interfaces, they often focus on voice-, touch-, and gesture-based technologies. Inevitably, their comments contain words like “natural” and “intuitive.” But all too often, they leave out the human body’s most fundamental interface — the eyes.
Quantum Interface (QI) hopes to remedy this with EyeZ, a technology that harnesses the power of eye-tracking sensors, motion controllers, and predictive motion algorithms, which combined make it possible to navigate even complex interfaces with a glance. What makes EyeZ stand out among the latest crop of “natural” interfaces is its emphasis on predictive technology. By anticipating the focus of attention, QI’s platform eliminates the need (and time) to specify the exact locus of interest.
This technology promises to make it possible for developers to design significantly more intuitive interfaces by creating a natural bond between the user and the device. “Eye tracking is something we believe will be an integral part of the user interface going forward,” says Jonathan Josephson, founder and chief technology officer of Quantum Interface. “We hope to see eye tracking begin to show up in mainstream products in the next couple years. The benefits are so large that they need to become part of the user interface experience sooner rather than later.”
Hardware Sets the Stage
EyeZ relies on two hardware components to bring predictive eye tracking alive. These include an eye-tracking sensor, such as those provided by EyeTech Digital Systems, and a touch or touchless sensing device, like Leap Motion and SoftKinetic.
The system marries two sensing technologies to implement the interface. It uses the eyes to locate an object or function, and the body — in this case the touch or touchless sensing device — to execute control.
The Power of a Glance
EyeZ builds on the core concept that the eyes have the fastest reacting muscles in the body. By recognizing the direction of eye motion and predicting the ultimate objective of a person’s gaze, EyeZ can process information and control faster than any other form of human interaction. Before the user actually looks at an object, the system has already initiated an action (e.g., selecting functions like open and play) with a glance. The user does not have to stare or hold a gaze. A simple glance gets the job done.
To do this, QI’s Precognition algorithms use change of motion — which includes direction, angle, speed, and acceleration — and vector-based extrapolation to predict what the user will do. By analyzing dynamic eye motion, EyeZ can provide real-time control.
As the eyes begin to move, EyeZ draws a vector, using the speed and direction of the motion to predict with a higher probability the object of the user’s gaze and the action’s intent. The software continuously updates the probability as the movement progresses. The faster the user’s gaze moves to the object, the higher the probability.
When the eyes locate an icon or function, such as a folder on the desktop, the user initiates an action via the touch or touchless sensing device. Thus, the interface and the body become one.
Compensating for Latency
As with all systems that bridge the physical and digital worlds, EyeZ must contend with latency. In this case, the hardware’s inherent latency averages about 40 ms. To counter this, EyeZ updates its vector sampling rate at 60 frames per second (fps). Latency in systems operating at 60 cycles per second is virtually imperceptible. For applications where the processor provides higher resolution, the system can increase the update rate to 400 fps. Given this level of performance, the system promises to react faster than the eye can move.
“No user interface has ever existed that can come close to the speed of eye movement until now,” says Josephson. “So we are just beginning to learn what happens with a system this fast. As we get used to it, there may be a demand to increase the frame rate to exceed eye speed to help decrease eye fatigue, but this is not difficult if required.”
QI has been demonstrating its technology to OEMs in a broad range of markets. Even so, commercialization remains a work in progress. To a large extent, however, the software side of the system is complete.
“Touch has had a long head start, and it is the most stable sensor platform available today,” says Josephson. “Touchless sensor platforms, such as infrared, acoustic, and optical are improving as well, and they are a bit ahead of eye tracking. We hope to see touchless — including eye tracking — go mainstream in the near future.”