Scientists are developing mobile software that can accurately identify where a person is looking at real time. This would potentially lead smartphones and other gadgets to be controlled by eye movements.
Researchers are taking more efforts to make eye tracking cheap, compact and accurate enough to be included in smartphones. They are crowd sourcing the gaze information and using it to impart the software how to read where a person is looking at.
The researchers at Max Planck Institute for Informatics in Germany, University of Georgia in the U.S and Massachusetts Institute of Technology (MIT) have so far been able to train the software to recognize where a person is looking with a precision of about a centimeter on a mobile phone and 1.7 centimeters on a tablet.
The technology is more expensive and requires hardware support that makes it tricky to embed the feature to smartphones and tablets. The proposed feature will be advantageous while playing games, page navigation and selection without tapping or swiping on the screen.
The project started by gathering collective information on how a user gazes over the phone and from which angles. They had developed an app called GazeCapture for this purpose, and recorded the movements in varied environments.
The researchers used the information from GazeCapture to train software called iTracker. The front camera of your handset captures your face and the software considers your head movement and eyes to figure out where your gaze is focused on the screen.
Aditya Khosla, a graduate student at MIT says, “About 1,500 people have used the GazeCapture app so far,” and added that if the researchers are able to get data from 10,000 people they can reduce the bugs in the software.