Skip to main content

MIT researchers develop eye-tracking program for your smartphone

According to a new paper, researchers at the University of Georgia and the Massachusetts Institute of Technology’s Computer Science and Artificial Intelligence Laboratory have developed a piece of software capable of turning almost any smartphone into an eye-tracking device.

Inspired by what MIT graduate student Aditya Khosla calls a “chicken-and-egg loop,” regarding eye-tracking applications for smartphones, researchers teamed up to bring together the information needed to develop such a program. In speaking to MIT News, Khosla said:

Recommended Videos

“Since few people have the external devices, there’s no big incentive to develop applications for them. Since there are no applications, there’s no incentive for people to buy the devices. We thought we should break this circle and try to make an eye tracker that works on a single mobile device, using just your front-facing camera.”

In order to create such a program, the team of engineers and researchers first needed data, the key ingredient in creating the brain of the program, a machine learning algorithm.

To do this, the researchers sought out strength in numbers with a little crowdsourcing from Amazon’s Mechanical Turk program. Previous research in this area was limited to only 50 or so subjects, who were asked to come into a lab to complete the tests.

A collection of the subjects who participated in the data collection stage of development
A collection of the subjects who participated in the data collection stage of development Image used with permission by copyright holder

Khosla and his team far exceeded previous research by sampling the eye movements of 1,500 mobile users, all of whom participated with the help of GazeCapture, a custom iOS application that combined on-screen animations with use of the front-facing camera.

Diagram showing how the data is used to predict eye movement in subjects
Diagram showing how the data is used to predict eye movement in subjects Image used with permission by copyright holder

The data captured from 800 of the 1,500 subjects was used to create iTracker, the first iteration of the eye-tracking system, which had a margin of error of only 1.5 centimeters. Since then, the researchers have analyzed the remaining 700 subjects and further improved the margin of error, shrinking it down to just 1 centimeter.

The team is hopeful that further testing will be able to reduce that even more, down to a margin of error of a half-centimeter, the key number that will make the program commercially viable in the eyes of Khosla.

As for the practical applications for such technology, it can be used for almost anything more conventional eye-tracking methods are used for, from helping doctor’s discover and keep tabs on various illnesses, to market research and analytics. The key difference here is that this new method of tracking is far less invasive and exorbitantly less expensive than alternative options, making it easier than ever to capture larger quantities of data.

Below is the paper in its entirety:

It’ll be interesting to see what the future holds. The team behind it will be presenting the paper on June 28 at the Computer Vision and Pattern Recognition conference in Las Vegas.

Gannon Burgett
Former Digital Trends Contributor
iPhone 7 owners are getting $200 in class action lawsuit, and here’s how you can track yours
iPhone 7 and iPhone 7 Plus.

Settlement payout from the iPhone 7 class action lawsuit against Apple are starting to roll out. Those who participated in the class action lawsuit have started to receive payments, with amounts varying based on whether you spent any money on repairing the iPhone 7 or the iPhone 7 Plus.

Some of the co-applicants in the lawsuit have started to receive around $200 as part payment from the $35 million settlement, 9to5Mac reported. While the payout is less than the maximum of $350 initially approved by the court, it should still feel satisfactory to the appellants.

Read more
Microsoft is making a major change to using your iPhone in Windows
The Dell XPS 13 on a table with the Start Menu open.

In a recent Windows Insider Blog post, Microsoft announced it's adding the option for iPhone users to access their phones from the Start menu. Thanks to a special widget next to the Start menu, when you connect your phone, you can see data such as notifications, battery indicators, recent contacts, connection status, and more.

To enjoy this feature, you must use the recent Windows 11 preview build from the Dev and Beta channels, and you must be a Windows Insider. You must also update the Phone Link app to version 1.24121.30.0 or higher, have a Microsoft account, and have a PC that supports Bluetooth LE. Microsoft said it does not support PCs running Pro Education or Education SKUs. Even if this doesn't affect you, the update is rolling out in phases, so reaching your PC might take some time if you don't already have it.

Read more
There’s a secret way to free up 7GB of storage in your iPhone
iPhone 16 Pro homescreen with an Apple Intelligence Notification Summary

If you constantly run out of space on your iOS device, you could free up as much as 7GB of data just by disabling Apple Intelligence. The iOS 18.3, iPadOS 18.3, and macOS Sequoia 15.3 updates released this past week enabled Intelligence by default, but it isn't a requirement — and if you don't want to use it or would prefer the free space, you can turn it off.

On average, Apple Intelligence requires around 7GB of storage space regardless of platform. The reason for this relatively large requirement -- especially when the operating system itself takes up a decent amount of storage -- is because Apple runs the Intelligence features on-device. This ensures greater levels of security, but it also means less memory. This is a 7GB requirement per device, according to MacRumors.

Read more