Skip to main content

Google wants your smartphone to be able to save your eyes

Google wants smartphone cameras to help users with the detection of heart issues and diabetes-related complications that can lead to vision loss. Google says its Automated Retinal Disease Assessment (ARDA) tool has shown promising results at detecting a condition called diabetic retinopathy that can lead to blindness if not treated in time.

The company claims tests done in Thailand have proved ARDA’s accuracy and that it is safe to deploy, especially in areas where eye screenings are not easy due to lack of infrastructure or financial constraints. Google plans to conduct more in-depth tests to see if pictures of a person’s eyes clicked with a smartphone camera can be used to detect diabetes, as well as non-diabetes-related health conditions.

Recommended Videos

A phone’s eye to save your eyes

Google claims early tests have proved that its A.I. is already capable of flagging heart-related issues such as abnormal blood sugar and cholesterol levels by reading pictures of the exterior of a person’s eyes. In addition to deploying a phone’s camera, Google also wants to employ the onboard microphone for health benefits.

A diagram showing a person's eye outlines Google's plan to test using phone camera to detect health issues.
Image used with permission by copyright holder

Essentially, health experts at Google aim to use a phone’s microphone as a stethoscope to record heart sounds. The goal is to help detect issues such as aortic stenosis, a condition that blocks the flow of blood from the heart to the rest of the body. Instruments like a stethoscope or ultrasound machines are usually required to detect such conditions, but Google is testing whether a smartphone’s built-in mics can be used to do the same.

Some progress has already been made

Google has already baked in a feature in its Google Fit app that allows users to measure their respiratory rate, as well as heart rate, using a phone’s camera. To measure the respiratory rate, all that needs to be done is fpr a person to sit in front of the phone, fire up the selfie camera in the Google Fit app, and let A.I. do its magic by analyzing the movement of the torso as one breathes in and out. For checking the heart rate, just place a finger on the rear camera lens.

The company also has a dermatology tool in the pipeline that again banks on A.I. smarts to identify skin, hair, and nail conditions with just a few photos taken from a phone’s camera. Trained to recognize over 280 skin conditions, the A.I. then gives a list of possible ailments. However, it is not intended to self-diagnose problems, and users are still advised to consult with a certified health professional.

Nadeem Sarwar
Nadeem is a tech journalist who started reading about cool smartphone tech out of curiosity and soon started writing…
Google’s Pixel Weather app just got two new features. Here’s how they work
The Pixel Weather app on a Google Pixel 9.

The Pixel Weather app has been the focus of a lot of attention lately as Google revamps the user experience and adds more features. Now, there's more good news: two of those promised functions — the Pollen count card and immersive vibrations — are newly available, at least for some users.

Thanks to "immersive weather vibrations," the Pixel Weather app vibrates to match the animated backgrounds it displays, with intensity levels that mirror the precipitation amount (because it's not just rainfall), according to 9to5Google. Of course, if you don't like the feature, you can disable it in the account menu.

Read more
Google Lens and Google Pay are about to get more helpful for holiday shopping
The new Google Wallet app running on an Android phone.

The holiday season is upon us, and that probably means you’ll be doing a lot of shopping in the coming weeks. Google is doing its part to help make that shopping experience a bit easier, especially if you want to do some in-person shopping rather than online, with some new features hitting Google Lens and Google Pay ahead of the holidays.
Shop better through Google Lens

According to Google, Google Lens performs about 20 billion visual searches each month, and about 20% of those are shopping-related. Today's update helps make Lens more useful by giving you insights tailored to the store you are currently in so you can make informed decisions.

Read more
Samsung is eyeing smart glasses that could shake up the market
A person wearing the Ray-Ban Meta smart glasses.

Samsung is ready to take a stab at another wearable segment, less than a year after introducing its first smart ring. A Shenzhen-based research company, Wellsen XR, shared in an investor note Samsung’s plans to launch smart glasses that could arrive late next year or early in 2026. 

“Samsung Electronics' plan to release AI smart glasses was confirmed earlier this month, and its first production volume is 500,000 units in the third quarter of 2025,” says the note, which was reported by Maeil Business Newspaper.

Read more