Skip to main content

Google demos its smartglasses and makes us hanker for the future

A screenshot from Google's TED Talk on its smartglasses.
TED

At a recent TED talk, Google’s exciting XR smartglasses were demonstrated to the public for the very first time. While we’ve seen the smartglasses before, it has always been in highly polished videos showcasing Project Astra, where we never get a true feel for the features and functionality in the real world. All that has now changed, and our first glimpse of the future is very exciting. However, future is very much the operative word. 

The demonstration of what the smartglasses can do takes up the majority of the 16-minute presentation, which is introduced by Google’s vice president of augmented and extended reality Shahram Izadi. He starts out with some background on the project, which features Android XR at its center, the operating system Google is building with Samsung. It brings Google Gemini to XR hardware such as headsets, smartglasses, and “form factors we haven’t even dreamed of yet.”

Recommended Videos

A pair of smartglasses are used for the demonstration. The design is bold, in that the frames are polished black and “heavy,” much like the Ray-Ban Meta smartglasses. They feature a camera, speaker, and a microphone for the AI to see and hear what’s going on around you, and through a link with your phone you’ll be able to make and receive calls. Where they separate from Ray-Ban Meta is with the addition of a tiny color in-lens display.

Headset and glasses

What makes the Android XR smartglasses initially stand out in the demo is Gemini’s ability to remember what it has “seen,” and it correctly recalls the title of a book the wearer glanced at, and even noted where a hotel keycard had been left. This short-term memory has a wide range of uses, not just as a memory jogger, but as a way to confirm details and better organize time too. 

The AI vision is also used to explain a diagram in a book, and translate text into different languages. It also directly translates spoken languages in real-time. The screen is brought into action when Gemini is asked to navigate to a local beauty spot, where directions are shown on the lens. Gemini reacts quickly to its instructions, and everything appears to work seamlessly during the live demonstration.

Following the smartglasses, Android XR is then shown working on a full headset. The visual experience recalls that of Apple’s Vision Pro headset, with multiple windows shown in front of the wearer and pinch-gestures used to control what’s happening. However, Gemini was the key to using the Android XR headset, with the demonstration showing the AI’s ability to describe and explain what’s being seen or shown in a highly conversational manner. 

When can we buy it?

Izadi closed the presentation saying, “We’re entering an exciting new phase of the computing revolution. Headsets and glasses are just the beginning. All this points to a single vision of the future, a world where helpful AI will converge with lightweight XR. XR devices will become increasingly more wearable, giving us instant access to information. While AI is going to become more contextually aware, more conversational, more personalized, working with us on our terms and in our language. We’re no longer augmenting our reality, but rather augmenting our intelligence.”

It’s tantalizing stuff, and for anyone who saw the potential in Google Glass and have already been enjoying Ray-Ban Meta, the smartglasses in particular certainly appear to be the desirable  next step in the evolution of everyday smart eyewear. However, the emphasis should be on the future, as while the glasses appeared to be almost ready for public release, it may not be the case at all, as Google continues the seemingly endless tease of its smart eyewear.

Izadi didn’t talk about a release date for either XR device during the TED Talk, which isn’t a good sign, so when are they likely to be real products we can buy? The smartglasses demonstrated are said to be a further collaboration between Google and Samsung — the headset is also made by Samsung — and are not expected to launch until 2026, according to a report from The Korean Economic Daily, which extends the possible launch date beyond the end of 2025 as previously rumored. While this may seem a long time away, it’s actually closer in time than the consumer version of Meta’s Orion smartglasses, which aren’t expected to hit stores until late 2027. 

Will it arrive too late? 

Considering the smartglasses shown during the TED Talk seem to bring together aspects of Glass, Ray-Ban Meta, and smartglasses such as those from Halliday, plus the Google Gemini assistant we already use on our phones and computers now, the continued lengthy wait is surprising and frustrating. 

Worse, the overload of hardware using AI, plus the many Ray-Ban Meta copies and alternatives expected between now and the end of 2026 means Google and Samsung’s effort is at risk of becoming old news, or eventually releasing to an incredibly jaded public. The Android XR headset, known as Project Moohan, is likely to launch in 2025.

Perhaps we’re just being impatient, but when we see a demo featuring a product that looks so final, and tantalizing, it’s hard not to want it in our hands (or on our faces) sooner than some time next year. 

Andy Boxall
Andy is a Senior Writer at Digital Trends, where he concentrates on mobile technology, a subject he has written about for…
Google just gave vision to AI, but it’s still not available for everyone
Gemini Live App on the Galaxy S25 Ultra broadcast to a TV showing the Gemini app with the camera feature open

Google has just officially announced the roll out of a powerful Gemini AI feature that means the intelligence can now see.

This started in March as Google began to show off Gemini Live, but it's now become more widely available.

Read more
Five reasons I’m excited for the new Google Pixel 9a
Person holds Pixel 9a in hand while sitting in a car.

Google has consistently ranked among the best smartphones for its affordable devices over the past six years, particularly with its Pixel A series. The Pixel 3a set the trend for major phone manufacturers to provide a compelling experience at half the price of flagship models, intensifying competition in this segment.

In the last three months, we’ve seen Samsung introduce the Galaxy A56 and Galaxy A36, which deliver features from the Galaxy S25 series at a significantly lower price point. Then there’s Apple, which entered the market with the iPhone 16e, priced considerably higher than its rivals. Additionally, Nothing offers the Nothing Phone 3a and Phone 3a Pro, arguably the best phones available at $379 and $459, respectively.

Read more
Google Meet’s new dynamic layouts make your video calls feel less robotic
Google Meets on an HP Chromebook.

Google Meet is getting a smarter, sharper facelift designed to make your next video call feel a lot more personal, and less like a screen full of disjointed tiles.

Google announced a sweeping update to its video calling platform called Dynamic Layouts that attempts to reimagine how caller appear and interact during virtual meetings. It's all part of a push to make hybrid meetings feel more human, with AI-powered visuals and intelligent room detection leading the charge. And hopefully a little less awkward small talk, too.

Read more