Skip to main content

With voice and gestures, Google’s Pixel 4 takes us closer to a hands-free future

pixel 4 gestures
Google

I have a vision of a time when I won’t have to slip a phone in and out of my pocket 200 times a day. I’ll issue voice commands, from play this music or TV show and show me the weather tomorrow, to feed the cat an extra snack — and it will happen automatically wherever I am. I won’t have to specify a device; if my request requires audio feedback or some visual representation then the nearest speaker or screen will come to life. If I want to pause it, I’ll just hold my hand up in the air.

This idea of disembodied A.I., sold to us by sci-fi movies and shows like Star Trek, is creeping closer whether you realize it or not. Truth be told, I can already do my example voice commands in my house and they mostly work, but there are issues with the ragtag band of devices required and they frequently act like enemies pushed into a grudging temporary truce. All too often something doesn’t work as expected and you’re back to digging out that glass rectangle and tapping and swiping your way to a solution.

Google is planning to change that. The company just released a video showing a woman interacting with the forthcoming Pixel 4 without touching it. First, the phone unlocks after it recognizes her face — so far, so Apple — but then she waves a hand to switch the song that’s playing. We know this is the fruit of Project Soli, which we first encountered more than four years ago. In a nutshell, it’s miniaturized radar that can fit in a phone and recognize precise finger and hand gestures. It may also allow you to pause, slide volume up and down, and even accurately navigate through menus.

Project Soli. Image used with permission by copyright holder

You may be thinking, didn’t Samsung do this with the Galaxy S4 and, more recently, didn’t LG do this with the G8 ThinQ? Kind of — but Samsung’s implementation was awful and LG is using a time-of-flight sensor instead. More importantly, Samsung and LG don’t have Google’s software skills or its wider vision. This isn’t the only move Google is making to free us from the touchscreen.

Tyranny of the touchscreen

As ports, buttons, and bezels melt away, our smartphones are being stripped back to the essentials, but we just can’t get away from that screen. According to one study, we touch our phones an average of 2,617 times a day. We’re conditioned to touch our phones, to prioritize responding to them over the actual person in front of us; sometimes we even feel phantom vibrations and react to an imagined notification that never was.

More importantly, Samsung and LG don’t have Google’s software skills or its wider vision.

Unlocking your phone with your face is a friction-less experience and it’s one of those things that you get used to very quickly and miss if it’s taken away. We’ve seen gestures before, but they’ve always been flaky in implementation and limited in utility. I’m optimistic that Google will elevate this idea to the next level. Both are important steps that reduce our need to touch our phones, but it’s only by combining them with voice that we can really change the way we interact.

Which brings us to another major reason that Google is perfectly placed to deliver a hands-free future: Google Assistant. It’s by far the most capable, feature-rich voice assistant and it’s improving at the fastest rate.

Google

Talking out loud

The next version of Google Assistant, which will also make its debut on the Pixel 4, allows you to ask strings of questions without saying, “Hey Google,” in between. The data has been shrunk down, so that it can fit on your phone, allowing Google Assistant to respond without having to consult a distant server. It understands the context and most importantly it can complete your command much, much faster than you could possibly ever tap, type, and swipe it.

Whether you want to dictate an email, find photos of your Disneyland trip, or combine tasks that jump in and out of multiple apps, using your voice and Google Assistant is set to become the easiest and fastest way to do it. People have been slow to adopt voice commands, partly because voice assistant abilities are limited and they don’t understand the context, but often simply because it’s faster to do it by hand. When that’s no longer true, the whole thing becomes a much more attractive proposition, and after using voice commands for a while, it’s just like face unlock — where switching to a fingerprint sensor feels like a step backward.

Google Assistant
Julian Chokkattu/Digital Trends

While face unlock and gestures and voice commands are exciting, it would be premature to predict the death of the smartphone. There are times when there is no speaker or screen in your vicinity, so our physical smartphones still serve an important purpose. There are also times when swiping and tapping a screen will be preferable to talk or gestures.

I use Google Assistant throughout my day now. It reads me the news, gives me weather updates, reminds me of appointments, and tells me how to get where I’m going. It also answers my endless buffet of random questions. But as comfortable as I am using it and as excited as I am about it improving further, I still tend to confine my use to times when I’m alone or with family. I’m not going to do it on a busy train or standing in line at the coffee shop and neither should you.

There’s another frustrating stumbling block to a hands-free future which Google is well-placed to overcome.

The missing piece of the puzzle

Competition between companies and stubborn resistance to integration or common standards is robbing us of the sci-fi future we deserve. While Samsung and LG, and Apple too for that matter, are all focused on selling us hardware, Google’s number one ambition is to be the disembodied assistant. I sometimes wonder if the move from the reference device Nexus line to the more consumer-focused Pixel was really just impatience and growing frustration that big manufacturers like Samsung are trying to develop their own assistants, instead of embracing Google’s.

Google is perfectly placed to deliver a hands-free future.

As it stands you can build a hands-free setup, but if you want anything close to the seamless integration between disparate devices, you have to go all-in with a specific manufacturer and become a disciple of Apple or a Samsung devotee. The trouble with that is that no single manufacturer offers the best of everything, so you have to compromise somewhere.

Amazon managed to sneak in the backdoor by allowing anyone and everyone to integrate Alexa, but the odds are against it winning out as the unifying force in tech because it completely missed the boat on smartphones. Google is in pole position, with smartphones sewn up and few obstacles to its expansion into smart home tech. Google may debut these new hands-free features on the Pixel 4, but unlike its competitors, the longer-term plan will see them roll out on any phone that’s capable of taking advantage – the kicker is that Google comes along with them.

Simon Hill
Former Digital Trends Contributor
Simon Hill is an experienced technology journalist and editor who loves all things tech. He is currently the Associate Mobile…
4 ways that Google Pixel phones can defeat Samsung in 2025
The Google Pixel 9 Pro XL lying by a plant.

When it comes to phones, it would be easy to consider Samsung as the best simply because of its sheer scale. However, look beneath the surface, and you’ll see a bubbling sense of competition. Samsung has failed to innovate and increase its healthy advantage, and the door is open for at least one company to provide a sustained challenge.

Google wants to be that company, and while I think there are better phones in certain categories, Google is the only phone maker that can compete with Samsung at every level. Samsung’s success comes from an ability to spend lavishly to reach its customers, and Google is the only company that can operate at that scale.

Read more
Android 16 adds a new way to use the Google Pixel 9’s fingerprint sensor
Pixel 9 Pro in Rose Quartz.

Biometric security — the ability to unlock your phone with your fingerprint or face — is an amazing feature, but you often have to turn on the phone's screen before you can use it. That's because many fingerprint sensors are optical and need light in order to work. Fortunately, Android 16 will make it so that you can open your Pixel 9 without turning your phone screen on at all (while also avoiding the groan that comes from searing your eyes.)

The feature was noted in the Android 16 Developer Preview 2, or DP2, by 9to5Google. The findings imply that this only applies to the Google Pixel 9 series because while it does appear in the Settings search on the Pixel 8 Pro, there's no option to enable it. This is likely due to the Pixel 9's ultrasonic fingerprint scanner; the improved hardware doesn't require light to use it.

Read more
Future Samsung phones may steal this great Google Pixel feature
Someone holding the Samsung Galaxy S24 FE with its screen turned on.

A lot has been said about the impressive capabilities of our phone cameras when it comes to capturing photos. However, they are also quite effective at recording videos. Unfortunately, unwanted background noise can often be captured when filming.

The Audio Magic Eraser is a valuable feature that leverages artificial intelligence to eliminate unwanted noise from recorded videos. While this tool is currently exclusive to Google Pixel 8 and later models, such as the Google Pixel 9 Pro XL, it now looks like a similar tool could be coming to Samsung devices.

Read more