I have a vision of a time when I won’t have to slip a phone in and out of my pocket 200 times a day. I’ll issue voice commands, from play this music or TV show and show me the weather tomorrow, to feed the cat an extra snack — and it will happen automatically wherever I am. I won’t have to specify a device; if my request requires audio feedback or some visual representation then the nearest speaker or screen will come to life. If I want to pause it, I’ll just hold my hand up in the air.
This idea of disembodied A.I., sold to us by sci-fi movies and shows like Star Trek, is creeping closer whether you realize it or not. Truth be told, I can already do my example voice commands in my house and they mostly work, but there are issues with the ragtag band of devices required and they frequently act like enemies pushed into a grudging temporary truce. All too often something doesn’t work as expected and you’re back to digging out that glass rectangle and tapping and swiping your way to a solution.
Google is planning to change that. The company just released a video showing a woman interacting with the forthcoming Pixel 4 without touching it. First, the phone unlocks after it recognizes her face — so far, so Apple — but then she waves a hand to switch the song that’s playing. We know this is the fruit of Project Soli, which we first encountered more than four years ago. In a nutshell, it’s miniaturized radar that can fit in a phone and recognize precise finger and hand gestures. It may also allow you to pause, slide volume up and down, and even accurately navigate through menus.
You may be thinking, didn’t Samsung do this with the Galaxy S4 and, more recently, didn’t LG do this with the G8 ThinQ? Kind of — but Samsung’s implementation was awful and LG is using a time-of-flight sensor instead. More importantly, Samsung and LG don’t have Google’s software skills or its wider vision. This isn’t the only move Google is making to free us from the touchscreen.
Tyranny of the touchscreen
As ports, buttons, and bezels melt away, our smartphones are being stripped back to the essentials, but we just can’t get away from that screen. According to one study, we touch our phones an average of 2,617 times a day. We’re conditioned to touch our phones, to prioritize responding to them over the actual person in front of us; sometimes we even feel phantom vibrations and react to an imagined notification that never was.
More importantly, Samsung and LG don’t have Google’s software skills or its wider vision.
Unlocking your phone with your face is a friction-less experience and it’s one of those things that you get used to very quickly and miss if it’s taken away. We’ve seen gestures before, but they’ve always been flaky in implementation and limited in utility. I’m optimistic that Google will elevate this idea to the next level. Both are important steps that reduce our need to touch our phones, but it’s only by combining them with voice that we can really change the way we interact.
Which brings us to another major reason that Google is perfectly placed to deliver a hands-free future: Google Assistant. It’s by far the most capable, feature-rich voice assistant and it’s improving at the fastest rate.
Talking out loud
The next version of Google Assistant, which will also make its debut on the Pixel 4, allows you to ask strings of questions without saying, “Hey Google,” in between. The data has been shrunk down, so that it can fit on your phone, allowing Google Assistant to respond without having to consult a distant server. It understands the context and most importantly it can complete your command much, much faster than you could possibly ever tap, type, and swipe it.
Whether you want to dictate an email, find photos of your Disneyland trip, or combine tasks that jump in and out of multiple apps, using your voice and Google Assistant is set to become the easiest and fastest way to do it. People have been slow to adopt voice commands, partly because voice assistant abilities are limited and they don’t understand the context, but often simply because it’s faster to do it by hand. When that’s no longer true, the whole thing becomes a much more attractive proposition, and after using voice commands for a while, it’s just like face unlock — where switching to a fingerprint sensor feels like a step backward.
While face unlock and gestures and voice commands are exciting, it would be premature to predict the death of the smartphone. There are times when there is no speaker or screen in your vicinity, so our physical smartphones still serve an important purpose. There are also times when swiping and tapping a screen will be preferable to talk or gestures.
I use Google Assistant throughout my day now. It reads me the news, gives me weather updates, reminds me of appointments, and tells me how to get where I’m going. It also answers my endless buffet of random questions. But as comfortable as I am using it and as excited as I am about it improving further, I still tend to confine my use to times when I’m alone or with family. I’m not going to do it on a busy train or standing in line at the coffee shop and neither should you.
There’s another frustrating stumbling block to a hands-free future which Google is well-placed to overcome.
The missing piece of the puzzle
Competition between companies and stubborn resistance to integration or common standards is robbing us of the sci-fi future we deserve. While Samsung and LG, and Apple too for that matter, are all focused on selling us hardware, Google’s number one ambition is to be the disembodied assistant. I sometimes wonder if the move from the reference device Nexus line to the more consumer-focused Pixel was really just impatience and growing frustration that big manufacturers like Samsung are trying to develop their own assistants, instead of embracing Google’s.
Google is perfectly placed to deliver a hands-free future.
As it stands you can build a hands-free setup, but if you want anything close to the seamless integration between disparate devices, you have to go all-in with a specific manufacturer and become a disciple of Apple or a Samsung devotee. The trouble with that is that no single manufacturer offers the best of everything, so you have to compromise somewhere.
Amazon managed to sneak in the backdoor by allowing anyone and everyone to integrate Alexa, but the odds are against it winning out as the unifying force in tech because it completely missed the boat on smartphones. Google is in pole position, with smartphones sewn up and few obstacles to its expansion into smart home tech. Google may debut these new hands-free features on the Pixel 4, but unlike its competitors, the longer-term plan will see them roll out on any phone that’s capable of taking advantage – the kicker is that Google comes along with them.
- How to turn off Google Assistant
- Operation Shush: The rise and fall of Samsung’s unloved Bixby assistant
- How to use Bixby
- How to take a screenshot on a Samsung Galaxy S20 and other Android phones
- From Android 1.0 to Android 10, here’s how Google’s OS evolved over a decade