Skip to main content

Meet Google Duplex, AI that sounds human when it calls to book appointments

google-io-2018-stage-3-2
Dan Baker/Digital Trends

For some people, the reason for choosing one hair salon over another is based solely on its ability to book an appointment online. At the Google I/O developer conference, CEO Sundar Pichai explained how its Google Duplex technology can help the phone-shy avoid having to actually speak to someone to make an appointment.

Around 60 percent of U.S. businesses don’t have online booking systems, according to Google. It’s been working on a way for users to give a time and date to Google Assistant, which can then make the call and set up an appointment. “It brings together all our investments over the years in natural language understanding, deep learning, text to speech,” said Pichai.

Duplex scheduling a hair salon appointment

It can be an incredibly complex interaction. Using an actual call between Google Assistant and a hair salon, Pichai showed how it would work. The user asked the Assistant to make an appointment on Tuesday morning “anytime between 10 and 12.” During the call, Google Assistant sounded convincingly human, tossing in “ums” and “uhs” in a woman’s voice. “Give me one second,” the salon employee said. “Mm-hmm,” the Assistant responded. After some back and forth, the employee made an appointment for Lisa at 10 a.m. and sent a confirmation notification to the phone’s screen. There was no appointment available at noon, the time the A.I. initially requested.

Duplex calling a restaurant

“A long-standing goal of human-computer interaction has been to enable people to have a natural conversation with computers, as they would with each other,” wrote Google Principal Engineer Yaniv Leviathan and Vice President of Engineering Yossi Matias, in a blog post announcing the technology.

Google appears to be succeeding.

For the next call, Google Assistant had to overcome a bit of a language barrier. The restaurant employee thought it was calling to make a reservation for seven people, not for the seventh. When she informed the A.I. caller that the restaurant didn’t reserve tables for parties of fewer than seven, the A.I. asked how long the wait time would be. When the employee assured the Assistant it wouldn’t be a long wait on a Wednesday, it replied, “Oh, I gotcha, thanks.”

This technology isn’t quite ready at the moment, but Google is rolling out what Pichai called an “experiment.” During holidays, restaurants and businesses often have different hours. Google plans to use its Assistant to make one phone call to a bunch of businesses, then update their holiday hours for web searches. That way, restaurants can avoid getting dozens of calls about whether or not they’re open on Memorial Day, for example. Pichai said this experiment would start in a few weeks, so if it’s not working for that holiday, check again on the Fourth of July.

As exciting as it may be to have a remarkably human-sounding bot doing all your chores on your behalf, the Duplex announcement catalyzed quite a bit of controversy around how it would identify itself on the phone. After all, wouldn’t you want to know if you were speaking to a bot rather than a live being? Now, Google has weighed in, noting that it will include disclosures in the feature.

“We understand and value the discussion around Google Duplex — as we’ve said from the beginning, transparency in the technology is important,” a Google spokeswoman said in a statement. “We are designing this feature with disclosure built-in, and we’ll make sure the system is appropriately identified. What we showed at I/O was an early technology demo, and we look forward to incorporating feedback as we develop this into a product.”

Google further clarified that Duplex will “identify itself as the Google assistant” when making phone calls on behalf of its human users. As Bloomberg reports, the tech giant informed its employees at a weekly staff meeting that the AI bot would “inform people on the phone that the line is being recorded in certain jurisdictions.” It’s not entirely clear what those jurisdictions are, and Google has not commented further on the matter.

It’s still unclear as to whether or not the businesses used in Google’s demos were alerted to the fact that they were speaking to a bot, or that they were being recorded. And in certain states, recording conversations without both speaking parties’ consent and knowledge is against the law.

Updated on May 20: Google says Duplex will alert people when their conversations are being recorded, and that they’re speaking to a bot. 

Editors' Recommendations

Lulu Chang
Former Digital Trends Contributor
Fascinated by the effects of technology on human interaction, Lulu believes that if her parents can use your new app…
Google’s Pixel 3a and Pixel 3a XL are official: Here’s everything you need to know
Mad U.K. retailer slashes 99% off Google Pixel 3a price for a few lucky buyers
Pixel 3 taking a photo

Google’s Pixel phones may have the best cameras on a smartphone, but they’re also expensive. The most recent Pixel 3 and Pixel 3 XL started at $800 and $900, respectively. It’s a big jump from the days of Google’s Nexus phones, which typically cost under $500.

Well, the Pixel flagships are now getting a midrange companion. Google took to the keynote of Google I/O 2019 to announce the Pixel 3a and Pixel 3a XL. Here’s everything you need to know about the phone, and if you want our in-depth impressions, check out our Pixel 3a and 3a XL review.
Updates
U.K. retailer Argos is offering a gigantic, and almost unheard of, 99% discount on the Pixel 3a. This slashes the price from 399 British pounds, to only 3 pounds and 99 pence. We doubt many would pass up on the chance to buy one of Google's newest phones for so little.

Read more
Google Assistant 2.0 isn’t just a minor evolution. It’s a game-changing upgrade
google assistant 2 change the way we use phones feat

Folding devices like the Galaxy Fold and Huawei Mate X represent the next major alteration in phone design, but what about the next, next? What will change the way we interact with our phones, once we're done folding them in half?

Google gave us a teaser during the Google I/O 2019 keynote presentation, as it demonstrated the prowess of Google Assistant when the masses of data it requires to operate is shifted from the cloud to the device. Voice control has been part of our smartphone experience for a while, but the speed, versatility, and accuracy of this advanced system could be a game-changer.
Meet Google Assistant 2.0
What did Google announce? A next generation version of the Google Assistant we currently know and love from our Android phones, Google Nest products, or even Android Auto. Google Assistant uses three complex algorithms to understand, predict, and act upon what we’re saying, which requires 100GB of data storage and a network connection to operate. Google announced it has used deep learning to combine and shrink those algorithmic models down to 500MB — and that means it’ll fit happily on our phones, and stops network latency slowing responses and actions down.

Read more
Google I/O: Android Q will standardize navigation controls and add new gestures
google io 2019 android q navigation and dark mode gesture in

According to a brief mention at the "Supporting mobile usability with Dark Theme and Gestures in Android Q" session at Google I/O 2019, Google is laying down the law where device navigation is concerned and is standardizing navigational controls in Android Q.

One of the best (or worst) parts of Android is how flexible Google's grip on it is. While Google's vision of Android is always open for manufacturers to use as part of its Android One program, most manufacturers opt to create their own version of the Android UI. The introduction of gesture navigation systems has given manufacturers another area to play with, and the result has been a series of different gesture navigation systems across different devices, with varying levels of usefulness.

Read more