Skip to main content

Will computerized voices ever sound human?

Sounding robotic has never been a compliment, but with the right amount of tinkering, computer scientists and engineers are hoping that this may soon change. Computerized voices haven’t quite hit the mark yet on sounding … human, and it’s the subject of one of the tech industry’s latest major efforts. As an increasing number of devices begin speaking to us — from Apple’s Siri to Amazon’s Alexa to our GPS system, it’s becoming increasingly important for machines to have voices we actually want to listen to.

As the New York Times reports, the relatively new focus area of “conversational agents” in the little-understood field of human-computer interaction design, seeks to build programs that understand language and are also able to respond to commands. Today, it is impossible for a computer’s voice to be rendered indistinguishable from that of a human’s. At least, not for anything more complex than offering short bits of information — whether it’ll rain, for example, or when to turn left.

Recommended Videos

Part of the issue lies in “prosody,” which is the capacity to correctly enunciate or stress certain syllables — saying words the way an actual human would. And of course, there’s also the uniquely human ability to add emotion into pronunciation. After all, we don’t always say “good” or even “left” in the same way. Machines, on the other hand, have yet to master that nuance.

“The problem is we don’t have good controls over how we say to these synthesizers, ‘Say this with feeling,’” Scottish computer scientist and Carnegie Mellon professor Alan Black told the Times. And it may still be some time before we’re actually able to do this at all.

But that might not be a bad thing, some say. “Jarring is the way I would put it,” Brian Langner, senior speech scientist at digital speech company ToyTalk, said about having machines sound too much like humans. “When the machine gets some of those things correct, people tend to expect that it will get everything correct.”

So no, you probably won’t be able to get Siri to sound like your mom anytime soon. But you may want to enjoy that inability while you can.

Lulu Chang
Fascinated by the effects of technology on human interaction, Lulu believes that if her parents can use your new app…
World’s most advanced robotic hand is approaching human-level dexterity
robot hands getting better holding pen

 

Remember when the idea of a robotic hand was a clunky mitt that could do little more than crush things in its iron grip? Well, such clichés should be banished for good based on some impressive work coming out of the WMG department at the U.K.’s University of Warwick.

Read more
This basic human skill is the next major milestone for A.I.
Profile of head on computer chip artificial intelligence.

Remember the amazing, revelatory feeling when you first discovered the existence of cause and effect? That’s a trick question. Kids start learning the principle of causality from as early as eight months old, helping them to make rudimentary inferences about the world around them. But most of us don’t remember much before the age of around three or four, so the important lesson of “why” is something we simply take for granted.

It’s not only a crucial lesson for humans to learn, but also one that today’s artificial intelligence systems are pretty darn bad at. While modern A.I. is capable of beating human players at Go and driving cars on busy streets, this is not necessarily comparable with the kind of intelligence humans might use to master these abilities. That’s because humans -- even small infants -- possess the ability to generalize by applying knowledge from one domain to another. For A.I. to live up to its potential, this is something it also needs to be able to do.

Read more
Zoox recalls robotaxis after Las Vegas crash, citing software fix
zoox recall crash 1739252352 robotaxi side profile in dark mode

Amazon's self-driving vehicle unit, Zoox, has issued a voluntary safety recall after one of its autonomous vehicles was involved in a minor collision in Las Vegas. The incident, which occurred in April 2025, led the company to investigate and identify a software issue affecting how the robotaxi anticipates another vehicle’s path.
The recall, affecting 270 Zoox-built vehicles, was formally filed with the National Highway Traffic Safety Administration (NHTSA). Zoox said the issue has already been addressed through a software update that was remotely deployed to its fleet.
Zoox’s robotaxis, which operate without driving controls like a steering wheel or pedals, are part of Amazon’s entry into the autonomous driving space. According to Zoox’s safety recall report, the vehicle failed to yield to oncoming traffic while making an unprotected left turn, leading to a low-speed collision with a regular passenger car. While damage was minor, the event raised flags about the system’s behavior in complex urban scenarios.
Establishing safety and reliability remain key factors in the deployment of the relatively new autonomous ride-hailing technology. Alphabet-owned Waymo continues to lead the sector in both safety and operational scale, with services active in multiple cities including Phoenix and San Francisco. But GM’s Cruise and Ford/VW-backed Argo AI were forced to abandon operations over the past few years.
Tesla is also expected to enter the robotaxi race with the launch of its own service in June 2025, leveraging its Full Self-Driving (FSD) software. While FSD has faced heavy regulatory scrutiny through last year, safety regulations are expected to loosen under the Trump administration.
Zoox, which Amazon acquired in 2020, says it issued the recall voluntarily as part of its commitment to safety. “It’s essential that we remain transparent about our processes and the collective decisions we make,” the company said in a statement.

Read more