Skip to main content

Which A.I assistant is smarter than the others? One company found the answer

Loup Ventures tests smart speakers and digital assistants each year to get a sense of their practical uses and to see who gets the highest score. This year’s tests are complete and the results are in. Once again, Google Assistant had the highest score, followed by, in order from highest to lowest, Siri, Alexa, and Cortana.

The test consists of 800 questions in five categories: Local knowledge, commerce, navigation, information, and commands. Each smart speaker has two scores: The number of questions correctly understood and the number of correct responses.

The table below summarizes this year’s test results for smart speakers. Loup Ventures differentiates between testing digital assistants on smart speakers and on smartphones. Google Assistant on Android phones and Siri on iPhones might have different relative scores than the same assistants on smart speakers.

Smart Speaker,
Voice Assistant
Answered Correctly  Understood the Question
Google Home,
Google Assistant
87.9% 100%
Apple HomePod,
74.6% 99.6%
Amazon Echo (2nd Gen),
72.5% 99%
Harman Kardon Invoke,
Microsoft Cortana
63.4% 99.4%

Even though Google Home understood all 800 questions in the test, the other smart speakers weren’t far behind. The HomePod only misunderstood three questions, and the Invoke and Echo misunderstood five and eight questions, respectively. Loup Ventures noted that most of the very few misunderstood questions contained proper names, such as towns or restaurants. From the perspective of understanding questions, therefore, you can safely assume these smart speakers will understand pretty much everything you ask or say to them.

The smart speakers’ scores on giving correct responses varied significantly. One area that drew the testers’ particular interest was how well the smart speakers answered questions related to commerce.

Loup Ventures noted that while “conventional wisdom suggests Alexa would dominate, Google Assistant correctly answers more questions about product information and where to buy certain items than Amazon’s Alexa, which tended to offer items to purchase.

To make the point, the testers gave the following example from the test.

  • Question: “How much would a manicure cost?”
  • Alexa: “The top search result for manicure is Beurer Electric Manicure & Pedicure Kit. It’s $59 on Amazon. Want to buy it?
  • Google Assistant: “On average, a basic manicure will cost you about $20. However, special types of manicures like acrylic, gel, shellac, and no-chip range from about $20 to $50 in price, depending on the salon.”

Loup Ventures also noted that because Apple HomePod and Google Home each have proprietary map data, their performance on questions about local topics and navigation stood “head and shoulders above the others.”

Editors' Recommendations