Adobe adds voice analysis as mobile searches move from text to speech

Google Home
Chatting with Alexa, Siri, or Google Assistant may be a more natural way to search, but, for businesses, voice assistants are tougher to analyze over typed text. That’s quickly changing as Adobe, on June 29, announced new capabilities for brands to pull metrics data from voice assistants, when using Adobe Analytics Cloud. The new Adobe Analytics Cloud voice analysis allows businesses to create seamless user experiences between mobile apps and voice assistant apps.

According to Colin Morris, director of product management, Adobe Analytics Cloud, the company found that voice accounted for 20 percent of mobile queries.

“Customer journey is changing,” Morris told Digital Trends. “All these touch points are heading into new arenas, and if they aren’t collecting [data], then what they can serve in personalization is diminished. We are trying to help them drive revenue, but also offer a cohesive experience.”

The software is powered by Adobe Sensei, the company’s artificial intelligence framework, and allows businesses to see how customers interact with voice assistants in order to improve the experience overall. For example, Wynn Las Vegas will soon add Alexa into every hotel room. With Adobe Analytics Cloud’s new features, the Wynn can use the data to better anticipate guests’ needs, like checking out and requesting concierge, create personalized offers, and increasing the use of the Alexa integration overall.

The tool allows businesses to recognize users — for example, a food app can use the platform to make a seamless experience between the Alexa app and a smartphone app. Like with Wynn Las Vegas, the program also works when the company provides the voice assistant, like giving guests a pass phrase to unlock a special offer to sharing details about a loyalty program.

“One of the most important trends in modern technology is how quickly consumers adopt new ways of interacting with content, as we’ve seen with mobile and video,” Bill Ingram, Adobe’s vice president for Experience Cloud, said in a statement. “We expect a similar trajectory with voice enabled devices. In the same way Adobe has shaped web, mobile and customer analytics, Adobe Analytics Cloud will enable brands of all sizes to extend voice data insights across the entire customer journey.”

Adobe Analytics Cloud is now compatible with Amazon Alexa, Apple Siri, Google Assistant, Microsoft Cortana, and Samsung Bixby. Sales of these devices have grown 39 percent year-over-year, Adobe says.

Analyzing voice interactions is more complex than picking up typed communication, Adobe says. By training the AI system to recognize both intent (“play me a song”) and the parameters (“from the Beatles”), Adobe was able to give Experience Cloud the ability to analyze customer interactions with voice tools. Combined with its text-based analytics, Adobe is providing the type of deep correlation results – with 95 percent accuracy – that used to require data scientists months to process, Morris said.

“It took time and didn’t scale very well,” Morris said. Now with the inclusion of voice analysis, Adobe allows data scientists can do more important work, while the layman can make their own search queries and receive the results that used to require painstaking man hours.

“It’s the democratization of data,” Morris added.

For the consumer, the thought of our voice assistants capturing and analyzing our data may seem like an invasion of privacy, but Morris said privacy is important, and that users can choose to opt out and be forgotten.

Adobe Analytics Cloud voice analysis is now among the platform’s multiple tools for analyzing data all in one place. The software, part of Adobe Experience Cloud, allows businesses to find marketing and customer analytics all in one location.

Editors' Recommendations