Amazon wants to minimize the impediments its customers face in using its services. A new tool for Alexa skills that is described in Amazon’s developer blog enables the smart voice assistant to answer customers’ natural language questions. Soon users won’t need to query using the prescribed “Alexa-speak.”
The company’s goal is to make Alexa friction-free. Similar to one-click ordering, Amazon Prime, and Amazon Go, removing barriers to customer interaction with Alexa will encourage more engagement.
Once a sufficient proportion of developers implement the new CanFillIntentRequest interface in their skills, you’ll be less likely to hear Alexa say, “Hmmm, I don’t know that one.”
The CanFulfillIntentRequest interface includes information about a skill’s ability to fulfill specific categories of requests. Alexa uses the information from skills with a machine-learning model to choose the best skill for the customer request.
When Alexa can discover, enable, and launch new skills as required, customers will be able to ask questions in natural language with sufficient context but without calling for a specific skill.
Amazon used the following question to explain the new function:
For example, if a customer asks, “Alexa, where is the best surfing today near Santa Barbara?” Alexa can use CanFulfillIntentRequest to ask surfing skills whether they can understand and fulfill the request. A surfing skill with a database of California beaches might be able to both understand and fulfill the request, while one with a database of Hawaiian beaches might only be able to understand it. Based on these responses, Alexa would invoke the skill with the database of California beaches for the customer.
Alexa-fluent users who already command Alexa to employ specific skills, routines, or custom commands to find answers or perform tasks won’t need to unlearn the structures. In time, however, most people will likely speak to Alexa in more natural language.
In an earlier post on the developer’s blog, Ruhi Sarikaya, director of applied science in Amazon’s Alexa Machine Learning division, wrote about other capabilities the company are working on to make it easier for customers to use and engage with Alexa.
Context carryover and a new memory feature are on the Alexa roadmap, according to Sarikaya. Context carryover will enable Alexa to understand customer intent across subject domains. For example, “Alexa, how’s the weather in Portland?” followed by “How long does it take to get there?” moves from weather to traffic domains. Today you would have to specify Portland in the second request. With content carryover, you won’t.
When Alexa has memory, which will be added first in the U.S., Alexa will “remember” any arbitrary information on command. For example, if you say, “Alexa, remember that Sean’s birthday is June 20th.” Alexa will reply: “Okay, I’ll remember that Sean’s birthday is June 20th.”
These additional functions and features may strike you as big deals or as small steps. Referring to the new Alexa capabilities, Sarikaya wrote, “We’re on a multi-year journey to fundamentally change human-computer interaction, and as we like to say at Amazon, it’s still Day 1.”
- The 25 best Alexa skills
- How to hear and delete Alexa conversations
- What is Amazon’s Alexa, and what can it do?
- How to play Apple Music on an Alexa device
- Can Alexa call 911? How to set up Alexa for emergencies