At an Amazon conference last week. Conversations is a module of the Alexa Skills Kit that integrates Alexa voice applications with experience that helps you perform complex tasks.
Alexa Conversations is perhaps the most intriguing and informative speech of the Amazon in recent years. Conversations will create skills with fewer lines of code. It also eliminates the need to understand the many different ways that a person can request an action, since the recurrent neural network automatically generates a conversation flow.
For users, Alexa Conversations will make it easier to perform tasks that require combining multiple skills, and reduce the number of interactions needed to complete tasks such as booking a movie ticket or ordering food.
Amazon Vice President David Limp views the conversation as a big step forward. “This is a kind of sacred grail of the science of voice: how to conduct a chain conversation when you do not think about it in a programmed way, from beginning to end. (…) I think that a year or two ago I would say that we don’t see a way out of this tunnel, but now I think that science shows us that (even if) it will take us years to have more and more talk. (…) This breakthrough is very important for us, the tip of the iceberg, – said Lame.
Evening and informal conversation begins.
The Alexa Conversations journey begins for the first time with a nightly script. During a live demonstration last week in Re: March, a woman buys a movie ticket, orders dinner, and begins a walk in about a minute. (Tickets for Atom, Uber and OpenTable are the first partners of Alexa Conversations.)
The evening scenario is the first of what Amazon believes will be a collection of grouped events to achieve a goal.
One day, conversations can fuel more complex tasks, such as a weekend trip script presented by Limp last fall. Limp's speech on the Holy Grail is a transformation that all the major technology companies in the world, with the help of an artificial intelligence assistant, are trying to develop with voice interface assistants by performing basic tasks for an assistant who can perform complex tasks and complex ones.
Two years ago, the current and former leaders of Alexa, Google Assistant, Siri and Cortana, co-founder of Vivana and one of the creators of Siri Adam Chayer, a man who wondered about the future of voice assistants from the 1990s, wondered aloud about the future. An assistant who can help you with your sister's wedding planning script. (to improve their assistant Bixby AI.)
At the event, Chayer explained how and how important it is to align the services of senior artificial intelligence assistants with an ecosystem of third-party voice applications. “I don’t want to remember that the car assistant, the television system, Alex vs. Cortana, can do … too much. I hope that the assistant from each device will be able to access each service without distinction between what is important and what is a third party, ”said Cheyer.
Amazon has been working in this direction, starting with reducing the number of interactions needed to work with Alexa. Last fall, you can participate in several interactions without uttering the word "Alex" once. Using Conversations, the number of interactions required to run the night scenario is reduced from 40 to about a dozen two-way interactions.
To reinforce the notion that Alex is capable of natural communication, the AI assistant has learned to whisper when a person whispers and can now respond to an unnamed skill call. This means that you can say “drive a car” instead of first starting the skill, saying “Alexa, start the skill Uber”.
Create a perception of intelligence
Amazon is not the only one who helps an assistant to have a conversation similar to the one you expect from another person. You have no right to keep talking about something. Alexa Conversations also gives an Amazon AI assistant the ability to do business or trade like Duplex and Google. and debuted last month. Microsoft also provides similar information to workplace assistants, a launch acquired in 2018.
This underlines the fact that more complex tasks require more than just sharing, said product manager Alexa Alexa Sanju Pancholi. “When you start solving more complex problems, there is more information sharing, every time there are more solutions, and therefore many actions can occur in the context of the same conversation. with different people, "he said.
He held a re: March session to present Alexa Conversations to companies and developers, and spoke about an assistant who was able to “solve their needs for products and services as soon as they understand what they are. need to ".
To be considered smart, Amazon believes that a helper must understand natural language, remember context and make proactive predictions, characteristics that can prove that a helper is smart enough to perform more complex tasks. Eliminating the need to repeat is also important.
“If you force (clients) to repeat information again and again, you force them to believe that they are talking to a stupid entity, and if this is the relationship that you build with them from the very beginning, there is most likely they will never delegate tasks more of higher order, because they will never think that you are capable of solving problems of a higher order for them, ”he said.
According to Pancholi, the Alexa Skills store has over 90,000 skills, and 325,000 developers used the Alexa Skills kit. Alexa is now available on 100 million devices.
Pancholi told developers that potential next steps for Alexa Conversations scenarios could include skill sets that would help people watch content at home, deliver food or buy a gift.
In an interview with VentureBeat, Alex’s chief scientist, Rohit Prasad, declined to give details of use cases that may be considered, but believes that this may include ways of planning weekends. end. Prasad, who led Alexa AI's language understanding and emotional intelligence initiatives, said Conversations were designed to link the voice ecosystem to increase the involvement of Alexa skills and skills. ,
“The developer’s suggestion is that you start to get more traffic and more discoveries as we develop cross-skills, for example, the fact that at night you now need to order a taxi.” Thus, Uber and Lyft will see more traffic and more commitment from customers. Thus, and in addition, the discovery of skills will naturally occur as part of this. Therefore, this is an important element of our value proposition in this case.
Prasad said that even – voice application templates for personal and user skills in Echo – may soon include conversations. Individual skill sets for the home can, for example, help children do multi-step exercises, do housework, and help you count down important dates.
The first active functions of Alexa, Hunches, which provide reminders of events and actions of the smart home, and Alexa Guard to detect the noise of broken glass or a smoke detector, were launched last fall. ,
Conversations can also sometime become an integral part of the Amazon voice assistant if the module is integrated, which has been added.
Brands, independent developers and
In January 2018, Amazon negotiated with brands such as Procter & Gamble and Clorox to enter into contracts to promote their products to Alexa users.
Steve Rabuchin, vice president of Amazon Alexa, insists that companies or developers are not able to get priority from the Alexa voice application recommendation system, except for the ecosystem Alexa voice applications may face another problem. Because voice applications often work without a display, packaging skills mean that some skills may inevitably be omitted or not classified.
This is especially important for voice applications. Unlike searching apps on a smartphone, Alexa offers a voice recommendation tool that serves only three skills at a time.
“Our vision is not to finish where these are only the largest or most popular brands,” said Rabukhin in an interview with VentureBeat. "Our most popular skills are mostly independent developers, individual developers."
Amazon's skill recommendation mechanism responds when you say things like “Alex, give me a lift”, recommend voice applications based on indicators like engagement levels that Amazon started paying for them. developers in 2017.
The conversations will include quality indicators of skills, such as user ratings, levels of involvement, factors such as regional relevance, whether the skill works on the intelligent display, and personal information can also decide which skills to display during Alex's conversation.
“I think we have a good book to read, I don’t think this is a great book to read, but this is a great book to start with,” said Prasad.