Amazon is making a number of changes to the Alexa user interface with the same goal in mind: making it easier to use the virtual assistant. Most notable is the change in how Alexa handles routines that developers can now create and recommend to users, rather than requiring you to manually create your own automations. Alexa is also starting to co-exist with third-party assistants, and Amazon is working to make the most important commands like “Stop!” – work no matter what awakening word you use.
Amazon made these announcements during its Alexa Live developer event, where the company announced a host of other new Alexa features, mostly focused on developers. They can add purchases to skills, maintain Matter and other smart home systems more easily, connect to an easier setup process, and better understand their surroundings.
But Amazon knows that none of Alexa’s bright new features matter much if you can’t find them or figure out how to use them. And rather than creating new user interfaces or smart voice menus, the Alexa team is increasingly leaning towards letting the system do the work for you. “We want to make automation and proactivity accessible to everyone who interacts with Alexa and Alexa-enabled devices because it’s so exciting,” says Aaron Rubenson, VP of the Alexa team.
The transition to Routines is the most obvious example among new announcements. Users can still set up their own routines – “when I say I’m leaving, make sure the stove is off and turn off the lights” and the like – but developers can now embed routines into their skills and offer them to users. based on their activity. “As an example,” Reubenson says, “Jaguar Land Rover uses the Alexa Routines Kit to perform a procedure they call “Good Night,” which will make sure the car is locked, remind customers of the charge level or fuel level, and then also turn on guard mode.” It’s something that a lot of people might like, but few would work to create for themselves, but now they just have to turn it on.
Reubenson says the people who use Routines are some of the most die-hard and consistent users of Alexa, and that he wants those people to still have the pens they need to create their weirdest and wildest automations. “But we also understand that not everyone will take this step,” he says. As Alexa continues to struggle to keep users engaged, adding some proactivity to routines could make them more useful to more people.
Voice assistants have always presented a difficult user interface problem as they don’t offer a range of buttons or icons, but instead are just a blank slate that you can talk or shout with. Over time, the Alexa team has eliminated these frictions, essentially trying to make it impossible to say the wrong thing. It’s part of the mindset behind its multi-assistant support, which allows developers to put their own virtual assistant next to Alexa inside the device. (Amazon’s new partner is Skullcandy, so you can talk into your headphones by saying “Alexa” or “Hey Skullcandy.”)
In the same vein, Amazon is also working on a feature called “Universal Commands” that allows an Alexa-powered device to perform certain important actions no matter what wake-up word you used. For example: you can say “Hey Skullcandy, set the timer for 10 minutes” and the Skullcandy assistant can’t do it, but Alexa can, so Alexa can handle it automatically. Reubenson cited timers and call rejection as equally important things that any Alexa-enabled device should be able to handle, even if you haven’t interacted with Alexa. This feature, according to Reubenson, will appear next year.
Of course, developers will have to implement and use these features in order for them to catch on. Amazon is going out of its way to encourage them to do so: it is changing its revenue-sharing agreement so developers get 80 percent of their revenue instead of 70 percent, and is launching the Skill Developer Accelerator Program, which Rubsenson says “will reward developers for the actions they take.” “. which we know leads to the creation of high-quality and engaging skills based on our entire history.” What the code is for: Amazon pays developers to improve their skills.
However, if Amazon can make it all work, it will take a step towards solving one of the big problems with voice assistants: it’s hard to know what they can do, which is why most users default to music, lights, and timers, which means developers have no reason to invest funds into the platform, which means that users have nothing to do. By simultaneously making the platform more powerful and making the platform do more work on behalf of users, Amazon can force this flywheel to move in the other direction. And you don’t even need help.
#Amazon #automates #Alexa #routines #assistants #universal #commands