Google Assistant

Google Assistant App Actions brings App Control to Android

Voice control comes to Android apps

Google is making it possible to use the voice command “Hey Google” to not just open but also perform specific tasks within Android apps. The feature will be rolled out to all Google Assistant-enabled Android phones, allowing users to launch apps with their voice as well as search inside apps or perform specific tasks.

Google “App Actions” are similar to Alexa Actions. As Google describes it, developers can create “natural and engaging experiences for users by mapping their users’ intents to specific functionality within their apps.” You’ll be able to launch and search within apps, trigger app-specific commands, and create shortcuts for commonly used phrases.

So instead of specifically asking Google Assistant to retrieve the news on Twitter, you can change it to respond to, “Hey Google, what’s going on today?” Google says developers should be able to get their apps working with Google Assistant in less than a day and expects more developers to get on board in “the coming weeks.”

Assistant will also begin suggesting relevant Apps Actions even when the user doesn’t ask for the app by name. So if you ask Assistant about Taylor Swift, for example, your phone will “highlight a suggestion chip that will guide the user to open up the search result in Twitter.”

Premium IPTV in the UK

People do a lot more with their apps beyond simply opening and searching within apps, and Google wants to enable voice commands to those frequent tasks, too. Now you can do things like playing music, starting a run, posting on social media, ordering food, paying back a friend, hailing a taxi —the list goes on and on—all with just your voice.

Google says it has grown its catalogue to include more than 60 intents across 10 verticals, including Finance, Ridesharing, Food Ordering, Fitness and, now, Social, Games, Travel & Local, Productivity, Shopping and Communications, too.

Starting today, you can try doing more using your voice with more than 30 of the top apps on Google Play available in English globally, with more apps coming

Developers

Through the power of Google Assistant’s intent mapping and Natural Language Understanding (NLU), all you need is a few days to add a layer of voice commands and users can jump to the activities in your app where engagement matters most.

The ability to perform tasks inside an app is implemented on the developer’s side by mapping users’ intents to specific functionality inside their apps. This feature allows users to open their favourite apps with a voice command — and, with the added functionality, lets users say “Hey Google” to search within the app or open specific app pages.

Starting today, you can use the GET_THING intent to search within apps and the OPEN_APP_FEATURE intent to open specific pages in apps; offering more ways to easily connect users to your app through Assistant.

It’s easy to implement all these common intents to your Android apps. You can simply declare support for these capabilities in your Actions.xml file to get started. For searching, you can provide a deep link that will allow Assistant to pass a search term into your app. For opening pages, you can provide a deep link with the corresponding name for Assistant to match users’ requests.

For a deeper integration, Google offers vertical-specific built-in intents (BII) that lets Google take care of all the Natural Language Understanding (NLU) so you don’t have to.

Every app is unique with its own features and capabilities, which may not match the list of available App Actions built-in intents. For cases where there isn’t a built-in intent for your app functionality, you can instead create a custom intent. Like BIIs, custom intents follow the actions.xml schema and act as connection points between Assistant and your defined fulfilments.

With more common, built-in, and custom intents available, every Android developer can now enable their app to fulfil Assistant queries that tailor to exactly what their app offers. Developers can also use known developer tools such as Android Studio, and with just a few days of work, they can easily integrate their Android apps with the Google Assistant.