Yesterday, Google released an update to the Google Assistant on Android, which allowed it to transcribe the contents off of a web page and read it out loud to you.
All you need to do is say, “Hey Google, read it” or “Hey Google, read this page” to your Google Assistant, and it will immediately read aloud the content of the web page. Your browser will automatically scroll the page and highlight words as they’re read aloud. You can also alter the reading speed and choose from multiple voices (Lime, Jungle, Royal, and Sapphire).
Website owners have to make no changes to their existing web page structure, while those who do not want their web pages to be read out aloud like this can use the Nopagereadaloud meta field.
However, developers can add the ability for Google Assistant to read aloud content in your Android mobile app using ‘Read It’ Actions.
Read aloud content in your Android mobile app
Android 6.0 Marshmallow introduced a new way for users to engage with apps through the Google Assistant. The Assistant is a top-level window that users can view to obtain contextually relevant actions for the current activity.
‘Read It’ is a Google Assistant feature available on Android devices that offers another way for users to read long-form web content like news articles and blog posts.
Android apps with web-based content can support Read It by providing information to the Assistant using the onProvideAssistContent()
method. Android provides this contextual content to the Assistant based on what’s displayed on-screen, so users get access to more ways of engaging your content without shifting contexts.
This process helps maintain the structure of data as it’s shared with the Assistant. Users who receive shared app content can then be deep linked or receive content directly, instead of as text or a screenshot.
We recommend you implement onProvideAssistContent()
for any web-based content and any sharable entity
in your app.
Providing content to the Google Assistant
For Read It to access your content, your app must provide the Assistant with information about the content, like its web URI and some basic context. The Assistant can then retrieve your content to be read out loud to the user. For Android apps that already implement web-based content using WebViews or Chrome Custom Tabs, we recommend using the same web URIs for Read It as a starting point.
When combining Read It functionality with built-in intents, you only need to implement onProvideAssistContent()
for the final app activity in the user’s task flow after invoking the App Action.
Provide a web URI for your content in the uri
field of AssistContent
. Provide contextual information as a JSON-LD object using schema.org vocabulary in the structuredData
field.
The following code snippet shows an example of providing content to the Assistant:
When implementing onProvideAssistContent()
, it is highly recommend that you provide as much data as possible about each entity
. However, the following fields are required:
@type
.name
.url
(only required if the content is URL-addressable)
To learn more about using onProvideAssistContent()
, see the Optimizing Contextual Content for the Assistant guide in the Android developer documentation.