Digital Marketing ChannelAdvisor By Link Walls

Digital Assistants and the Rise of Voice Search: Shopping via Voice (Near Term)

In our last post, we reviewed the various devices and apps in the digital assistant market.  Anyone who has tried to complete a fairly routine purchase using one of these devices likely found the experience lacking compared to current expectations for an e-commerce purchasing experience. This post will discuss what we see as the near-term shopping experiences that digital assistants are enabling.  Much like the early days of mobile, many necessary puzzle pieces are missing, yet ingredients for a more robust future shopping experience are being added.

A fairly typical exchange with Alexa today sounds something like this:

Shopper: “Alexa, what headphones are on sale?”

Alexa: “OK, I found ________, do you want to buy?”

The response of “[state offer], do you want to buy?” seems equivalent to “Nice to meet you — want to get married?” Clearly, buying a new pair of headphones is hardly the commitment of picking a life partner, but it feels…too soon. The natural process of shopping either in store (picking up an item, feeling the material, potentially trying it on) or online (reading reviews, looking at different images, reading the description) are all absent.

In Part 4 of this series, we’ll dive into our recommendations for how retailers can most effectively tackle voice search.   

Shopping List: The Next Battlefront

“I could not find a good match for “milk”, should I add it to your shopping list?”

One of the earliest descriptions for search engines like Google was that they represented the “database of intentions.”  The explanation was that because Google (or Bing or any applicable search engine) capture queries phrased in the user’s own words, they carry an enormous amount of information about a user’s intent. This is true across all topics and Google knows a great deal about many thoughts we probably aren’t willing to share publicly (“should I adopt?,” “new job in Perth,” “early warning signs of skin cancer”), as well as much more mundane commerce queries (“new Beats headphones in red,” “Tom Brady jersey,”  “mens jeans”).

In today’s search world, these commerce-related queries really don’t have a centralised repository for a user. The closest thing would be an Amazon Wish List, but even that typically doesn’t capture everything needed or wanted to purchase. That brings us to the shopping list tied to digital assistants.  Today, the shopping list is fairly modest and not all that different than any checklist you would create in a dozen other mobile apps:Shopping list as it appears on mobile

However, this shopping list is a gold mine of user intent, particularly in grocery, that the digital assistants can access. As a result, we believe these lists will evolve significantly over the coming years and represent a key advertising opportunity for brands and retailers. As a result, we expect the UX of these to change dramatically to accommodate brand advertising.  For instance, if I add “provolone” to my shopping list, it is only natural that Kraft would want to buy advertising there, knowing both the identity and the intent of the user. Advertising continues to get smarter, as advertisers get a better idea of who they are targeting and of which consumers are in-market and ready to purchase.   

Each item on a shopping list has several associated attributes. Thinking through these gives clues to where promotional options exist, such as:

  • Date added
  • # of days since added to the shopping list
  • Brand qualifier (“Kraft provolone cheese” or just “provolone”)
  • Descriptive characteristics (size, colour, etc.)

Opportunities exist to mine this data both at an individual line-item level (e.g., “you have provolone cheese on your list, there is a promotion that ends today at Site X”), as well as at an aggregate level.  For example, grocery items that stay on the list for several days (especially essentials) indicate that it is a busy week and the consumer hasn’t had time to go to the store to purchase. It is not hard to imagine a digital assistant initiating this conversation:

“I noticed you have five items of food that have been on your list since Monday. Would you like me to combine them into an AmazonFresh order to be delivered this evening?”

This also applies to what we call predictive commerce, with the easiest pattern being replenishable products (paper towels, milk, makeup, etc). As more data and purchase history funnels through the assistant, it can make more intelligent decisions. For example, it can reorder paper towels for you on Wednesday since it knows you will run out by Thursday (even if you don’t know that yet).

We expect that the shopping list will very quickly evolve beyond its very basic layout to something more like this:

shopping list v2

The shopping list is very well suited to the type of shopping that is all about completing a task.  However, for discovery-based purchases that are more about inspiration, it really doesn’t work well.  

Why Screens?

All too often when new technology is evaluated it is done through the lens of replacing one technology with another. While that is true in terms of software versions (obviously, none of you are still rocking Windows ‘95), it is less true when it comes to hardware and, specifically, entry points to the internet.  

The computer was the first entry point to the internet for nearly everyone, but it certainly hasn’t disappeared in the rise of smartphones. Instead, consumers have continued to add new connecting points and then optimise use between them. Just watch someone use a laptop and then look at their phone — most likely whatever was on the phone was accessible from the laptop, yet that person chose to optimise for that specific task.

When we think about voice search and digital assistants, it is too narrow to think of them as the ONLY device that gets used, particularly for shopping. This assumes that usage models are much further developed than they really are. If we start to think about how the experience can change using ALL devices consumers have at their disposal, many new usage models emerge. This is where understanding the various platforms and their related assets for the consumer is important.

For example, imagine you can ask your digital assistant for a commerce-related query like this one:

“OK, Google, show me what kids clothes are on sale today at Target.”

Google Home: “Ok, I found 14 items on sale at Target – would you like me to read them to you or send them to your phone?”

From here, it isn’t much of a stretch that the phone app (in this case Google Home) could display the 14 products in something like a PLA carousel view. Alternatively, it could offer to display the results on a connected TV. The result leverages the best of both worlds — voice to answer my question and a screen to help me understand the results visually.

Enter the Echo Show (Amazon has not yet provided a release date for Echo Show in Australia, but it is likely we will receive some indication once Amazon launches here). The Echo Show launched in late June in the United States and is the first digital assistant to have a built-in screen. The primary use case promoted by Amazon is a FaceTime-like ability to drop in and video chat easily with others in the Echo eco-system or watch videos or video feeds of things like a baby monitor or video doorbell. One of the capabilities that already exists outside of the Show is browsing Amazon search results. By allowing for visual verification, this feature takes a lot of the clunkiness out of the voice search experience.