Eye on AI - November 29th, 2021
Welcome to Aigora's "Eye on AI" series, where we round up exciting news at the intersection of consumer science and artificial intelligence!
This week, our focus turns to the food personalization race as we break down Alexa’s new “What to Eat” feature and SnapChat’s Food Scan, which it hopes will become the ‘Shazam’ for food.
Amazon Alexa Expands Personalized Food Recommendation Capabilities
We begin with Amazon Alexa news. Last week Amazon announced the launch of a new Alexa meal recommendation feature called What to Eat”, as described in a recent The Spoon article, which personalizes user recommendations for restaurants, recipes, prepared items, and more.
On the surface, the move seems insignificant; the What to Eat feature is, after all, only an extension of Alexa’s “What’s for Dinner” meal personalization feature released earlier this year. But where What’s for Dinner recommends recipes based on past purchase behavior, What to Eat includes recommendations based on dietary preferences and restrictions.
“This evolution of Alexa’s meal personalization capabilities gives Amazon monetization opportunities through a user filling up their e-commerce basket with ingredients via a shoppable recipe, selling prepared foods from Amazon Fresh or Whole Foods, or by gathering a spiff for a restaurant recommendation,” writes The Spoon contributor Michael Wolf. “While not all of these opportunities are created equal – Amazon obviously gets a bigger share of the spend when customers add a recipe to their Amazon Fresh basket as compared to when a user eats out at a local restaurant – What to Eat entrenches Amazon deeper into the decision-making process of the consumer.”
Personalized recommendations based on past conversions have become ubiquitous with digital marketing. Netflix recommendations are based upon this idea. So too are Spotify, Facebook and countless retail recommendations (who hasn’t seen some version of the “you may also like” recommendation). What You Eat goes one step beyond that. It requires a higher degree of engagement to be used effectively (i.e. sharing dietary restrictions, etc.). Because Alexa can be attached to virtually every device, from refrigerators to ovens and even cars, and connects to data across most of Amazon’s food-based subsidiaries, users are much more likely to engage with it.
“After helping to create the category in 2015, Amazon continues to be the runaway leader in the US smart speaker market share, logging 69% of all installed speakers as of mid-2021,” continues Wolf. “A good chunk of those smart assistants reside in the kitchen where users often will ask for recommendations, add things to a shopping list, and more. All that activity enables Amazon to profile us and, now, make money at every step in the meal journey.”
Personalization + availability = higher user engagement. Higher user engagement means more 1st party data, better personalization, and more conversions. Amazon’s data collection ability through Alexa was already lightyears ahead of the competition. The addition of the What You Eat feature into Alexa’s growing recommendation arsenal means that Amazon is one step closer to cornering the food recommendation market.
Snapchat Creates Shazam-Style Food Identifier
To finish, let’s look at another food personalization announcement in Snapchats Food Scan, which uses a photo identifier to identify any food, similar to how the Shazam app identifies songs, but takes that idea one step further by supplementing food identification with recipe recommendations.
“Snapchat’s AI will process the image and suggest a recipe from partner Allrecipes, as well as serve up other information, such as a Wikipedia page, about the item,” writes Wolf. “According to Snapchat, the feature has access to over 4 thousand recipes and can process up to 1500 ingredients.”
Snapchat isn’t the only company hoping to develop the Shazam for food. Whirlpool, Microsoft, Google, Pinterest and others have been attempting similar photo-based recipe recommendations for years. No one has quite perfected it. And thus far, neither has Snapchat –– Wolf notes a few Food Scan errors during his evaluation. But Food Scan clearly has its sights set on building revenue through shoppable recipes and recommendations. In time, it could become a broader food-related augmented reality recommendation tool that suggests restaurants, meal kits, and other potential monetizable recommendations similar to Alexa’s What to Eat, and just might have the user base (i.e. data) to compete with Amazon.
Miramax sues director Quentin Tarantino over Pulp Fiction NFTs
Researchers rewire worm brains to predict smell using machine learning
Scientists use machine learning to predict smells based on brain activity of worms
That's it for now. If you'd like to receive email updates from Aigora, including weekly video recaps of our blog activity, click on the button below to join our email list. Thanks for stopping by!