John Ennis
Eye on AI - June 21, 2019
Welcome to Aigora's "Eye on AI" series, where we round up exciting news at the intersection of consumer science and artificial intelligence!

This week we shift gears and focus on news immediately actionable to sensory and consumer scientists, most notably on news related to report preparation and data collection.
First, as pertains to report preparation, Microsoft announced new automated features for PowerPoint. As a fan of the PowerPoint "Design Ideas" (Tool #5 in this post), I'm excited to use these new tools. As described in Business Insider, the four tools announced were:
Guidance in the creation and use of branded templates.
Theme recommendations based on styles and colors commonly employed by the user.
Suggestions for making numerical quantities more meaningful to the audience.
A new "Presenter Coach" (coming later this summer) that helps users practice giving their presentations.
Watch more from Microsoft below and enjoy this write-up in the Verge.
Next, on the sensory side of things, more progress has been made linking computer vision and touch - this time by MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) - with possible consequences for the tactile side of sensory. According to Wevolver:
“By looking at the scene, our model can imagine the feeling of touching a flat surface or a sharp edge”, says Yunzhu Li, CSAIL PhD student and lead author on a new paper about the system. “By blindly touching around, our model can predict the interaction with the environment purely from tactile feelings. Bringing these two senses together could empower the robot and reduce the data we might need for tasks involving manipulating and grasping objects.”
As this technology matures, it will be interesting to watch how well predictive models of tactile experiences perform when based only on visual information. Such models, if viable, could speed up product testing and could also assist with packaging development by helping to create a more consistent user experience.
Pivoting to consumer research, QuenchTec announced the launch of "Chatbot Survey Builder," a step towards bridging the gap between quantitative and qualitative research. As we've discussed previously, this step is one of several involving natural language processing (NLP) that will allow for the collection of focus-group or even interview quality data, but at a much larger scale. If you're a consumer researcher, this space is worth keeping an eye on.
On the flip side of rich data collection, the New York Times published a fascinating exposé on location tracking through Bluetooth beacons. According to Michael Kwet:
"Most people aren’t aware they are being watched with beacons, but the “beacosystem” tracks millions of people every day. Beacons are placed at airports, malls, subways, buses, taxis, sporting arenas, gyms, hotels, hospitals, music festivals, cinemas and museums, and even on billboards."
Finally, for a comprehensive and at times amusing look at the current state of voice shopping, we wrap up with this article from Wired. I think the future of smart speakers holds much more than voice shopping (such as conducting automated, in-home consumer surveys), but it is remarkable that voice shopping is not further along given how prevalent smart speakers have become.
Other items of interest:
What happens when you try to steal food from a delivery robot.
Agri-researchers combining speed breeding with gene editing to turbocharge plant evolution.
And finally, two reviews of the use of blockchain to improve supply chain efficiency and accountability, one by the Institute of Food Technologists (IFT) and one by the Harvard Business Review.
That's it for now. If you'd like to receive email updates from Aigora, including weekly video recaps of our blog activity, click on the button below to join our email list. Thanks for stopping by!