John Ennis
Eye on AI - January 8th, 2021
Welcome to Aigora's "Eye on AI" series, where we round up exciting news at the intersection of consumer science and artificial intelligence!
This week is all about AI sensory enhancement, and focuses on two unique studies that unveil how sound and environment affect our sensory perceptions.
Enjoy!
Augmented Reality in Food and Sense Perception

Let’s begin with a look at an article out of MSensory this week, titled “New Augmented Reality System Helps Manipulate Taste Perception of Food | Azorobotics,” which discusses a food study using VR headsets that shows how luminance, or light exposure, affects food perception.
The study was meant to determine how minor changes in light exposure on various foods changed how participants perceived it. For instance, if there was light banking off of lettuce, the researchers were curious whether participants would believe it to be more nutritious, watery, or fresh, and by what degree, despite the actual food remaining unchanged?
“When the researchers interviewed the study participants, they observed that manipulation of the standard deviation of the distribution of luminance, while maintaining the color and the overall luminance constant, not only changed the participants’ anticipation of taste with respect to deliciousness, wateriness and moistness but also changed the true texture and taste properties when they sampled the food itself,” writes the MSensory team.
If you remember our post on VR from last year, food marketers and producers were already beginning to use VR to study taste and location early in 2019. This study takes that idea one step further. And while it may seem rather simple, or even self-explanatory, that setting and food appearance alter taste perception, these studies help us understand, with clear evidence, how and why it happens.
In the future, it’s not difficult to imagine restaurants changing light displays to enhance daily specials, or perhaps offering suggested ‘taste’ lighting with takeout. But it’s also not a stretch to see how people could begin to look at eating habits and spaces in new ways based on these and other food studies, perhaps enhancing interior design, packaging, and grocery store displays to improve food satisfaction design and lighting.
As VR headsets become more advanced, new food studies will emerge. Touch will soon be incorporated. And smell. It’s even thought that VR will eventually seem as real as our normal reality –– we’re not there yet, but you get the idea. As far as I’m concerned, it’s a foregone conclusion that augmented reality (of which VR is a part) will change how we understand our food in relation to our senses. It’s what we do with that information that’s the real question.
AI May Soon Unlock Healing Powers of Music in Prescribable Ways

We conclude with a positive bit of music news. According to the ScienceDaily article titled “Music-induced emotions can be predicted from brain scans,” researchers at the University of Turku recently discovered which neural mechanisms are the basis for emotional responses to music, which will lead to a better understanding of how our bodies respond to specific types of music stimulation.
The study was conducted using MRI machines that looked at the brain activity of participants as they listened to non-lyrical music, which allowed researchers to determine which brain regions were activated when different music-induced emotions were separated from each other using machine learning.
“Based on the activation of the auditory and motor cortex, we were able to accurately predict whether the research subject was listening to happy or sad music,” says Postdoctoral Researcher Vesa Putkinen. “The auditory cortex processes the acoustic elements of music, such as rhythm and melody. Activation of the motor cortex, then again, may be related to the fact that music inspires feelings of movement in the listeners even when they are listening to music while holding still in an MRI machine."
Understanding how the brain reacts to music could lead to new breakthroughs in music, sensory understanding, and even health –– that’s right, health. The Sync Project has been looking for ways AI can create musical combinations that address things like stress, depression, and anxiety. As we better understand how music affects our brains and bodies, researchers will discover new ways music can heal other mental and physical ailments, perhaps even leading to musical therapy sessions for things like cancer or high blood pressure. A prescribed playlist sure beats prescription pills. And though we’re not there quite yet, that sort of breakthrough could be coming soon. Guess you’ll just have to keep listening for it ;-)
Other News
‘Electronic nose’ to be deployed in new project to improve agricultural chemical use
Burger King launches integration with search engine to make digital ordering easier
Bernard Marr’s take on the impact of COVID-19 on the 4th Industrial Revolution
That's it for now. If you'd like to receive email updates from Aigora, including weekly video recaps of our blog activity, click on the button below to join our email list. Thanks for stopping by!