Danielle van Hout
Eye on AI - September 9th, 2022
Welcome to Aigora's "Eye on AI" series, where we round up exciting news at the intersection of consumer science and artificial intelligence!
Our focus this week turns to Meta’s new AI speech tool, which is helping researchers communicate with people who are immobilized, then switches gears to look at a 24/7 robot smoothie kiosk that may be coming to a truck stop near you.
AI Speech Tool Could Help People in Vegetable Wakefulness Communicate
We begin with promising yet unexpected news in speech AI. According to the article “An AI can decode speech from brain activity with surprising accuracy,” Meta’s research team has developed an AI that can guess what people have heard through monitored brain activity, which could eventually allow for direct communication with people in incapacitated states.
“Using only a few seconds of brain activity data, the AI guesses what a person has heard,” writes Science News contributor Jonathan Moens. “It lists the correct answer in its top 10 possibilities up to 73 percent of the time, researchers found in a preliminary study... the AI could eventually be used to help thousands of people around the world unable to communicate through speech, typing or gestures, researchers report August 25 at arXiv.org”
Meta’s team trained their language model to recognize specific features of language at the fine-grained level (letters and syllables) and the broader level (words or sentences) using thousands of hours of speech recordings from multiple languages. The team then applied an AI with this language model, which connected to databases with stored brain activity from volunteers that listened to a series of stories, then identified patterns in the magnetic or electrical component of brain signals to determine what was being read to each participant using only seconds of data.
The results, though positive – 70% using magnetoencephalography (MEG), 30% with electroencephalography, which is very good at this stage in the research – have much to improve upon. The MEG system is incredibly bulky and fairly impractical for patient clinical use, while the AI itself only looks for patterns in speech perception, not production, production being the ultimate goal. Though we’re still a long ways away from brain communication, researchers point to this system as an early breakthrough.
First Robot Kiosk Offering 24/7 Jamba Juice Opens in California
Switching gears, it was recently announced that Love's Travel Stops has opened its first Jamba by Blendid autonomous robotic kiosk at its Williams, California, store, introducing a healthy 24/7 foodservice option to its traveling clientele.
“The Jamba by Blendid kiosk offers visitors smoothie options inspired by Jamba and powered by robotic foodservice solutions company Blendid,” writes NACS’ editorial team. “The self-operating kiosk allows customers to customize their smoothie orders by adjusting ingredient quantities or adding boosts directly through the Blendid app.”
The Williams, California location robot lives in an uplifting and colorful setting, adorned with a mural by artist Bongang, a street muralist who has frequently worked with the Jamba brand. It has its own unique personality and can dance and move as it creates smoothies on-demand or pre-order. Love’s Travel Stops traditionally serves motor travelers, such as truck drivers, at their 590 locations across the country. This healthy, colorful, and personable food option should make for a welcomed addition.
Machine learning teaches AI humanoids how to play soccer from scratch
AI-enhanced system detects bruised strawberries before they arrive at the grocery store
How researchers are using AI to control digital manufacturing
That's it for now. If you'd like to receive email updates from Aigora, click on the button below to join our email list. Thanks for stopping by!