John Ennis
Eye on AI - February 19th, 2021
Welcome to Aigora's "Eye on AI" series, where we round up exciting news at the intersection of consumer science and artificial intelligence!
We’re keeping things simple this week to focus on new AI-powered devices and systems with market-disrupting potential, including Amazon’s Alexa-infused glasses frames and a phone-based disease-sniffing system that identifies diseases with more accuracy than doctor-administered tests.
Enjoy!
Amazon’s Glasses with Built-In Alexa Poised to Dominate Wearables Market

We begin with the release of Amazon’s new Echo Frames, which is sending ripples throughout the wearables market. Fort Wayne news does a wonderful first-take on these voice assist specs in their article “Amazon glasses with Alexa built-in feel like they’re from the future”, including a video to show off their new futuristic look and feel.
“The Bluetooth frames have tiny speakers and microphones built-in, along with a touch-sensitive area and buttons for controls,” writes WANE contributor Rich DeMuro. “You can use them for hands-free access to Alexa, music, phone calls and more.”
While news of Echo Frames release is nothing new, this is our first chance to really see the glasses in action. And they don’t disappoint. Unlike the failed and often ridiculed Google Glass, Echo Frames are much less visually invasive. They’re futuristic-looking, indeed. But they’re hardly distinguishable from your typical eyewear. The speakers and built-in microphone mere slits in the frame, no earbuds required –– sound comes directly out of the frame. And while the frames do have simple touch sensors on the sides, there are few visual indications they include voice assist–– that is, so long as you don’t mind appearing as if you’re talking to yourself. Furthermore, prescription lenses can be installed directly into the frame.
Here’s what DeMarco had to say about trying on his first pair:
“I used them to listen to soft music while I worked, podcasts during a walk and phone calls on the couch. You almost forget the glasses are on and the audio seems to magically appear “around you. While battery life and audio quality could be improved, the Echo Frames are a fantastic start. They feel like something right out of the future.”
At $250 a pop, Echo Frames aren’t cheap, though they’re not exactly expensive either, especially for power Alexa users. And while copycat products may soon hit the market soon to challenge Amazon, Alexa’s AI is so far ahead of the curve that it would be hard to see any newcomers threatening Amazon’s command of the voice assist eyewear market any time soon.
Dog-Inspired Disease-Sniffing Device Could ID Cancer from Your Phone

Speaking of market disruption, did you know that over the past fifteen years trained dogs have been the leading disease detectors on the market, more accurate than even hospital-administered tests? It’s true. But it may not be for long. According to the MIT News article “Toward a disease-sniffing device that rivals a dog’s nose”, a team of researchers have come up with a system that uses machine learning to detect the chemical and microbial content of an air sample with even greater sensitivity than a dog’s nose.
“The miniaturized detection system… is actually 200 times more sensitive than a dog’s nose in terms of being able to detect and identify tiny traces of different molecules, as confirmed through controlled tests mandated by DARPA,” writes MIT News contributor David Chandler. “But in terms of interpreting those molecules, ‘it’s 100 percent dumber.’ That’s where the machine learning comes in, to try to find the elusive patterns that dogs can infer from the scent, but humans haven’t been able to grasp from a chemical analysis.”
Knowing that human-administered tests were routinely being outdone by the noses of trained dogs, researchers at MIT and other institutes created a system that incorporated mammalian olfactory receptors. Researchers stabilized the receptors to act as sensors, then, using machine learning, trained their system to understand the chemistry behind the scents being passed through it. Data streams from receptors were handled in real-time by a typical smartphone’s capabilities.
The system’s initial study used fifty urine samples from patients with known cases of prostate cancer and a control group without and showed that the artificial system created by the researchers had the same accuracy as dogs. This achievement, researchers say, provides a solid framework for further research to develop the technology to a level suitable for clinical use to train the system to identify other diseases.
“We knew that the sensors are already better than what the dogs can do in terms of the limit of detection, but what we haven’t shown before is that we can train an artificial intelligence to mimic the dogs,” says Research Scientist Andreas Mershin of MIT. “And now we’ve shown that we can do this. We’ve shown that what the dog does can be replicated to a certain extent.”
While further research is cost-prohibitive, –– each sample costs roughly $1000 to test –– the potential seems well worth the expense. One day, researchers hope that every phone might have a scent detector built-in, similar to how cameras have become ubiquitous in phones, that could detect not only diseases but other scents such as smoke, leaked gas, CO2, etc. It may take some time to get there, but the potential is already piquing my interest.
Other News
Chartwells brings ghost kitchens to college and university campuses
ML-assisted study shows how coffee consumption reduces heart failure
That's it for now. If you'd like to receive email updates from Aigora, including weekly video recaps of our blog activity, click on the button below to join our email list. Thanks for stopping by!