top of page

Nimesha Ranasinghe - An Augmented Future


Welcome to "AigoraCast", conversations with industry experts on how new technologies are transforming sensory and consumer science!


AigoraCast is available on Apple Podcasts, Stitcher, Google Podcasts, Spotify, PodCast Republic, Pandora, and Amazon Music. Remember to subscribe, and please leave a positive review if you like what you hear!


 

Nimesha Ranasinghe is an Assistant Professor at the School of Computing and Information Science and directs the Multisensory Interactive Media lab at the University of Maine. Nimesha completed his Ph.D. at the Department of Electrical and Computer Engineering, National University of Singapore, in 2013. Nimesha’s research interests include Multisensory Interfaces, Human-Computer Interaction, and Augmented and Virtual Reality. He is well-known for his Digital Taste and Virtual Cocktail inventions and has been featured in numerous media worldwide, including New Scientist, New York Times, Time Magazine, BBC Radio, Discovery Channel, and Reuters. Furthermore, he has published his work in several prestigious academic conferences and journals, including the ACM conference on Human Factors in Computing Systems, ACM conference on Multimedia, and Journal of Human-Computer Studies. He has received numerous awards for his research works. In particular, in 2014, his work on Digital Lollipop was selected as one of the world's ten best innovations by the netexplo forum in UNESCO HQ, Paris.




Transcript (Semi-automated, forgive typos!)


John: Nimesha, welcome to the show.


Nimesha: Thank you, John and thanks for the invitation and welcome everyone.


John: Great. So, Nimesha I really want to get into your background because I think it's interesting how the kind of twists and turns took you into sensory. But first, I was going to comment that here we are virtually having this call and it was about a year ago, maybe just over a year ago, that I first saw you give a presentation virtually at Pangborn. So you are really ahead of your time. And I think anticipating.


Nimesha: That was an unexpected event happens and I couldn't attend the conference. So I discussed with the organizers and came up with the plan to make a video out of my talk. And then I connected live for the Q&A. It felt like the pandemic situation, even though a few years ago.


John: So I think that's the theme of your research, we're going to get into it. I think that you've always been 3 years ahead of it, including me. When I first saw your virtual cocktail stuff, I thought, okay, this is interesting, but I didn't really get it. And I have to say now I really get it. But let's talk a little bit about your background. So you have, I think, maybe an interesting background for somewhat in sensory. So maybe you could talk a little bit about the path you took in sensory.


Nimesha: Yeah. So, as you mentioned, I'm not nearly or closer to the sensory science world, but I'm really interested about the human physiology and human psychology as well as the technology. I'm coming from originally from Computer Science undergraduate back in Sri Lanka. And then I joined the Electrical and Computer Engineering Department in Singapore. So I have this interest about the technology, especially on the electrical and electronics and then the software aspects, especially on the gadgets and related like tangible, not just the pure software engineering. As well as I am really interested in the virtual reality, augmented reality and human computer interactions like how can we, like, bring the technology more closer to the human. And that's always a question I'm asking when we are coming up with the new ideas, new technologies. And that's why it was very interesting to think about, like human sensory perception and cognition and all of these aspects when we are thinking about the new technologies for the future.


John: And so you were telling me a little bit in the precall about how you first is really doing your PhD, that you started to look at some of these kinds of food?


Nimesha: Exactly. So when I joined my PhD, I had something in mind like I want to do something on virtual reality or digital interactions. But I didn't have any sort of focus. So I was talking to various people from the medical as well as neuroscientists with different diverse backgrounds and talk about like, hey, how can we bring this smell and taste senses into the technological world? Because when you think about the smartphone or any technology they all built around the audio visual sense. And sometimes people are using haptic, I mean, barely like the vibrations and all of these things but people are doing a lot of research on haptics right now. But very few people are working on the smell and taste and taste itself. So that's where my interest laid and then I thought like, how can I develop like a simulation technology. So like conceptual diagrams and conceptual interactions to taste and smell using digital video game controllability of those two senses using digital media and that was the beginning of all these series of work we have right now.


John: Yeah, so fascinating and this is what I didn't really get, you know until fairly recently, I think is that what you're saying, that in the tech space, you know, there's UX, right? And UX in some sense is almost like sensory science, but only applied to visual things to an extent. You have apps that are very important how they look. You go to AirBnb website is beautiful. Like so much work into the user experience. But it is a very limited experience. Right? Because it's mainly a vision and a little bit of sound, but not really any taste and smell. Of course, people played around with kind of smell of vision. And there's always been this effort and I'm not sure you actually want to smell everything that's on TV. But what I think is really interesting is that you are looking at these I mean, maybe you can talk about your kind of inventions over the years, like you mentioned a little bit of a digital lollipop, could you take us through the series of kind of projects that you've worked on in the space trying to provide a simulated sensory science?


Nimesha: Yeah, exactly. I think before that, I want to sort of make a comment about the thought of where you mean like we don't need to smell everything. I think that's exactly the purpose of the research we are doing. Like not just releasing smells like everywhere like crazy people, but understanding human cognition. We can use these additional modalities to deliver the information in a more effective manner, right? As well as like when we think about our real world or natural sort of interactions, we use smells and taste and all of these things to sort of indirectly communicate or indirectly pass messages of in between. For example, like if you are putting into like a smell, I mean, it smells good place, right? You certainly attracted then more engaged into that space rather than you are going to like a with the bad smell you immediately want to like disconnect yourself from that space. So there are like a lot of hidden sort of cognitive aspects of the human physiology. We need to study in terms of the human psychology and perception and then how this perception, memory, all of these things working together. That's really interesting to sort of study the multi-sensory perception and then the memory and cognition. All of these hidden aspects of the human brain or the human mind. So to answer your original question, I think if I like walk you through all these technologies we developed, like as I mentioned, this all started as part of my PhD research. So I wanted to come up with like a methodology like simulate or stimulate the taste sensations. At least the primary sensations, as we all know, like salty, sour, sweet and bitter in one sensations using a digital mean. And that's why I play with like a lot of mediums like the electrical stimulation, thermal stimulation, even the infrared and all of ultrasound the stimulation. So I played with all these technological sort of methodologies to stimulate human tissue, especially on the tip of the tongue. And to understand experimentally understand how people perceive these sensations, like what kind of immediate effect they have. Are these painful and what the threshold level that people have are. And all these things I started in the beginning and then eventually I sit down with the electrical stimulation and thermal stimulation. And we found out by employing the certain magnitude of current that by controlling the frequency as well as the different areas of the tongue. We can simulate the primary sensations like saltiness, sourness and bitterness. Later on we found out like using the thermal stimulation, especially when rapidly heat up and cool down. People also feel like a minor kind of menthol, sweet kind of sensation. So these are like the initial findings we have. And then I concluded my thesis saying, like, hey, by combining these parameters, we can simulate this primary taste up to this percentage of accuracy based on my use experiments. And that's the sort of end of the beginning but the beginning of the next chapter. And then I wanted to sort of I mean, a lot of PhD work, like students are doing them and then leave that particular lab or the university and that work. But I thought, like, I want to bring this technology out and further develop and show the world at least if this is a viable technology as a consumer product or consumer device. And that's where I want I embraced this technology, especially the electrical stimulation aspect to everyday familiar utensils. For example, we develop a spoon. Soup bowl like a Japanese soup bowl where you will consume soup from the bowl itself and then chopsticks and then a beverage bottle. So all these devices or the sort of like a digitally enhanced cutlery or the utensils we had the control system embedded which is like a separate module. You can get the whatever the module and slotting into the upgraded data structure which also hold these electrical component or electronic components and then all these devices to electrodes which are eventually when you consume something from a utensil, you will touch your tongue, these two electrodes and minor electrical stimulation will be applied. Thereby sort of d enhancing or augmenting the flavors of the original food or beverages. And that feels like the playing with only the electrical stimulation. And then we thought about like, what are the different aspects of flavor experience? Right? We often talk about the smell and taste and then the color and then the freshness of all the bubbly sensations and then the temperature and all these and also the previous experiences, memory and all these cognitively aspects of sensory. So I thought, like, what if we can simulate the other kind of aspects of a food or a beverage just to see how people's perception changes. And this is already has been studied by many of sensory scientists. All of user experience designers as well, like changing the color of the beverage, color of the food and see like people’s perception changes. And I bring that idea as well, and then improved my prototypes with the addition of color sensation. So, for example, now you can not only change the settings to more salty, but you can see the blue color changed or the entire utensil is becoming like, look, I'm just I was just using LED light inside the utensil just to give a glimpse. And then I realized, like, when you couple the color with the electric taste sensation, people has a more sort of attraction and more they like it better than just the electrical stimulation. That the reason is when we think about these basic taste attributes, let's say someone is preparing this pure, clean solution of salty, sour and bitter. Some of the people having a hard time like to differentiate or distinguish between each other or discriminating different solutions. So we need these additional modalities like the colors, smells to sort of create and discriminate between primary sensations. Otherwise, it's almost like a mixed sensation, you can see. So that's a challenge with the electric taste simulation as well. So once we sort of use the color modality to differentiate these different settings, people start to say like, oh, this is clearer now. Some of this is salty and this is bitter sensation. And then we go to encourage because of that and then I thought, like, how about we add, like, the smell aspects as well. And that's the left of the virtual cocktail idea, because when you think about the cocktails or mocktails, they come with various colors and various look and feels. And you can do various experimentation with these beverages. And I thought like similar to the virtual sort of cocktail and some cocktails, what if we have a beverage called cocktail, like a virtual right where you can control these different aspects in using digital tools. For example, using an app so we develop this 3D printed structure again with all the control modules, and then we go to a traditional martini glass and you can slot into the structure and now you can change the product different colors onto the beverage. And then on the rim of the martini glass, we put electrodes. So when you are taking a sip, your tongue will touch and then get like a boost of a flavor sensation as well as we had these three small chambers underneath where you can pump different smells on the surface of the beverage. So thereby making it like a more like a complete three sort of modalities to control using a digital mean. And we develop an app and we did a lot of sort of studies as well as exhibitions using the virtual cocktail prototype. And that's sort of the conclude like the journey. But right now we are looking at like what are the other aspects like, for example, when we are changing these modalities, how the human emotions changes and how their likeness changes towards the beverage itself. We are looking at like a lot of experiments right now, rather than like building new tools. But we are trying to simulate, like the texture and all the other aspects as well in the background. So that's sort of like the story or the timeline so far.


John: Yeah, it's great. And have you thought about sound? A sound come into this at all?

Nimesha: Yes. We haven't included any sounds or like the more visual aspects rather than the colors. People have been studying like the how to like different frequencies affect our taste or the flavor perception or like using the virtual reality to simulate like the restaurants and different environments to see how the perception changes. But in our case, we want to focus more towards the smell and taste and then study more towards them rather than focusing on the audio-visual aspects. But certainly, if you add audio and visual aspects, definitely we'll have more sort of augmentation as well.


John: Yeah, that's we're thinking about for sure for. Steve Keller was on the show, he's the head of Sonic Strategy for Pandora. Maybe, you know, Steve. And he was talking about the sonic manipulations to affect taste. It's quite interesting, actually. Might be something that you, I'm sure would resonate with you. Something I'd like to ask you about, I actually just finished the book, "How Innovation Works by Matt Ridley." You might already read it, it's an excellent book. And he talks about these kind of two ways, everybody thinks there's going to be some sort of science led innovation where there's like science is done and then people study the science and then they get ideas. But in real life, it oftentimes is more a kind of trial and error approach. So I'd like to kind of hear, where do you get your ideas for new research? What are the things that you're thinking about doing in the future? And, you know, is it really coming down to the trial and error, your own experience? Are you looking at the neurobiological literature? I think some discussion of your process would be quite interesting.


Nimesha: Yeah, one word, science fiction. So I grew up with a few, if you were familiar, Arthur C. Clarke was a huge influencer back in Sri Lanka. He got citizenship in Sri Lanka and he translated lot of his books into our local language. And most of the kids, like grew up reading those and that's the beginning. Like looking at the science fiction and think about like what are the new technologies we can produce in 50 years or hundred years in time? And how can we help the people even now by thinking about those future technologies of future scenarios and bring them into the world right now as like experimentations. And you can think about some idea and then put it like a hypothesis and then test it a little bit and then try to find like a niche segment of the market and then produce or develop a product towards them as the sort of like the initial getting started place. So again, so it's mostly about the science fiction and the curious mind, I would say. But as I mentioned, like even having that, I always wanted to think about the human being, not just something which is like a possibility, which seems possible in 50 years, 100 years of time. But can we use these tools or the interactions we are building to also to learn about the human beings. So as human beings, we don't know a lot of things about how our brain works, how our mind works, even how our sensory systems work. So one of the challenges we have is rather than always starting from like studying from brain and neuroscience and all these things to coming to the senses. I was asked a question, can we go like the bottom up approach, like starting from the senses and stimulate them and see how our perception changes and sort of like interventions that by studying the human perception and physiology. So that's kind of interest I have. And that's where all these kind of ideas and interactions we are producing.

John: Now that is fascinating. Yeah because I actually never thought until this moment with the virtual cocktail, to some extent, that is a kind of brain computer interface in that you're delivering information into a brain through this. You're just leveraging the taste system right? Rather than, you know, there's this idea that you're going to put, I guess, the neural link idea where you've got electrodes going into the brain and somehow information is coming in. But why not use the hard wiring that's already there?


Nimesha: Yeah, as I mentioned, there are two ways. You can directly sort of talk to the brain using as you mentioned, like implantable technology or implantations. Like people are already using for patients with Alzheimer's disease. They try to implant and use the electrical and magnetic stimulation. And on the other hand, you can use our original sensory system to influence the brain and falling the brain to think like this is something they are. Like our television and radio, right? It's not like a natural thing, but you can produce some creative, innovative things through the visual aspects or the audio aspects. And at the same time with the virtual cocktail and the technologies we are working on, they can be as television for the human part of the moment.

John: Yeah, it's fascinating. You really ahead of your time on this, I think maybe I was just behind, but I mean, I have to say I was embarrassed it took me, yeah, it was really this whole idea.


Nimesha: Also, it is like an extension of the next stage of the virtual reality, like not just the audio visual aspects or VR headset, but which can color the taste, smell and haptics and temperature. All these modalities using VR headset.


John: Yeah. It really makes you think how far we can go because you know, the temptation to think about virtual reality is this uploading your brain into the computer. Right. That somehow that's going to be the endpoint of human existence because we have our brains into the computer. And I don't know either that's good or bad. I don't know. But actually, it's amazing how far we can get with this augmenting all the senses, not just sight and sound. Yeah, that's very fascinating. Okay, obviously, have you worked on sodium reduction because it seems like that's enough?


Nimesha: So even though I have my primary interest on the virtual reality and augmented reality and how we can develop these wearable and ubiquitous technologies to improve the AR and VR technologies, but most of these technologies we are producing, I mean, not just me, like a lot of other researchers in the world as well, has direct impact on the health implications as well not only the physical aspects of the health improvements, but also the cognitive and the psychological aspects as well. So we are working on sort of ideas like how can we work on the sodium reduction. So the reduced sodium meals by producing these utensils where you can stimulate and enhance the salty sensation using electrical stimulation rather than adding the salt on your diet. And that's one of the primary aspects and in general, like cutting down the calorie intake in general is also one of our primary focus as well. Because right now, I mean, including me, like we are eating most of the time, not just because of our body wants to sustain, just because of the pleasure and craving for different flavors and all of these junk food we are eating. Not necessarily we need them, but we need them to satisfy ourselves or build some pleasures. So we will be asking a question, can we sort of influence the people to eat healthy diet and enough portion to sustain your body? But you get the similar satisfaction by augmenting these multisensory aspects of a meal like for example, like not right now, but in the future. Let's say someone is eating some broccolis, but he can see and feel like eating a piece of chocolate or piece of like a burger. Right? And so he will get the same satisfaction as like you are eating something junk or something you like to eat, but eventually underneath your body sustained the proper nutrition you are getting. That broccoli idea just one example, but that's sort of the cool directions we want to be looking at as well.


John: Fascinating. Well, I have a four year old here at my house.


Nimesha: Exactly. That's always a challenge, right? And again, something we have sort of understand things like I'm not saying like all our food is going to be replaced with the virtual food, but I'm saying, like, we can influence the people using the technology to eat right. And then to eat whatever the balanced diet we need or as human beings or human bodies need. But increasing the pleasureness of us using other external modalities while eating and drinking.


John: Right, exactly. I mean, it's just extending augmented reality to other senses. Okay, well, we actually are almost out of time, so I think it would be good hear, are there any other thoughts, you know, as far as things you're working on right away that you think would be interesting to the listeners before we kind of wrap up? I would like to wrap up with advice, but any kind of last topic to discuss?


Nimesha: Yeah, I think, as I mentioned, like so far, we have been looking at like the human senses during like eating and basically dining environments. How can we simulate these things and then probably understand human perception and sensitivity. But right now we are focusing more towards the human cognition and experience and facial sensations and how can we use these modalities to sort of talk to those aspects and then to sort of producing a more satisfied person using these augmented technologies. And on the other hand, we are also interested in like the remote interactions, especially after the pandemic. We are studying different areas in the world, and we only interact with these videoconferencing and chat systems. But we were also trying to think about multisensory technologies we can use in conjunction with like Zoom or any other audio visual and streaming services. So not just audio visual information can be transferred, but also how can you synchronize the different spaces with different olfactory sensations or how can we use, like the same lighting conditions in the same different environments just to like understand and study like, how can we build up a more engaged community through these multisensory technologies even though we are looking at the distance or remote spaces.


John: Yeah, that's fascinating. Okay, so final advice for our listeners? Suppose that you're going to be talking to young sensory scientists, you know, she/he is just beginning their career, what advice would you have for that person?


Nimesha: Be curious. As a scientist, that's the basic and first thing we have to sort of follow. Be curious and try to think about the I mean, anyone in the field like sensory science or engineering or art or design or any field like nowadays, world is very sophisticated and we should not sort of like 20-30 years ago, if you were an electrical engineer, you always work with the power systems and nothing else. But now the world is so much changed and you can think about your interest, identify your passion, what are you passionate about? And then you can certainly sort of use your background to do something on that regard. And that still feels like having fun and it will not wear you out. So that's also the basics of this, like all these startup ideas and all these new ideas come up with like that. People who are like working completely opposite directions are trying to like do something in some other field and then suddenly something clicks and everyone is happy about that. So that's why I mentioned, like, be curious and not only focus on like if you are a sensory scientist, not only focus on the sensory science, think about the other aspect, like the human side and the technology side, and then the new design ideas, innovations like the creativity and innovation is a tool like buzzwords nowadays especially in many places. So all these things are very important to whatever the field you want to succeed. So that's sort of my thinking and my advice as well.


John: Yeah, that's fascinating. Okay, great. Nimesha, this has been really a pleasure. So if someone has to follow up with you, what would be the best way for them to get in touch with you?


Nimesha: Email is the best way. I think you can probably display on the screen my email address?


John: Yeah, I can put a link to your LinkedIn, if that's okay.


Nimesha: Yeah, that's fine.


John: And we'll put a link to your lab webpage, should someone be hired as a student, I think it would be a great place for someone to be right now. Okay, this has been a real pleasure. Thank you very much.


Nimesha: Thanks John. Thanks for having me and it's a pleasure to talk all these things.


John: Okay, that's it. Hope you enjoyed this conversation. If you did, please help us grow our audience by telling your friend about AigoraCast and leaving us a positive review on iTunes. Thanks.



 

That's it for now. If you'd like to receive email updates from Aigora, including weekly video recaps of our blog activity, click on the button below to join our email list. Thanks for stopping by!



bottom of page