top of page
  • Tian Yu

Carlos Velasco - Looking at the Big Picture


Welcome to "AigoraCast", conversations with industry experts on how new technologies are transforming sensory and consumer science!


AigoraCast is available on Apple Podcasts, Stitcher, Google Podcasts, Spotify, Podcast Republic, Pandora, and Amazon Music. Remember to subscribe, and please leave a positive review if you like what you hear!


 

Carlos Velasco is an associate professor at the Department of Marketing, BI Norwegian Business School (Norway), where he co-founded the Centre for Multisensory Marketing. Carlos received his Ph.D. in Experimental Psychology from Oxford University, after which he worked on a number of postdoctoral and consulting projects in Europe, Asia, and North and South America. His work is situated at the intersection between Psychology, Marketing, and Human-Computer Interaction, and focuses on understanding, and capitalizing on, our multisensory experiences and their guiding principles. He wrote the book "Multisensory experiences: Where the senses meet technology" and edited the collection "Multisensory packaging: Designing new product experiences". Carlos has worked with a number of companies from around the world on topics such as multisensory experiences, food and drink, branding, and consumer research.



For more information on Carlos, visit here

Carlos Velasco on LinkedIn

Carlos Velasco on Twitter

Carlos Velasco on Instagram


Transcript (Semi-automated, forgive typos!)


Tian: Welcome to AigoraCast, Carlos.


Carlos: Thank you so much. It is a pleasure to be here with you today.


Tian: Yeah, we're very happy to have you. While reading your bio, this whole thing is so interesting. I'm pretty sure we'll have a pretty good conversation here. But before we dive deeper into this, can you tell us a little bit about yourself, how you get to this field, this very interesting field?

Carlos: Yeah, sure. So my story is kind of like one that started over ten years ago. I was an undergraduate student of Psychology in Columbia, and I was very very interested in Psychology as a broader topic of inquiry, in a way, we're being taught perception, memory, and many different kinds of like concepts of Psychology. But I always thought for myself, where there are humans, there is human psychology, and we can actually study where there are animals, there is psychology. So we can study multiple contexts. So before I graduated, I wanted to start doing some research, and unfortunately, there were not many professors willing to take me. So I decided to turn my gaze toward industry as a way to fund my research. So I had a colleague back then, his name is Alejandro Salgal, and we created a company that, in a way, would give some services to industry, research on packaging, on different products and stuff like that. And we would, in a way, tax them. But that tax that we had, we used it to sort of like fund our own research. So with that, we ended up going to conferences, publishing articles, and stuff like that. And that led me to meet my supervisor, Charles Spence, a professor at Oxford University. We kind of like started discussing some ideas for a possible Ph.D. and then I got accepted into a Ph.D. and that was some of the most interesting years of my life so far, doing research on multisensory perception. How are the senses interact when it comes to our perception of the world around us? And that's how I got into the field of multisensory experiences. I started kind of shaping my own theories, and my own views on the topic, and at that time, or closer to the end of my Ph.D., I met Professor Marianna Obrist, who is now at the University College London. And she was working in human-computer interaction. And we were very interested in the intersections between multisensory perception and what she called multisensory human-computer interaction. That is new, the development and exploration of new technologies that involve different senses, not only the senses that we typically use when interacting with technologies which is what we see here and increasingly touch. So that led me to an interesting path. I worked with her on the post-doctoral project, and then I got invited to another human-computer interaction laboratory in Malaysia. Long story short, through this path, I ended up working in three fields, psychology which is where I'm coming from, then marketing, because I started doing this practice with the company and I kept the consulting projects. And then I entered into human-computer interaction to dive into the technology part and it became clear to me that these fields are fields that are not necessarily always talking to each other. Sometimes they use specific jargon that dealers don't understand. They have so many things in common and so many things to work together, such as understanding how the senses work, such that we can actually implement them in applied consumer context through marketing and at the same time design new technologies that actually, in a way, respond to the way in our senses work. So if we connect those three, then we are able to design a world closer to our own image, I guess. So that's kind of like the long story short. Now I'm a professor at the BI Norwegian Business School (Norway), as you said, where I have an active research program on the interaction of these three fields on multi-sensory experiences.


Tian: Well, that is super interesting. You mentioned the multi-sensory research and how this human-computer interaction. So that just makes me think of how to involve sensory consumer science in the metaverse. Of course, this metaverse, as you said, has different targets for different things. So how do you define metaverse and what's your thought of this metaverse related to Web3?


Carlos: Alright, the metaverse, I would say, is more than a revolution, as many have postulated, it is more of an evolution, in my opinion. The metaverse is basically we already live in what many called mixed reality where most of our everyday life experiences involve both online and offline experiences, online and offline elements. The key to the metaverse is that it involves immersive technologies now, right? So immersive technologies are technologies such as augmented reality and virtual reality that allow you to do more things in those digital environments. And this connects with web3 in the sense that many of our life experiences are now in digital environments. Well, there are new sorts of interactions and ways of approaching this that we didn't have before. Web3 basically comes from or was a term coined a few years ago, as an evolution of Web 2.0. Basically is the Internet where we have this promise of we're going to have the internet is decentralized information, and anyone can access it. Lots of services can happen now online and stuff like that. But the reality is that many of those things happening on the internet were centralized by big companies like Meta, Facebook, Google, and many others. So in a way, the promise of decentralization wasn't necessarily fulfilled. But now with Web3, what we have is blockchain technology, which is one of the major digital transformations that we can witness now. Blockchain and decentralization, what they do is kind of like give back that promise. Now, we are not centralizing in the service of Google or Meta, everything that is happening and all the services that can happen, but we're putting them in a decentralized network of computers that anyone can connect to and you can create lots of new kinds of interactions. I'd like to wrap up the answer to your question, what I would say, what I like to see, how sensory science relates to metaverse to Web 3.0, our experiences or the journey of our experiences has not changed much. What has changed are the touch points that are part of that journey. So if you're going to buy a product, if you're going to engage with a service, you still go through key stages like the pre-purchase, the purchase, and the post-purchase stage. That's kind of like general. Depending on the industry in which you are, you may have more subtle steps and so on. But what is changing now are the touch points that are part of those stages of the journey that customers have. So before we had just packaging, maybe an offline ad, maybe we could have a YouTube video. The reality is that now, aside from all those touch points that are part of our experiences, we have four key digital transformations as I would call them. The first one is technologies in mixed reality which involve augmented and virtual reality. The second one is the Internet of Things. Internet of Things is we have sensors, part of our kind of like environments now that can track and create these sorts of interactions. The third one is artificial intelligence technology. Artificial intelligence is powered by technologies such as chat, bots, robots and so on which we are now interacting with as well. And then the final one is blockchain technology, web 3.0, and the metaverse. So those are the new touchpoints that are providing a fertile area for innovation in terms of the way in which we design experiences. And that should be of concern to anyone in sensory science precisely because it's part of the experience that customers have.


Tian: Right? So, I have two questions here, but let me start with this one first. So blockchain is this new thing, web 3 that is replacing web 2. Web 2 has a central place that collects less of the data, the data belongs to Facebook or Google or whatever. Now we have web 3, and we kind of have the opportunity to store the data kind of privately on the blockchain, right? So how do you see that fit into our data collection like sensory consumer scientists collect data, are you expecting to use blockchain? Like how would you expect blockchain fitting to more traditional sensory consumer research?


Carlos: Well, I have to say that is a question that I think it's an open question in many ways. Having dived into a kind of like Web 3.0, I have seen that there are so many opportunities that are yet to be explored. So I don't have a specific answer to your question, but I do have some ideas of what sort of things can happen. So for example, here is this metaphor. One of the projects that got my attention in a given blockchain was this sort of different alternative to YouTube. So how does it work with YouTube? YouTube owns your data so then they sell that data to other companies, target you with specific ads, and then YouTube profits, and the companies in theory also should profit from that. So there is no this blockchain-based YouTube, that is called D2, that is in a way what it's doing is you keep your data, the data is yours and you might even get paid for providing your data to be exposed to advertisements. So imagine in a way that all the sensory science data that we may have, our customers remain in their power and that could be used as a token to exchange something with the companies that they provide the data to. So I think it's a very good way to empower consumers. If that answers your question.


Tian: That's definitely something that we were thinking about as well. Yeah, so consumer owns a lot of their own data, right? The person that's about their taste and everything is somehow can put that data in a blockchain and we want to share it whenever we want to share it. Then, as you said, AI technology can grab those data and do the big machine learning models and we can tell the personalized everything. So that is something that could really go into the future?


Carlos: Exactly


Tian: Right. So, another question is that, so you said a lot about the multi-sensory research and, you know, the Metaverse, the virtual space, how the taste and smell, the chemical senses can fit into the virtual environments.


Carlos: Well, that's a fantastic question. I have to say I have actually been working with some people in human-computer interaction in a sort of like devices that connect the chemical senses to the internet. And that is quite a challenging task. So you have many people or several people in human-computer interaction working in these taste devices and these olfactory devices, but they're very raw if you ask me. So the whole theory, for example, of taste devices is that we know from about 100 years ago that from electric stometry, that is this idea that you apply some sort of electric current to the tongue to test the thresholds of people. People report a subtle sensation. Sometimes it feels metallic, sometimes it feels a little bit salty. Sometimes it feels it can give you kind of like a phantom taste sensation. So based on this principle, there have been some researchers designing these taste devices that you can plug into your computer and through changes in electricity, in a way stimulate your tongue and create a subtle sensation. There is a guy called Nimesha Ranasinghe. He's been developing, like these virtual lemonades and stuff like that, that is basically you have a glass that projects color onto the ultra water, but then on the place where you get the seed from in the glass, it's basically an electric conducting metal piece that at the same time as you're drinking the water, it electrocutes your tongue, to put it in those terms, very very subtly, you don't feel it as electrocution and it creates a phantom taste sensation. So this means that in a way, you could potentially be actually drinking something in the metaverse. But again, I'm a little bit skeptical of these technologies because we know that taste is much more than just an electric impulse. There are other people working, for example, with changes in temperature. We know that from sensory science, and research in neuroscience that changes in temperature can lead to, for example, height and sweetness, sensation, and stuff like that. Some people have also been working on these devices and creating rapid changes in a sort of like the temperature of the device, such that you create a phantom sensation, a taste sensation on the tongue. But the key here is why are they doing this is because what we see, what we hear, and increasingly what we touch is controlled by electricity, which is relatively easy. You can recharge it and that's it. Digitalizing the chemical senses is a whole other thing because electricity by itself might not necessarily do the trick unless you go to human brain interfaces, which is something that is another topic that needs to be up much further. You still need the chemicals. And that is a challenge because if I give you a computer, for example, if I start selling a computer that has a sense of smell, that delivers smells, that computer will need to be refilled for the smells because there are chemicals. So that is something that discourages many people to get into this sort of technology, practically. And my take on it is we do not necessarily need to see how we embed these technologies into computing systems that we have such that we can access the metaverse through the senses, but we should just invent different experiences. Completely different experiences. So what would the sense of taste look like in the metaverse if it was not for the offline sense of taste? We don't need to replicate reality. One of the beauties of the metaverse is that we can actually conceptually violate the laws of physics and go to fantastic worlds. So what if instead of eating or drinking a glass of water, you're basically literally putting some electric input on the tongue of people while they're flying in a unicorn? You know, it sounds weird and surreal, but the key here is we have the opportunity in the metaverse to go beyond our physical realities and that's something that we should definitely consider.


Tian: I see. That's very interesting. Yes, so I actually came from the background, I did taste neuroscience for my Ph.D. I know totally that tongue biology in some sense we don't know, too very detailed even today, like sour, salt, those are still like, debating to some extent. So how can you just put electricity and replicate all of that, right?


Carlos: I mean, I wish it was that easy, but it's just not, right?


Tian: Right. But that created another big opportunity. Just as you said, we create a whole new experience for how to involve chemical sense and taste and smell in the metaverse. So are you also thinking, to some extent we try to associate, like the taste and smell into some of the other senses that we are more familiar with, like digital auditory, and that somehow and of course, we all have our experiences, memory, and all of that? Are you also thinking that direction to make an association of the chemical senses to your familiar senses, visual, and auditory, and that creates a metaverse experience?


Carlos: Yeah, definitely. And I think this is something that actually the concept of synthetic metaphor works pretty well. This is something that has been exploring advertising quite a lot in traditional advertising, you cannot tell people what it's going to taste like exactly because people cannot taste the advertisement, right? So they have come up with these things that they call synthetic metaphors, which can come in the form of words or in the form of images that trigger sensations in other senses. So, for example, Skittles is a good example of that. Skittles taste the rainbow, right? So you are not tasting the rainbow, but it gives you an idea of the palette of experiences that you might have. So through storytelling, through visual elements, through auditory elements, you might actually be able to at least remind people of the other senses, I would say. But there is also another opportunity if you allow me to say it here, which is bringing elements of the metaverse to mixed reality. I think for me, one of the biggest potentials is actually in mixed reality. It's not that you're 100% in the metaverse, but that you bring elements of immersive technologies to your physical reality. So, for example, I was working with a company here in Norway, and one of the questions that we were asking is, okay, people, sometimes coffee lovers go to a cafe because they want to taste new coffees. But I don't know, you go to Norway in the winter. This is not a place where coffee is produced. It's completely different. So you want to taste the coffee, and the coffee just feels different because of the multisensory atmosphere in which you are at. So we're thinking, what if we can actually bring the context of the coffee production through virtual reality to the coffee drinker such that they would know exactly where the coffee was harvested and how it was processed while sampling the cup of coffee? So, in a way, we created that for this company, and we did an experiment that showed that the enjoyment of the coffee for that coffee consumption and also for expert panelists was enhanced through the use of virtual reality. So in this case, what you're doing is mixing the virtual reality with the offline experience in a way that ends up in something more than either of the parts.


Tian: I see. That's very interesting. I think that they call this type of thing augmented reality, right? You're trying to, let's say, wear a goggle while you are drinking a coffee, while you're cooking or doing whatever, and that gives you the visual-auditory simulations that come along with your physical cup of coffee and chemical senses experiences. That's brilliant.


Carlos: Exactly. And I also have some master's students that have been doing some more cross-modal related research. For example, we know that the lighting of the space in which we're at changes your perception of taste or can influence your perception of taste. What if you actually create novel experiences, and again go to the fantastic and something that doesn't exist that manipulates light to season the drink that you're having? So you're just wearing this VR headset. Maybe you're in a business meeting or something, and then that helps you enhance the dating experience that you're providing to the members of the meeting. So I think that these immersive technologies, augmented reality, and virtual reality is excellent ways to season food if you ask me.


Tian: Interesting. So you just mentioned a cross-modal association. To me, that is one of the very important and interesting parts that I am very interested in right now. So can you give us some examples of how do you approach the cross-modal association from chemical senses to other simulations?


Carlos: Our brains have sort of like evolved in a way to try to maximize the signal out of the noise in our environment and the senses worked together in that. Sometimes if you don't have one sense, the other one might give you a hint. Sometimes if you have a weaker signal in one sense, another sense with the same weaker signal might just trigger a bigger signal. So the senses are constantly interacting and this is something that you can study in multiple ways through what we call cross-modal matching tasks like classification tasks, semantic, primary, and lots of different tasks. But the key here is people make associations and you can study them. I can give you an example. Do you associate a round shape with sweetness or sourness? Many people would say with sweetness and this is something that has been demonstrated throughout multiple different experiments. People actually associate sweetness with rounder shapes. And we have been using this in the packaging in different contexts to see how shapes with specific curvature characteristics can influence our perception of sweetness and our expectations of sweetness. So this cross-modal association, you can study them in almost any pair of senses in more than two senses and just try to construct the way in which different senses relate to a specific sort of like impression if you like.


Tian: Okay, so you just mentioned the cross-modal association. I was wondering, can you give us a more specific example of how you research the cross-modal association?


Carlos: Yes, our brains are devices that have evolved, sort of tried to get the scene a lot of the noise in our environments, and our senses work together in that endeavor so that we can actually get the signals of our environment but also understand better those environments. We interface with the world around us through our senses in one way or another. So the key here is we can study how our brain makes associations such that it can better relate to that world. Some of those associations are very straightforward. For example, the bark of a dog goes with the image of a dog because they belong to the same identity or meaning. But there are others that are a little bit less straightforward. For example, if I show you two shapes, one is round, and one is angular, which you associate with a sweet taste. Most people would say the round one. And this is something that we have demonstrated in China, Colombia, the UK, Norway, in different countries. It seems that there is a tendency to associate tastes of different quality with variations in the curvature of visual shapes and also on tactile shapes. So you can study in multiple different ways through what we call cross-modal matching tasks, where you literally show something in one sense and then ask people to match it to things in another sense. Or you can also study through congruency tasks where you manipulate one sense and see how people perform in another sense and you see the compatibility of the reactions. And it's super interesting because some of these findings actually can inform very well industry. I can give you in this book that I wrote about packaging, we found many of these associations that are quite relevant. For example, if you want to communicate that product is sweet at the expectation level, you might actually be able to use certain shapes, certain typefaces, and certain colors that might increase the likelihood of people detecting that in the product.


Tian: Right. That's definitely something that we're trying to look into. I can't believe this is already 25 minutes or so. I think this is a good place for you to give some advice to the young sensory scientist.


Carlos: So I would say, I think there are many. The first one is to think of what you want and try. Go for it. I know that it's like the most generic, sort of like advice that you can get people to, but I was like an undergrad in Colombia and I was thinking of these topics and I was thinking like, who am I to the research? Who am I to propose these things? Who am I? And in that process, I just kept going for it and it really worked. The second thing that I would say is to look at trends. Look at the things that are happening in the world. I think one of the key things that were very updated in terms of multisensory by the time that I was doing a Ph.D. and it really was a topic that was going to be developed a lot in the next years. And I think there's a big opportunity now with technology and a lot of possible sort of like opportunities to explore. In particular in the blockchain. I'm impressed, I'm joining non-fungible token communities. I'm joining a different DAO. We're talking about like decentralized autonomous organizations. Lots of things happening in this blockchain space in mixed reality. Lots of things happening with artificial intelligence. Lots of things happening with the Internet of things. But the three are not invented completely. So we have a big say there. So look at that and have a say. But most importantly, and I think this is kind of like my major advice, step back and not only look at the technology trends because sometimes the technology comes from an engineering laboratory, and then it comes to us and we look for an application. My philosophy has been, what if we just jump into the engineering lab and work with them from the beginning, such that our needs of consumers are addressing the development process of the technologies? So I would say that that's something that is looking for multidisciplinary collaborations, look at other disciplines. There's a lot to learn from there, and we don't want to get ready to use packages of technology and other things, we want to actually dive into them and develop them together with them so that we can use them better.


Tian: Right. That's brilliant. It's definitely something that we can learn from exactly. That's such good advice. So, how can people find you?


Carlos: So I'm on most social media, I guess. I have a LinkedIn account. I have a Twitter account, and an Instagram account, so you'll be able to access them in the description and if anything, just send me an email. I'm always happy to share my research. To share ideas, discuss and talk about these important topics. And also one that we didn't talk much about which is the ethics of how we're using these technologies in our practices and in research, which is, of course, like a big topic, given all the implications that they have for our human experience.


Tian: Of course. That's great. So we'll put all of your contacts in the show notes. We're very happy to have you here today, and thank you so much.


Carlos: Thank you so much, too.


Tian: Thank you.


Okay, that's it. I hope you enjoyed this conversation. If you did, please help us grow our audience by telling your friend about AigoraCast and leaving us a positive review on iTunes. Thanks.


 

That's it for now. If you'd like to receive email updates from Aigora, including weekly video recaps of our blog activity, click on the button below to join our email list. Thanks for stopping by!



bottom of page