Welcome to "AigoraCast", conversations with industry experts on how new technologies are transforming sensory and consumer science!
AigoraCast is available on Apple Podcasts, Stitcher, Google Podcasts, Spotify, PodCast Republic, Pandora, and Amazon Music. Remember to subscribe, and please leave a positive review if you like what you hear!
Dr. Chris Simons is a professor of Food Science and Technology at the Ohio State University. He received his PhD in Sensory Science at the University of California, Davis before conducting his post-doc in Sensory Neurobiology in France. From 2004 through 2012, Chris led the Sensory Research function at Givaudan Flavors Corp. and joined the faculty in the Department of Food Science and Technology at The Ohio State University in 2013. In 2017, Chris was awarded the Barry Jacobs Memorial Award for Research in the Psychophysics of Human Taste and Smell by the Association for Chemoreception Sciences. Chris’ research interests use a multidisciplinary approach to understand perception of foods and how they are processed to influence reward and ultimately behavior.
Transcript (Semi-automated, forgive typos!)
John: So, Chris, thanks a lot for being on the show.
Chris: Well, thanks so much for having me. I'm happy to be here.
John: Okay, great. Now, Chris, one of the things one of the reasons I was really excited to get you on the show is that I think we can both agree that historically one of the biggest problems that sensory scientists have had is matching up the research that we do with outcomes in the real world. That over time, I think we've gotten very good at doing things like optimizing liking scores. But then does that really correspond to increases in repeat-purchase behavior, for example? So I know that your research is really like a lot of it geared around building a bridge between the kind of sensory measurements that are made in laboratories and then outcomes that happen in the real world. So I'd like to just at the start to get kind of your thoughts on this topic on how you see your research is helping to bridge that gap.
Chris: Yeah, I think I totally align with you on that thought. I think when we look at sensory science sort of historically and the way even, you know, I have tea presented it in some of its media, you know, sensory sciences is used of humans as an instrument to understand foods. And I think we've done that really well for, you know, 50 plus years where we've been applying that. And I think what we do which is sort of different from other, you know, scientists who use instruments is that we don't really understand our instrument. And I think that that's where a lot of my research is sort of focusing. And I think a lot of the research that I'm really interested in and that others that are doing sort of in this space as well that I find really kind of exciting is sort of the flip side of that same coin where instead of using, you know, humans to understand foods. We're really using foods to understand humans and ultimately human behavior with the hope that if we can do that, we can build a better instrument, we can get more reliable data. We can use that data to make stronger correlations and predictions of human behavior, which I think ultimately will be beneficial to the food industry, especially as we move into these new spaces where, you know, we're looking to provide healthier foods and need them to be just as rewarding as, you know, the old food. So I think it's really an exciting time for sensory science.
John: Yeah, I completely agree. And maybe for our listeners who aren't as familiar with your research. Could you give an example of a project that you've been involved in over the last few years that you think really typifies that the kind of flipping perspective that you've just described this idea of using food to understand people?
Chris: Yeah, I think a couple of different spaces. So I think one of the areas that we're really active in right now is trying to understand the mechanisms that sort of underpin perceptions. So looking at how, you know, right now we have a big focus on texture, which I think is still the Wild West and looking really to see how texture is driven, not from a top-down perspective, which I think is how a lot of research has been done in the past where we manipulate some aspects of the food and then look to see how that changes. You know, use that instrument and look to see how that changes the human perception. But our approach is really sort of different, where we manipulate various aspects of the stimulus and really actually drill down to the mechanisms that underpin those. So what are the mechanoreceptors? What are the various aspects that actually are responsive to these different types of stimuli? Sort of this bottom up approach. I say, you know, another area where I think we're really starting to become active, which I find so fascinating, is this whole area of context. And I think, you know, companies have realized context is important, you know, and I think, you know, it's it's naive to think that companies have not sort of been in the position where they're considering these things. But it's been difficult in the past, and I think it's been expensive, you know, home use tests on premises tests tend to be expensive. You lose a lot of experimental control. Our approach has been to utilize immersive technologies and looked to see how we can manipulate context and sort of twofold. So one is to see can we develop more reliable methods that are more predictive of future choices? You know, consumer choices or liking. But also, it gives us an opportunity just to really to understand the interaction of context with eating behavior and how that changes people's perceptions of the foods. And, you know, when we stick somebody in a booth, we turn on a red light and we isolate them, that's just not how we typically eat. And so I understand why we do that. But if we can incorporate context in a meaningful way, especially in a way that we can actually control how people interact with that context. It really allows for some, I think powerful experimental designs. And I think, you know, for me, I'm not I think I'm not naive to think that immersive technologies is the panacea. You know, it's not going to answer all the questions. And I think, you know, actually, the further we get down into this rabbit hole, the more you know, the more questions we have. You know? I sort of put this in air quotes. We sort of see, you know, typically the similar sorts of responses where we see better product differentiation. We see more reliable data. So what people like today. They like again, you know, two weeks from now when we. Yeah, exactly. More stable. Exactly. But the effect sizes are different, right? Sometimes we see really really powerful effect sizes and sometimes we see, you know, very little, if any. And so I think, you know, trying to understand what scenarios, what products are sort of amenable to contextual influence is really important, because I don't think everything, you know, necessarily has to be tested in a contextually relevant environment. So I think there's just there's a lot of fascinating questions to still be addressed in this space. And for us, you know, it gives us really, I think, differentiating and interesting research directions to pursue.
John: Right. So you set the context properly. You have, generally speaking, more stable results over time. You see differences that you've more clearly than you were in a kind of, well, okay, there's different types of environment. One is just a kind of abstract, kind of anesthetic type environment. But I know some of your research actually is where you've got this lack of congruence. I thought that was very interesting. Your research on looking at manipulating the congruence between different, I think it was in different modalities, if I correctly. Right. Exactly. Yeah. Yeah. That the more that the environment that someone is actually experiencing the product and or they giving you responses and the more that's congruent with everything to do with the product. The more stable and the more differentiating the responses are. Is that correct?
Chris: Exactly right. Exactly right. Yep. So when everything aligns, all of the contextual information sort of aligns with the product experience, that's when it really sort of hits with people and if there's some aspects that aren't aligned, then you start to see, you know, a lesser impact of that contextual information. And for me, this is so fascinating because, you know, everybody's experiences are different, right? And so right now, one of the things that we're really interested in is how we can actually personalize contacts. Right? So the way that we've approached it in the past is we record contextual information which we think is relevant. So a coffeehouse scenario for assessing coffees. Right? And so we'll go and we'll record coffee, video and contacts, you know, audio, video aromas and so forth that are consistent with that environment. But, you know, if that's not your coffeehouse or if that's not typically how you consume coffee. If you consume coffee more at home or in your car, then you know, that's probably not a personalized, contextually relevant scenario. And so I think now, you know, there's opportunities to actually collect really personalized contextual information for people as well, have them record their own sort of environments, bring that with them to the laboratory, and then test them in these, you know, scenarios that are very consistent with how they typically evaluate products. And it still allows us to it's almost like bringing the home use test into the laboratory where we still have experimental control.
John: Right. That's right. That's fascinating, Chris, because, I mean, I've seen like for me, when I started to really get into these kinds of new technologies, what I saw was that the technology could help bridge quantitative and qualitative research. That was kind of a there's always been a kind of divide there. But I think when you start to, well, I mean, you already see this already with, say open ends and online research. If you get enough open ends, you can start to sort of approximate what you might get from a focus group, at least kind of in broad-strokes. And you and I have talked about, for example, putting chat bots in the surveys that might be poser, you know, to a conversation. Something other research I've been involved in recently. I've started to talk to my clients about Alexa based surveys, and that's really just on the cutting edge. But the idea is that someone could be sitting at home evaluating product and Alexa's interviewing them. You know, Alexa's asking them questions. And that's as many many advantages to that sort of research because now you've got instead of quant call, which you kind of have is you have it's divide between, okay, you've got scientific rigor and scientific control on one side and then you have kind of ecological validity on the other side.
John: And so now what you're doing and what I think is really exciting is using technology to bridge that gap. To try to bring detail, the scientific rigor together with some sort of ecological validity. So that is super exciting, actually.
Chris: Yeah, I think so too. I think that's really how science advances, right? I mean, the more that you can do to improve the ecological validity of any sort of testing environment or scenario or experimental design, obviously the you know, the better off you are. But, you know, at the same time, in order to do science, you have to maintain scientific rigor. You have to ensure that people are experiencing the same treatments and otherwise it's not science. Right? And so I think that as the technology advances, we can start to do some really interesting things. And I think, you know, as we've talked, as you mentioned as well, sort of this idea of, you know, use of artificial intelligence and some of these, you know, algorithms that allow for processing huge reams of data. Right? I mean, technology is not only advancing as it relates to virtual reality, the ability to, you know, sort of put people into a contextually relevant scenario. But, you know, people are starting to wear devices that are recording biometrics. Right? So we have real time information about people's status. Right? That could easily be fed into these, oh, "easy" in air quotes, right? So ideally, these could be fed into, you know, these engines and you could really start to even start to think about personalizing them. What people need at any given time. Right. So personalizing nutrition, personalizing product development. And I think that just like, you know, even more exciting and I think companies, you know, it's sort of goes against the mantra of a lot of companies. Right? I mean, companies don't want to have to make two million different products. Right? They want to make one product to be able to sell it to you know, two million different people. But, you know, 3D printing, all of these things that are ongoing, right? I mean, we're going to need as technology develops, we're gonna need all of these systems to sort of integrate and be able to sort of talk to each other, speak to each other. I think it just it makes for such an exciting future for the food industry starting to link all of these different components into the product development process and being able to potentially create real time for specific individuals needs at that given point in time. It's fascinating.
John: Yeah. I totally agree especially when you think about all the kind of point of sale, like you think about the Coca-Cola freestyle vending machines. You know, Pepsi, I think PepsiCo has similar technology where the things. Yeah. I mean, I guess it's interesting how privacy concerns start to come into this a little bit, because ideally, you know, if the more you know about the people who you're selling the product to, the more you can personalize the product. Like, it's a nice idea. If you've got, say, a freestyle vending machine beside a gym, maybe it's going to offer more water options or it's going to offer, you know, options that are based on the location of the machine. What the machine is offering can be tailored somewhat. The ideal scenario, I mean, you see this in China, for example, the KFC is in China are using facial recognition in the screens. People come up to order. And I'm not sure if opting in is even necessarily an option. You just are recognized by the machine. Yeah. Yeah, definitely. If you're part of their loyalty program, then for sure this happens. I'm not sure if it's generically happens. It's possible it just generically happens. But if you are in their loyalty program for KFC in China, the screen that you see is customized for you when you go to order. And over time, it will learn your behaviors and it will learn for better for worse what to upset you on. So they have increased sales and actually increased loyalty. Because you think about it, there's a bit of a data out there. If you know that a restaurant knows you well, just like if you have a regular restaurant, you attend you go the same place for dinner once a week. You go and you say, I'll have the regular, I'll have the usual. Whatever the bartender needs you they can offer you a drink. Now they're starting to get provided at scale at these large restaurants. And there's also a benefit where if KFC knows you and they know what you like, you can go to any KFC anywhere and they're gonna know what you like. Right. They're going to offer you. Or maybe they'll have suggestions. They'll say, "Hey, have you thought about this?" This you know recommenders say that right? So I think that is very interesting. And I think that it just shows a dynamic there between privacy which of course.
Chris: Right. Which is interesting. Yeah. Because I have colleagues and friends and family who, you know, Facebook will put up information that is consistent with things that they Google searched or whatever. Right? Or these various recommender systems that provide, you know, how you looked at this. You might like this. I love those things. Right? I love it. I love it. It just takes one thing out of my, you know, the thing that I have to decide upon. Right? So but I do. Everybody does like it. And so it will be an interesting dynamic. But I think, you know, for companies to be successful at it, they need to I think, really provide, you know, use that information in a way that provides meaningful choices to people that really do have impact, right, that have functionality that the consumer, the person, you know, responds to in a meaningful way.
John: Right. Right. Yeah. I mean, for example, it can be good. Like Microsoft, I think had some demo at this I think it was in London. There was an AI conference and they had ice cream flavors where their software, based on your mood, it would detect your mood and then recommend ice cream flavors to you based on your mood. That was the idea. Right. So, I mean, that's where this stuff is going. You go up in the machine, got this person looks like they're you know, they measure whether your face is flushed or they could really use water right now or would you like that. Whatever pushing things. I mean it is it is very exciting time. So. All right, Chris. Well, maybe we can talk now a little bit about some of the other kind of branches of your research. So what are the areas that you're working on right now? Like, what are the kind of going forward? What are the interesting topics for you when it comes to bringing new technology.
Chris: So I very much embrace the idea that, you know, we need to identify and involve technologies that are necessarily outside of our field, right? So we do a lot of work with a variety of different biometric types of tools. Psychophysiology tools to not only try to understand implicit measures, but also sort of to try to understand kind of just that consumer response to a variety of different stimuli. We're doing a lot of stuff. In actually two big areas as it relates to kind of what I call the hedonic bucket. But so in addition to this context piece in the virtual reality. One of the things that we've also noticed is when people come into one of these immersive environments, they're more engaged, right? So part of that response or part of that improved data quality, if you will, maybe just because we have a more engaged panels, right? Somebody who's more engaged is more likely to give higher quality data. So we really got interested in engagement. I have a graduate student who sense some really really good work on trying to really pinned down engagement and understand what are the various factors that contribute to it. So we've now developed we're in the process of revising the initial submission, which will hopefully be published soon. But basically an engagement questionnaire that will allow us to really start to understand how people's level of engagement changes under different conditions. Right? And so our goal then is to take this and use this as an instrument then to evaluate other types of manipulation. So I have another graduate student who's starting to look into gamification and utilizing gamification elements. And are there ways that we can improve the quality of testing, making it more engaging by improving, you know, by including some of these engagement, not engagement with these gamification elements? Right?And again, it's a spaces tangential to where we've traditionally been. You know, we're a sensory scientist. We're really good at making surveys and asking, you know, questions that people respond to. Gamification requires skill set that's I certainly don't have, you know, and so finding graduate students with that background in that interest has been something that I've really worked hard to do. So bringing in a lot of that expertise, that's sort of tangential. Another big area that we're also interested in is this idea of reward. And, you know, again, when we typically ask in our in our hedonic test, we ask willingness to pay, we ask, but it's typically liking. Right? I mean, generally we use nine point hedonic scale and we assess liking. And, you know, over the last 10 to 20 years, at least in the neurobiology space, we've started to really realize that, you know, liking is just one aspect of the reward complex. So when you look and if you break it down and there's different, different neural pathways that actually process these different components. And so there's a wanting aspect, right? That that motivation, that in a salience actually get that reward. Right? Then there's the liking, which is experiential. Right? So you like it once you've experienced it. Right? And then there's the satisfaction, which is sort of that third component, which is sort of comparing the experience to the expectation. And then that's ultimately we think that's what's being consolidated into memory, which is then driving that next, you know, that next choice. And so we're really looking at so we're utilizing some of these psychophysiological measurements where we're designing tests to really be able to measure these different types of these different elements of the reward complex with the belief that if we can understand and if we can measure these different things differently, then we may be able to use those to optimize products in a different way. Personally, I really believe, you know, wanting as much is likely going to be a much better predictor of future product choice. Right? So if you're really motivated to get something to buy a product, right? That's going to be a much better predictor. It's not as easy as just asking somebody how much you want it. Right? It's actually easy to do with a rat you put it on a treadmill on and see how long will it run to get a reward. Right? We're humans, that's what I love about working with humans. Exactly. See how hard to run. I'll get a drop of grape juice. But, you know, I think there's some really interesting things in that space. And so if we can sort of decomp bullet that reward complex and have different ways of measuring those different components, I think we may be able to sort of tap into something that's potentially more predictive of, you know, future human behavior.
John: And have you done in your research, have you looked at the actual purchase behavior of your subjects? Is that something you have access to? I do think that's a little bit of the missing piece, a lot of the time for us that we dont really know that level of information about our subjects.
Chris: You're absolutely right. And it's not something that I have done in my research. So even when I was with Givaudan, it was some things that we had questions, stupid as a supplier. You know, we had information on sales of flavors and so forth, but not, you know, ultimately product sales that, you know, consumers were purchasing. So I think I agree with you hundred percent. I think if you if we can tap into those sorts of databases of information that would be extremely useful to start making those connections. And, you know, there's obviously companies have a lot of those, you know, Krogers of the World and so forth. Right? They have that information.
John: Right. Amazon is increasingly getting into making you know, kind of CPG type products like they've launched the gem in the UK. You know. Yeah. So it is interesting if they have panelist's they actually have all the sales data from the very panelist's.
Chris: Exactly. I mean, that that's the holy grail, quite honestly, right? If you can actually link behavior to some of these other metrics, I think it's really really powerful.
John: Okay. Alright, we have a five minutes left here so that the time we have left, maybe we can just have you talk about it. What do you see as being the likely? So you've talked about looking at alternate measures of kind of these alternate psychological dimensions other than liking like wanting your satisfaction. In the next five years, what do you see as being like the most fruitful areas research within sensory science?
Chris: Well, that's a good question. So if I can't choose my own areas?
John: We can say your own, looking around, I mean, assuming that the robots haven't taken over.
Chris: Exactly. So I think I think data analytics is going to be huge. I really do. I think there's lots of data out there. And I think data analytics is going to be a huge component of that. I think it's going to require artificial intelligence. I think those are going to sort of be married together to sort of start looking at a behavior and predictive models.
John: Learning different sources of data.
Chris: Exactly. Exactly. Yep. I still think. You know, the next five years, I think this idea of immersive technologies is still going to continue to grow. I think companies have started to become interested in this idea. It's relatively inexpensive compared to, you know, what it was, you know, 20 years ago when you had to build a cave and, you know, have all of the specific, you know, high powerful computers and so forth. So, you know, I think that it's relatively, I don't wanna call it low-hanging fruit, because I think there's still a lot of questions, and that's honestly that's one of my fears, is that, you know. A lot of companies started to offer services and so forth, offering, you know, immersive technologies as a way of collecting data. And I think that's great because I think that it sort of shows this belief in this direction, but I think there's also there's you know, there's a lot of ways to do things poorly and then say, "Oh, you know what? It doesn't work." So why are we spending our money on this? Right? And that's one of my big fears, is because I think this area is so important. Context is so important. And even in our own hands, right? We'll do studies where the effects aren't what we thought or are small or whatever. And so we just really don't understand. And I you know, it would be easy to say, "Oh, well, you know, these effects are the same as if we just tested in a booth. Let's just test in the booth." You know, yeah, in that particular case. But, you know, in so many other cases and if we really understand how context is influencing, you know, what's the mechanism by which context is exerting its influence, then I think that we have a much better platform upon which we can start to offer these these types of services. But I'm happy to see that, you know, companies and industry is interested in starting to embrace it, because I do think it provides some unique opportunities to do some testing and, you know, more ecologically relevant and in valid ways.
John: Yeah, it's good, Chris. I mean, what I really like about what I'm hearing talking to you is that it seems like the science is really driving things and you want to use technology to advance the science.
John: And it reminds me when I talked last week with Michelle Niedziela at HCD Research is the same thing. She's really interested in the science. And yes, we have these tools, but we don't want to do cool things just because we can do cool thing. We want to make scientific progress and be aware of the tools and bring them into our science. So it's really, yeah, I admire that about your research.
Chris: I appreciate. Thanks.
John: Yeah. Okay, great. So where can people find you? What would be the right kind of venues on social media or on the internet? Well, how would people get in touch with you especially to someone who wants to be a graduate student in your lab.
Chris: We're always looking for good graduate students and not only food science graduate, students too, which is I think an important element, right? So they can always reach me at OSU. So we have faculty pages on the food science website. We have a personal lab page as well, I think it's Simon's lab at OSU. And then, of course, people can always email me if they have questions as well. I am much to my wife and my students Instagram. I don't use social media as much as I should, so I don't have a presence.
John: Probably healthier. LinkedIn is fairly safe, I think.
Chris: Yeah. And I do use LinkedIn. That's a good point. Right.
John: As long as you really, you know, curate your feed pretty carefully.
Chris: Yeah. I have a good colleague at school who was telling me he knows Twitter is her way of communicating with her students because that's what they that's what they respond to. And it's easy to tweet and it's easy to, you know, get that information out there in a meaningful way. I know. Maybe I'm old.
John: No, it's all right. You really doing cutting edge. You're a model for all of us, Chris.
John: Okay. Well, this has been great. So thank you so much, Chris, for being on the show. We'll put all the links on our website so people can find you that way. And anything else you want to say before you wrap it up?
Chris: Yeah. I just I appreciate the opportunity to talk and congratulations on the on the new baby.
John: Oh, thank you. Yes. Everyone that's out there, we just have a two-week old baby girl right now.
Chris: For the fact that you're actually hosting this podcast is impressive in and of itself.
John: Okay. Alright. Well, bye everybody and we'll see you next time. And thanks again, Chris.
Chris: Thank you.
John: Okay. That's it for this week. I hope enjoyed this conversation. And, if you did, please remember to subscribe and to leave us a positive review. Thanks.
That's it for now. If you'd like to receive email updates from Aigora, including weekly video recaps of our blog activity, click on the button below to join our email list. Thanks for stopping by!