John Smythe - Find Your Coffee Ice Cream
Welcome to "AigoraCast", conversations with industry experts on how new technologies are transforming sensory and consumer science!
AigoraCast is available on Apple Podcasts, Stitcher, Google Podcasts, Spotify, PodCast Republic, Pandora, and Amazon Music. Remember to subscribe, and please leave a positive review if you like what you hear!
John Smythe works at Amazon as part of their Amazon Insights program, which focuses on delivering insights to Amazon vendors that improve their customer experiences. Prior to that, John worked at Tate & Lyle, where he helped co-author three patents involving taste perception, and prior to that he was with Herbalife where he helped build and globalize a full service sensory program functioning in 18 countries. John specializes in automating and visualizing complex analyses for non-practitioners and has a passion for food, data, travel, and wine.
Transcript (Semi-automated, forgive typos!)
John E: John, we have a lot in common, it seems. So, welcome to the show.
Smythe: Thank you. Glad to be here. Thanks for having me.
John E: Of course. It's a pleasure. So first question I'd like to ask something I'm curious about and I'm sure that a lot of our listeners are curious about is how did you end up at Amazon? Can you take us through that journey?
Smythe: Sure. Prior to working at Amazon, I was at Tate and Lyle which is an ingredients manufacturer who specialize in starches and sweeteners. I was the head of their sensory program over there. And one day a recruiter reached out to me and was talking about an opportunity at Amazon in their private label business. They have a private brands division for food, and they were interested in having a sensory expert join the staff and come onboard and help them sort of optimize their product testing strategy for their private label brands. It seemed like an amazing opportunity. Anytime Amazon calls, you always have to ask yourself, what if I don't take that opportunity? And the draw was just simply too large to not join. Now, when I joined, I was definitely focused on the private label business. But Amazon's got a very interesting culture. They want you to rotate your roles and kind of expand your skill set about every 18 months. And so since then, I've rotated out of the private brands group. And now I'm in called Amazon Insight's, which is a sort of like a service provider to the vendors and sellers on Amazon. So now my clients are really much more the brands who would buy on Amazon rather than Amazon itself.
John E: I see. Okay. Alright. So a lot of kind of questions, that just kind of logistically here, were you already in Seattle then? Where is Tate and Lyle?
Smythe: Tate and Lyle in Hoffman States just outside of Chicago. And so my wife and I did have to pack up our bags and relocate. Chicago is beautiful, but yeah, the winters are very very difficult. And so we find Seattle to be a wonderful place if you've lived in Chicago for a couple winters. So we're quite happy here.
John E: Okay. Yeah, I was in Bellevue actually for the first symposium on data science statistics last year. I'm not sure if you were there, maybe across paths. Yeah, it was beautiful though.
Smythe: Yeah. I didn't actually attend that but Bellevue is a beautiful community, sort of near the Microsoft headquarters. Only about 30 minutes from my house. My wife and I love getting lunch out there. It's a wonderful little area.
John E: Great. Alright, John. So the next question I have is that, so were you the first sensory scientist hired by Amazon? I mean, when was this that you joined Amazon?
Smythe: Yeah, I joined them about two years ago. Amazon's pretty big company, and they actually have a number of people in New York who have experience in sensory. I was actually for our team the second hire. They had hired somebody prior to me and I was brought on board to sort of revitalize the program and sort of drive it forward. I had worked previously with a woman there named Nicole Fink, who is also related out of Amazon. She was phenomenal, but there's just too much work to handle. So they brought me on board and then, you know, we kind of grew the program a little bit. Then the strategy and the direction for the program, you know, evolved. And it sort of changed its focus to be more off the shelf kind of stuff. But, yeah, that was not the first, nor was the last but, you know, it was good time to get in there.
John E: Okay, excellent! And so now with your current clients, users and vendors, people who are making their own products and selling on Amazon, did they come to you for consulting services, for field services?
Smythe: In a sense, the Amazon Insights program is a sort of a unique operation with an Amazon that will help our clients, like our vendors, answer their own research questions. We generally focus on customer insights. We don't do a ton of product testing in our program right now. Most of what you get when you are a vendor or a seller on Amazon is access to data about your product or your brand or your traffic. But you can't really answer the question why things happen. So you might know the traffic is up or you might know traffic is down. But there's really no good way to reach out to Amazon customers directly unless you hire a third party agency. The challenge, obviously, with a third party agency is that people you have professional panelists, right? And people just work their way into a study. Maybe they're not a perfect fit for. We sort of bypass all that because we have first level metrics on the purchase behaviors and browsing activity of our customers. And we can use that to have very highly targeted inquiries for our vendors and our sellers.
John E: Okay. Very interesting. So, I mean, do you get into, of course, if I ask any questions you can answer, no problem, can answer in the sense of confidentiality.
Smythe: Of course.
John E: Do you provide demographic information, that sort of thing or is it more about the actual behavior of the customer?
Smythe: Right. Yeah. So with the ADU privacy laws that kind of implemented in 2020, we're no longer allowed to even ask demographics on our program. So really what we do, everything is based upon non-demographic activity. So we're less focused on whether you're a male who's, you know, 35 years old with a, you know, four kids and a dog. We're more focused on did you look at this Easson or this SKU? Did you purchase it? Did you purchase it multiple times? Things that we can track through those kinds of actual behaviors.
John E: Very interesting. Yeah. I mean, it is completely different way of looking at the world to think about just defining people by their behaviors as far as that of their demographics. It's very interesting. I bet a lot of success that they actually, with machine learning, using psychographic behavioral information, etc. as inputs, you know, models. So I can imagine that must be very valuable. Now, I'm not going to push it further on this, but I think we're starting to get into, you know, things that it's hard to be....
Smythe: I mean, I can say very high level that we are dipping our toes in the water psychographic profiling, trying to understand people based on how they perceive themselves and how they sell work without necessarily getting into demographics. These are super, super hot topics for people. But again, kind of our real like prize sort of in our portfolio is the fact that we actually know what what folks have bought, what they've bought relative to that product, what they haven't bought. And we find that helps us. Like, for instance, we have a survey we call the Lost Customer Survey, and that's a survey that focuses on people who, you know, buy and the browse note or the category that the client is selling in, but those people that didn't buy the client's product. And it's really really more diagnostic study. It tells you kind of how they got there, what where they'll say, look, things like that. But those kinds of studies are, they have power when you know the actual behaviors of the people that are going into the study.
John E: Now, I can imagine that's really valuable. Okay. So that's something I know you're really passionate about that we were talking about in our precall, is simplifying the communication of complex results. So I know you actually have even pioneered some original approaches in this area. Do you want to talk to our listeners a little bit about what are the things that you think that you feel are most important when it comes to communicating? I'm sure in your case, you have a very complex results that you were trying to communicate. So what are some tips you would provide?
Smythe: Yeah, I mean, when you think about the way that we study things, you know, I've got sort of publications on some flank, complex analysis like PSA so that an executive can review it maybe a little easier. But even more fundamentally, just at like summarizing data for executive review in general, that can be a very cumbersome process, has a lot of room for error. And so what sort of my claim to fame has always been in my career is figuring out how to take a data set and go from basically like a downloading Qualtrics or survey monkey or whatever it is. Automate all of that analysis just by pasting and having formulas in Excel that re-tabulate all that information into summary data tables, using those summary data tables to populate plots and then mapping those plots into like a PowerPoint or a word file. So the entire analysis is done, you know, in about 10 minutes from like literally data download to a viable report. But the key in that kind of stuff is that it requires standardization of your methods and standardization of how you communicate your data. And this, I think, more so than anything in scaling and automating data analysis and making it easy for people to understand is sort of a prerequisite. The reality, of course, is that a lot of people, a lot of programs don't have that luxury where they can sort of create these benchmarkable programs. And in those cases, I think what it really boils down to is trying to understand is what I'm doing here, is this something I could communicate, you know, to my kid and like who's in like fifth grade and help them understand it? Because a lot of times, you know, executives have no idea what we're talking about. You know, they don't play in our space. They don't play in our field. It's all kind of Greek to them. And it's easy to sort of talk over people in a way that they just don't know what you're talking about.
John E: Yeah, I had lot of experience doing that. I definitely relate to what you're talking about. So, yeah, I mean, I do see that as a definite theme right now with the automation is kind of upstream benefits. As you push towards automation, it forces you to standardize how you put your data, which then of course makes it much easier to store your data, which gives you all sorts of benefits into the future. So, yeah, I think that's definitely something that I would agree with.
Smythe: I mean, just to kind of go off of that point of storing data, you know, we work with a lot of clients external to Amazon, and there's no question that people really want to benchmark their performance over time and to that of their competitors. And the first step in allowing for that kind of benchmarking over time or against external references is standardizing your data and to your point, warehousing your data in a mineable form.
John E: Right. Where you can easily get to the information that you want later. Right?
John E: Right. I think that is really important. Now, why do you think it is, there's something that I think is idiosyncratic about our field in that we tend to have these data sets that when you go back to historical data, it's almost like, we for some reason sensory scientists, and I'm one of them, have a lot of trouble running a study with the same questions twice that it's like...
Smythe: It's absolutely true. I mean, I think, you know, every product's different every time you you sort of revisit a product and the product lifecycle. You're going to have different increase in. Even in our studies, we have our portfolio. We have basically about 10 studies that we sort of locked in on a core structure. But our clients very often, you know, don't want the questions we're asking and exactly the way we're asking them. So what we do is we've created basically wildcard functions for a number of our questions that allow the client to adjust the response options. So the question itself is the same, but the response would be different. So, like, take for instance, I mean, this is for an insight's project, not a sensory project per say. But let's say that you've got somebody who's, I don't know, making bases. We don't have any clients who make faces, but they're going to have a very different set of features that they'd be interested in than a client who makes let's say they're making mattresses. Right? And so we keep the structure of the survey consistent, but we use wild cards to allow for them to pick different features, and then that allows us to benchmark, you know, the value of a feature or like the effectiveness in communicating a feature on our detailed pages, for instance.
John E: Yeah, that's a good solution. I can relate to that. There's something I'd like to talk to you about, this is another thing, if you can't talk about that is okay, but I'm very passionate about our Alexa based surveys. I'm just finished what one pilot study where we got very encouraging results. I think there's I mean, it's a little bit an analogy I've been using is, have you ever seen pictures of early automobiles of what they look like?
Smythe: Of course.
John E: Where you've got the two bicycles with the board between the motor?
Smythe: Right. Basically.
John E: And, you know, if you saw that you would never think unless you had vision like that's gonna turn into, you know, Daytona 500 or whatever. You just look at that thing. But with, you know, like Alexa based surveys, I think there's huge potential down the road to have quantitative and qualitative research coming together where you essentially have a conversation.
Smythe: Absolutely. Yeah. And I mean, if you think about like even there's the idea of the blueprint skills in the Alexa skills portfolio where anybody with very, very limited ability to code or anything can go in and actually take templates that previous people have used, modify them and in a sense adjust skills so that you can get these surveys to work where you can say, Alexa, give me the, oh I almost just activated my machine. Got to be careful. But, you know, you say, hey, let's do this thing, I mean not just for things like surveys, but for any kind of information these skills can be super useful. For instance, in our program, we've put our terms of service on Alexa's skill. So if somebody wanted to understand our terms of service, they could just do that. The trick that we find, and I imagine you're probably running into some of this yourself, is the complexity of the survey, the response rates and really the likelihood of people to sit and sit through those kinds of surveys. What we're finding is polls and things like that are effective. People will do a one or two question poll, but mostly if they're gamified, you know, you want to make it fun for the respondent if it feels like work. We find that it's not always people aren't always going to spend the time to interact with Alexa if the complexity is high.
John E: Yeah. They'd have to be engaged somehow. Like, I'm starting, so I've started to collaborate with Perriman Croll. And they have a standing panel and their intention is to build out within their larger panel a subpanel of Alexa panelists. Where they you know, when people don't have Alexa at the beginning, they get an Alexa, I hope I'm not triggering your device there when I'm saying, oh, you're on headphone.
Smythe: I'm on headphone so we're good. Yeah
John E: Yeah. But you know, their idea is and I mean, it's very I mean, it's just the what you have to do because if you only do your research among people have Alexa, this huge bias, right? Yeah. Take their panel, recruit people. And if they don't have an Alexa, give them Alexa as the incentive. Right? And then now they're in your Alexa panel. And then over time they grow this out. Right? So then you're paying people to be in the study just like you would otherwise. But even then, I would say, this is, yeah, that there definitely, I'm going to hold back in saying too much here because we have some covers, presentations coming up, but I don't want to give away all that.
Smythe: Sure. Still want people to come to the talk.
John E: Yeah. But the basic idea is for in the moment, evaluations of a few key questions you care about, right? Handsfree. You're going to bite into a barbecue chicken. You bite into it and your cover hands are covered in barbecue sauce. How do you say is the chicken? There's really no better way to get that information than with Alexa.
Smythe: Right. Absolutely. Or like, you know, if you're doing something where you're using a cleaning product, right? And your hands are just soaked up or whatever it is, there are many, many use cases where Alexa, I think, could present a much better solution than giving somebody a paper ballot that they fill out after they've used the product, that they then go to a computer and key in. I mean, that's I think too archaic.
John E: Or you're brushing your teeth, shaving, any of this stuff.
Smythe: Brushing your teeth. How's the toothpaste? They're good.
John E: Well, you know, actually, I think I might be the first person in the history of the world to ever do an Alexa based survey in the shower. Myself and my developer who works on this rig up a shampoo survey demo. I did it in the shower. I set up a little shelf above. I have my Alexa as little battery pack and put her up there. And I did my survey in my shower while I was shampooing my hair. I was answering questions about it.
Smythe: Absolutely. Absolutely. Yeah. And again, I think the upside potential is there, the trick is, you know, Alexa, there's a certain latency and how she answers things and how long it takes for things, and I mean, it's as you've probably encountered, there's issues with timing out. So if you've got a long survey, I think that's where Alexa probably isn't at the point today where she's ready to handle that. I think if you've got these sort of shorter, you know, four or five questions, a survey can be done in three minutes, four minutes. I think there's a lot of upside potential there.
John E: Yeah. And I see it as supplementing traditional approach is not replacing them. But in the moment get some things you really care about. Then you might still have the online survey later.
Smythe: Yeah, absolutely.
John E: And do a deeper dive. And in fact, this is interesting idea, you may even want to have in the moment questions if you can get this going in form, it'd be really nice if you had a link between the online survey. This is something I think that dynamic surveys are really on their way in terms of, you know, doing deeper dives based on what people have done before.
Smythe: Yeah. Absolutely. Yeah. I mean, kind of building off that, I mean, you know, when you create these skills, if you're using a skills based model of surveying, right? You can have a dynamic link in your assuming you're using Qualtrics, for instance, you can have a dynamic link that will identify which skill you offer to your panelists that populates within the survey flow, right? And so to your point about customizing the links in the skills that people get relative to the surveys are doing. Absolutely you can do that with the existing technology. It's just you need a good understanding of using embedded data and linking embedded data's through the URL to the customer ID that you're testing with.
John E: That's fascinating. Yeah. So you could actually have someone doing a survey on the phone. They get to some point. This is all right, now let's do the Alexa apart. Then go back and they finish on the phone, oh that's really good.
Smythe: Yeah, absolutely.
John E: Yeah. This is why I do these podcasts because these kinds of insights that we like discussion, you know, that is really, really good. Okay. Excellent. Is there anything else you wanna say about? I mean, I think the future super bright for that line of research.
Smythe: Yeah, sure. I mean, another thing, I think that is super important without going into too much detail about panelist's and things like that. But just kind of top line is that an interesting thing in Amazon is that it allows for subscription programs. And so if you've got panelists who are part of your community that you're reaching out to. Right? There's Amazon subscriptions where you could get them to enroll with their customer I.D. into these programs. And in a sense, you can sort of create a community with them and you don't need to use the Amazon community. You could use any sort of subscription program you want. But the idea that you've got direct access to this pool of people who want to be in this program, you create the shell of of an Amazon skill that goes to them. Right? And that can be their gateway to your studies. So instead of doing traditional recruiting, right? You could even have a system where you say, hey, look, we've got an Alexa skill. You log in and you say something like, hey, you know, Alexa, what tests are available to me, possibly matching a profile that they filled out in advance? These kinds of things can really simplify and streamline the screening process for us, I think if we lock in into them correctly and we deploy them correctly.
John E: That's fascinating. Do you think that the I mean, this is just whatever you think about that, we've bring in insight information here, but I'm just wondering, do you think the Alexa developers had the idea for Alexa had any idea that this kind of stuff was going to start happening? Or do you think they just so here's a cool way to sell products with voice. I mean, was there any vision, do you think, that there be so many aftermarket applications for this technology?
Smythe: I mean, my understanding of the way the tool is built and this is just my understanding, not an actual official understanding of it from the Amazon perspective, is that it looks like they were just they were creating this tool that they wanted to have for that they could create this sort of like open source community where people could build anything, where you could interact with the device in an automated way. I don't know that they were necessarily thinking they would like create surveys, for instance, but they were in their earliest things, basically asking people questions and getting responses. Right? And in effect, like even though you maybe don't have the long term vision to say, hey, how do we create surveys using a skill, you know, some of the earliest stuff would be like, you know, what time is it? Where do you live? You know, what's your name? Right? I mean, these are surveys. Right? And so they were trying to create a mechanism to share information, to extract, to request to come back. The AI coming back and having Alexa be smart enough to come up with, you know, dynamic questions. I don't think, you know, is there today. But I think that's always been part of the vision, you know, to have this sort of interaction with Alexa, just like all AI which allows for this two way communication. And, you know, surveys would just be an interesting use case of that.
John E: Right. So just kind of a general purpose technology that was developed?
Smythe: I mean, in my understanding, you know, I think it was there to help with ordering. It was there to help with refilling subscriptions, things like that. But the tool set so varied and broad and people are so creative in how they use the tool. It's just I think it's just blowing up.
John E: Right. So it was designed in the beginning to be what's right, extensible that it had you know, it was supposed to just be here's some technology can do whatever you want with and then it leads to all these cool things happening, like the surveys or whatever else people are like choose your own adventure stories that my son does.
Smythe: Exactly. Yeah, I mean, that's my understanding. And they even have like choosing your adventures are a perfect use case of we do a lot of conditional flow surveys. Right? Where the survey that you get is going to be contingent upon how you've answered things in the past. And that is effectively a which way book. Right? And so they have apps out there now that are which way stories. And if you can get the source code for it, which way story, you can literally adjust those fairly easily to create these sort of like individual tailored survey experiences. But the interesting thing is, again, is like the more that you can gamify these things, the more you can make it fun for your respondents. I believe the longer they're going to engage with the skill and the more likely they'll come back and engage again.
John E: Right. That is interesting. Did you ever play the infocom game when you were a kid? Did you play zork or any of that stuff?
Smythe: I didn't play those as much I know of them.
John E: I don't know if you're old enough. I'm 43. So I might be older than you at this point.
Smythe: I'm actually 44, so I got a year on you.
John E: Oh, Okay. Well yeah. But this infocom kind of games. I'm still waiting for somebody to put those into Alexa because I would just play them all day long if I had the chance. I bet you heard of Zork, it's a text base games.I spent a lot of my middle school years playing that games.
Smythe: I've definitely heard that. Yeah. I used to go on the BBA servers and play. More like Empire Building games, like Dragons Den and stuff like that. I love that classic stuff.
John E: Yeah, it's good. Alright. So we have actually, it's amazing, John. This is really fun by we have five minutes left. In the time we have left, I mean, just kind of from your vantage point. So I was hoping to get into kind of some of this, well, quickly, can we talk just a little bit because we touched on this in our precall and I thought this was interesting that, you know, it's important, obviously, that Amazon maintain trust. But with all this data being collected, what are some of the steps that Amazon is taking to help maintain the kind of trust with their customers and their users of Alexa that, you know, their data is being managed in a responsible way and not being you know, abuse in any kind of way?
Smythe: Sure. So, I mean, you know, Amazon has some fairly rigorous policies in place that really focus on customer trust and ensuring that the customer believes that Amazon is a good steward of their information. So, for instance, we don't sell customer data. We don't share customer data directly with our vendors or our sellers. When data from customers is used, we use it in aggregate. Customers sort of have all these protections from the government in place. The ADU compliance things. But the reality is that Amazon takes customer data extremely seriously. I couldn't find your address in Amazon if I worked on it for a thousand hours. That data is not available to me. Even if I had your customer ID, I could not find your address ever. It's just not there. Now, if I knew your customer ID which you should never give anyone your customer ID, it's kind of like your Social Security number equivalent at Amazon. Don't give that number out. But if I did know it, I could theoretically tailor marketing messages. And this is what we do is through sort of automated systems as people who have purchase activity, we will tailor marketing campaigns to help find them offers that others have taken advantage of to help give them a benefit. But we don't do individual level mining of data and any time I mean, it's funny because it's like even purchase patterns of a certain sense. You know, if it could in any way be considered personal identifying information. Amazon disallows the use of it for any sort of business opportunities. We really look at stuff in aggregate and the ability for us to find an individual and information about the individual, it's actually extremely, extremely difficult.
John E: Now, that's reassuring. So there's no Cambridge Analytica in Amazon's future?
Smythe: I highly doubt that's in our future.
John E: Fair enough. Alright. Well, this has been great, John. So do you have any final advice for sensory scientists? I always like to close with that question, you know, especially for you being in a obviously tech first environment. What do you see in the next couple of years that sensory scientist should be thinking about?
Smythe: I mean, for me, you know, I think I was raised in the era of sensory where we're looking for sort of absolute metrics of success. So it's a seven on a nine point scale. Oh, you know, 60 percent of our population like a product or whatever it is. And I feel like the way things are going, we really need to focus a little more on personalization in the individual perception of a sensory panelist or a consumer and how that relates to a product offering. This is some of the work I did at Tate and Lyle, you know, is that we identified that there are different dose response curves for how people respond to something like Stevia Rabe, right? And, you know, everyone's like, oh, I hate Stevia. And what it turns out is that, yeah, I mean, at some point, you know, many, many people do not like stevia, but there's a point below the threshold where everybody likes it. Effectively, it seems, you know, it's in the patent. You can look up the numbers. I won't give them out here. But basically you can replace one or two percent of sugar for almost everybody. And if you're doing that, you're doing a sugar reduction. You know, Steve is a great solution. But historically, people didn't think about inner individual differences in perception and they sort of like wrote off something like stevia as being an effective tool in a toolkit for a certain portion of the population. And you know, what I found in all of my research is that tailoring your studies to understand the individual dynamics will help you find sub-populations that you can be extremely successful with. If you go for the sort of like, blanket approach, like, oh, I need a seven on a nine point scale to launch a product. And you look at your data and half your half your population gave it a nine and half your population gave it a one. Well, that could be an amazing product for 50 percent of your population, which could be a totally viable market opportunity. And that's the kind of thing that I think as sensory scientists sort of evolve in our field, understand those individual dynamics. Understanding what represents a good business case for something. Maybe it's not gonna be the Coke or the Pepsi of the world, but maybe it's a really good Dr. Pepper, right? That kind of stuff, I think is where the future is for microtargeting of customers. And of of sort of like consumer groups.
John E: Yeah, definitely. Yeah. I mean, I see that with some of the machine learning models that I've involved with where we have models that are trained, where you have things you know about people, things you know about the samples, and then you predict liking or you predict whatever the target is you care about. And it is interesting you take the perspective that, you know, I want to keep because we always want to we're always trying to change the product, right? But if the product is fixed, you might ask yourself, alright, what's the group of people to whom we should try to sell this thing? Right? That's another perspective, it is kind of optimising the consumer. That's a really interesting idea. And that's exactly kind of a line that it seems to me what you're saying about personalization, that you know, you have something who have, there are some people out there who appreciate it. Maybe not everybody.
Smythe: Yeah. I mean, like one of my personal favorite examples is coffee ice cream, you know, people who love coffee ice cream, love coffee ice cream and probably no other, you know, and people we're into like vanilla and chocolate and strawberry, they're kind of yeah, coffee. Ice cream. It's okay. So, you know, find your coffee ice cream.
John E: There we go. That'll be the title of the show. Find your coffee ice cream. So. Alright, John. So how can people reach out to you if they want to find you on LinkedIn, if they have questions or follow up?
Smythe: Yeah. I'm on LinkedIn, you know, you can also email me at my personal email address, it's firstname.lastname@example.org. If you have any questions, I can't give out my professional email address. Amazon has a very strict policy about giving out your e-mail. But if you find me on LinkedIn, I'm John Edward Smythe on LinkedIn. I love talking about sensory and customer facing issues. So happy to discuss with anyone in any time.
John E: Great. And we'll put the link to your LinkedIn page on the show notes so people can just click and find you.
John E: Great. Okay, well, this has been great. Thank you so much, John for doing this.
Smythe: Yes. It's been a pleasure, John. Yeah, appreciate it.
John E: Awesome
Smythe: Have a great day.
John E: Okay. That's it. Hope you enjoyed this conversation. If you did, please help us grow our audience by telling your friend about AigoraCast and leaving us a positive review on iTunes. Thanks.
That's it for now. If you'd like to receive email updates from Aigora, including weekly video recaps of our blog activity, click on the button below to join our email list. Thanks for stopping by!