Create successful new product introductions by understanding shopper emotions

The harsh reality – 80% of new product introductions usually fail within the first year. However through the use of technology that evaluates shoppers’ emotions, retailers and CPGs can now generate insights to understand how people engage with new items, new store layouts, and new shopper marketing approaches, before they are even released into store. Hear from Andy Harris, Director of Virtual and Augmented Reality, Symphony RetailAI as he explains how this technology works when he spoke to Mike Troy, Editor-in-Chief at Retail Leader.

  • Read Video Transcript

    MT:      Hello, Mike Troy with Retail Leader here at Symphony RetailAI Xcelerate Conference in Irving, Texas.  I’m joined by Andy Harris.

    AH:      Hello.

    MT:      Good to see you.  So shopper experience insights.  You’ve got an interesting solution here, we were talking a little bit about it, kind of like the two key use cases here being brand owners, but then also retailers that are developing formats and new concepts.  Walk me through a little bit about like the problem you’re solving here for companies…

    AH:      Sure.

    MT:      …and how this actually works.

    AH:      Sure, well, what we’re looking at here, is a solution to address the issue of new product introductions.  Obviously 80% of new product introductions usually fail within the first year.  What we’re trying to do here, is to generate some insights to understand how people engage with these new items, these new store layouts, new shopper marketing approaches, before we actually put them out into store.  So it’s about shopper experience and shopper engagement.  Shopper experience of course, is a crucial thing in brick and mortar environments at the moment, because of course, we want to engage people and bring them out off the couch and the sofa, to interact with the stores themselves.  So this allows CPG companies and retailers to evaluate new concepts and the emotional engagement that they generate with the customers before they actually launch them.

    MT:      And how does that work, because I see that little picture of you here, and it’s capturing your emotional state?

    AH:      Yeah, so what we’re actually seeing here, we take a number of different feeds and information elements here, including conscious surveys, our 3D store information etc.  But what we’re looking at here, is our emotional approach.  So what we’re looking at, is something which gives us the ability to be able to track the respondent’s emotions to the stimulus that they’re seeing.  So what’s actually happening, this is what happens behind the scenes.  So the respondents don’t see this, but we then send out a survey to a quantitative panel, 1,000 people plus, and we give them some different examples of in store stimulus.  And they could be anything from stills to interactive 3D interactive walk-throughs, to shopability environments.

    MT:      And I know privacy’s a big deal, so people opt in, they know that they’re being…their emotions are being captured via a webcam?

    AH:      That’s correct, yeah.

    AH:      Everything is completely GDPR compliant in Europe and there’s no PII information being saved.  Even though we’re seeing the respondent here, we’re not actually capturing any information from…in terms of my facial information, nothing that will actually identify myself to anyone else.  What we’re capturing is the emotional stimulus that I’m getting from the items that you’re seeing on the right hand side, and the survey itself.

    MT:      How does capturing that emotional stimulus, help me not fail as a new product?

    AH:      So it’s another stream of information to understand the efficacy of the items that you’re launching.  Obviously, you can track how people buy things and what the intended purchase is going to be.  But what we can do here, is understand the engagement with the product or the launch items that we’re putting together.  In this particular instance, we’re looking at different produce environments and trying to understand from this environment, which of these generate more positive emotional reactions to the supermarkets…to the stimulus you’re seeing here.

    MT:      So this section of produce here, like what would you be looking to analyse, based on this?

    AH:      So we are looking for things like price information.  We’re looking at the overall visual merchandising effect.  We’re looking at any promotions that we have on there, the quality of the product that’s on display and we’re using all those as stimuli to be able to understand the responses to the customers.  So alongside this stimulus here, we then ask physical questions about what you’ve just seen, so we can get some conscious responses to what you’ve seen as well.  And we can match those conscious responses with the unconscious responses that I’ve actually generated through my emotional response.

    MT:      So then the ability to capture that visual emotion, then also works for me as a retailer if I’ve got a…I’m resetting a store or changing a design, you can see how I’m reacting to that as well.

    AH:      Exactly.  And we use that in combination with things like eye tracking as well, to be able to understand what people are looking at in store.  But here, we’re looking at specifically, different emotional types, but also what we term valence and arousal.  So how engaged people are.

    MT:      I’m sorry, what was that?

    AH:      So this is…we’re looking at different valence and arousal.

    MT:      What is valence?

    AH:      So that’s kind of the…valence and arousal are two different emotional responses.

    MT:      Okay.

    AH:      So valence is about whether you like something or not, frankly.

    MT:      Okay.

    AH:      And arousal is how engaged and excited you are on it.

    MT:      Yeah, we know, I’m aroused right now, talking about this solution.

    AH:      I mean, who wouldn’t be?  Who wouldn’t be?

    MT:      Yes, exactly.

    AH:      It’s kind of like you feel watching…when you watch horror films for instance.

    MT:      Right.

    AH:      You know, you may be getting a very strong emotional response but you may not enjoy the experience.

    MT:      Okay.

    AH:      So if I’m getting a very excited response, high arousal, but I don’t like watching horror movies, I’ve got negative valence.  So that actually gives us the ability to look at different degrees of emotion through that particular stimulus.  Now we’re applying that type of technology to stimulus in supermarkets.  So that could be specifically for new items.  Of course, new items are when you start disrupting the shelf, because you’re adding new packaging designs, you’re adding new subcategories into the environment itself.  You might be adding a whole new niche subsection to your planogram and display.  And you may be supporting that with different emotional responses as well, different emotional calls to action.  So shopper marketing, activation signage, pricing information, all of that type of stuff.  And what we’re looking to do with this, is to use…to track how your emotions are responding to that.  And use that to be able to then not just look at shopability, but also the engagement of the customer with that particular item, with that particular offer.  So another stream that we can look at, to try and quantify the effect of the environment in store.

    MT:      And you just help me make easier, better decisions, so I don’t make the wrong decision?

    AH:      Absolutely, so our CPG partners are using this to be able to validate the items that they’re putting together, their offers, to be able to create better business cases, to be able to go to their retailer partners to introduce these items, because they can say that this new item that we’re launching, engages with customers when we put it in this planogram, as opposed to this one.  When in this part of the store, as opposed to something else.  So it’s not just about simply, are people buying, but it’s also about what are the emotional cues which are causing them to convert their purchase?

    MT:      And then if I’m designing a store, it also would help me avoid making a bad choice about putting a fixture this way, instead of that way…

    AH:      Yeah, absolutely, so when we’re using it with retailer companies or people who are looking to create specific new concepts, next generation retailing environments for instance, then we can use this with our eye tracking scenarios, to be able to see what people are seeing in this new store concept, what’s emotionally engaging with them, within the store itself.  And it’s a very quick turnaround process to be able to generate this type of insight, before stores that haven’t necessarily even been built yet.

    MT:      Right.  Are you able to use it for analysing people’s emotions about…let’s say, I’m redesigning my website and I want to make sure I don’t…you know, if I put some product over here, versus over there?

    AH:      Yeah.

    MT:      Are you there yet on…?

    AH:      So that’s not been a key focus for us at the moment.  We’re still very much focused on the in store experiences.  But certainly, this type of technology has lots of uses, in terms of the digital experience.  So for v-commerce and e-commerce, to be able to use it for simple A/B testing of different environments, then it’s very useful for that, because we can understand what people are looking at.  But also potentially to be able to interact with user experience.  So being able to understand the emotional responses to what people are seeing on that particular shopping approach and maybe use that to generate offers, to be able to generate calls to action and incentives, once people are emotionally engaged with what they’re seeing.

    MT:      Are you getting a bigger reaction from brands or retailers?

    AH:      So far, it’s brands.  That’s not to say it’s not of interest and use for retailers, but CPG companies obviously are launching lots of new products throughout the course of the year.  I was speaking to a customer earlier on, who said they were launching round about 70 or 80 products in the course of the year.  Now that is a high turnover, lots of value, lots of money actually riding on those successful implementations, so we’re seeing the CPG companies especially, are able to use this to be able to influence that type of product investment.

    MT:      You know who else is introducing a lot of new products, is retailers’ own brands.

    AH:      Absolutely, yeah.

    MT:      Because a lot of them, a lot of major food retailers are up to 30% penetration with private brands.

    AH:      Totally.

    MT:      So there could be like a use case both ways, store design and brand products?

    AH:      Yeah, very much, I mean, private label is kind of the CPG company within the retailer almost.  So being able to understand the impact of those, especially when there’s going to be a direct comparison against the brand, be it the…or whether you’re comparing it say, for a value label, trying to compare against like a tertiary or secondary line.  Being able to then do those spot checks and focused analysis of, how does my new product extension for this private label, how does it compare against the brand leader?  Are we going to be able to encourage people to switch from the brand leader to the private label and maximise that?  So it’s kind of poacher turned game catcher…keeper, I suppose, for private label teams, as well as CPG companies.

    MT:      What was that, poacher?

    AH:      Poacher turned gamekeeper.  Yeah, it’s a UK expression.  So it’s using your skills in one area to be able to then flip to the other side and use it in that experience as well…

    MT:      Okay.

    AH:      …so there you go.  That’s one for the US market.

    MT:      Alright, I’ll remember that one.  Thanks Andy.

    AH:      Pleasure.

    MT:      Alright.