Listen to this article

Understanding pet shopping journeys for Edgard & Cooper – From shelves to screens and AI

Editor's note: This article is an automated speech-to-text transcription, edited lightly for clarity. To view the full session recording click here.

AI has allowed researchers in the industry to move at the speed of business without sacrificing quality. But the technology continues to evolve giving researchers more tools.

One example is the new technology from Conveo.ai, which is referred to as ‘multi-modal video insights.’

Niels Schillewaert, head of research and methodologies at Conveo.ai, explained the benefits of this new technology in a 2025 Quirk’s Event – Virtual Global session sponsored by Conveo.ai. Schilewaert used a case study with Edgard & Cooper, a pet food company, to illustrate the benefits of AI and multi-modal video insights.

Session transcript 

Emily Koenig Hapka 

Hello and welcome to the session, “Understanding pet shopping journeys for Edgard & Cooper – From shelves to screens and AI.” My name is Emily and I'm an Editor at Quirk’s. 

Before we get started, let's quickly go over the ways you can participate in today's discussion. First, you can use the chat tab to interact with other attendees during the session. Second, you can use the Q&A tab to submit questions for our speaker during the session.  

This session is brought to you by Conveo.ai. Please enjoy the presentation!

Niels Schillewaert 

Hi everyone and thank you, Emily, for the introduction. 

My name is Niels. I am the head of research and methodologies at Conveo.ai. And as Emily just said, I'm going to talk to you about pets. This is a case study that we have conducted together with Edgar & Cooper, which is part of General Mills. 

One of the things we want to convey to you is that the promise of AI for consumer insights goes far beyond qual at scale or AI moderation. The capabilities are important, but they are just a springboard and the start of a new journey. 

If you actually think about why we would use AI for consumer insights, there's both a consumer reality that is changing and a business reality, but there's also a research challenge in terms of human constraints, which we encounter.  

The real transformation of AI lays in how we interact with real humans and understand them holistically and collaborate with them.  

If you look at some of the business challenges that Edgard & Cooper, General Mills is facing, it's all about diverse and evolving consumers, in this case, pet parents. They are becoming more health-conscious feeders. Focusing on natural ingredients, for example, for their pets or they're culinary inspired. Just like with human food, they want to give their pet the best options. 

Their behaviors also become more digital, and that has an impact on their shopping behavior. They interact with platforms like ChatGPT, to get recommendations on where to go for convenience and want to know where shopping is the best for that. 

They also have a say, do gap or the gap between the ideal and the practical of shopping for pet foods.  

When you look at research and conducting research, there's obviously a lot of challenges there too. When you want to conduct a five-country study with 30-minute interviews, it'll take you easily two months before you can start to get to the analysis. 

There is obviously constrained resources in terms of time as well as cost on the client and both as well as on the agency side. And the insights need to be delivered at the speed of business, which is accelerating much more than ever before.  

Doing better, faster, cheaper is no longer enough. You need to do all three and even go beyond and be transformative.  

That's really where the promise of AI for consumer insights kicks in.  

A platform like Conveo.ai will help you scale your team without really scaling your team. It gives you insights superpowers, whether you're on the agency side or on the client-side. 

We've developed a platform where AI helps you in setting up your study, where it helps you in the clicks of a couple of buttons to recruit participants.  

It conducts video moderation in a very human-like fashion.  

Once you've collected your data, it's a click away to generate your analysis and you can interact with the platform and your data. Let's say you're sitting on a pile of answers, and you can ask the right prompts and the right questions and generate and co-create insights. 

I won't be demoing the platform. You'll see some of that happening throughout the case study, but that's really what the Conveo.ai platform does end-to-end.  

When you look at Edgard & Cooper and their usage of the Conveo.ai platform, it's really all about scaling that team without having to scale the team.  

They are a three full-time equivalent team, but have been able to run multi-country studies across a multitude of use cases in a heartbeat and have almost interviewed 900 people in their natural and rich environment. 

And with that, they've been a leading-edge user within the General Mills family, and obviously all of this leads to the ability of leveraging the insights capability at Edgard & Cooper. 

So, as a case study, what have we done? 

Let's have a look at the methodology and the research design of the two studies that we've done. 

Basically, we wanted to understand how pet parents navigate throughout their purchase journey across retail channels.  

Therefore, we sent out over 30 people to the regular store, brick and mortar, using their mobile phone and conducting an interview in front of the aisle, which lasted about 25-minutes. We also had more than 40 consumers conduct a shopping mission on Amazon and all of that. We conducted it across three countries. 

Basically, when people went to Amazon, they screen shared and showed how they navigated the e-commerce platform. 

All of that was done through the Conveo.ai video moderation, which captures obviously attitudes as well as actual shopping behaviors. 

Let's have a look at a couple of takeaways illustrated by a number of video quotes straight from the Conveo.ai platform. 

First of all, there's a lot of overlap whether you're looking at the digital channels or the physical bricks and mortars channels. In both channels, nearly everyone executes a quick search, then develops a shortlist, anchored on price, brand, pack size and ingredients. 

There's some nuance, of course, according to the channel and especially if you look at pack size for online, people want to make sure that they're buying the right sized pack, not too little, not too big. But transportation is not an issue. 

In the offline environment, pack is important, but people are concerned if they have to buy big packs. They are concerned about transportation. But mainly they do a search, develop a short list and look at price and brand, which are super important.  

Ingredients as well are super important. And that obviously has to do with front pack analysis of claims and a fast heuristic, quick screening. Then people go into a deeper check when it comes to specific ingredients. 

In the offline channel, when they look at claims, the claims, the colors and the imagery get much more attention. The ingredient list check happens, but is less consistent as people are on a time pressure because they're shopping for other goods as well.  

When you look at the online channel, shoppers rely on clear hero images. So, secondary images on the back of pack just at a more detailed verification scheme. 

The second takeaway is that there's channel nuances in what actually triggers the selection. 

So, when you look at the offline people and shoppers rely much more on the shelf visibility. They look at brand blocks and color cues for brands to enter their consideration shortlist.  

In the on shelf, the color and the claims do the job as a tiebreaker. So, when people are in front of the shelf, they first of all have a clear pan in terms of visibility. Then look at blocks, colors and cues, and when that's still the same, they will go into looking at claims.  

Let's have a look or listen to this consumer who's in front of an actual shelf and how she interacts with that.  

[Niels Schillewaert plays 4 videos on screen of participants shopping for dog food in the store.] 

Niels Schillewaert 

As you can see, a very rich interaction across different cultures and languages with the shelf as people shop for pet food. 

When we look at the online channel, the story is a little bit different.  

In online, people rely much more on filtering and variant selection. Then they also look at if I subscribe to buying specific dog food in this case, I can reduce my cost. I can save and get discounts. 

When it comes to when people have gone through the filtering, the variant selection and the save and subscribe, they will look at reviews, if it comes to tiebreakers. Remember in the offline they'll look at claims. In the online environment, they'll look at reviews. 

Let's have a look at how a couple of people interact with the Amazon platform when it comes to selecting dog food. 

[Niels Schillewaert plays four videos on screen of participants shopping for dog food on Amazon.] 

Niels Schillewaert 

So, as you can see, we're able to capture both digital as well as the offline shelf experience. 

A final learning that we had from this project, which was important for Edgard & Cooper, is that packaging is important but it's impact is much bigger and very different when it comes to in-store versus online.  

Four out of 10 participants reported that packaging directly influenced their noticing, picking up and choosing a product offline. While only two in 10 cited the hero image or pack visuals as shaping their online cart content. 

As you saw in the online environment, the packaging comes second. In the offline environment, it really comes first. 

We did an analysis with the ability of talking to your data in terms of asking based on the two studies, what is the different role of packaging in the purchase environment?  

The holding power, which is inviting consumers to evaluate a product. The holding power is pretty consistent when it comes to online versus offline. In offline there's the ability of physically picking up the pack, while in online it's really about these zoomable images.  

The closing power of packaging is also similar when you compare, or at least in terms of the impact of online versus offline. With offline a lot of it is based on habit or in aisle reassurance. So, people go for familiar brands or ingredients and then online they look at trust building icons as well as reviews. 

The main difference lies in grabbing the attention or the stopping power of packaging, which logically is much higher in the offline setting and bricks and mortar shopping than it is in online. In online it's much weaker. It's all about filters and ratings rather than the color contrasts or the hero images, which pop out on the shelf specifically.  

So far, there are three key takeaways when it comes to the content of assessing digital versus in-store shopping behavior.  

Now the ability that Conveo.ai brings to the table goes further. 

One of the abilities, which is bleeding edge technology, that Conveo.ai has, is looking at what we call ‘multi-modal video insights.’  

What multi-modal does is it looks not only at what people say, but also at the content of specific videos. In this context of shopping, digital as well as offline, the video is able to analyze what people do, what product they pick up or click on, what emotions they have whilst interacting, for example with the Amazon websites, which branded products they may not report or verbalize, but do consider once they're shopping.  

Let's have a look first at an illustration of an interview transcript and a video transcript of somebody who's in front of the shelf. 

What you will see here in the center, this is the Conveo.ai platform at the interview level. What you see in the center is the video, and then next to it is both the transcript of what people say. As well as the video analysis in terms of what it captures.  

[Niels Schillewaert plays a video on screen of a participant shopping for dog food in the store.] 

Niels Schillewaert 

So, what you see here, first of all, you see this is the transcript of what the participant has said and it's coded. In this case, the person doesn't mention any brands. The person refers to the colors that stand out.  

What you can see here is the video analysis and what it does, it looks at the vocal, the tone of voice, the verbal or the type of words somebody uses, as well as the visual analysis, which in this case points to bags of the Wag brand. And so again, what's really important here is that there's a gap between what people say and what they actually do or show.  

So, we can basically bridge the gap between what people say and what they do. We can listen or we can analyze the tone of voice and we can also see what SKUs draw people's attention. 

Let's have a look at something similar when it comes to bridging the say do gap when people interact with the Amazon platform.  

[Niels Schillewaert plays a video on screen of a participant shopping for dog food on Amazon.] 

Niels Schillewaert 

So again, a couple of things that you see here. This is the transcript, and you may have noticed the lady mentions I would go for this one here. I would go for this one here. And she again does not mention any brand. She also doesn't mention any emotion, but what the video analysis is able to do is to say, well, when she says this one here, she was for example, at a certain moment talking about the Pooch and Mut brand.  

Also, she was expressing emotion, and the video analysis is able to listen to the vocal tone of voice in this case as well as her facial expression to reveal the emotion. And so, it can look based both on the video as well as what people are sharing at the screen.  

This capability of multi-modal analysis is really strong and as I mentioned, it's bleeding edge technology. 

What we're able to do is not only analyze what people say, but also what they do and how they say it. 

Currently, although it's bleeding edge, we're not at the end yet. We do this multi-modal analysis post interview. So, we conduct the interview and then do the multi-modal.  

The next step, which will completely change the way we interact or have AI interview people, is obviously when we are able to assess multi-modal analysis and video insights live.  

In the case of the last lady for example, when she says, I would use this one here, the AI moderator will be able to say, “Oh, seems like you're looking at Pooch and Mutt. Can you explain a little bit in terms of why you would do that?” 

So, the multi-modal is a capability that sets us apart, and that sets our platform apart from other platforms.  

I want to wrap things up and leave some room for Q&A.  

If you look at what Conveo.ai is able to do and how we leveraged the insights in terms of shopping behavior, both digitally as well as offline for Edgard & Cooper, you can really scale your insights power.  

The AI assists you in amplifying your capabilities exponentially, which leads you to deliver insights at the speed of business. Also, it creates what we call the gift of understanding. 

You can lead with wonder, you can excite your stakeholders internally, which may not always have been possible before. 

Obviously, this is all new and so it creates curiosity, but we as researchers also have to adapt our skills.  

We're also using AI with real humans, so we're not relying on synthetic data. This is really, really tapping into people's and consumer's reality.  

And obviously because AI has no human constraint, when it conducts a lot of tedious jobs, we can scale the depth, which basically means for us, the power of AI, as I mentioned, goes beyond just moderating at scale.  

It's really a newer methodology to look at. It is combining the depth of qual research with the breadth of quant research. The sweet spot really lies in quantifying the “why.” 

Thank you so much for your attention. And before we go to Q&A, please feel free to scan this code and join our webinar on December 3rd, where we're not going to demo our platform, but we're going to reveal the results of a global study on how AI influences people's shopping behavior.  

Now, we'll turn back to the platform, and I'm not sure if there's any questions coming in.