Transforming qualitative research: Real-world lessons in AI-powered analysis with CoLoop and STRAT7
Editor's note: This article is an automated speech-to-text transcription, edited lightly for clarity. To view the full session recording click here.
CoLoop was one of the organizations that sponsored a session in the 2025 Quirk’s Event – Virtual Global.
CoLoop Founder and CEO, Jack Bowen, and Andrew Dare, Group CTO, STRAT7 discussed two case studies in which AI lead to quality insights faster than ever before. The two shared the importance on early integration of AI, balancing AI with researcher expertise and scaling the collaboration with AI software.
Session transcript
Emily Koenig Hapka
Hello and welcome to the session, “Transforming qualitative research: Real-world lessons in AI-powered analysis with CoLoop and STRAT7.” My name's Emily and I'm an Editor at Quirk’s.
Before we get started, I want to go over a few ways that you can participate in today's discussion. First, you can use the chat tab to interact with other attendees during the session. Second, you can use the Q&A tab to submit questions for our speakers during the session.
This session is brought to you by CoLoop. Please enjoy the presentation.
Jack Bowen
Brilliant. Well, hello everyone and thank you so much for joining today's session. My name is Jack. I'm one of the co-founders and CEO at CoLoop, where we are building an AI co-pilot for insights and strategy teams.
If you haven't heard of us, CoLoop is an analysis tool built specifically for insights work, particularly qualitative. Today we're used by more than 400 teams globally across agencies and brand side.
Today we are very fortunate to be joined by Andrew Dare of STRAT7.
Andrew, over to you. Would love to hear an introduction.
Andrew Dare
Yeah, thanks very much, Jack. Yeah, hello everybody.
My name is Andrew Dare. I'm the Global CTO for STRAT7. I'm responsible for all the technology that we deploy across the group.
But recently created an AI strategy lab within the organization where we're doing a combination of building our own tooling, but also then looking for partners in our AI ecosystem. CoLoop is one of those prominent partners.
We've been working with them for some time and we had some really great results from using the tool. So, obviously Jack and I will discuss that as we move forward.
Jack Bowen
Wonderful. Great. So yeah, I guess to frame the conversation right now, unless anyone watching has been perhaps living under a rock. The market research space is going through a huge amount of change, specifically on the qualitative and the insight side.
Teams today are under growing pressure; qualitative sample sizes are getting larger. There's more pressure to provide data and analyze data in more and more formats like video and audio across different languages on more and more compressed timelines.
And, overall client research teams’ related functions are expecting more evidence-backed insights delivered faster and to a high level of quality.
At the same time, manual qualitative synthesis remains incredibly time intensive. In project briefs that we've seen, around half the research time can be spent on manual synthesis work.
AI now presents a real opportunity not to replace researchers, but to reduce the heavy lifting so that those researchers can devote more of their time and more of their mental margin towards value add tasks like strategy, communication, deliverables and storytelling. In order to make sure that those insights actually drive significant impact on the stakeholders and clients that they're being produced for.
On behalf of today, we're going to talk through two STRAT7 case studies, one in pharma and one in FinTech. We're going to use these in conversation to explore how AI has helped to reshape the process and enable new levels of quality, efficiency and insight generation using AI as part of the process.
So, in the first case study, we're going to look at a global pharma client and an approach run by STRAT7.
Andrew, do you want to jump in and give everyone an overview of what this project was about and what kind of impacts and kind of concerns were here?
Andrew Dare
Yeah, sure.
As Jack said, this was for a global pharma client. What they were trying to do was basically make sure that their partner organization across 15 different regions, well across global regions, were working effectively.
And so what that meant for us really was to go out and perform in-depth interviews in each of those regions. Obviously, bringing all that information back together and then create a bunch of insights so that we can present back to the customer to say, “look, these are things that are working. These are the things that aren't working”.
As you can imagine, quite an interesting challenge.
Jack Bowen
Absolutely.
With these sorts of large pharma projects, I mean you're dealing with multi-billion-dollar supply chains. I know it's an industry where collaborations and partnerships are absolutely fundamental to driving success and producing very complex products and offerings in this space.
What was the complexity around this, and how would you guys have approached something like this in the past?
Andrew Dare
I mean complexity wise, the fact that it's across multiple time zones in multiple regions, we have 15 in-depth interviews to work through. Typically, what we would've done in the past, we would've done the interviews and analysts would've looked at all those interviews. We would have worked through all the transcripts, noting out the things that were really, really interesting.
It takes a long time to do that effectively. And if I'm being honest from a research perspective, it's pretty boring, right? Because you're wading through a lot of information to get to where the nuggets are.
So, using a tool to do that really changed the game for us. It really did because basically what happened was that we effectively pushed all those transcripts into CoLoop and said, “I want you to look for information, look for nuggets around a very specific structure.”
And in doing that, we were able to bring insight out quickly to find nuances of things that we may not have picked up because of the sheer volume of stuff that we were working on.
In the end, in terms of the researchers, the idea is that we're using the tool basically to accelerate our thinking. To get us to a place where we can actually apply our storytelling and our true insight, because we're not having to process the data to find those nuggets. Those nuggets were surfaced by CoLoop and it worked really well.
Jack Bowen
Definitely, and I guess with these pharma projects as well, I imagine that the types of folks who were able to engage with this information and understand the nature and the context of it, they're going to be specialists, senior folk, where that level of expertise really matters.
And so, with that in mind, on top of it being technical subject matter, these are some of your best people who are putting extraneous amounts of effort into working through quite a lot of information.
Andrew Dare
Yeah, absolutely. At the end of the day, those people really are employed for their analytical skills. They're not paid to sit and look at transcripts and find things that look interesting. So, for us, it was fabulous.
The analysts still reviewed the things that came out, if you like the insights that were generated. But in the end, they ended up doing a lot less of the heavy lifting in terms of finding those things.
For us, very much seeing CoLoop as a time amplifier. It's not a shortcut. Work still has to be done. It just means that you get to those insights a little bit quicker, a lot quicker. And that allowed us to spend more time on the final outcome on those insights and genuinely delight the customer.
Jack Bowen
Yeah, I like that expression time amplifier.
I think often with a lot of AI tools in service driven industries across even other areas like legal and accounting, there's always a tension where it's something that's always been billed essentially based on time-based sorts of outcomes. So, it's a difficult subject to broach sometimes in terms of efficiencies. But I think you really hit the nail on the head there in terms of what value actually means and what we see across our other customers too, which is what it unlocks in terms of how you can redeploy that time on the project to really focus in on driving quality and driving good outcomes for the client.
Andrew Dare
For me, it's quicker time to insight. It sounds like a cliche, but actually it's important.
If we know that we are going to get good quality data without having to troll through all the transcripts, we can then start to think about what the structure of that looks like. We can spend more time thinking about what the final analysis looks like than we would've done because we would've been potentially constrained by the time that we've used on the transcripts or the fact that we've had to use other people to do the transcripts.
Then we're asking what that really meant, where did that come from?
The fact of the matter is we can get to that very quickly, and it allows those senior people to sit down and genuinely think about what it is we're going to present forward.
Jack Bowen
Yeah, definitely. And I think that's actually a theme across, different applications, across different AI tools. As well around amplifying and making sure that, as we sort of discussed, your top strategists, your people that know the space and the client really well, are actually sitting in a driving seat. That they're then able to orchestrate and amplify their time rather than spending it on tasks that could be automated.
Andrew Dare
Yeah, we're not talking about replacing researchers with AI.
What we're talking about is one of my colleagues coined this really well and I really liked it, which was we're raising the base camp. We're getting to a point where you're starting in the process where you've got more to go out rather than having to start at the very bottom and gather all that data together. Really, really interesting, I think.
Jack Bowen
Yeah. So, I suppose finally in the end, what were some of the key learnings and things that really came out of this for the client and how did it make a difference for you guys?
Andrew Dare
Well, I think in terms of key learnings, I think for us it was one of those early projects. So, I think we definitely learned some stuff.
Back to the point about it being an amplifier, it doesn't replace experience. It doesn't replace skill. It just allows you to get to a place very quick.
Again, sometimes I think with any of these AI products, context is king. So, looking at things in isolation doesn't work, and that's back to the skill of the researcher. That's why the researchers look at that data. That's why they analyze it because it's all about context.
Ultimately using AI purposely so you know, working towards a purpose, exactly what you're trying to achieve, massively increases the speed to insight. And actually allows you to do stuff without compromising the quality of what you're trying to do. It really does.
Jack Bowen
Definitely. Cool.
We're going to move on to our next case study now.
This one was a FinTech brand launching a new tax filing feature. However, this didn't sound like it was your standard request.
Andrew, do you want to speak about the context around this one?
Andrew Dare
Yeah, so as you say, it was people wanting to get some understanding about the new tax filing feature, getting people to use it and everything else. The only problem was that we didn't really have much time at all.
In fact, the time was measured in days, and it was a very interesting and challenging thing to do.
In the end, we ended up doing eight qual interviews. Basically talking to people about what we were going to do, but we did that over three days. So, we were doing quite a lot of in-depth interviewing over a short space of time, three days. Obviously when you were doing that, traditionally you'd end up spending quite a lot of time just doing the transcriptions.
So, you will do an interview, and it may be a day before you even get a transcript back. So, the three days to do all the interviews and have it transcribed was a blessing at the very beginning.
But what made it even better was after that the customer said, "Well, actually once you finish the fieldwork, we really want some answers back in 24 hours, because we don't have much time here.”
So, of course all the delights that you want to take so you don't have very long to do the interviews. And by the way, you've not got very long to do the insight.
I would say, and I don't think I'm exaggerating, without CoLoop we wouldn't have been able to achieve it, or if we did, people would be burning the midnight oil in the past.
So yeah, having a tool that, as I said earlier, allowed us to surface insights for it quickly, allowed us to get to the heart of the data without having to wade through it ourselves really, really made a difference there.
Jack Bowen
Definitely. Yeah, I think it's an interesting one as well because I suppose similar to the pharma projects, it's something that's incredibly high leverage in terms of the research value.
I can imagine this is a company with potentially millions of users, you are launching a new feature, it's incredibly stressful for all of the product teams. You want this thing to go well and you're bringing in someone like STRAT7 because you want a pro to come and help you figure out exactly how to make that conversion on your messaging as easy as possible.
Andrew Dare
Yeah, I mean for us, we're very much only as good as our last research project. So, we can't afford slip-ups. We don't want to give substandard work to our customers. It is not sustainable as a business.
I mean at the end of the day, after 24 hours, we managed to produce a 50-slide deck that had all kinds of insights and allowed them to move forward. As I said before, we couldn't have done that in 24 hours.
Again, the customer was impressed with the output, with the speed of it, but not just the speed because no good doing it quickly if what you've produced is rubbish. They were impressed with the quality of it. And again, it really helped their business move forward.
Jack Bowen
Definitely, and I think again, it seems like another interesting example as well of one where putting that kind of pressure on a team, although they might be able to service that project in 24 hours, doesn't necessarily mean that just because they're able to go through it themselves that the quality would be preserved on those compressed timelines.
So, it sounds like in this example, having some process automation here gave your team some breathing room to actually make sure that they did a really good job.
Andrew Dare
Yeah, actually I think that's a really nice analogy. It gave them breathing room.
Having said that, I suspect if you spoke to the team separately, they would say, “Well, it would've been nice to have more than 24 hours to actually analyze the output and genuinely leverage the expertise that we have.”
But I think in terms of where they wanted to be, the tooling allowed them to get to that place where they could devote time on it because otherwise they'd just be spending the time creating decks, doing transcriptions, doing the mechanisms of the research, not actually the research itself.
Actually I think that's a really interesting point, which it allows you to go through the mechanisms but quickly so that you've got time then to sit down and do the thought leadership. The things that you can't do with an AI. You can't draw those conclusions of things together.
So, it allows you to get to that place a lot quicker than it would do normally.
Jack Bowen
Yeah, I think the interesting thing about qualitative in particular as well is it is a real discipline in thinking. And when we've done our own qualitative research, I don't think you can ever overstate the importance of actually having the time to run discussions and talk to the other moderators and just go through the different permutations of what it could mean.
And with something like message testing, I'm guessing you're going to be turning around some recommendations at the end around either what to go with or how it should be adjusted or recommendations for which to split test.
So, the value in that is fundamentally creative on some level. How you get to those strategic recommendations. And nobody functions well, half cut on a late-night deadline trying to turn a 50-page deck around like that.
Andrew Dare
And that's the point. I think the expertise and the experience, the ability to provide that insight, whether it's five slides or 50 slides, doesn't come quickly. People learn these things over time.
We talk about human in the loop here, but for us, the important thing is that we're using the experience of our researchers tied to the tools to get them to that place quicker.
The idea is that, again, it's not just about speed, but it's about speed and quality because the quality of the output then allows the researchers to put the quality time in to then generate the insights.
I mean, okay, a three-day interview schedule and a 24-hour turnaround is not ideal, but at that point then I think the customer then realizes where they are. And what you're doing at that point is you're delivering something, but it's getting it to a point where you can make those decisions and have those discussions as well.
Jack Bowen
Absolutely. One of the outcomes we were discussing in preparing for this I thought was really interesting too, was this point around the client reaction. And one thing you said earlier was around as an agency your only as good as your last project in terms of how these clients perceive you. And I think one thing that really stood out here is the fact that you guys were able to leave a really great impression as well, which is so, so important in the agency space.
Andrew Dare
Yeah, absolutely. And even six, eight months ago, we wouldn't have been able to do it.
As I said, if we really were pushed, we could probably get a report out, but who knows what quality it would be and it would be very difficult for us to release something that's subpar quality as well. That's the other point.
As a business, we don't want to release stuff that's not the best of qualities. So, it would've been a real struggle for us in terms of meeting the deadline, but then also the understanding that its maybe not as complete as it could be. Whereas having tooling to support that we managed to get a much more complete picture out than we would've done normally.
I think for us it was almost reputation saving because we managed to achieve something. The customer was very happy at what we achieved, not just the speed, but also the insight that was given and the depth of that insight as well.
How did you manage to find your list in 24 hours? Well, that's how we did it.
Jack Bowen
Yeah, it's that kind of magical experience to get to give them.
And I think one other thing you mentioned there too as well is that anyone who's run a research project on compressed timelines, it's the most frustrating thing too as well where thinking back about it later after that submission deadline, after the decision's been made, there was more juice to squeeze out of the lemon if you just had a bit longer to play with the data. I think minimizing that at the very least, that's the level of craftsmanship that you want to be able to elevate people to so that they perhaps don't feel like that's often.
Andrew Dare
Yeah, no, I agree.
And I think it's really important that the researchers are really happy with the output from the tools and they can depend on it and they trust it. Then they know then that they can apply their thinking and their expertise to do a really great job. I think that's really important. Absolutely right.
The last thing you want to do is, oh, I missed that, or, oh, well, if I'd have had a bit more time, I would've maybe chased that a little bit further. I think that's always going to happen, right? This is not going to fix every problem. It's always going to happen, but actually it does give people more time to chase those things.
Jack Bowen
Yeah, it's such a familiar feeling.
I wanted to wrap up with a couple of key learnings and best practices across these case studies. These are three key points that stood out when we were preparing ahead of running this talk.
The first one I wanted to talk about was around integrating AI early in the process.
One of the recommendations we've seen come up time and time again from folks who've used many different types of AI products is really around making sure that they're integrating AI early and starting to think about how they're structuring data, how they're capturing it, how they're tagging it in order to maximize those benefits.
So yeah, I mean, what are your thoughts on that, Andrew. How does that map onto your internal advice and STRAT7, how you guys operate?
Andrew Dare
Yeah, I think something that we've learned is what's the best mechanism, what's the best structure for things to go into the tool. There is that age old saying, “Garbage in, garbage out,’ and it applies just as well.
If you don't understand the data that you're putting in, if you don't market and tag it appropriately, you're probably not going to get the best out of the tools. So yeah, I think it's really important.
I think that's a whole thing about research that is changing and I think there will be more focus on that, how we get that data, how we structure that data and understand the things that we need to do to use AI effectively.
Jack Bowen
Definitely. I think that kind of ties into this next point here as well, which is around balancing AI with human oversight.
So, a strong theme that came out through these case studies is that the role of the researcher as a guide. That sort of interpretive layer around what's going on here and the requirement to really validate outputs and make sure that that interpretation aligns with the stamp of quality that you want to be broadcasting to all your clients in any of the projects you run is super important.
And STRAT7 is a massive household name in the research space and obviously reputation's very important for you guys. How do you balance AI with human oversight and make sure that you are getting those efficiencies while still producing the same great product people know you guys for?
Andrew Dare
Yeah, it's a journey and a journey that we're still on.
We're in such a different place from when we first started this stuff, but ultimately when we started, and again, when I started the AI lab, one of the things that we said was non-negotiable from the beginning is having AI in the loop. Having everything centered around enabling a person to do the job, and that's what we did all along.
So, that's where we've kind of got to, but people are now starting to trust the tool more. They're starting to understand how data needs to go into the tool. So, I think things are gradually improving all the time.
I don't think we were in a bad place to start with, but I think we're getting better all the time and people are now actively wanting to use CoLoop for stuff where we picked a specific project size and then we've got more and more people wanting to use it.
I think that's a really great sign. I genuinely do.
Jack Bowen
Definitely.
The folks at Quirk’s are warning us that we are four minutes from the end. So, onto the final point around scale and collaboration.
You spoke there about AI in the loop and humans being the driver of that collaboration is something that is really important. We mentioned earlier this concept of qualitative and breathing space to be able to have discussions with people who are involved.
How are you guys approaching that? How do you tend to collaborate? How does that work now with AI?
Andrew Dare
Yeah, so I mean the tool itself allows us to collaborate because multiple people can work through a single version to see a project. But I think more than that, it's also about giving you airspace to do that collaboration, to sit down and actually just talk about what you found.
So, did you see that? Oh, what about that? Did you think about that?
Those are the things that you don't always get a chance to do. It's a combination of the tooling actually allowing you to collaborate, but also giving you that space and that bandwidth then where you can actually sit down and talk about things a lot more. I think that's something that's sometimes underestimated.
It's allowing us to talk, it's allowing us to talk and to chat about what we find. Yeah, there's a whole nuance load of stuff that happens there because we're getting to the speed of insight. We're getting to that so much quicker. I think it enables a lot of things.
Jack Bowen
Definitely, and I know from my own experiences too, we'll use our own tool for product research.
One thing that's super powerful when it comes to collaboration in any meeting, if someone has an idea, a thought, they remember something, just being able to dull that up quickly and retrieve the information and have that discussion in real time rather than having to say, maybe let me get back to you on that once I've thumbed through everything again and tried to find that reference.
Incredibly powerful and just gives people the ability to move at the speed of thought like that.
Andrew Dare
Yeah, as I said before, it gives people the space to actually think, but it also gives them the space to chat.
We said this all along, but it's not a substitute for human analysis, but enables us to get across the data and confirm, validate, sense and check our findings. Ultimately, that's what it's about.
It's about being able to get to those insights, but also then make sure that the thinking that you're putting in it ties up with what the data is showing. I think that's kind of the key here.
Jack Bowen
Absolutely.
Andrew, thank you so much for joining. I've really enjoyed preparing this and running this talk.
I know we have about one and a half minutes left for a Q&A, so we'll hand it back over to the Quirk’s team.