In the several years that we have conducted the reader survey that forms the core of our annual Q Report, our editorial staff has always been struck by the remarkable consistency of the data. Other than salaries and other employment-related aspects fluctuating a bit as a result of then-current economic situations, the assessments expressed are typically all within some expected ranges. For that reason, and to keep things interesting for you and for us, rather than ask the same questions each year, we’ve been on a three-year cycle of inquiring about certain parts of the marketing researcher’s job, from vendor relationships to evaluating new methods.

The Q Report work life study of corporate researchers is based on data gathered from an invite-only online survey sent to pre-qualified marketing research end-client subscribers of Quirk’s and members of ESOMAR and The Insight Management Academy. The survey was fielded from May 22 to July 1, 2019. In total we received 828 usable qualified responses. An interval (margin of error) of 3.35 at the 95 percent confidence level was achieved. (Not all respondents answered all questions.)

As we did in 2016, this year we focused on pain points, the impact of the data deluge and some assessments of MR and insights as a business discipline. (See Emily Koenig’s article on the salary and compensation side of things.)

Popular pain points circa 2019 are just what you might expect. “Too many projects for our budget” and “too many projects for our staff” again earned top combined always/often a pain point percentages. (And, as with previous Q Report surveys, the audience of researchers was not shy about expressing their, ahem, feelings about our survey. To the question about pain points, one person replied: “My biggest pain point at this moment is that you did not allow Don’t Know responses in this survey. :)”)

Our open-end seeking more thoughts on their pain points drew a range of interesting responses from the researchers. Some themes were familiar – difficulty reining in rogue DIYers; not enough time or money; shoddy reports from vendors; and managing internal clients’ expectations. Some were new, including these two instances of researchers having trouble with country/cultural issues within their organizations:

“My company is very supportive of research and takes seriously the need to put consumers in the forefront of what we do. However, we are a global company based in France and a major challenge is getting consideration for local or U.S.-based suppliers, as the bias is definitely toward European research suppliers, methodologies and ways of thinking about research.”

“I deal with our global headquarters (in a different country) who doesn’t have a research SME and ignores our recommendations for methodologies, questionnaire design, interview guide design and selection of research agency and the resulting insight is often not very useful.”

And, unfortunately, too many mentions like this one of trouble finding quality sample:

“Respondent quality is awful from many panels. Feels like the majority of respondents are professional or not caring and thus each year more and more efforts must be made to screen out bad data.”

Sentiments were also similar to 2016’s on the question of how readers would rate their company’s research function on a number of fronts. As might be expected, with the constant drumbeat to provide measurable proof of research’s value, “ability to demonstrate ROI” earned a similar combined very poor/poor percentage this year – 28% – compared to 2016’s 33%. “Ability to mitigate risk for the company using research” was also cited as an area where research’s performance was poor.

 Only 33% of respondents rate their market research function’s ability to demonstrate ROI as good or very good.

On the positive side, the ability to uncover business problems (47%) and (separately) solve them (53%) were seen as strengths. Interestingly, the ability to demonstrate statistical validity went from a 32% “very good” in 2016 to a 20% “very good” in 2019. Granted, the combined good/very good percentages were 73% in 2016 and 65% in 2019 but one wonders if some of the pressure to deliver insights ASAP is making researchers fret over their ability to maintain rigor.

How would you rate your company’s marketing research function on the following fronts?

Key metrics

Research ROI gets a lot of play as the one metric to rule them all but it’s also seemingly impossible to uniformly define so we asked an open-end to find out what key metrics are used to judge respondents’ research and insights functions.

They ranged from the specific…

“# of action plans implemented as a result of survey findings.”

“Do our business partners make changes based on insights? Are we coming to our business partners with actionable and cost-effective research solutions? Speed of research.”

“Sales return within two years of market research.”

“Output-based measures – number of projects, number of participants, number of people on the panel, diversity of panel.”

“Return on ad spend, success of project KPI (could be click-through rate, completion rate, etc.) and speed to delivery.”

“Number of completed projects (lame, I know).”

…to the vague (but interesting-sounding!)…

“It seems like it’s the qualitative opinion of the clients that we work with who judge the inside function.”

“We actually really struggle to answer this question since the number of projects and objectives/goals differ all the time. It’s really hard to measure ‘how well’ we are doing. We are not sure of the best way to approach but we measure which departments within the organization we are working with and make sure that we are working across all business lines and truly supporting the organization as a whole.”

“This is a tough question. The function is judged based on the performance of individuals to support business objectives of their specific categories. We might also be judged on our ability to work cross-functionally, to collaborate, to be a proactive and engaged strategic thinker. But I’m not sure of the ‘metric’ for that. It tends to show up in individual performance reviews as best I can tell.”

While the quest to define and demonstrate ROI is certainly admirable, the responses to our question point to the futility of agonizing over a one-size-fits-all approach. With the endless variables in play – company size, company type, department size, department budget, level of management belief in marketing research – every situation is different. Some readers indicated they aren’t subject to any kind of specific success measurement but many sketched out what sound like workable scenarios – a bit of NPS here, the number of projects completed there – that have been cobbled together to try to place a value on what they do.

Then there’s the situation of this lucky (or maybe not!) researcher:

“Metrics? None! Executives and leaders throughout the org like us & use us and we’ve not been asked to quantify our value with specific metrics (yet!!! dreading the day it comes).”

Become less and less central

As organizations are awash in data these days, observers feel that research’s role as the conduit through which the customer’s voice is heard is under threat. When surveying or talking to customers isn’t the only way to get customer data, the thinking goes, research will gradually become less and less central. We posed a question along those lines to see if respondents felt the drive to integrate the voice of the customer would elevate or put MR at risk, and a healthy 42% said it “definitely” elevates research and another 39% said it “probably” elevates it.

We asked for comments and got some very thoughtful responses.

“Every touchpoint is an opportunity to gather understanding. We should embrace these potential data streams rather than ignore or fight against them.”

“I think it should elevate the role of market research, as long as the researcher grabs the challenge and stays front and center as a leader in the area. If brand managers, on the other hand, see it as one of their primary responsibilities, they might benefit from the visibility to the detriment of the researcher since, as everyone knows, marketers are far better at grabbing the limelight! :-)”

“I think shifts like this are an opportunity for strong research/insights leaders and teams. Nimble teams who have built trust and demonstrated their impact and value will be able to elevate their role; ones who haven’t will struggle to stay relevant.”

“Sadly, this often becomes shorthand for having overworked product or sales managers collect input in ways that are not reliable for decision-making.”

“Puts us even more at risk in a company that thinks everyone can do VOC. Instead of fighting it, I’ve decided to help this happen more formally and systematically by implementing a VOC program for everyone in the business to use. If you can’t beat them, join them!”

Multiple responses cited the potential dangers of a fragmented approach to customer listening, as it could lead to inefficiencies of effort and budget (many internal groups trying to answer the same questions) and, perhaps more damaging for researchers, a sense that anyone in the organization can gather and analyze data.

But for some, it’s clearly been a good thing:

“[We are getting] more and more calls for research initiatives to provide the voice of the customer. We now have a pipeline of 10 initiatives whereas two years ago we typically had 1-2.”

“‘Anyone can send out a survey’ is the phrase we hear. VOC masks the real work of insights, which is helping to identify the right business issues to research and drawing out the correct implications and application of what’s learned. We consistently see folks conduct VOC on the wrong issues, be clouded with confirmation bias, not see the orthodox-challenging findings, not advocate for the customer in the business decisions. We (insights team) are continually called into help because we are ‘independent’ or the voice of truth.”

Along those lines, we asked if the wide availability of customer data from sources other than marketing research is potentially damaging to the research and insights function or if it had the potential to elevate its role. Twenty-two percent of respondents said they felt it puts MR at risk while 24% said they felt it “definitely” elevated MR’s role and 34% said it “probably” did.

Do you feel the wide availability of customer data from sources other than marketing research (CRM data, sales data, web analytics, etc.) elevates the role of marketing research and insights or puts it at risk?

In their comments after answering the question, many rightfully acknowledged the reality that other sources are here to stay but that researchers can still flourish by taking on the role of the sense-maker and storyteller who can assemble all the data points into a cohesive view:

“It’s sometimes hard to reconcile hard, transactional data and self-reported behavioral data. It’s a blending of the two that creates the insight and understanding.”

“This all depends upon what you do with it and can you as a professional step up and use this data as another truth-telling platform.”

“As long as insights is method-agnostic and gives holistic answers, it’s a win.”

“Availability [of data] was never the issue; knowing what to do with it has been the issue. I don’t pay a carpenter $100 for the hammer and nails; I pay them because they know where to put the nails to build/fix stuff.”

“[Non-MR-generated data] tends to be backward-looking and faster to deliver, which makes it a preferred tool for senior management. However, the results often only provide limited direction that management then acts on with instinct/experience. That instinct is typically their personal perspective – not always the same as what the customer or consumer or shopper wants.”

“The various data sources are all part of the jigsaw of useful data; they complement each other rather than compete.”

“If we can get access to this information (this is an issue in my company), the data opens up a wide range of things we can do from an analysis standpoint. It can make the insights team much more well-rounded and not be viewed as ‘those people who do surveys.’”

Still, despite the rosy potential of serving as data integrators, things don’t always go so smoothly.

“Ideally, marketing research PULLS TOGETHER various sources (internal and external, primary and secondary, customer records, etc.) to create a comprehensive, cohesive, 360 view of reality. That NEVER happens, though.”

“Data analytics people (separate from insight in this organization) think they can understand/predict consumer behavior just because they have big data.”

“I lost my former job as a market research manager because of a company focus on analytics rather than primary research.”

“Top management sees big data, customer analytics, etc., as ways to get customer voice for free. They don’t see the decreased value.”

“It elevates the role but presents challenges in terms of skill sets. Finding people with the combination of technical skills to leverage this data – but also a strategic lens to use the insights – can be challenging.”

“These data streams should, theoretically, integrate well and tell a more holistic narrative. In practice, data tends to get siloed and integration rarely happens.”

“Data from other sources causes me confusion because I am rarely ever told other initiatives are happening. The UX people started an ongoing NPS project without my knowledge. Respondents to my surveys were complaining that I send too many surveys and that’s how I found out about the NPS project inside my own company. This kind of thing really grinds my gears.”

“The challenge is getting access to this data and then support to use and integrate it properly. I recently asked for a Salesforce account so I could do DIY market research and my request for an account was rejected because the gatekeepers – sales leadership – did not see why they should spend to give a market researcher access to ‘their’ system.”

“I think it can elevate the role of MR, but one has to be careful in using data from other parts of the firm. Often the MR function is not an expert in what is going on in that area, so partnering and sharing credit and insight is critical. I think the biggest danger facing MR is that other areas are becoming more astute about this data use and ownership and are getting on the ‘big data bandwagon’ – if one engages in a turf war with these areas, MR is often the loser since we are not seen as “mission critical.”Compared to two years ago is your company doing more or less of the following?

Isn’t widely understood

We asked twin questions about how valued respondents felt the insights function is as a profession in the general business world and also at their respective organizations. On a 10-point scale of “not valued at all” to “extremely valued,” researchers generally feel that insights is valued more highly in their companies than in the general business world.

In their comments after the question, some observed that research just isn’t well or widely understood in business spheres and that even for those who do know of it, the impression isn’t always that MR is an essential practice.

“Generally, people see it as an expense, luxury or a tool to uncover what went wrong rather than a leading indicator or risk reducer. But at larger companies or companies with leadership experience in larger companies, this is not the case.”

“While I think my company values the marketing research and insight function, I’m concerned that the general business world values it less so. It started with big data and the idea that you don’t really need insights people as long as you had someone who could handle data. I see that more and more, and have heard of companies, including two that I have worked for, eliminating their entire research departments in favor of data analysts.”

“I think that companies are getting distracted by AI and other tools a bit and some traditional MR functions are getting lost in the shuffle. MR needs to show that it can incorporate all of the new data sources and AI to be even more relevant.”

“Honestly, it is valued to a point, but decision-makers rarely let bad news stop them.”

“I don’t think the general impression of MR is as doom and gloom as some would promote. But, we all need to push ourselves to be better at understanding the business aspect, and connecting with our parts of the business, if we want to have a better reputation.”

This response, and many others like it, struck a more confident tone:

“Market research influences so many aspects of the business: it impacts product development and strategy; it provides a thorough and nuanced lens into the customer and their motivations, habits and preferences; and when done correctly, can predict and impact marketing functions to make the right decisions and gain competitive advantage.”

Compare usage

To get an idea of usage levels of various techniques – established and newer – we asked respondents to compare usage from two years ago to today on methods ranging from traditional focus groups to virtual reality. The mix of approaches with the highest percentages of “have not used in the past two years” includes traditional approaches like telephone focus groups and paper-based surveys and buzzed-about tools such as facial coding, virtual reality and biometrics.

It’s telling that big data analytics has the second-lowest percentage of “have not used in the past two years” at 17% while netting a combined 54% who say they are using it some more or a lot more. The reported usage of secondary data confirms anecdotal comments heard in recent months at industry conferences that more researchers are (re)turning to secondary sources: 31% say they are using them some more and 12% say they are using them a lot more. Online surveys (23%) and social media research (12%) also have instances of “a lot more” usage. In general, the tried-and-true methods of traditional and online focus groups, online surveys and panels seem to be holding their own, with some of the highest levels of “about the same” usage percentages.

Which of the following do you do to stay up-to-date on research methodologies and techniques?

A big part of employing the new methods and techniques is finding out about them in the first place and hearing others who have used them talk about their experiences. We asked readers to tell us how they stay up to date on MR methods (select all that apply) and “attend in-person conferences and events” netted 76%, followed closely by e-newsletters (75%), “read blogs and websites” (72%) and “read print or digital magazines” (72 percent). Social media-based outlets were less popular, with “participate in online discussions (such as LinkedIn groups)” 48% and “read or follow research feeds on Twitter” at 12%.

What does the future hold?

So, against a backdrop of data integration and competition from within their own organizations over who holds the right to speak for the customer, what does the future hold? We asked researchers to tell us the biggest changes their organizations would make over the next year.

There were multiple mentions of enhancing efficiency to free them up to do more analysis, either by teaching DIY techniques to internal audiences (“Greater emphasis on helping select internal stakeholders with DIY research on lower-risk projects, both to help them know their customers better and to free up our professional researchers’ time for higher-payoff efforts.”) or by offering new capabilities (“Challenging the standard ways we do current research – innovation process on research studies.”).

Also, many researchers will be upping their digital game. Several offered a variation of the phrase “digital transformation” and talked about automation and dipping their toes in the AI/VR waters.

And, lots of talk of changing roles/responsibilities and reorganizing, either due to company-wide shifts or an insights department-driven plan: “Moving to a ‘shared service’ model with differentiated roles (e.g., analyst, programmer, project manager, vendor & contracts, consultants, etc.).”

There were multiple mentions of plans to move beyond capturing reported or stated behavior and many cited plans to look for newer, more agile suppliers with new tools and new ideas: “We plan to use new suppliers, usually tech startups who can do things faster & cheaper but don’t have a ton of insights expertise AND rely less on claimed-behavior surveys – more in-the-moment data capture.”

Qual will get some love as well, with several respondents mentioning their plans for using more in-person methods: “Investing more in qualitative, specifically in both deeper-dive observational (and possibly ethnography) and smaller mobile qual projects.”

Perhaps the most promising plan of all was this one:

“More emphasis on research as a profit center.”

That has a nice ring to it!

Methodology

The Q Report work life and salary and compensation study of corporate researchers is based on data gathered from an invite-only online survey sent to pre-qualified marketing research end-client subscribers of Quirk’s and members of ESOMAR and The Insight Management Academy. The survey was fielded from May 22 to July 1, 2019. In total we received 828 usable qualified responses. An interval (margin of error) of 3.35 at the 95% confidence level was achieved. (Not all respondents answered all questions.)