Sponsored content

Editor's note: Finn Raben is director general, ESOMAR.

An old business adage states, “Faster, better, cheaper – you can have any two of the three!”

Recently, speed has become an increasingly important determinant of business success, but we often forget about its impact on the other two elements.

This is true within the data, research and insights profession, where quality and rigor are core to the service or product delivery, which popular belief now considers antithetical to the modern-day demand of speed. This lies in the misconception that rigor and quality can only add time and cost to the process, not value.

Let’s look at a popular, simple arithmetic question: A bat and ball cost $1.10. The bat costs a dollar more than the ball. How much does the ball cost?

Many people respond quickly and confidently, insisting the ball costs 10 cents. This answer is both obvious and wrong. (The correct answer is 5 cents for the ball and $1.05 for the bat.)

While philosophers, economists and social scientists have assumed for centuries that human beings are rational agents, people such as Daniel Kahneman, Amos Tversky and Shane Frederick (who developed the bat-and-ball quiz) demonstrated that we’re not nearly as rational as we like to believe.

When people face an uncertain situation, they don’t carefully evaluate the information or look up relevant statistics (quality and rigor). Instead, their decisions depend on a long list of mental shortcuts, which lead them to make snap decisions. These shortcuts aren’t a faster way of doing the math but rather a way of avoiding the math altogether! Asked about the bat and the ball, we completely ignore our arithmetic lessons and instead default to the answer that requires the least mental eff ort.

Quality and rigor

The bat-and-ball quiz nicely encapsulates the clear need for quality and rigor in any assessment process. Two definitions that I like are:

  • Quality is always the result of high intention, sincere effort, intelligent direction and skillful execution.
  • Rigor is the quality of being thorough and careful.

These two fundamentals have been at the heart of research and insight industry codes, ISO standards and professional guidelines for over 70 years, providing the yardstick against which members of the research and insights profession voluntarily subject themselves to be measured (and in a worst case, disciplined).

As our environment – data, business, legal and ecological – evolves, so too must our paradigms. “Fit-for-purpose” must now be added as a core requirement – perhaps even the overarching requirement – which in our profession has resulted in dramatic changes to fundamental concepts such as consent to codes such as the ICC/ESOMAR International Code on Data Research and Insights and to the professional guidelines co-authored by associations all over the world.

In my opinion, one of the reasons that the popular definitions of quality and rigor have become somewhat devalued lies in the (seductive) belief that technology advances can absorb or replace the demands of quality and rigor, while simultaneously improving speed of response and results. However, I would argue that this is fool’s gold for four main reasons:

1. Technology, automation and machine learning will only work within pre-defined parameters. 

How do you determine the parameters for data quality when you don’t know which data set is going to be most relevant? How do you determine representativeness to the business challenge when samples are no longer representative? How do you determine the relevance of question wording as language and communication channels change over time? Is a question used five years ago still relevant today?

A knowledge and understanding of these variables is the skill of the researcher. Reaching a definition of these parameters is the informed debate that must be held between the commissioning client and the provider: defining the business need, defining the most appropriate solution (and costs), conducting it to agreed standards and presenting it in a timely fashion. No amount of technology can provide this knowledge-based debate in an off-the-shelf solution. Google isn’t even that good! Remember that while Google took great credit in predicting Obama’s re-election in 2012, subsequent congressional elections were not accurately predicted, nor was the recent U.S. election in which Trump won.

Where does this leave DIY survey tools? Tools are ineffective unless you know how to use them! If you don’t know how to ride a bicycle, it may still get you from A to B, but you will fall off repeatedly and be quite sore by the time you reach your destination.

An understanding of sampling, questionnaire-wording, target audience and the ultimate users of the data will allow you to use a DIY tool effectively. Without it, you may well be bloody and bowed before achieving the insights you require!

A cautionary tale: One company recently believed it could remove its entire research and insights function and replace it by giving the marketing team DIY research tool. Ten months later, the company was rehiring an insights team.

2. Technology cannot force change – only humans can. 

Here, our profession must do better in recognizing and adopting the overarching principle of fit-for-purpose. For example, technology has now produced the most amazing handheld computers: mobile phones. Noting that mobile phones are now practically ubiquitous (and in many regions, the sole form of communication) why isn’t the default contact mechanism for citizens, consumers and participants the mobile phone? Why do we continually insist upon using increasingly outdated forms of contact?

Quality and rigor (not to mention engagement) will be substantially improved by using current and popular forms of contact and communication, rather than forcing people to utilize more outdated – and possibly less accessible – methods.

3. Projects are no longer singular. 

Due to the multiplicity and complexity of modern day communications, single-source or single-method surveys are finding it increasingly challenging to achieve comprehensive coverage of the target audience, or indeed the topic under review. Most projects these days combine several sources of data and use multiple communication channels. Quality and rigor demands vary across these sources and channels and need to be agreed upon in advance to meet the business requirements.

Perhaps the most interesting case here would be the U.K. election of 2015. While most measures indicated a very close Conservative vs. Labour contest, the result was an overwhelming victory for the Conservatives. Many observers of the research techniques deployed believed that there should have been much more of a brand measure included, as any work which looked at brand Conservative vs. brand Labour showed a significant preference for Conservative – as the election proved.

4. We still need researcher interpretation. 

Finally, the translation of research findings or insights into actionable business opportunities requires a comprehensive understanding of the mechanics of the business and the financial drivers. A business that is more dependent on just-in-time stock levels will have a very different marketing and consumer strategy than a business that heavily loads its delivery pipeline.

Such understanding can be better applied by humans than by machines, as humans can again make discretionary or counterintuitive decisions. This point may provoke some level of disagreement from dashboard suppliers, for example, but the fact of the matter is that those dashboards are usually templated and used by insights or marketing professionals to assist in their decision-making, not to replace it.

Not replaceable by technology

So where does this leave us, in the context of research? Marketing research is comprised of a set of skills and a body of knowledge that is not completely replaceable by technology. This is communicated, proven and upheld by an adherence to quality standards, rigor demands and legal obligations. These criteria are constantly evolving to fit in our changing world.

Said in another way: Surgeons and veterinarians follow the same basic medical education but if you needed an open-heart operation, would you want a surgeon or a vet to do it? Just as a vet is not the same as a surgeon, a marketer is not the same as a researcher.

Enviable track record

The marketing research and insights profession has one of the most enviable self-regulation track records, supported by constantly evolving, peer-reviewed standards (both users and providers), and has never been responsible for an industry collapse as the financial sector has, despite its apparently strict regulation.

Good, fit-for-purpose research makes a difference for people, businesses and governments – none of which are planning to replace themselves with technology, so why should research?

It is very attractive to believe that a technology solution can replace a complex knowledge, value and quality profession. But if that seems too good to be true, then it probably is.