Editor’s note: Bradley Honan is a CEO and president of Honan Strategy Group, a public opinion and data analytics firm, New York City. 

It’s worth a pause to take stock of where the market research and polling industry is today and consider the challenges, as well as the opportunities, facing our profession. 

In many ways the term big data is redundant for those of us working in the market research profession. As an example, a quantitative research study of 45-60 questions with several banners of crosstabs is enough to keep the most proficient data junkie among us up to their eyeballs in numbers for several weeks.

But the big data era, which should present an enormous opportunity for the market research industry, is actually creating strong headwinds. Ironically as data – and the data profession – becomes sexier and sexier, the challenges facing our industry grow more profound. Today, significant disruption is staring us all straight in the face. But we all have the opportunities as people invested in our industry to help enact overdue change.

The current headwinds exist in five main areas.

Cost

Our industry has not done nearly enough to contain costs, by failing to more quickly integrate technology and automation. This is most notable in terms of data collection, presentation-making and on the analytics side. At the same time, clients are increasingly being pitched – and are buying – automated social media sentiment analysis tools that cost a fraction of what a customized research study would cost.  

While many could argue the benefits of a custom study deliver far greater strategic insights than automated sentiment analysis, many clients are voting with their wallets and purchasing far cheaper sentiment tools.

I don’t see this choice as an either, or. Rather, in an era with costs constraints, our studies need to become far more cost effective – and technology needs to play an even larger role in our arsenal. If we can order products through Amazon’s Alexa, why can’t we collect research data that same way, and in doing so, bend the cost curve downward?  

Timing

The world is moving faster and faster and corporate decision making is keeping pace. We as an industry however, are not. A stand-alone custom research study usually takes no less than four-to-six weeks to complete. Again the pressure we face is from high-level corporate decision makers who increasingly need answers in near real time. In contrast, social media tools can produce data and insights literally minutes after scanning social media content and conversation.  

At this very moment, there are scores of junior-level analysts building out charts and graphs for presentations and in many, if not most cases, they are manually inputting data into Excel or PowerPoint. This is a mundane process, weeding out good people from a career in our field and introducing the possibility of manual data entry errors. And it’s terribly slow and inefficient.  

Whatever your solution is, the directive to us all is unmistakable: our industry needs to move our work along much more quickly – without sacrificing methodological rigor or data integrity or accuracy. 

Representativeness and accuracy

The world around us is becoming even more heterogeneous, while response rates continue to fall and almost can’t go any lower. At a critical time when businesses need our insights more than ever, we are unable to provide larger sample sizes cost effectively. Larger data sets, collected at lower costs and much more quickly, need to be our marching orders. 

A need for actionability 

Far too much of our industry regularly produces reams of polling and qualitative data, which isn’t particularly actionable for business decision-making – the so called “data dump” report. This in itself is not new, but is exacerbated by the previously mentioned threats of cost, timing and accuracy. We all need to recommit ourselves to tying the important work we do with the specific outcomes that our clients are trying to achieve and the specific answers they request to their burning questions. 

Our industry must become the masters of separating the nice to know data and information from need to know data and insights. We cannot just focus on mastering how to scale a question, how to word a question or whether we should rotate or randomize a question series in a survey – as important as all these points are. Our mandate is and should be bigger and far broader.

Method agnostic data analysts

I have interviewed and hired scores of junior- and mid-level research executives who see themselves in a surprisingly limited way – survey research professionals, who can only do surveys or focus groups. 

Instead, I choose to see them in a different and more accurate light – and have always encouraged them to do the same. We are professionals with the unique set of skills for gathering and analyzing complex data sets.  

But in order to thrive, we need to both think of ourselves as more than survey professionals and at the same time commit to ongoing professional development and seek to become far more data agnostic – not focused solely on one data gathering tool like a survey, but instead extracting key business insights from disparate data sets. 

The world around us is flooded with too much data – and much of it is frankly meaningless. I know of no group of professionals better able to tackle and sort through what really matters. 

Tackling obstacles 

The future can be brighter for the market research industry than any of us would have ever realized just five years ago, but we need to tackle important operational and mind-set obstacles that prevent greater industry growth and hold us back from achieving even more significant impact.

I see a future that is bright, but one that is firmly of our making. Researchers, let’s go!