Earlier this year, Canada’s Marketing Research and Intelligence Association (MRIA) released results from a study it undertook to explore how data and information – with a focus on public opinion research data – are used in Canadian governance and policy making and how such usage could be improved.

A set of 39 in-depth interviews was conducted with members of government, politicians, political strategists, media members, academics, opinion research experts and heads of think-tanks, NGOs and other national associations.

As I read the executive summary (you can find it at http://mria-arim.ca) I was struck by how familiar the com-plaints were to those frequently cited by users of business-related marketing research and insights data.

Substitute “company” or “organization” for “government” in the following passages from the MRIA report and you’ll see what I mean:

  The opinion leaders routinely identified three perceived shortcomings in the data used by government.

  Longitudinal data have gaps. This inhibits the government’s ability to identify and track trends and to effectively address emerging issues and challenges (e.g., how to react to an aging population, how to react to a changing climate).

  Data are not sufficiently rich nor detailed. This limits the government’s ability to conduct the analyses needed to understand issues fully and in all their complexity (e.g., understanding changes over time, understanding differences between groups of people).

  Data are often outdated. This problem is increasingly difficult to address in a world where even the most up-to-date information can become irrelevant in a relatively short period of time.

  Shortcomings identified less frequently include:

  • data that is partial or limited;
  • gaps resulting from information that is not being collected (e.g., lack of national environmental tracking data);
  • difficulty transforming massive amounts of data into useful information;
  • difficulty identifying and measuring data because of its nature (e.g., how to measure the integration of new citizens); and
  • answering the “why” question (e.g., why outcomes are not being achieved, why people think the way they do).

Apply the same approach to this passage on the perceived shortcomings in the way information is used by government.

  Subordinating evidence to politics was the most frequently identified perceived shortcoming in the way government uses information. This was seen to take various forms, including:

  “Cherry-picking” or focusing on information that supports a certain agenda or policy and at the same time ignoring or dismissing information that does not.

  Employing a self-serving, partisan bias in the decision-making process (e.g., what will enhance electoral success rather than what constitutes sound policy).

  Basing policy on hunches, unfounded assumptions or anecdotal evidence instead of research-based evidence.

  Giving greater importance to the opinions of a certain audience even when the issue relates to a broader population.

  Three other perceived shortcomings in the way government uses data were also identified relatively frequently:

  • insufficient analysis of data;
  • focusing on shorter-term considerations instead of longer-term considerations; and
  • too many restrictions on data-linking and –sharing, which ultimately impedes the government’s ability to collect and use relevant information.

A dismissive chuckle?

So, what’s your response to the study findings? A dismissive chuckle? A pained nodding of the head? A feeling of schadenfreude? All of the above? For me, it’s comforting (for lack of a better term) to know that business in-sights professionals aren’t the only ones who struggle with these problems. When you’re dealing with data and a host of stakeholders with competing agendas, varying skill sets and disparate views on the value of the research process, you end up fighting the same types of battles, no matter which sector you work in.