Surveys, polling and other forms of marketing research have been practiced in a formal way for decades and yet on an almost daily basis I see the misuse of data. Here are three errors I’ve seen in the news recently.

Not knowing the source of data (or knowing it but not seeing the conflicts of interest)

My general rule is that whenever a sponsoring organization has a vested interested in the outcome of the data, the data has a bias. The bias might be in the sample, the wording of the questions or simply the way the data is presented but if the organization is publishing the data for all to see, it is generally going to make sure the findings make it look favorable.

In my home state of Minnesota it recently came out that projections showing that revenues from electronic pull tab sales would explode in Minnesota and immediately start funding a new Minnesota Vikings stadium were based largely on estimates made by gaming businesses that would benefit if the state adopted the new form of gambling. Those in charge of supplying the data claim that they never hid the source. That may be, but they certainly don’t seem to have been forthcoming either.

Lesson: Know thy source. If the data sponsors stand to gain from the results of a study, the results may be biased, no matter how loudly they protest to the contrary.

Blindly believing the data that works best for you

The data error isn’t always based on an intention to manipulate. Sometimes I see instances where companies, organizations or people want to will away data. This seems to have been the case with the Mitt Romney presidential campaign. Every time a poll showed Obama ahead, Romney’s team would attack it. At first I thought this might just be political posturing but based on comments they’ve made in the months following the election, it seems the campaign team simply refused to believe independent data that showed Obama winning – despite the fact that the numbers were coming from numerous sources.

Lesson: Ignoring data does not make it less true.

Surveying your customers only

In a third example of survey bias, a local company recently profiled in the media noted that, based on research among its customers, it had a 98 percent satisfaction rating. The news story, however, went on to state that the company was losing market share. The problem seems to be that the firm is surveying the wrong people, if it wants to continue growing. After all, it isn’t just your customers you need to understand but your entire market. For example, I have no doubt that those who rent videos from their local video store are very happy with their service. But what about everyone else? If a video store only surveyed its own customers, it would miss a large segment of its potential audience (and in this case be unaware of a major threat at its doorstep).

Lesson: It isn’t enough to just know your customers. You need to know your competitors’ customers as well. 

When reviewing studies or news articles involving survey findings, what else do you look for to determine if the research is believable? Perhaps we can compile a list like “The 10 Commandments of Reading About Research in the News.”