How many are too many?

Editor’s note: Don Bruzzone is president of Bruzzone Research Co., Alameda, Calif. Jack Bookbinder is senior market research consultant at Oakland, Calif.-based Kaiser Permanente.

Members of online panels are being inundated with invitations to take surveys. For some they arrive almost daily, reaching a peak of eight a day from the same panel. This has continued for almost a year for three researchers from our two firms, who joined one panel and stuck with it to see how bad it could get and how long it would last. One saved all the invitations and collected 600 in 10 months, an average of two per day (not including reminder invitations).

On the day before this article was submitted to Quirk’s for review one of the authors received five more invitations from this panel provider, including three in 16 minutes! It was on this same last day that another of our researcher/panelists received four invitations from that same panel - in three minutes!

Were these just isolated cases? Or could excessive invitations be causing lots of people to drop out and resolve to never take another survey?

Beyond the problem of respondents dropping out, there is the issue of invitation bias. Suppose the panel provider is short of young males in two separate studies being conducted in the same market (health insurance and coffee) at the same time a survey on professional sports is being launched. Putting invitation limits aside, all young males in that market are sent invitations to all three surveys. Upon arriving home after work, Bill checks his e-mail and reads all three survey invitations. If he has time/patience to respond to only one survey, which one is it likely to be?

Potential respondents for the health insurance and coffee clients are being sacrificed for the benefit of the sports survey. Now, suppose Bill did answer all three surveys (consecutively). Would you want your survey to be the one he took last?

Worth looking into

It was obviously an issue worth looking into. In this case there was a firm resolve among both client and research company not to do business with panels that subjected their panelists to any kind of abuse, so our goal was to find panels that didn’t. The rest of this article describes what we found - and provides some guidelines for dealing with online research panels.

Both ESOMAR and RFL Communications have compiled lists of questions to ask panel providers about their business practices. Both contained questions about limiting invitations. A number of panel providers have published answers to those questions. So we started by examining all that we could find. Only half quoted any specific limit. Others often had fairly wordy answers that said, in effect, “It depends.” By the time we finished this project we learned a bit about how to read between the lines:

“We have built sophisticated sample tools that apply complex business rules to prevent panelists from receiving too many invitations.” (That means they have no firm limits.)

“We control invitations by limiting the number of surveys a panelist can take.” (That means there is no limit on the number of invitations that can be sent to panelists who have not taken a survey recently.)

The Advertising Research Foundation (ARF) conducted a promising study. It checked the quality of online research by comparing results from the same set of surveys conducted online among panelists from 17 of the largest research panels with results when the surveys were conducted by phone and by mail. Two of us from Bruzzone Research Co. (BRC) helped last year designing and launching the study. When the incidents of excessive invitations started, one of our first reactions was to see if the extent of the problem could be covered in the ARF study. However, at that point, the study was too far along to add anything to the questionnaire about how often respondents receive invitations to take surveys.

Earlier, a similar study in the Netherlands1 had caused considerable consternation within the research community when specific differences in the performance of individual panels became known. So from the outset the ARF study promised panels that if they participated, no specific information about their panel or how it performed would ever be released. As a result, even if the ARF study had asked how often excessive invitations were received it would not have helped us identify the specific panels we wanted to avoid. At that point we set out to canvas the panels ourselves.

Since BRC had been involved in asking panels to participate in the industry-wide ARF study we contacted them on our own with the following set of questions. We contacted 19 panels, 17 of which ended up participating in the ARF study.

•  Do you feel it is necessary to control the number of invitations?

•  Do you keep track of the number of invitations a panelist receives?

•  What is the limit you try to maintain?

•  How often do you exceed that limit?

•  How do you limit them?

•  Can we tell others how you answered these questions?

Only 12 of the 19 answered our first inquiry. Others openly objected, calling our questions “intrusive” and “incredibly invasive” and adding comments such as “it offended me,” “not something we discuss outside the company” and “answers taken out of context can only hurt us.”

Some asked if we could call and discuss our questions by phone. In spite of this evidence that some panels were reluctant to reply, we kept repeating our requests, explaining why we felt they were important and saying we eventually planned to write an article on the subject. Finally, after as many as three attempts, 17 of the 19 had replied in writing.

Another incident

Another incident showing panel provider reluctance occurred when BRC was conducting its annual study of Super Bowl commercials earlier this year. The study included a variety of questions to identify respondents who were not answering honestly, a set of questions that was reported on in an earlier Quirk’s article2. They included questions about the number of panels the respondent belonged to and the number of invitations the respondent had been receiving.

We had submitted the questionnaire to the panel provider for checking well before the launch date. They apparently had not read it because we received an urgent call from the panel rep in charge of our survey hours before it was supposed to launch. He said they were sending BRC a non-disclosure agreement that had to be signed before they would launch the survey. It said BRC could not release any of the information about their panel members without their approval. After complaining vigorously that the survey was being held hostage to suppress information all researchers should be collecting about panelists, BRC signed and the survey was launched. Three days later the head of the company and the rep were in our office to talk about the situation. It added to the body of information we were collecting on why panels were so sensitive.

What motivates them

It is always helpful to put yourself in the other person’s shoes and try and understand what motivates them. We listened to many representatives from the panels but we came away with the feeling there is just one basic reason underlying all their concerns: panel quality is often inversely related to panel profitability.

Heavy responders pay back what it cost to recruit them very quickly and then continue to produce virtually nothing but profit. They are the “cash cows” panel managers are reluctant to cut out. While we didn’t find anyone from the panels who would argue with the premise that the whole online research industry depends on its ability to deliver good, reliable and replicable samples, we did find a very human tendency to do everything possible to avoid those things that would reduce the profitability of their business.

We also heard a number of other reasons. (We leave it to fellow researchers to ponder their relative importance.) For example, panel recruitment and retention procedures are often viewed as valuable trade secrets. People had worked hard to develop systems and procedures they felt were more efficient than their competitors and they didn’t want to lose those advantages.

Reluctance to limit invitations can also be quite rational. One panel said a coffee study where all heavy coffee drinkers were used and then “rested” could bias a study that followed on a tooth-whitening product. The survey would have been more accurate if there were no limitations on invitations and the tooth-whitening study had a sample that included its full quota of heavy coffee drinkers.

Another cited studies showing invitation frequency was not related to “bad behavior” or panel dropout rates; questionnaire length and complexity was.

Universal agreement

Seventeen of the 19 panel providers replied to our inquiries. Two never replied even after three attempts. There was universal agreement on several points.

•  All 17 said that it is important to limit invitations, that they have procedures in place to do this and that they know how many invitations panelists get. Thirteen of the 17 went on to describe how they limited invitations.

•   Some (three of 13) said they only limit the number of surveys a panelist can take and use that to limit invitations. Most (eight of 13) have a direct limit on the number of invitations they will send their panelists.

•   One said it had different limits for different types. Another asked panelists how many would be too many.

•   Twelve of the 17 went further and stated what their limits were: The average was 15 invitations per month, but limits ranged from one a day to three a month. All 12 said they stayed within their own limits at least 90 percent of the time.

•   The 12 who stated what their limits were and answered virtually all of our other questions were: Authentic Response, NPD, GMI, Research Now, Greenfield Online, Synovate, Harris Interactive, TNS, Luth, Toluna, MarketTools, PineCone/Nielsen. (The study was conducted just before Greenfield Online became part of Toluna.) Five more replied but didn’t state limits. Again. Two never replied even after three attempts.

•  There was no evidence of falsifying answers. The two panels where we felt we had evidence of excessive invitations were both in the group that declined to state they had any limit on the number of invitations they would send to an individual panelist.

•  None of the five who declined to state any specific limit said we could cite their results. Five of those who did state specific limits said we could tell others how they answered our questions, and they are shown below. They emphasized these were limits, not averages. They would allow up to the number of invitations shown.

             Panel           Limit on Invitations Sent

  Harris Interactive         Three per month

          Toluna                    One per week

        Synovate                   Six per week

Greenfield Online            One per day

    Research Now               One per day

An enigma

All this led to something of an enigma. We asked the panels if they felt the following comment was justified: “Panels that only limit the number of surveys a panelist can take have a system that could still accidently overload a panelist with invitations, if they happened to be in a category the panel was short on. For panelists who haven’t taken a survey there would still be no limit on the invitations they receive.” No one objected - including those who said this was the way they limited invitations.

But another statement that we asked panels to react to did generate some strong objections: “This has led some to speculate they [the panel providers] don’t put any limits on invitations because it acts as an easy substitute for the more laborious job of screening out inactive panelists. If panelists are continuously getting a lot of invitations, that means they are not taking any surveys so the panel is not losing anything if they get irritated and drop out.”

Several said if the above was actually going on it was atrocious. It was the type of thing that could ruin the industry. With our documentation of excessive invitations being sent to three panelists continuously for almost a year it is hard to conceive of any alternative explanation. But their reactions did show sensitivity to the issue of excessive invitations, and some firms expressed hope for change if enough in the industry insist that all panels place limits on the number of invitations they send to their panelists.

Final thoughts

Some final thoughts on what we learned from this exercise:

•   Despite the excesses of some panel providers, this article is not meant to be an indictment of all panel providers. To identify the more responsible providers you have to ask the right questions - and get answers.

•   Get their answers in writing. We found a number of cases where what was said over the phone didn’t match what they put in writing.

•   Study their answers carefully. Did they really say what you hoped/thought they said?

•   Talk to the panel firms about the problems you find. If they are never questioned there will never be any motivation to change.

•   Don’t just pick the lowest-priced panel. That is what started the problem in the first place.

•   Clients who do not deal directly with panel providers should ask their research firm to provide the information as part of the evaluation of the research methodology.

•  Finally, be sure to pick a panel that definitely limits invitations.

References

1 Willems, Pieter; van Ossenbruggen, Robert; and Vonk, Ted. “The effects of panel recruitment and management on research results - a study across 19 online panels.” ESOMAR 2006.

2 Bruzzone, Don. “Sampling the impact,” Quirk’s, April 2009. [Enter article ID 20090404 at www.quirks.com.]