New problems, new solutions

Editor’s note: Terry Brazeal is vice president, longitudinal analytics, at Chicago research firm TNS NFO.

During the last few years, much of the data collection done in the research industry has moved from traditional telephone interviewing, mail surveys and some types of central location testing to the Internet. Rapidly developing and often inexpensive online technology has made it possible for respondents to do more and more things online: “pick up” product packages and turn them over to read the fine print; view magazine ads in a magazine-like context, complete with a realistic page-turning mechanism; and visit a virtual supermarket section and choose among the competing items. Today’s technology even allows us to put an avatar on the survey page to provide audible encouragement: “We’re almost finished now - just a few more questions to go.”

In the steady movement from ink to pixels, diary panels have not been exempt. Both long-term open-ended panels covering multiple categories and brief ad hoc projects have made the transition. It seems logical, natural and even inevitable: If so much other research can be done online, why not diary panels? Specifically, why not longitudinal diary panels - panels that collect the same information from the same people, continuously, over time?

Now that several years of data exist, some answers to that question are becoming clearer. Diary panels, due to their nature and complexity, present unique problems in translation from one mode to another and require new solutions.

Longitudinal diary panels have been around since the mid-1940s in custom and syndicated environments. The diaries have served a wide variety of needs across the consumer packaged goods, textiles and soft goods, and financial services sectors, measuring usage and consumption, behavioral tracking and many other areas.

In the CPG industry in the United States, these panels began migrating away from paper diaries in the 1980s as shoppers’ cards and in-home scanners provided by A.C. Nielsen and Information Resources made it possible to run a “no-diary” diary panel by directly collecting the UPC codes of items purchased by known groups of consumers. Unfortunately, UPC codes were not always available in usable ways from some important trade channels, and they offered no assistance in tracking consumption or other types of behavior, so the in-home paper diary continued to play an important role.

Since 2000, the research community has worked in earnest to move paper diaries online. Sometimes the transition involved a major shift in design philosophy - surrendering the “longitudinal” aspect of the sample. Since it might be difficult to keep the same people reporting online month after month, some researchers opted to convert longitudinal diaries into online tracking studies, collecting the same information every month but from a changing sample of respondents. This method offered much of the same information supplied by diary panels - share, volume and price tracking - but sacrificed some of the most powerful types of longitudinal analyses, such as brand shifting, buyer flows and behavior-based buyer groups (heavy/light, loyal/non-loyal, etc.). By their nature, these types of analysis require continuous reporting from a constant group of individuals, and when longitudinal samples disappeared, so did they.

Other researchers were not willing or able to make that sacrifice, or did not agree it was necessary. They transitioned longitudinal panels from paper to the Internet while maintaining the basic structure - a longitudinal sample consisting of the same panelists continuously reporting. As a result, the industry is beginning to understand and manage the differences in the data induced by this fundamental change in reporting methodology.

Questionnaire design

It is a well-established research principle that changes in questionnaire design can significantly change the results of a study, even if delivered by the same methodology. This result is amplified when an altered questionnaire is delivered and returned via an entirely different channel.

To address this concern, early online questionnaire designs mimicked the paper diary format as closely as possible. Researchers hoped the familiar look and feel of the diary would induce panelists to continue to report. Eventually, however, to ease respondent burden and promote panelist tenure, online diary designs started to take advantage of interactive capabilities like drop-down menus, guess-ahead completion of entries and automatic duplication of recurring entries, etc. These innovations did succeed in reducing panelist burden, improving longevity. But inescapably, online questionnaires differ from their paper counterparts. The potential differences in results can usually only be guessed at in the absence of parallel tests, which are often expensive, time-consuming and impractical.

Sample characteristics

The U.S. online population available to respond to an interactive diary is demographically different than the total population, most notably in age, education and race. (These variations are declining as the online population grows to approximate the total population.) While some differences can be dealt with by sample management - over- or under-recruiting in key areas, demographic weighting of the sample, either by cells or at the margins, etc. - these techniques can only deal with variables that are: known or suspected to be relevant; measurable; measured.

For instance, dimensions such as “general comfort with technology,” or “willingness to innovate,” may characterize online diary panelists. As an example, in comparing mail to online diary panel members, it became clear that online panel members are much more likely to own and use a digital camera than members of a paper diary panel. In hindsight, this makes perfect sense, but it was not foreseen before the panel began. Another comparison of two panels revealed that online panelists are more likely to make transactions at an ATM machine. Even if such a possibility had been anticipated in a particular category, how might it be dealt with? What demographic variable is a proxy for “feels comfortable with an ATM machine?” How could the sample be weighted to balance for it?

Fortunately, this type of problem has a limited future. In 10 years or so, the probability of someone being intimidated by a keyboard or a mouse will be as low as the probability of being intimidated by a ringing telephone or a ballpoint pen. But a lot of research must be done, used and validated during that next 10 years.

Reporting consistency

An obvious difference between a paper diary and an online diary is physical presence. The paper diary is there, sitting on the kitchen counter or stuck up on the refrigerator with a magnet. A new diary comes in the mail every month, along with an envelope to mail back the old one. The presence of the diary and the arrival of the new one are reminders to complete and return the diary every month.

On the other hand, the online diary has an intangible existence - even seeing it requires initiative. The panelist must sit down at the computer, get on the Internet, go to the correct Web site and log into the diary, usually with an ID and password. Is it any wonder that reporting may be uneven, or that return rates may sag after a while? Remedies for this are available, but it is a problem that must be addressed.

What matters in getting panelists to report online?

Several key factors impact getting reliable, representative reporting from online panelists. Understanding these factors will not solve every problem, but it will make it possible to move ahead and focus attention on data analysis rather than on the mechanics of data collection.

1. Incentives matter

While this might seem obvious, it is not always possible to test the impact of varying an incentive for an ongoing diary panel. We have had the opportunity to test this and the results have been eye-opening.

Figure 1 shows the results. To protect proprietary information and our client’s confidentiality, the dates and the actual incentive levels used are masked and shown as “low,” “medium” and “high.” In virtually every one of the six time periods studied, response rates varied directly with the amount of incentive paid.

Despite the difference in response rates, the reported data did not vary significantly by incentive level. Panelists in the “low” incentive group reported purchase and behavior levels similar to those in the “medium” and “high” groups. However, significantly more of the “medium” and “high” incentive panelists actually returned their diaries. Since many longitudinal diary panel analyses rely on a large static sample - a panel that has returned most or all of the diaries - this is extremely important.

2. Experience matters

Online reporting of even moderately complex behavior involves a learning process. Figure 2 shows purchases of a high-incidence category over time reported by a new online diary panel versus a long-existing mail panel. It is clear that reporting improved steadily as the online panelists learned and remembered their new responsibilities.

The nature of the category as well as the data from the paper diary make it very unlikely that the actual purchasing behavior of the online panel changed. The true purchasing levels of the two panels were almost certainly fairly consistent. Over time, the experience and learning of the online panelists (aided by ongoing training and reminders from the panel operators) gradually improved their reporting. Decisions based on the early reporting of the online panel might have led to unfortunate consequences.

A recent test yielded experimental confirmation of the impact of experience on online panelists. A few periods after the beginning of an online test, a large number of new recruits were introduced into an online panel. Overall reporting levels immediately dropped, relative to the mail panel. This result was analyzed by separating the experienced online panelists from the new recruits. As Figure 3 shows, the experienced online panelists tracked much closer to the mail panelists, while the new recruits had not yet climbed their learning curves.

To be clear, the paper diary had been operating for several years, and many of the panelists were members of reasonably long standing. Although there might originally have been a learning curve for the paper diary, for most panelists it was well in the past.

3. Diary design matters

The design of the questionnaire makes a big difference - pixels are not paper. As previously discussed, the online diary lacks a physical presence in the home. This presents two problems:

  • Without the reminder of the diary in the purse or on the refrigerator, panelists often forget to report all their purchases or behavior. This is especially true for purchases involving small amounts of money, or behavior that may take place away from home.
  • Without the reminder of the new diary arriving in the mail, panelists may forget to “return” the online diary. In our system, a panelist must log in and click a button to indicate the report is complete for the month.

Several activities can be done to help panelists remember to report smaller purchases or away-from-home behavior.

  • Small, printable sections of the online diary can be created. Panelists can click an icon and give themselves a “mini-diary” to carry in their pocket or purse, or post on the refrigerator where the old one used to be.
  • Reminder tools - for example, small vinyl receipt collectors - can be given to the panelists, providing them with a place to store their purchase receipts during shopping, and making it easy for them to recover the information they need when they get home.
  • E-mail reminders sent near the end of the reporting period help stimulate return of the reports. This is a potentially dangerous technique that must not be over-used. If panelists start perceiving mail about the diary as spam, the relationship will end long before it should.
  • Hybrid mail/online panels may be useful. These panels use paper diaries for daily recording. At the end of the month (or other reporting period) panelists log on and transcribe their already-collected information online. The arrival of the new diary provides a reporting reminder. Although this method combines the speed, economy and operational efficiencies of online reporting with the benefits of having the diary present in the household, panelists are, in effect, asked to record data twice. To control respondent burden, its application should be limited to categories involving relatively few purchases and a limited amount of detail.

Learning continues

The movement of diary panels from mail to online modes will undoubtedly continue, and much remains to be learned. Indeed, there a have been discussions about diaries on PDAs and even cell phones. The issues involved in maintaining reliable reporting from a longitudinal online sample are, to date, only partly overcome. But learning continues, and perhaps in the near future the current set of issues will look trivial and quaint - at least when compared to the entirely new set of problems that will inevitably have arisen by then.