A growing sense of community

Editor’s note: Gregory S. Heist is director of innovation, and Mitchell S. Sanders is research director, analytics, at Gongos Research, Auburn Hills, Mich.

Over the past several years, online communities have developed into powerful platforms for engaging customers in extended conversations. As more and more corporations embrace online communities, many market researchers are eager to pursue a more sophisticated set of research applications within them.

General Motors was among the first in the industry to take private online communities to this next level. Seeking to lend greater validity to the insights gained from online communities, it needed online communities to not only act as a vehicle for interaction and observation but also for the findings they generate to carry statistical weight.

GM’s experience, and the experiences of other companies pursuing quantitative results, suggests that the industry still yearns for answers to significant questions about the quality of insights generated by online communities:

  • Are business insights from online communities similar to those from online access panels?
  • What types of consumers join online communities?
  • What level of data quality do online communities provide?
  • How does the online community experience contribute to fostering positive feelings toward the research process among respondents, and, by extension, to enhancing data quality?

The answers to these questions are critical, since they point to the potential for online communities to represent a new research paradigm - one that combines the tools and statistical power of quantitative research with an interactive and engaging environment that can fuel additional types of insights.

This article explores these questions and examines how the implications of our findings will affect the future of online communities in marketing research.

Same explanatory power

For online communities to become the basis for a new paradigm in market research, it’s vital that they provide the same explanatory power and business insights as current approaches. In contemporary market research, that standard has been set by online access panels. While Internet panels are certainly not without their detractors, in a world where online access is widespread and surveys in other modes are increasingly difficult, we feel online panels are the most relevant benchmark for quantitative research in any new research platform.

Because participants in online communities are recruited using methods similar to online access panels (or in some cases, recruited directly from them), it would be surprising to find that the two types of sample generate radically different results. To substantiate this intuition, we investigated three studies that were conducted using parallel samples from online communities and from online access panels.

As suspected, we found a very high degree of similarity between online community results and our benchmark - online panel results. The results from the side-by-side studies (unweighted studies of American adults) are as follows:

Greeting cards. In a study of seven holiday greeting concepts, a measurement of purchase intent using a constant-sum allocation task yielded identical rankings of the concepts in both samples. Across each, average interest for the concepts differed by no more than 2.5 points on a 100-point scale.

Pet care. In an importance rating of 61 evaluative statements concerning shopping for pet care items, in both samples the top three statements were in the same order, and the top 10 statements contained eight items in common. Over the entire list, rankings differed by an average of only two spots. A ranking of satisfaction with pet care retailers was identical (considering the six retailers with at least 100 responses from each sample).

Electric vehicles. A study of electric vehicle concepts yielded very similar results about attitudes, concept evaluation and manufacturer rankings (see sidebar).

Further, it should be noted that the negligible differences between the samples would not have changed the nature of any business insights from the three studies.

This is not to say that participants in community surveys are in all ways identical to online panelists. Not surprisingly, members of an automotive-related community tend to have greater interest in, and expertise about, automotive issues (see sidebar). Nevertheless, despite these differences, this analysis clearly shows that online communities and online panels provide equivalent business insights, and would produce the same business decisions.

Selection effects

Online communities and online panels share another characteristic - the potential for selection effects. Membership in an online panel or an online community does not happen at random - participants have chosen to join, and have chosen to stay involved. Therefore, some characteristic differences from the general population are to be expected in any sample obtained online.

In a sample from Consumer Village - a Gongos Research-managed community - we found that respondents spend more time online per week (21 hours) than the average online American (10 hours). Respondents from Consumer Village also engage in a more eclectic set of online activities - they are more likely to use online classifieds, buy in online auctions and do their banking online.

Still, these observed differences should not be concerning, for two reasons. First, members of online communities and the general population engage in the same types of online activities, albeit to varying degrees. However, unless these differences between online community members and the general population are correlated with responses to the questions of interest, they will not impact results. In such cases, conclusions can be considered projectable to the general population.

Likewise, for topics where online behaviors are found to be relevant, such as exposure to information found online, or opinions related to online privacy, knowing the nature of the differences also provides the power to mitigate them. It is possible to use national benchmarks for online behavior as a “safety net,” using weighting or sample stratification to balance results. This type of adjustment would be in addition to any stratification, quotas or weighting implemented to balance sample demographics to known benchmarks.

Can provide quality data

In general, evidence shows that online communities can and do provide high-quality data. In our experience with Consumer Village, for example, we have seen an average response rate of 33 percent over the past 12 months, with 90 percent of studies having response rates between 22 percent and 41 percent.

Inattentiveness varies with study type and length, but respondents from online communities are consistently more attentive than industry standards, as measured by occurrences of straightlining and looking at consistency with data-quality traps.

On average, of those who qualify for a study, 92 percent complete it. If we include those who are terminated because they fail to qualify, the completion rate rises to 97 percent.

Consumer Village retains an average of 88 percent of its members per quarter (where attrition is defined as non-participation in a six-month period). At least 95 percent of Consumer Village respondents, and sometimes more, are willing to provide information about their household’s income.

Other communities, which structure incentives to promote participation over the life of the community, can be expected to have even higher response rates, completion rates and retention rates. But even without this boost, data analysis can proceed without significant concerns about data quality.

Positive experiences

There are some standard elements of online communities that can help facilitate data quality, like customization, visual appeal and ease of use. But we believe that the key drivers of data quality are the positive and diverse experiences available to participants in online communities.

To understand how these opportunities impact respondent motivation, we asked a sample of Consumer Village members to rate the importance of various statements about participation in online research. Some of the usual suspects emerged:

  • Nearly 90 percent considered “earning incentives or rewards for participation” to be important.
  • More than three-fourths identified “influencing the decisions that companies make about products,” and “expressing my opinion” as important.

Yet there was also broad interest in the types of experiences provided by online communities:

  • 47 percent considered it important to “interact with others about topics that interest me.”
  • 39 percent considered it important to “belong to a community.”

A more in-depth dialogue with members in Consumer Village echoed these results, and indicated the importance of community interaction in producing a positive online experience.

To measure community participation, we looked at the frequency with which respondents posted messages in community forums, either in response to a moderator-sponsored activity or on their own initiative. We found that for many members, their community experience tends to resemble that of an online panel, as they respond to survey invitations but decline to participate in discussion forums. But significant numbers take advantage of the interactions that the community has to offer: 19 percent post on average at least once every two weeks, and another 10 percent post on average at least once a month.

Further, these community-oriented behaviors are strongly associated with increased participation in quantitative studies.

  • Among those who post on average once a month, participation in quantitative surveys is 29 percentage points higher than for the average member of Consumer Village.
  • For the most frequent posters, those who engage in discussions on average at least once every two weeks, the increment is 36 percentage points.

This finding makes sense on both practical and motivational grounds. There is a direct effect of participation, as more frequent posters have greater exposure to surveys posted in the community and therefore encounter more opportunities to participate.

But more importantly, there is an indirect effect of participation, based in the motivations of community members. Those who are inspired to share their opinion in one way (via discussion forums) will also be likely to express that opinion in other ways (quantitative surveys). And because communities tend to attract and retain those who are interested in expressing their opinions in an interactive community, those same individuals will respond at relatively high rates when invited to participate in quantitative studies.

High response rates would not be especially helpful if highly motivated respondents skewed a study’s results, because easier access to lower-quality data is not a winning combination. But as we’ve seen from our sample comparisons, online communities generate the same conclusions as online access panels. Online community members may express their opinions more frequently, more avidly and more vividly, but the opinions they express don’t differ significantly from their more reserved counterparts.

(Integrating research and branding: GM's Project Driveway online community blends research, viral marketing, public relations and in-field product testing.)

Implications for the future

Our findings point to a number of exciting implications for the future. More than just a technologically-driven reinvention of the market research wheel, online communities have the potential to usher in a new type of relationship with consumers.

The interactive and collaborative community environment creates a different dynamic for marketing research. Beyond a linear transaction of data, communities offer a wide range of ways for consumers to share opinions, exchange ideas with others and learn about how others feel about various topics.

The ability to post topics of their own enables community members to connect with other consumers, providing researchers with the ability to observe consumer behavior on a larger scale. By mining consumer-generated content for insights, researchers can identify innovation opportunities and gain fresh perspectives.

The community model paves the way for corporations to conduct quality research while simultaneously reducing costs and increasing the speed with which decisions can be made using consumer input.

Since communities generate real-time longitudinal insights that can be continuously integrated into the decision-making process, they help “bring consumers to the executive table” within an organization.

As GM’s Project Driveway shows, the informal and interactive nature of online communities provides an opportunity to integrate marketing and branding activities within the context of a true market research environment. Project Driveway is a multi-year online community for the GM fuel cell program that blends market research, viral marketing, public relations and in-field product testing. Representing a joint effort between GM’s fuel cell engineering team, global product research and corporate marketing, members of Project Driveway are selected based upon their geographical location (living close to existing hydrogen refueling stations), various demographic and attitudinal factors and their interest and passion for green vehicle technologies.

Select members of Project Driveway are given the opportunity to be among the first consumers in the world to test drive GM’s fleet of hydrogen fuel cell vehicles for two or three months. Drivers participate in press and media events and provide ongoing detailed feedback to the community about their driving experience.

By integrating all of these disciplines into one community, Project Driveway highlights ways in which online communities can become even more valuable for meeting the needs of multiple disciplines within an organization.

New realm

It is clear from our analysis that online communities represent a promising new paradigm in the field of marketing research. They have the potential to deliver statistical results that are equivalent to traditional online access panels while simultaneously creating a rich new realm for interacting with, and learning from, consumers in relevant and dynamic ways.

Let’s look at the numbers: online community vs. online panel

It’s not surprising that over 90 percent of respondents in a recent automotive study agreed that “the cost of gasoline is rising at an alarming rate.” Similarly, large majorities concurred that “we rely too much on foreign countries for our oil/petroleum needs,” and that “I am concerned with the current cost of fuel for my vehicle,” while less than 10 percent of respondents reported that “I do not think fuel-efficient vehicles are important.”

These were among 42 statements about fuel economy, environmental issues and vehicle styling that were recently evaluated using two different samples: an automotive online community and an online access panel. Importance rankings were highly consistent across the two samples: the top eight statements in each sample were identical, and differed by no more than one place. Overall, half of the 42 statements were ranked identically or within one place, and on average, each statement’s ranking differed by only 2.5 places between the two samples.

When asked about purchase consideration for a plug-in hybrid vehicle on a 10-point scale, the average response differed by no more than 0.25 points between the two samples. And when asked about purchase consideration with respect to specific manufacturers, company rankings were identical in the two samples.

Community members are more likely to describe themselves as people who “like introducing new technologies to [their] friends” (42 percent vs. 27 percent) and whose friends “think of [them] as a good source of information when it comes to vehicles” (45 percent vs. 29 percent). But, as the strong similarities between samples suggest, these knowledgeable community members are equally helpful at providing useful insight into the vehicle-related attitudes and perceptions of the general public.