A volunteered response

Editor’s note: Margaret R. Roller is president of Roller Marketing Research, Gloucester, Va. Linelle M. Blais is national vice president, voluntarism, talent strategy, with the American Cancer Society, Atlanta.

With over three million volunteers, the American Cancer Society (ACS) is the largest cancer-related nonprofit community-based voluntary health organization in the U.S. In 1998 ACS embarked on a research program with its volunteers that went beyond any known published research within or outside the organization in terms of scope and depth. The centerpiece of our research - which was conducted every year through 2006 - was the Volunteer Satisfaction Study (VSS). The principal objectives of this study were to provide: an in-depth, detailed measure of volunteer satisfaction at the community level; actionable results for the ACS divisions as well as the national home office; and, a mechanism by which ACS could track or monitor volunteer satisfaction over time.

When considering design modes in 1998, a mail survey using the U.S. Postal Service (USPS) was deemed preferable to either telephone or face-to-face due to the nature and length of the questions as well as cost. For this reason the basic research design for the VSS consisted of a self-administered mail survey conducted among a random selection of “active” ACS community-level volunteers.

However, as Web surveys became increasingly popular among survey researchers, research work within ACS began to shift to this electronic mode, with staff increasingly conducting their own surveys via Web-based tools such as Zoomerang and SurveyMonkey. And, not surprisingly, we witnessed a rising demand for a Web version of the VSS.

Beyond the sheer novelty of online research (it seemed easy, new and everyone was doing it), the staff perceived real advantages to an online solution, including the ease of administration (no mailings), cost savings (no printing, addressing, stuffing or postage), time efficiency (less staff effort, less survey data processing time by the analyst), and increased depth and breadth of response across the volunteer population.

While there is reason for many researchers to be enamored of the relative simplicity, low costs and speed associated with Web designs, the Web mode is not free of administrative hurdles (e.g., maintaining accurate e-mail addresses) or gnawing questions associated with response quality. For instance, a meta-analysis reveals that the Web mode typically results in an appreciably lower response rate compared to other survey methods (Manfreda, et al., 2008), and response quality is impacted by socially-desirable responding as well as the inherent bias fostered by any particular survey mode.

Although the large demographic differences prevalent in the 1990s have diminished, older people, lower-income households, those living in rural areas and/or with less education are still less likely to use the Internet compared to younger, higher-income, urban, educated individuals (Pew Internet & American Life Project, 2008). These differences - particularly age, given that the average ACS volunteer is 50 years - have the potential of negatively impacting the results of a Web version of the VSS.

Analyze the viability

ACS conducted a test to understand whether findings from other researchers’ experimentation with survey modes apply to the nonprofit volunteer sector; and, specifically, to analyze the viability of a Web design for the VSS. This study set out to determine the following:

1. What is the rate at which volunteers would respond to a Web survey compared to the traditional VSS paper design?

2. What are the mode preferences among subgroups of the volunteer population?

3. To what degree are behaviors and opinions measured by this study the same or different depending on mode?

4. How complete are volunteers’ responses (i.e., the extent of item non-response)?

The research model was comprised of three test conditions - the paper-only (“paper”) group (control condition), the paper-Web option (“option”) group (similar to the paper segment; however, this group was given the option of completing the questionnaire on the Web), and the Web (“Web”) group (all contacts and link to the online survey were sent via e-mail). A total of 4,000 volunteers from the central U.S. were randomly selected to participate in this study. Two thousand of these volunteers were randomly selected from the entire database of active volunteers in this region and then randomly assigned to either the paper (n=1,000) or the option (n=1,000) group. The Web group was randomly selected from the remaining volunteers in the database who had supplied their e-mail addresses (n=2,000).

To preserve the historically-reliable data of the VSS as well as take a true measure of mode effects, every attempt was made to maintain uniformity across test segments pertaining to: the prominence of ACS and its logo in all facets of the research; the look and feel of the questionnaires; the availability of a toll-free telephone help/comment line for respondents; the incentive (all respondents were offered a summary of results); and allowable skips. That is, given that respondents to the paper questionnaire can opt to skip partial or entire questions, the online version generally did not force respondents to respond in order to advance in the questionnaire. One area of variation was the follow-up telephone interviews with non-respondents, which were only conducted with the option and Web test groups.

Results

Rate of response

A total of 1,166 usable questionnaires were returned from active volunteers - 51 percent or 590 were completed on paper and returned via USPS (309 from the paper group, 260 from the option group, and 21 from the Web group) while 49 percent or 576 were completed via the Web (498 from the Web group and 78 from the option group). Figure 1 shows the rate of questionnaire return across all three study groups. Within the first three weeks in the field, significant differences emerged, with the paper group reaching a return of 31 percent compared to only 25 percent and 21 percent in the option and Web groups, respectively. At the end of three weeks, telephone interviews were conducted with non-respondents in the option and Web groups. These follow-up interviews resulted in conversions, lifting the ultimate rate of response to 36 percent within the option group and 27 percent for the Web segment. The paper group ended with a response rate only slightly higher than in the first three weeks (34 percent).

Based on return rate alone, it would appear that offering an online version of the VSS may not be worth the effort. The return from the Web group was significantly lower than the rate of return from either the paper or option group. The relatively low response from the Web group is in line with various industry sources that consistently find a 20 to 30 percent rate of response to an online test condition (Manfreda and Vehovar, 2005; Kaplowitz et al., 2004). In the case of this study, one partial explanation for the low Web response might be derived from the fact that central U.S. volunteers tend to reside in rural communities and research has indicated that rural residents represent the lowest concentration of Internet users in the U.S. (Pew Internet & American Life Project, 2008).

Reluctance to respond online is also made apparent by the group that had a choice between paper and online completion (i.e., the option group). This group overwhelmingly favored the paper over the online version (77 percent vs. 23 percent, respectively, Figure 2). The fact that the Web and option groups opted for the mode to which they were initially assigned is consistent with the work of Gesell, Drain, Clark and Sullivan (2007), who found that “respondents showed a preference for the survey mode that was randomly assigned [to them].”

Respondent demographics by mode of completion

Although the online mode generated the lowest overall response, it is important to go beyond response by test condition and compare total completions by mode - paper vs. the Web - in order to understand the contribution of each to the final interpretation of results. In terms of demographics, the biggest differentiator is age, with a median of 50 years among volunteers who completed the paper questionnaire compared to 40 years of age among online respondents. This result, of course, is not surprising, given that Web survey response has shown to be characterized by younger respondents compared to respondents who opt for the paper mode (Kaplowitz et al., 2004).

Other demographic indicators - race, ethnicity, gender and type of community - were nearly identical across modes, with a slightly greater proportion of Caucasian and suburban volunteers completing the Web survey (98 percent and 33 percent, respectively), and a marginally higher percent of females and rural-community volunteers responding in the paper mode (86 percent and 52 percent, respectively, Figure 3).

Type of volunteer by mode of completion

It is noteworthy that fundraising and advocacy volunteers were significantly more likely to respond to the online VSS questionnaire while volunteers involved with patient services were more apt to respond on paper. This is consistent with the fact that fundraising volunteers are typically younger than volunteers overall. This inverse relationship between Web completion and age extends to volunteers’ tenure with ACS as well as their association with cancer. Specifically, volunteers responding via paper (i.e., older volunteers) have been an ACS volunteer one year longer (on average) than volunteers who completed the Web questionnaire (five vs. four years), and are significantly more likely to be cancer survivors. Interestingly, however, leadership volunteers (vs. non-leaders) significantly favor the Web mode.

Response differences by mode of completion

Importance attributes

Because survey mode impacts the type of volunteer who responds, it is logical to assume that the results would indicate response differences across modes. While there is a high degree of consistency in the data in many areas, a few notable exceptions are apparent. For instance, significantly more volunteers responding to the paper (vs. the Web) survey “strongly agree” that “staff’s support” and a “staff that is willing to listen to me” are important to their volunteer experience at ACS. On the other hand, volunteers responding to the online survey place a greater importance on: project descriptions, having projects that are “matched to my skills,” “updates on various volunteer opportunities” and “opportunities to assume greater leadership responsibilities.” This makes sense from the standpoint that Web respondents are more involved in fundraising activities and in leadership positions which inherently raise the level of complexity in volunteers’ participation with the Society, making them more attentive to project-specific aspects of their work as well as more receptive to various opportunities. Furthermore, the greater importance Web responders place on their skills is consistent with the fact that skills and skill development play a more important role among younger ACS volunteers (Roller and Blais, 2005).

Performance attributes

Possibly because of their level of involvement with ACS, volunteers responding to the Web survey generally rate ACS higher in performance compared to volunteers completing the paper questionnaire. There are several performance attributes where significant differences exist, including the areas of project materials, diversity, recognition, job descriptions, skill development and opportunities. Generally, Web responses are more positive or favorable toward ACS than those from volunteers in the paper mode. This tendency toward favorable responses among Web respondents is consistent with other work in this area (Carina, Hayek, Kuh, Kennedy and Ouiment, 2003).

Communication preferences

Another area where differences emerge between volunteers who completed the paper questionnaire and those who responded online is in preferred communication vehicles (see Table 1). Not surprisingly, volunteers in the paper mode are significantly more interested in receiving a newsletter in the U.S. mail, while Web respondents indicate a higher preference for all other channels, particularly communication via e-mail. The fact that Web responders significantly prefer face-to-face meetings with staff most likely reflects the higher incidence of fundraising and leadership volunteers whose activities are best accomplished through personal interaction.

Item non-response by mode of completion

Item non-response is typically higher among volunteers who completed the paper questionnaire compared to those who completed the online survey. Similar results have been reported elsewhere (Grigorian and Sederstrom, 2005) and have suggested that Web respondents are “hyper-cooperative” due to their level of comfort as well as enthusiasm with the technology.

Another explanation for the higher item non-response in the paper mode may be the ease with which respondents can pick and choose which questions to answer, compared to Web respondents, who may fear that not answering a question will result in an error message (even though very few error messages were utilized in the Web design), which may explain why all of the volunteers in the Web mode answered the overall satisfaction question but 3 percent in the paper mode did not answer this question.

An additional explanation points to the types of volunteers responding across modes and the proclivity of volunteers to skip questions in the VSS that are deemed irrelevant to their particular involvement with the Society. So, for example, a significant portion of volunteers responding via paper did not answer the question concerning their preference for e-mail messages/newsletters, possibly because they were so disinclined toward the e-mail option they simply skipped the question.

Open-ended comments by mode completion

Volunteers responding to the Web survey not only answered more questions (i.e., demonstrated a lower item non-response) but also were significantly more likely to respond to the open-ended question asking for their suggestions to improve volunteer satisfaction. Figure 4 shows that 65 percent of the volunteers in the Web mode answered this question compared to 52 percent of the volunteers in the paper mode. The sentiment of these comments (i.e., positive vs. negative vs. neutral remarks) did not differ greatly across modes; however, the length of these comments varied hugely by mode. The average word count of comments made by volunteers responding to the online survey was 13 times greater than the word count among volunteers responding on paper - 268 words per comment vs. 20 words per comment, respectively. This result is in concert with other researchers’ work in this area (MacElroy, Mikucki and McDowell, 2002; Grigorian and Sederstrom, 2005) and reveals a real advantage of the Web mode as a source of rich feedback from volunteers in terms of the amount or quantity of responses.

There is also some indication that the quality of the open-end comments in the Web mode may be superior to that in the paper mode. A cursory analysis of comments in both modes suggests that comments from Web respondents are more detailed (e.g., references to specific examples or names) and tend to be more constructive (i.e., offer suggestions for improvement) than comments from the paper questionnaire.

Interestingly, however, the readability scores, based on the Flesch-Kincaid Grade Level analysis, from the paper-survey comments indicate an 8th-grade level while comments from the online respondents read at the 7th-grade level. Whether this is a function of the younger age of Web respondents or the informal (even sloppy) writing style many e-mail users have adopted or something else is left for further research.

Carefully weigh

In developing and refining the best possible research techniques for exploring volunteer satisfaction, it is incumbent upon researchers to carefully weigh the advantages and disadvantages of each design mode. In this study we tested the two most viable modes of the VSS - paper via USPS and the Web. The results of this study clearly point to strengths and drawbacks of these modes and suggest that the ultimate design may be a combination of the two.

Benefits of the paper mode

There are certain advantages or benefits associated with the traditional paper VSS. First, response to the control test condition shows that a relatively high response is possible in the traditional format and that the control (i.e., paper-only) group would have met or exceeded the return from the option condition had it not been for the additional telephone contact with this sample, and did exceed the response from the Web group.

Furthermore, the fact that the option group overwhelmingly chose paper over the Web speaks to the comfort factor volunteers have with the traditional VSS (although this result may partly be an artifact of the option group showing preference for the mode by which they initially received the VSS, i.e., paper). The paper mode also enables certain volunteer segments to respond to the VSS that might otherwise not respond if no paper option were given, especially older, longer-term volunteers, those living in rural communities, cancer survivors and patient-services volunteers.

Benefits of the Web mode

Likewise, the Web mode provides its own benefits to the VSS design. Although rate of response was significantly lower than response in the paper conditions, it did provide a greater share of relatively young, Caucasian, and suburban-dwelling volunteers as well as those involved in fundraising, advocacy and leadership activities. The higher incidence in leadership volunteers is of particular interest because engagement and satisfaction of community leaders is critical to the mobilization of volunteers and community systems accessed through their personal influence and networks.

Furthermore, responses to the Web survey are more complete in that item non-response in this study was actually found to be lower in the online mode than in the paper mode. This includes the single open-end question, which garnered a much higher and richer response compared to the paper survey.

Mixed-mode design

While accepting the fact that gaps exist between paper and Web representation - age, economic status, people with disabilities, rural residents, race and ethnicity - it is equally true that these gaps are slowly closing. According to Dillman et al. (2001), a mixed-mode design will “increase response rates substantially,” and we believe that our results support a mixed mode (without an option) as the preferred design for research with community-level volunteers.

The lower initial response from the option group in our study suggests that the mixed-mode strategy should not be one of choice but a single mode for each of two samples. It appears that offering volunteers a choice between paper and the Web actually slowed their response, leaving us to speculate that the choice, in and of itself, communicated the notion that volunteers could respond at their leisure.

Increasingly going online

The increase in e-commerce and e-giving has also witnessed its corollary in e-volunteerism. Virtual volunteerism, volunteer-focused Web sites and blogs, e-learning tools and volunteer matching services are emerging daily. Nonprofit knowledge management and relationship management is increasingly going online because of the ease of technology in facilitating interactions and exchanges of services and value.

E-advocacy is a growing strength within many organizations, with volunteer advocates affecting policy change at the local, state and national levels. Youth and college students are organizing online communities to mobilize around special events in real space as well as recruiting volunteers and holding virtual events in gaming communities to raise money and awareness. Cancer survivors are actively seeking support from other survivors through personal blogs as well as in online communities.

All this online activity would suggest an overwhelming acceptance and readiness for a relatively non-invasive online satisfaction survey. However, the growing popularity of the Web does not translate into a majority preference in our overall current pool of community volunteers. The temptation to jump headlong into a Web-based survey of volunteers must be tempered by consideration of volunteers’ preferences.

This article calls researchers in the nonprofit sector to resist the urge to assimilate to the “new normal” because everyone is doing it and return to the basics of good survey methodology. Choose your delivery mode based on response rate considerations and quality data collection costs, and on the likely receptivity of your participant volunteer group at this time. Know your volunteer audience, and design accordingly. When it comes to volunteer satisfaction, one modality does not satisfy all.

 

References

Carini, R., Hayek, J., Kuh, G., Kennedy, J., and Ouiment, J. (2003). “College Student Responses to Web and Paper Surveys: Does Mode Matter?” Research in Higher Education , 44, 1-19.

Dillman, D., Phelps, G., Tortora, R., Swift, K., Kohrell, J., and Berck, J. (2001). “Response Rate and Measurement Differences in Mixed-Mode Surveys Using Mail, Telephone, Interactive Voice Response, and the Internet.” Draft paper.

Gesell, S.B., Drain, M., Clark, P.A., Sullivan, M.P. (2007). “Test of a Web and Paper Employee Satisfaction Survey: Comparison of Respondents and Non-Respondents.” International Journal of Internet Science, 2, 1, pp. 45-58.

Grigorian, K., and Sederstrom, S. (2005 May). “Qualitative Comparison of Paper and Online Self-Administered Modes.” Paper presented at the American Association for Public Opinion Research, Annual Conference, Miami.

Kaplowitz, M.D., Hadlock, T.D., and Levine, R. (2004). “A Comparison of Web and Mail Survey Response Rates.” Public Opinion Quarterly, 68, 94-101.

MacElroy, B., Mikucki, J., and McDowell, P. (2002). “A Comparison of Quality in Open-End Responses and Response Rates Between Web-based and Paper and Pencil Survey Modes.” Journal of Online Research . Retrieved July 29, 2005, from www.ijor.org/eval.asp?pID=1.

Manfreda, K.L. and Vehovar, V. (2005 May). “Comparison of Response Rates in Web Surveys Compared to Other Survey Modes.” Paper presented at the American Association for Public Opinion Research, Annual Conference, Miami.

Manfreda, K. L., Bosnjak, M., Berzelak, J., Haas, I., and Vehovar, V. (2008). “Web Surveys Versus Other Survey Modes: A Meta-Analysis Comparing Response Rates.” International Journal of Market Research, 50(1), 79-104.

Pew Internet & American Life Project, August 12-31, 2008 Tracking Study. Note: The Project bears no responsibility for the interpretations presented or conclusions reached based on analysis of the data.

Roller, M.R. and Blais, L.M. (2005). [Analysis of the American Cancer Society Volunteer Satisfaction Study National Summary tabulations]. Unpublished raw data.