Editor’s note: Steve Wygant works in the assessment office of Brigham Young University in Provo, Utah, teaches research methods at BYU, and consults on Web/e-mail research methodologies for clients of Western Wats Center, a Provo-based data collection and processing company. Ron Lindorf is president of Western Wats Center.

When Yahoo! released its list of Most-Wired Colleges in May 1999, one university in Utah pushed the top of the national list. Brigham Young University (BYU) shines as an example of network infrastructure. The majority of its dorm rooms are wired and two-thirds of its on-campus computers are available 24 hours a day for its nearly 30,000 students. The unprecedented access to this highly wired and relatively stable population of 18-24-year-olds allowed researchers to investigate some of the most hotly debated topics in the emerging arena of Internet research. What they found about response rates, turnaround times and data validity might surprise you.

There is no question that data collection via the Internet has been gaining popularity, yet no one seems completely sure what to make of data gathered using the Web. Harris Black and a handful of other industry leaders have staked their future on Web surveying, but other market researchers are more cautions in their approach to Web surveying methodology. It isn’t the degree of difficulty - gathering data via the Internet is technologically easier than traditional methods. The most heated debate rises around validity - are data gathered via Web sites as valid as data gathered through more traditional means? Are self-selection biases and other sampling errors introduced by collecting data in a self-administered, electronic venue? Can the projectability of data gathered this way be trusted?

Last October, researchers in BYU’s assessment office set out to test some of these issues. By conducting a split-method survey of the school’s on-campus residents, they were able to test the impacts of two different methods of data collection on self-selection and sampling bias. Identical questionnaires were constructed, one electronic and the other on paper. The questionnaires were split randomly between electronic and mail administrations and sent to 2,600 BYU dormitory residents. All potential respondents had both private university-provided post office boxes and e-mail accounts. Respondents in the electronic sub-sample received an e-mail invitation to complete the questionnaire, with an embedded hyperlink that, when selected, linked immediately to the Web-site containing the questionnaire. Respondents in the paper-pencil sample received the standard printed invitation letter and questionnaire in their campus mailboxes and a self-addressed, postage-paid return envelope.

Web versus mail respondents

The demographic data gathered show some interesting differences between the two respondent groups. Both samples included a greater proportion of females, reflecting the same proportions found in campus housing in general. However, a slightly higher proportion of males returned the survey in the electronic mode than the paper mode, suggesting a greater likelihood for 18-24-age males to respond to a survey electronically. Interestingly, while students who responded via the Web or mail report similar levels of comfort using computers, students who responded electronically report using the computer about an hour-and-a-half more per week than their paper-pencil counterparts. This might suggest that those who are heavier computer users are more likely to answer an electronically administered survey. However, it more likely reflects the greater proportion of male respondents (who reported more hours of computer usage) in the Web sample.

Benefits of Internet-administered surveys

On analysis, several definite benefits to Web-administered surveys emerged. Of particular interest is the response rate, the turnaround time, and the open-end quantity.


Table 1—Respondent Demographics

Web

Mail

Percent Male

39%

34%

Percent female

61%

66%

Age

18.5

18.6

Rating of comfort using computers (1-7 scale)

5.8

5.6

Hours using a computer per week

10.9

9.4


Table 2

Electronic

Mail

N (Sent out)

1,270

1,299

Returns

629

410

Response Rate Final

50%

32%

Days elapsed until 80% of final response total was received

2

22


Table 3

Electronic

Mail

Percent responding to all four open-ended items

93%

93%

Average words per response to open-end items

31.1

22.1

  • Response rates. As shown in Table 2, a significantly greater portion of respondents completed the questionnaire via the Internet. Of 1,270 respondents solicited via e-mail, almost 50 percent (629) returned the survey. The numbers were significantly lower for the mail-out group, only 32 percent (410 of 1,299) of respondents returned the questionnaire. Findings prove that Web-based surveys are convenient for the respondent as well as the researcher.
  • Turnaround time. While 80 percent of the total Web surveys returned were done so within two days, it took 22 days to reach the same sample penetration point in the self-administered paper-pencil format! Furthermore, the electronic methodology returned an abundant 64 percent more completed questionnaires than via the mail method (see Table 2).
  • Beefier open-ends. As shown in Table 3, the same percentage of respondents in the two groups answered all four open-ended items on the survey (93 percent). However, the total open-end word count on the Web questionnaire averaged almost 50 percent higher than those from the paper-pencil group! For researchers who are constantly battling to get more verbatim information using self-administered questionnaires, Web-based administration may provide this additional benefit.

Checking data validity

Of course, none of the practical benefits of electronic data collection would mean much if the data from an electronically administered survey were of questionable validity. To this end, researchers analyzed the mean responses from the Web-administered questionnaires with those from the mailed surveys. Comparison of responses to the closed-ended items between the two methodologies revealed nearly identical patterns and the study methodology was not shown to influence responses in any way. Therefore, for this population, data from an electronically administered version of the survey has high projectability across the study population.

Conclusions for research practitioners

Several compelling benefits of using electronic data collection methods to gather data from an 18-24-year-old, computer-conversant and educated population, are apparent for astute research practitioners.

  • Immediacy: Using this e-mail/Web-based approach, researchers at BYU have been able to provide near real-time data for last-minute information needs and for major policy decisions within just a day or two of request. Recovering 80 percent of response data within two days could become the client benchmark for data delivery in the near future.
  • Response rates: Response rates for survey research are of course dependent on many more factors than mode of data collection. Instrument length, sample characteristics and interest in topic all affect how willing people are to respond. However, the higher response rate in this matched sample suggests the potential for a powerful benefit from electronic data collection.
  • Cost reduction: Make no mistake, there is substantial initial investment in setting up shop to collect survey data electronically. Software purchase or development, server acquisition and maintenance and technical staffing are all significant expenses. However, over the long run electronic data collection seems more cost effective - in this case, total estimated project costs for the electronic administration were approximately one-sixth the cost of the mail administration.
  • Project execution: E-mail recruitment and Web-based data collection has allowed a small assessment team to cover a lot of ground in a short time. From June 1998 through May 1999, two full-time researchers have conducted 27 electronically-administered survey projects, from instrument design through data analysis. In this 12-month time span, they have delivered a combined total of over 50,000 questionnaires to a wired population 35,000 students, faculty and staff, and completed data analysis and reporting of over 12,000 respondents; with a staff of two, and a minimal budget to match!

For market researchers, this is a comparative study among a narrow but important segment of Web users, with an arguably dated, yet proven technology - mail surveys. But hasn’t the Web in recent years been made up of highly fragmented and pre-disposed sets of people, be they home computer owners, the highly educated, slightly more male, etc.? Census Bureau and Harris Black figures suggest that the population of Internet users is becoming increasingly similar to the rest of the country. As demographics of Web users are found more truly representative of the American population, smart researchers will take advantage of Web research to provide timely information inexpensively to their clients.

The results of this BYU study show that research practitioners can have increasing confidence in their cyberspace data collection, at least for those target audiences on the Web that can be identified and reached electronically. Coupled with some of the practical advantages to Web research - at least compared to Pony Express mail methods - greater confidence in the validity of data might mean researchers can have their cake and eat it too.