Report from Portland
Editor’s note: Jeff W. Totten is assistant professor of business administration at Bemidji State University, Bemidji, Minn.
In May, on the 20th anniversary of its eruption, I was treated to a bird’s-eye view of Washington’s Mt. St. Helens as we approached Portland International Airport. I was going to my first American Association for Public Opinion Research (AAPOR) conference, held at the Doubletree Jantzen Beach in Jantzen Beach, Ore. In addition to presenting some research I had done, I was also to serve as a roving correspondent for Quirk’s.
All in all, there were nine sessions devoted to Internet research activities over three days (May 19-21), in addition to sessions on research methodology and other research issues. I attended six of the sessions in their entirety. I’ll briefly summarize the papers presented at these sessions in the space below.
Diana Pollich and Jo Holz of Roper Starch Worldwide reported on Roper’s second cyberstudy of online and Internet usage in the first Friday session. The study was based on a telephone sample of 1,009 people who had accessed the Internet in July 1999. The online population grew from 45 million in 1998 to 63 million in 1999. The researchers found that Internet users are becoming more representative of the U.S. population as more people with less education and/or lower incomes are going online. Also, greater numbers of women and older people are going online. There is also a tenure effect being observed, in that the longer people are online, the more likely they are to engage in activities online (e.g., buying online, gathering information online).
K. Viswanath from Ohio State University discussed research on the adoption and diffusion of new technologies using data collected from monthly Buckeye State Polls (telephone surveys) over the last three years. Social class and geographic factors were found to have an effect on the adoption and diffusion of computers and the Internet.
Lars Willnat from George Washington University reported on a study of mass media and Internet usage among young Americans (ages 16 to 24). Television and newspapers still ranked high as sources of news for young Americans. Males spent an average of 86.84 minutes online versus 71.8 minutes for females. Asian-Americans and Caucasians spent more time online than did African-Americans and Hispanics.
I presented findings from my survey of U.S. marketing research firms about their use of e-mail and the Internet as data collection methods. Approximately 40 firms indicated that the use of Web-based Internet surveys had a bright future, once the industry solved the problems of sampling and security. The future for e-mail surveys was mixed. Problems encountered with both methods were also identified, along with advantages and disadvantages of each.
Tara McLauglin of Cyber Dialogue started the second Friday morning session by discussing guidelines for ethically collecting data from Web site visitors. These included making sure the client provided privacy policies for visitors to access or link to, collecting only data absolutely necessary to the research, and reporting only aggregate findings to the client.
Julie Schmidt of Greenfield Online addressed measures to take in order to protect a firm’s online research panel. In light of refusal rates climbing to 60 percent, the Internet offers a new opportunity to reach the public. Schmidt talked about establishing clear communications with panel members and urged the adoption of a Digital Consumer Bill of Rights, which offers guarantees about data accuracy, security and privacy.
Doug Rivers of InterSurvey, Inc., first discussed six barriers to online research: 1) the digital divide (only 55 percent of U.S. households have Internet access), 2) the last mile (slow connections), 3) the installed base (lowest common hardware and software), 4) the daily dose (people must be online on a daily basis), 5) the sampling problem (no random-digit dialing equivalent, self-selection bias, coverage bias), and 6) the research industry (lying about survey time length, problems with telemarketing).
Two Friday afternoon panel sessions focused on the Internet’s impact on society. The early afternoon panel addressed such topics as the digital divide, access and inequality, and the political use of the Internet. The late afternoon panel looked at how the Internet was changing social and cultural aspects of American society. A separate session looked at Internet studies done on physicians, business managers, and recent science and engineering school graduates.
The first Saturday morning session was devoted to Internet probability surveys conducted by InterSurvey, Inc. The six presentations built upon the Friday presentation by Rivers. The company recruits panel members through a random-digit dialing phone sample. Letters are sent to recruits, who formally sign up by responding back by mail. InterSurvey then provides and pays for Internet access and WebTV boxes. E-mail messages are sent to the households via the WebTV boxes, which then download the survey. An indicator light on the box lights up. The panel member then responds within a week by turning on the television set, clicking on Web and mail icons, and then clicking the start button. The questionnaire appears on the TV screen question by question. Karol Krotki and Mike Dennis discussed questionnaire design, sampling and weighting issues. William McCready and Robert Tortora discussed a comparison of CATI with InterSurvey’s interactive TV. Anna Greenburg and Michael Bocian addressed the problem of “don’t knows” with this new Web-based method. Vincent Price discussed initial findings with a series of online electronic dialogue groups who are discussing a variety of topics monthly. Finally, Kathleen Frankovic of CBS News discussed their use of InterSurvey’s method to conduct an instantaneous poll of peoples’ responses to President Clinton’s State of the Union address earlier this year. A 53 percent response rate was achieved by the poll.
The late Saturday morning session focused on the format and design of Internet surveys. Dennis Bowker of Washington State University reported on an experimental study on survey alignment on the Web page. A snowball sample of 684 students was used in the study, with 350 assigned to the right alignment and 334 to the left. The right alignment allows less mouse movement since the mouse and the scroll bar are on the right side of the computer. The left alignment resulted in a higher rate of item nonresponse (speculation: due to navigation of mouse and scroll bar that’s required). There were no significant differences between alignment formats with regard to respondent satisfaction or confusion.
Katja Manfreda of the University of Ljubljana (Slovenia) reported on the findings of two Slovenia research studies in 1996 and 1998 regarding various design features. The scroll-based design (where you must scroll down to see all the questions) required less time for respondents to complete the survey; however, it produced higher item nonresponse. They also looked at the length of survey, the use of logotypes, the assignment of topics to be answered and the use of instructions.
Neli Esipova of the University of Wisconsin discussed the strengths and weaknesses of online focus groups versus face-to-face and phone groups. There was a higher participation rate for face-to-face groups (86 percent of those who agreed to participate actually did) than for online (78 percent) and phone (74 percent). Online focus groups required more discussion time (98 minutes versus 77 for phone and 87 for face-to-face). Phone group participants were more likely to switch to either of the other methods in the future.
Scott Crawford of Market Strategies reported on research that looked at perceptions of burden and the impact on nonresponse. Progress indicators on the Web-based Internet survey help reduce break-off nonresponse (where a respondent starts a survey, then stops and doesn’t finish). Using an automatic password also reduced nonresponse.
Presenters in the Saturday afternoon session discussed suggestions for improving response rates to Internet surveys. Curt Dommeyer of California State University-Northridge studied how response rates to e-mail surveys could be improved. A random sample of 300 students on binge drinking was divided into half, with one half receiving the survey as an attachment to the e-mail message, while the other half got the survey embedded in the message. Better results in general were achieved with the embedded e-mail survey (37 percent response rate versus 8 percent). Obstacles to attached surveys include software limitations, knowledge limitations, time limitations and fear of viruses.
Richard Clark of the University of Connecticut Center for Survey Research identified several advantages of Internet surveys: relatively cheap, very convenient for respondents, faster data collection, and visuals can be added. Identified disadvantages included: sample frame difficult to draw, coverage area weak, page design, technical difficulties, and low response rates. One recommendation from their study was that leaving e-mail reminder messages are more effective than leaving telephone messages.
Michael Bosnjak of ZUMA’s Center for Survey Research reported on participation in a non-restricted Web survey in Germany, where participants can proceed without having to answer any given question. Ajzen’s theory of planned behavior was then used in a model to explain and predict the different types of nonresponse.
Sandra Bauman and Jennifer Airey of Wirthlin Worldwide finished the session by discussing how to gain respondents’ participation in Web surveys. There are three stages to cooperation: invites (hit rates), introduction (call to action), and incentive (cooperate). Various methods to use for each of these stages were then discussed by the presenters.
The final Internet-related session was held on Sunday morning. The topic was comparing Internet results to other interview modes. Nojin Kwak of the University of Wisconsin tested mail and Web/e-mail for response rate, speed and data quality. A random sample of 1,000 produced an overall response rate of 33 percent. The response rate was higher for mail surveys (41.9 percent vs. 27.4 percent); however, the speed for mail was slower (9 days vs. 2.2 days for Web). Women were more likely to respond to the mail survey. There was a higher item nonresponse on the mail survey. Multi-mode surveys (e.g., URL on mail survey) are recommended.
Thomas Guterbock and others from the University of Virginia surveyed computer users on campus about computer usage. Twenty-five percent of the sample were asked to answer the Web version of the 20-page booklet. E-mail and mail reminders were used for both groups. Response rates were 36.8 percent for Web and 47.8 percent for mail.
Carl Ramirez from the U.S. Government Accounting Office conducted a survey of 3,200 GAO employees in the fall of 1999. Employees could select which mode to use (Web or paper). The overall response rate was 89 percent; 87 percent of those responding did so using the Web. He looked at open-ended item response rates, the volume of open-ended narrative, fixed choice question item nonresponse, and average ratings on scale questions.
Timothy Elig of the Defense Department’s Manpower Data Center sampled military members, spouses, and civilian employees. Three modes were used: mail survey only, mail survey with Web option, and Web survey with mail option. Due to delays in completing the study, only preliminary results were reported.
John Kennedy of Indiana University reported on a fall pilot study of over 29,000 undergraduate students at several schools on what they get out of college. The response rate for the Web-only option was 38.5 percent, versus 43 percent for the paper with Web option survey. A major study of over 200,000 students at over 250 schools will be conducted this year. Results are available at http://www.indiana.edu/~csr.
Rachel Askew and Peyton Craighill of Rutgers University compared a newspaper telephone poll in January 1999 (n = 587) with four short online polls using the same questions during January and February 1999 (n = 130 to 197 each). Online respondents had more opinions and were more skeptical and critical.
More information about these presentations is available from the author or the AAPOR Web site (www.aapor.org).