Going online

Editor’s note: Hal Spielman is CEO of MSW Group (formerly McCollum Spielman Worldwide), a Great Neck, N.Y., research firm. Art Klein is vice president/co-director of the MSW Interactive division.

For almost three years MSW Group has been actively experimenting with the use of the Internet for its communication research services. Our objective was to investigate and understand the strengths and weaknesses of the Internet so that our clients could take advantage of this exciting approach to gathering information from targeted consumers.

MSW Group’s heritage is in communication research of all media, but researching the effectiveness of television commercials is a major part of our activity. In spite of advances in technology, computers currently do not function like television sets (due in part to bandwidth constraints); thus utilizing this medium for research is extremely difficult. The main problem is that video files are huge; in order to download video files, the material has to be compressed to the size of a small window, which results in a loss of resolution. The same size and resolution problems exist when video is streamed in real time to a computer. These video issues were particularly problematic for MSW Group since within our system, commercials are tested within program context. Prior to the introduction of our AD*VANTAGE/ACT Online, low-resolution video material was streamed or downloaded to the respondent’s computer and appeared in tiny video windows. One objective was to overcome this problem and deliver full-screen, full-motion TV-quality material of whatever length of time was required to simulate true TV program viewing.

We chose to focus on adapting our AD*VANTAGE/ACT copy research procedure. We did so not only because of its complexity and software demands of combining video with questions, but because of the large body of pre-existing validation studies with this technique and measures.

The complexity of this service stems from an approach that requires multiple exposures of the advertising. Unaided awareness measures are taken after the first exposure in a program, persuasion is taken both before any exposure and after second exposure and then extensive diagnostics are taken after the third exposure. In addition, there is the optional opportunity to execute a scene-to-scene analysis with detailed verbatims.

Even the persuasion measures (“Consumer Commitment Persuasion”) - actually a pre- and post-executed sequence of brands and frequency-of-use questions - requires full-screen, full-color display of the competitive set of brands. This is an important refinement in the accuracy of our procedures since consumers frequently make their purchase on recognition of the package shape and color of the contents (e.g., “I buy the green stuff.”).

Thousands of online, experimental interviews were conducted using a variety of Internet service providers, online panel companies, survey software companies and multiple Internet “field services.” In the course of experiments we came to understand the various advantages and shortcomings of each of these alternatives.

The results of these extensive experiences led us to what we have come to refer to as the four S’s of online research - sample, software, security and service. Let’s look at each one separately.

Sample

To most professional researchers this seems an issue of obvious importance. But to non-professionals this is often overlooked. Bodies filling out a questionnaire do not a “sample” make. We have all heard the horror stories of product managers fielding studies on their own because “they know what they want and can do it quickly.” Of course, they frequently fail to provide sample guidance to the field service and then are stuck trying to understand and make management decisions on whatever data has been dumped on them from inappropriate respondents.

It quickly becomes apparent that getting quality target samples of consumers who have not been involved in ad research studies for at least one year requires an extraordinarily large base of online households. In general, existing online panels could not meet our volume needs or provide consistent targeted respondents necessary to match samples from test to test or from wave to wave. We found that the only way for us to perform the type of research we conduct online was for us to find a partner that had the ability to sample the entire Internet.

Do differences in the online sample exist when compared to census data? Yes they do. In general, they skew slightly higher in income and education, slightly lower in age (though these are changing rapidly). However, it is extraordinarily rare that a request is made for a test sample to reflect the census. It is almost always a target sample of that product’s consumers that must be delivered - and consistently for each test. Our comparative studies showed how close we actually were to today’s primary source of interviews: the mall.

Importantly, the real issue is: Can we produce a targeted sample for our clients in any research venue? The answer is yes, and we can do it online.

Software

MSW has for many years used software specifically designed and developed for use in our videotape or touch-screen research systems for offline research. However, we had to go further in the technological control of the interview process and the means of exposing the stimuli. We wanted full graphics and multimedia integration that would be full-screen, full-motion, TV-quality viewing that would carry up to an hour of TV programming into the respondent’s home and simulate the actual on-air viewing experience. Moreover, the viewing and questionnaire had to allow for a fully interactive experience for the respondent on whatever questionnaire design we, or our clients, devised.

For example, one option our clients requested was a scene-to-scene analysis. We wanted respondents to rate each scene in a commercial and then have their strongest and weakest replayed so that they could capture verbatim reasons for those ratings. (Note: This is a very simplified description.) The software had to be sufficiently sensitive to sort these ratings and pull up the appropriate scene for the respondent’s comments.

This same software ability was required to execute our three-step brand selection process from the package visual display in our Consumer Commitment Persuasion measure.

Through the touchy-feely experimental effort we were able to find and adapt the appropriate software that would allow for the above and permit a virtually unlimited number of simultaneous interviews so that all studies could be executed quickly and no study need ever be turned away for lack of facilities.

Security

This issue really has more facets than a diamond. We were clearly concerned about security of respondents. The need for anonymity has been much in the headlines these days. Many states have legislation pending on this issue. (CASRO and CMOR are actively attempting to protect the research industry from destructive legislative infringement.)

Further, to prevent ballot stuffing and respondents attempting to qualify for studies of their own choosing, a rather complex multi-screening procedure is employed that overcomes these problems. Each screener brings us closer to selecting and interviewing the target consumer.

Still another facet of security extended to the material being studied. This was overcome by the use of a patented encryption system applied to a home-delivered CD-ROM. This CD-ROM can only be brought to life through the online survey software. Once used, this CD (which cannot be copied, saved, printed, or cached in the respondent’s computer) becomes a useless coaster. This procedure makes it virtually impossible for a competitor to see test material. Respondents do not know in advance what product or category is the subject of the study and the CD-ROM itself can only be activated by participation online. (Note: Again, this is a very great simplification of the test mechanics.)

Servicing

Our mantra has always been “It’s not the numbers, it’s what they mean.” Simply getting data more quickly or less expensively is useless unless it is quality data and is translated into useful and actionable information. Our executives have been going through training on the strengths (and, yes, the weaknesses) of Internet research. Important as this venue may be, it is not right for all studies or samples and certainly has its own design demands. Our account staff have learned to supply each client study with the most efficient recommendation for study design, venue and constructive analytics derived from our combination of evaluative and diagnostic measures, and years of experience.

Certainly in the general sense, it is possible to access Internet users quickly and cheaply. “Down and dirty” has always been available as an excuse for getting some kind of “research” on the table. But is management willing to risk major decisions (or even minor ones) that affect the sale of their brands on cheap, questionable-quality research? Clearly, the position taken by Vince Vaccarelli, the research director of Xerox, that the value of research should be thought of in relation to the size of the issue on which it impacts, is a very meaningful one. It is a challenge that all research directors must deal with. Quality information leads to quality management decisions. If the funding or time made available is inadequate to do a quality piece of research, you might as well flip a coin and save the money.