Preventing speeders and cheaters

Editor’s note: Debbie Peternana is president of ReRez, a Dallas research firm. Keith Strassberg is executive vice president of Universal Survey Center, New York.

As providers of data collection services to our clients, we view Web survey validation as an important topic. While it is important for Web data collection providers to ensure the validity of the data they collect via the Web by verifying their panel members, we also must use well-written screeners and processes to guarantee the validity. Many data collection companies have recognized the need to ensure the quality of their data and have implemented a validation process, but many companies have not, expecting the client to be solely responsible.

It is imperative to question the data collection companies you utilize and ask them to explain their data validation process. If you are satisfied with the process there may not be a need to add any additional steps. If you aren’t satisfied, you will find some suggestions below to incorporate into your project management which will save you time and difficulty in the end.

We do not wish to insinuate that respondents are fraudulently completing surveys, but a small percentage may alter their responses to gain access. Therefore the screener questions are much more important in keeping out those who are not qualified.

Since respondents do not know (or, should we say, “should not know”) what qualifications will grant them access, it would be difficult to make up the responses. Unfortunately some data collection companies give far too much information when sending an invitation to their panel members. Some respondents are not fraudulently completing the studies but may not take their time, may misunderstand the direction or just not pay attention. These types of errors often are the cause of invalid data as well.

Ask to see the invitation the respondents will receive and verify the following:

  • There is no indication of the topic in the e-mail subject line or the invitation. Respondents should not know the subject of the survey when they choose to respond. Knowing the topic results in panel members choosing not to take the survey because they do not like the topic. In some cases mention of the topic will give away information that could help panel members gain access. For example if the invitation mentions “snow skiing” and one of the screener questions asks “Which of the following sports do you participate in…” with snow skiing being a choice, panel members may select it as an answer because they believe doing so will gain them access into the survey.
  • Make sure the information in the invitation includes the length, the end date if there is one and the incentive. The incentive should be enough to motivate the respondent to participate but not enough for the respondent to want to fraudulently access the survey more than one time to receive the incentive.
  • Ask about the method of accessing the survey and make sure that the survey is password-protected. This can be done a number of ways and each company may do it slightly differently. Normally, this is done by including a link in the invitation to which a unique identifying number has been appended. This unique number can be linked to the profile information provided by the panel member when he or she signed up on the panel. It will allow the respondent one-time access into the survey unless the survey has been designed to allow a respondent to come back to a survey and complete it. This is usually offered if a survey is exceptionally long. When the respondent returns to the survey, they will begin where they left off. The program does not allow for the respondent to go back and make changes. Additional efforts include real-time and automated checking and back-end manual checking.
  • Real-time automated checks. At the beginning of each Web survey, include the following introduction:

“To ensure that this survey is not being completed by a computer or a professional survey taker, we have included easy-to-answer validation questions. The questions will be readily apparent to you. Please follow the instructions to answer these few questions. If you do not answer these questions correctly, we will not be able to include your opinions and your survey will not count.

“In addition, we expect it will take time to read and answer all of the questions properly. Please take your time to consider each question and answer it honestly. If you are racing through, and take the survey in a less than acceptable time based on the time you were told it would take, we will not be able to include your opinions and your survey will not count.

“Thank you in advance for participating in our survey. Your honest opinion counts.”

Doing the above allows the respondents to understand that these questions are being verified and respondents will be answering questions that could effect their participation and receiving their incentive.

  • Time test. Work with your data collection provider and the final survey instrument to determine estimated length of the survey, and estimated length of certain sections of the survey. Through Web pre-testing and monitoring tools, you should be able to determine estimated lengths.

Then, during programming, if your programming tool allows, we suggest including timers in the survey. If a respondent completes a section in less than half the estimated time, they receive a speed warning. “So far, you have spent less than half the time it has taken others to reach this point in the survey. Please ensure that you are reading all of the questions and giving us your honest answers.” All surveys (and sections) completed in less than half of the estimated time will be flagged, and individual responses will be manually reviewed for accuracy. Surveys will be discarded if they don’t pass validation.

  • Validation questions. For each 25 questions, randomly insert one validation question. Every survey will include at least three validation questions. If a respondent fails two out of three validation questions, the survey will be immediately discarded. If a respondent fails one out of three validation questions, the survey will be flagged and manually reviewed for validation.
— Validation question type # 1 - single-punch
To ensure quality, please (enter the number X/check choice X) below.
ENTER X
(letter or number is randomized for each survey participant)

— Validation question type # 2 - multi-punch
To ensure quality, please select choices X, X, and X below.
SELECT CHOICES X, X, and X
(X) (X) (X) (X) (X)
(letters or numbers are randomized for each survey participant)

— Validation question type #3 - logic checks
(Some data collection partners will work with you to determine certain logic checks.)
How many hours per week do you watch TV?
Enter number of hours.
Of the X number of hours you watch TV each week, how many hours do you watch TV in the morning, that is between 6 a.m. and 10 a.m.?
Enter number of hours.

  • Back-end manual checks. In addition to manually checking each survey that is flagged for review, each day, review all open-ends to make sure they are applicable. Either train individuals in your office to review the open-ends and know what to look for or speak with the data collection partner to verify they have someone assisting in the verification. Surveys that don’t pass validation will be discarded.

While the processes above can help to reduce fraudulent responses, as well as invalid data caused by respondents who do not take their time and misunderstand the questions, there will likely be some inconsistencies. Therefore, it is important to include the processes above but make sure that you and your data collection partners work together on steps to further reduce these inconsistencies.

In summary, remember to:

1. Speak to your data collection partners and request they provide to you the process they use during the project, as well as the steps they would take if a question about the validity of the data arose after the completion of a study.

2. Review the screener questions and questionnaire to ensure the qualifications are what you originally thought they were and that the screener questions are strong enough to keep out respondents that do not qualify. The goal of the screener is not to trick the respondent but determine if they are qualified to participate in the survey.

3. Review the invitation being sent to ensure it includes no mention of the topic, but does include the length, the end date (if applicable) and a unique ID which will allow only one access into the survey.

4. Include the statements listed above, which let the respondents know that the time it takes them to complete will be monitored and that there will be validation questions to guarantee validity.

5. On a daily basis review your data and check the open-ends for relevance. Also look for straight-line responses and examine time completed and any other logic tests that would verify data validity.

6. Keep in mind you may find a small percentage that do not stand up to the rigorous validation process you put in place. Work with a data collection partner that will allow you to collect extra responses and completed interviews to make up for any that are discarded.