Do you know where your data came from?

Editor’s note: Karl Feld is research manager at D3 Systems Inc., a Vienna, Va., research firm.

Imagine you’re driving a vehicle of unknown manufacture with dials you can’t read down a muddy or dusty dirt track with no name to find a house with no number to make sure your contractor’s employee interviewed the right respondent in a language you don’t speak. You’ve been doing this for days, maybe even weeks. There’s no running water, no electricity, no telephones, no mail service and possibly no food other than what you’ve brought with you.

Welcome to collecting research data from most of the world’s people.

It’s the rare research organization which has its own global, or even single-country, infrastructure designed to support this type of work. There are few firms with enough ongoing work to support such an infrastructure, and even fewer with the expertise to support it. In most cases, researchers use data collection subcontractors.

While this is not an unusual practice even domestically, what is of note is that on a global level those subcontractors are usually umbrella organizations using country-by-country second-degree subcontractors. In turn, in most countries those second-degree subcontractors are using independent contractors or geographically-dispersed local companies for face-to-face data collection in remote locations. In the end, a researcher is often faced with receiving a deliverable which has passed through four tiers of subcontractors, each with its own set of standards and administrative processes.

Two approaches

Researchers can take two approaches to managing their projects and controlling the quality of deliverables received from these subcontractors. They can quality-control from the interviewer level up and assure that the third-degree subcontractor is doing the job right, or they can turn a blind eye to the process and trust the data they get. With these options, lurching down the muddy track in the middle of nowhere becomes a very real scenario for the discerning researcher. The author loves it. Most researchers do not.

Fortunately, this exercise can be avoided by making the right choice of data collection subcontractor from the outset and then following a few simple observational guidelines to making sure your overseas face-to-face project gets done right the first time. Interestingly, the same principles apply to managing all cross-cultural, multilingual face-to-face research contractors, whether in the developed countries of Europe or elsewhere. They just require varying degrees of application.

It all starts with selecting a rock-solid subcontractor. This is partly a product of references, partly a product of price and partly a product of the contracting process. We will assume that the originating research organization has decided against building its own in-country data collection teams to conduct the data collection, though this too is often a viable option and one the author has used.

There are various ways to identify global umbrella organizations or single- or multi-country contracting companies. There are directories that provide lists of companies providing such services. Another, less comprehensive route is simply to ask around. International researchers are a small bunch and are often willing to share war stories1. In the end though, it all comes down to references and the contracting process.

References a researcher trusts are a crucial litmus test of whether a particular subcontractor is one worth considering. There is really no other way to gauge a subcontractor’s probable performance without knowing how they have performed previously. References who do work similar to the proposed project and who spontaneously recommend a firm are the best source.

Requiring subcontractors to provide their own references which can speak to work similar to the proposed research is an important step in the screening process. Calling those references and having in-depth discussions can be extremely revealing. Since one can assume the firm has provided their best possible references, hints of dissatisfaction from these references often are the tip of larger data collection icebergs other, less favorable clients would talk about.

Expectations are set

The further you get from the developed world, the more important contracting becomes. Contracting is the point in a relationship at which expectations are set, performance metrics and controls defined, and penalties and benefits outlined. Good contract negotiation practices can spotlight poor subcontractors before you get into any entangling commitments or collect poor data. It is important to note that it is what you can learn during the negotiation process that is important, rather than contract enforcement later. In many places enforcement is an expensive and often fruitless exercise due to bureaucratic lethargy and corruption, and the locals know it. Bottom line: Make sure your subcontractor happily and willingly answers your questions and commits to your needs and comfort level in writing before signing away the project.

So what should those questions be? The author has found that the following contracting criteria quickly spotlight subcontractors’ strengths and weaknesses:

  • Require guarantees of data completeness and accuracy be built into price estimates.
  • Require guarantees of quality control reporting, including return of original surveys along with unweighted and weighted data sets.
  • Require guaranteed delivery of respondent contact validation files and cleaning tables.
  • Require intermittent and final disposition reporting with delivery of sample file and individual case dispositions.
  • Require reporting of all postal codes (if available) of points completed against contracted sample frame requirements.
  • Require realistic timelines be built into the proposal which accommodate dispersed survey completion, recontact, validation and correction.
  • Require full disclosure of hiring, training, retention, incentivization and quality-control procedures.
  • Require delivery of standard interviewer training materials.
  • Explicitly detail and agree to penalties for failure to deliver on spec.
  • Require full disclosure of all subcontractors.
  • Pay as little up front as possible. The U.S. dollar goes a long way to covering costs in most places.

Steps to be taken

Once you’ve settled on a contractor that meets your comfort level and research needs, there are other steps to be taken to protect and enhance the success of your project. Much ink has been spilled on many of these topics, so I include only a brief overview here.

The first is questionnaire design. Questionnaire design in multicultural, multilingual research must use both the proper language and cultural context to elicit the desired responses. Context applies both to the language in the survey and the way it is administered, which is often more important than the questionnaire design itself. People in some cultures better relate to conversational interviewing styles than fixed questionnaire order. Some cultures require sensitive questions to be in a different order than others. In some places, people will only talk in particular settings. The best source of information to provide guidance on these topics will be the in-country subcontractor, who will also usually provide the translation service. Questionnaire translations should always be checked through back translation or retranslation by a different translator, ideally subcontracted independently.

In research conducted by the author in Bosnia-Herzegovina, for example, questionnaires had to be administered in a neutral location not affiliated with either group. As the timing of the research was shortly after the Bosnian conflict, and the context was discussions of reintegration of multi-ethnic communities which had been battle zones, none of the respondents for the three ethnic groups wanted to be seen or accept the risk of meeting researchers on “enemy” ground. As researchers, we could not interview on friendly ground, as the results would be perceived as biased by the other sides. As a result, interviews were always conducted on neutral ground.

Similarly, research in Arabic, Muslim countries which involve women generally must be conducted under the watchful eyes of the responsible male family leader, as social custom requires women not meet with outsiders without male presence. In Russia, it used to be extremely difficult to get face-to-face interviews inside people’s homes. Public places were preferred. In Japan, it is only in private places like the home that face-to-face interviews will capturemeaningful data.

In the author’s experience in yesterday’s Russia and Moldova, and in today’s China, respondents asked questions of substance often will refuse to provide meaningful answers without approval from another authority2. This is especially the case when interviewing professionals. Appropriate lag time or pre-approval needs to be factored into timelines and interviewing environment to allow for this phenomenon. Similar conditions applied in research in South Africa and Bosnia-Herzegovina, where security forces had to be co-opted and prepared for interviewing work in order to get local residents’ cooperation3.

Pilot testing

Proper pilot testing by the subcontractor staff is critical for this type of data collection. The questionnaire is being tested for all the usual question wording, order and miscommunication issues controlled for in domestic, English-language face-to-face work. In addition, good pilot testing can identify proposed administration techniques and translation choices which are not culturally compatible. To this end, focusing effort on assuring that subcontractors understand and have a formal and documented process for implementation and gathering feedback from field staff during pilot testing is critical. Don’t assume that they do, as the communication processes for data collection management in many parts of the world are not this formally structured. Lack of telecommunications, transportation and mail service can make this process much more difficult than it at first appears.

Subcontractors should bring the most qualified staff to the pilot testing process. It is important to assure that local ethnicities and language capabilities are matched. Some countries have dozens of local languages whose use is politically charged, often carrying assumptions about which people are more “intelligent” or “intellectually capable.” Subcontractors can map those biases into defining their “best” staff selection for pilot testing. Often each of those populations has completely different cultural contexts and communication norms. Similarly, cognitive capabilities and communication norms can vary widely by education and geography. Be sure to review issues of language, cultural context and cognitive ability with the subcontractor before the pilot test and verify which of their staff you want doing testing with which subpopulations. Then follow up to make sure they are doing it that way. It will save a lot of time, trouble and cost later on.

Many African countries have over a dozen different languages, each politically charged. India  has single regions with over 20 different languages. While doing research in Bosnia-Herzegovina the author learned that it was critical that Serb-speaking researchers conduct work with Serb-accented speakers and Croat-accented speakers with Croats to elicit honest response. In some cases where groups were mixed, it was actually helpful for the author to conduct the focus group work himself through interpretation, as respondents could see that the researcher belonged to neither group.

It is important that all parties be aware of and prepared to accept the delays and often increased cost that discovery of culturally incompatible administration techniques during pilot testing can create. Because of poor communications and transportation structure in many places, gathering information, retraining and then retesting can eat up significant amounts of time and be expensive. Planning with the subcontractor for sufficient time and expense to accommodate this contingency is therefore critical.

Problematic element

Sample frame design is perhaps the most problematic element of any mode of research with people living in the conditions described above. The missing information which makes design so difficult often plagues the field interviewer when trying to find designated sample points. “Household” selection techniques (some cultures are not organized this way) must be explicit and simple to follow. Given the difficulties of finding people in agricultural and nomadic situations, clear and simple rules on substitution techniques are required which you can verify interviewers understand.

This author was involved in a research study completed in South Africa. The study’s sample frame was to draw from all adults in South Africa. Given that many South African villages lack building addresses, roads and convenient grid layouts, sampling had to be designed using satellite maps to select dwelling units using an interval formula. A similar problem exists in Mexico, where streets are unidentified and houses unnumbered, compounded by walls and servants who keep strangers out4. In Saudi Arabia, there is no officially recognized census of population and there are no elections and therefore no voter registration records or maps of population centers5.

Because of the difficulties surrounding sampling in this type of research, its administration by the subcontractor deserves even more focus than it is given domestically. Project timelines and budgets should be predesigned to accommodate for the difficulties inherent in reaching the sample point and then making proper substitutions. Depending upon transportation, communications, weather, language variance and cultural norms, this can become a significant effort. Discuss with the subcontractor your expectations and encourage accurate and honest feedback and agreement on increased costs and timing. Once you’ve got that feedback, carve it in stone by documenting sampling and substitution processes and reporting in the work order or contract. Then pilot test it. If possible, validate performance via address matching, GIS plotting, map coordinates or postal code.

In the case of the South African research, interviewers needed to stay in village environments for a minimum of two days to make necessary attempts to reach sample points. Closer to home, our own U.S. Census often has to plan for its Alaskan workers to be stranded for as long as a week in remote villages due to weather conditions; locations they reach by scheduled airline, small “bush charter” craft, river boats and saltwater ferries, small four-wheeled all-terrain vehicles, snow machines, dog sleds and even skis6.

Interviewer training

Interviewer briefing and training processes also deserve some specialized attention. While all the standard issues in domestic face-to-face work apply, there are some other issues to monitor with subcontractors. Ideally, your subcontractor conducts training using native speakers and a fully translated training handbook. This avoids interpretation issues during training and assures interviewers have a written reference you created. Of course, you will have no idea what the trainer is saying or how the manual reads other than what an interpreter or translator tells you! It can be useful to bring your own interpreter to the training sessions to verify that what’s being taught is consistent with your expectations.

The author exercised this practice frequently in Bosnia-Herzegovina, Ukraine, Russia, Kyrgyzstan, Kazakhstan and Moldova. Combining a passing personal command of Romanian and Russian with interpreter support, the author would find that written training direction was often recontextualized by trainers to fit cultural norms, some of which would compromise data collection design. The author usually did not reveal his personal command of the language until the problem surfaced, so that motives could be identified. Sometimes they were driven by the trainer’s political agenda, sometimes by cultural differences.

Subcontracting is prevalent in countries where data collection is widely dispersed, so be on the lookout for it. Requiring disclosure of any subcontracting activity by your contractor, and reporting of certifications and quality standards for those subcontractors, can be important. Similarly, requiring disclosure of data collector incentive/penalization programs and hiring standards, average employee turnover and skill sets, quality control processes and employee skill improvement programs can provide great insight into the probable quality and reliability of your deliverable, and allow for mediation of these factors in advance.

Quality control of fieldwork merits focus. If you’ve implemented the above issues well, this becomes far less burdensome than trying to control for all these factors post facto; it limits your muddy road time. You do need to assure some items however.

Require that reporting and accountability processes, as well as validation activities, are reported to you as a deliverable. It is ideal to receive this information during and after field to enable intervention during field if necessary. Good reporting should identify an interviewer’s ethnicity/linguistic/cultural backgrounds and provide transparency as to how they are being matched up to appropriate sample points. Similarly, good reporting should make the processes used to verify survey completeness and accuracy, and the quality of deliverables, transparent to you at the start of collection and throughout the process. Make sure that culturally and ethnically appropriate interviewers do actually work the right sample points and identify interviewers with poor sampling and data recording habits early.

The author has seen numerous studies where quality controls and expectations were not sufficient to control final deliverable consistency. Studies in both India  and Nigeria returned paper data records filled with gaps and illegible or inconsistent data. Conversely, the South African experience used extensive double-punch and revisit procedures combined with electronic data capture processes to assure data reliability.

Assure proper quality control processes are also contractually required and in place for data entry and processing. Ideally, you can bring these services back home and have them done by a domestic contractor you trust. However, this is sometimes not possible due to the need to have processors who can read the local languages. In these cases, clarify in the contract your requirements for subcontractors to clarify ambiguous data and retain original documentation. Stipulate up front your return-to-field policy and data gap procedures. Assure they use double-punch entry and provide you with cleaning tables as they progress. Require that they deliver to you original as well as weighted data with and explanation of the weighting scheme, if they are doing this work for you. Finally, if you want to assure compliance and accuracy, let them know their work will be independently audited by another firm and then have it done.

Minimum of road time

If you do all this right, you can assure you’re getting really good data within cost with a minimum of road time. You need to do some field time to verify interviewers are actually doing the work the way you want it done (e.g., in proper environments, following question order, etc.), but at this point it should be a pro forma exercise with a random sampling of the staff and you can relax and enjoy the countryside. 

References

1 The following sources are some possibilities for referrals and directories; Quirk’s Researcher SourceBook, World Association for Public Opinion Research member directory, ESOMAR Members and Research Organization directories, and of course the research association membership lists for each country in question.

2 Malhotra, Naresh K. (2004) Marketing Research. 4th Edition. Pearson: Prentice Hall. Upper Saddle River , N.J. p.668.

3 Unpublished paper. Carrasco, Moises M., (2001) “Conducting the U.S.  Census in Remote Alaska .” May 19, 2001.

4 Malhotra, p.334

5 Malhotra, p.334.

6 Unpublished paper. Carrasco, Moises M., “Conducting the U.S.  Census in Remote Alaska.” May 19, 2001.