Editor’s note: Wendy Jones is vice president of client services, Sky Alland Marketing, Columbia, Md.

Designing an effective telephone interview is a crucial part of every marketing research effort. In many ways, the success of the program depends on an interview that captures the right data, is easily and positively administered to customers and prospects, and does so effectively from the first day the program goes on-line. Of course, the real art is in designing a conversation around the consumer.

There is no hard and fast definition of a good telephone interview, but a good interview should measure all the right things and do so efficiently. The script should flow smoothly and the questions should be conversational. The interview should be brief and the questions clear, straightforward and not leading or repetitive. Above all else, the interview should be designed around the customer and his or her needs.

When our firm, Sky Alland Marketing, starts a new customer relationship management (CRM) or marketing research program with a client, designing the interview is one of the first steps. Although not all CRM programs use an "interview" per se (some programs serve to welcome customers, manage leads or fulfill product or literature requests) telephone interviews are crucial to any relationship marketing program with research objectives.

While interview design varies depending on program objectives, the client and, of course, the customer, the following are basic steps we follow in the interview scripting process.

Initial planning

Our first step is usually to arrange a planning meeting with the client to discuss the goals of the program and what they want to accomplish with the interview. These will vary depending on the kind of program they are initiating. It could be customer satisfaction measurement, marketing research, customer retention, channel management or some other type of customer relationship management program. We consider the purpose of the call, which can include measuring customer satisfaction, encouraging customers to activate a new credit card, tracking awareness of a product, or identifying drivers that affect purchases or defections. These goals can vary widely and will affect the type of information we attempt to capture with the interview.

During the planning meeting, we determine the survey geography and eligible respondents. Our approach will vary depending on whether respondents are customers, prospects or non-customers. Sometimes the client provides us with a demographic profile of the intended respondent, although this is not always required. Demographics can include gender, age and any specific screening criteria, for example whether the respondent has traveled in the past year or if they are satisfied or dissatisfied customers. The client may provide background or collateral material such as brochures customers may be calling about. The customer profile and program goals drive how we script the call, as will what we want to measure.

We talk about what the client wants to measure and discuss the kinds of questions to ask. On some occasions, the client may ask us to review a script they have used in the past. After meeting with the client and establishing goals, we design a questionnaire and circulate it for review and revision. This process can take from three days to two weeks from the initial meeting to the finished questionnaire.

How long should the interview be?

Interview length and format depends on the type of call. For example, welcome calls tend to range from three-and-a-half to five minutes. A welcome call is more open-ended and we do more of the talking, providing information and thanking the client for their business. For customer satisfaction calls, we try to keep the call brief and aim for a range of three to four minutes. Research calls are typically longer and can take from five to 10 minutes. Customer satisfaction calls and research calls involve more listening. Keep in mind that these are average ranges; interview objectives should dictate length.

The questions

What kind of questions to ask depends on program goals and what kind of reporting and analysis the client wants to receive. For example, if the goal is to identify key satisfaction drivers, a questionnaire of all yes-and-no questions will not model well. If the program’s goal is analysis, there should be more rating questions on specific parameters. Questionnaires are often a combination of general and specific questions. In customer satisfaction programs, we ask about overall satisfaction ratings but also break down satisfaction into more specific rating questions on individual behaviors. For example, in measuring satisfaction of customer service, the interview might include questions on how long it took to reach a representative, the courtesy and knowledge of the representative, as well as an overall satisfaction rating.

There should be a balance between open-ended and closed-ended questions. This varies with program goals and what the client wants the interaction to look like. Some clients strive for an interview that presents warmth, appreciation and good feelings with the customer. Since approaching the customer tactfully is a primary objective in these interactions, they will involve more yes-and-no questions which are easier and less time-consuming for the customer to answer. Other clients want a more straightforward interview. Questions with ratings lend themselves more to a straightforward, businesslike format. Every program is unique, but most programs call for a mix of yes/no and rating questions with at least one open-ended question at the end, usually to ask what could be improved.

Testing the script

Once an interview is scripted, it is tested and revised. We test scenarios, script length and the effectiveness of individual questions. If a question consistently generates scores of 98 and 99, it probably will not produce results that are useful or actionable. One of our primary goals is to design questions that will help the client improve performance and increase customer satisfaction and retention. We pre-test the script by conducting a small number of interviews, maybe 25, with live customer respondents. A project director usually monitors these calls to evaluate whether the questions are understandable, whether the length is right and whether it flows smoothly.

Timely revisions

After the interview has been pre-tested and any kinks ironed out, it should be used consistently throughout the program, so as not to skew results. Questionnaires are sometimes revised over the course of a program, but never within a reporting period. Sometimes, in the course of speaking with customers, we learn about new satisfaction drivers the client may want to measure. In this way, the interview and overall program can evolve in response to new information. Nevertheless, one should make every effort to design the best possible interview in the first place so program results provide a good basis for benchmarking over time.