Real-time competitive benchmarking via the Internet

Editor's note: John Chisholm is president of CustomerSat.com, a Menlo Park, Calif.-based customer satisfaction measurement and market research firm.

In 1997, four companies in the customer relationship management (CRM) software industry wondered how their customer satisfaction and loyalty compared with that of their closest competitors. To answer that question, these companies - Aurum Software, Inc., Clarify Inc., Onyx Software Corporation, and The Vantive Corporation - engaged our firm to assess their customer satisfaction and loyalty relative to industry-wide benchmarks. Since the companies together represent a large share of the market, their own aggregate scores would be indicative of industry averages. And since the companies' customers are technology-savvy businesses, it made sense to use the Internet to gather the data. Ensuring the integrity and confidentiality of customer data collected for four direct competitors, and providing rapid feedback to an industry whose raison d'etre is enhancing responsiveness to customers was an opportunity and challenge that we could not pass up.

Customer relationship management software (also known as front-office systems, customer-interaction systems, or customer asset management systems) are enterprise-wide systems that manage the relationships between a company and its customers. These systems handle customer support automation, customer order tracking, sales force automation, customer problem tracking, and other sales, marketing, and support functions. CRM systems are typically used in call centers, where the software enables customer support and telesales representatives to access customer histories, access product, pricing, and problem resolution information, take orders, send messages to colleagues, respond to customer inquiries by e-mail, and resolve or escalate customer problems.

By working closely with the four software vendors over several months, we discovered they shared deep, corporate-wide commitments to customer satisfaction and loyalty.

The benchmark initiative had three parts: determining the appropriate performance metrics for the companies and composing survey questionnaires that reflected those metrics; securely deploying the surveys via e-mail and the Web; and promptly delivering confidential and actionable survey results to each vendor.

Performance metrics and questionnaires

CustomerSat.com conducted interviews with the four vendors' customers, managers, and call center service representatives to identify customer satisfaction and loyalty attributes to include in the questionnaires. To allow results to be aggregated, most questions were identical across the four questionnaires. The overall benchmark study, which is deployed on an on-going basis, included:

1) Performance benchmarks: how customers rate their satisfaction with different attributes of their CRM vendors and software. Forty-five performance dimensions encompassed such areas as product quality, sales force knowledge and effectiveness, support quality and effectiveness, ease of doing business, and pricing.

2) Importance benchmarks: the importance of different attributes in determining customers' overall loyalty and satisfaction.

3) Market positioning: perceived positioning of each vendor and its competitors along multiple market dimensions, by both each vendor's customers and all of the vendors' customers in aggregate.

4) Demographics: the composition of each vendor's customers vs. the overall market, by industry segment, size of customer company, server platform, data base, size of installation, geographical region, and other dimensions.

Our goal was to ensure that the four vendors enjoyed not just operational advantages from the survey results, but strategic advantages through early insights into customer and market requirements and perceptions. The final Web questionnaires had approximately 150 questions each (Figure 1).

Inviting respondents by e-mail

All of the vendors had customer databases that included e-mail addresses. (We encourage all companies, if they do not do so already, to ask for e-mail addresses when customer contact information is collected. Doing so quickly pays dividends as more and more customer communications - both research and direct marketing - can shift from conventional media to e-mail and the Web.) We worked with each vendor to select a representative sample of customer e-mail addresses.

We invited the respondents to the Web surveys by e-mail. The invitations are personalized (Mr. John Smith, ABC Company, Dear John Smith:) and contain the Uniform Resource Locator (URL) or address (a string of characters in the form http://www.something.com/xxx) of the Web survey page (Figure 2). To access the survey, responding customers either click on the address or copy and paste it into their Web browser, depending upon whether their e-mail software is Web-enabled or not. We have learned through experience that the e-mail invitation should come (or be made to appear to come) from the client, and that the Web survey should be hosted on the survey research provider's site. This combination helps assure the respondent that the survey is legitimate and that respondent confidentiality will be protected.

Guarding against ballot-box stuffing

An issue for any self-administered survey is ensuring that only authorized respondents can respond and that they cannot respond more than once (i.e., no ballot-box stuffing). Addressing these concerns is especially important when direct competitors' surveys are being fielded - or in the case of Web-based research, "hosted" - at the same time. To address these concerns, the benchmark study required that we use new technology we developed last year called Positive Respondent Identificationâ„¢ (PRI ).

PRI ensures that stray Web surfers cannot access surveys and that authorized respondents can complete a survey only once. In each customer's e-mail invitation, a unique password is appended to the Web survey address ("52Z87W52" in Figure 2). On the Web server, a program reads the password and, through a database, confirms both that it is valid and that it has not previously been used. If the PRI code is valid, the survey is displayed in the customer's browser. If the PRI code is not valid, either the message "Sorry, we could not find you in our database" or "Sorry, your ID code has already been used" is displayed. After the respondent completes the survey, the database is updated to disallow use of the password again. Respondents have nothing to type in or remember with PRI, unlike conventional passwords, thereby increasing response rates.

First drafts of the HTML of the four surveys were composed with Decisive Survey software from Decisive Technology. CustomerSat.com Web programmers then formatted the raw HTML into concise, attractive tables, attached program scripts for PRI and real-time generation of individual company and aggregate results, and posted the surveys in private locations on the Web (Figure 3).

Our goal was 100 completed responses for each vendor within 30 days. As incentives, we offered random drawings for the 3Com PalmPilot personal organizer, a popular device for business professionals that can be used by virtually anyone with a PC.

The Web site hosted the four surveys, databases for PRI, and selected results generated in real-time. Other analyses and reports, shown at the bottom of figure, were generated by conventional means.

Over 400 companies respond

After e-mailing of invitations, approximately half of the responses to each survey arrived within 24-36 hours. To give all invited customers ample time to respond, the surveys were hosted for 30 days, with reminders e-mailed to non-respondents after approximately 10 days. Over 400 companies worldwide that are customers of the four vendors responded, yielding a 35-40 percent response rate for each vendor. The responding companies are estimated to represent over half of the worldwide users of enterprise CRM software.

Respondents were enthusiastic about the process. According to John S. Townsend, senior director, Network Operations Support at Intermedia Communications Inc., and one of the customers who responded, "The Web survey was great. I was able to key in my answers and comments and then just click to send. No extra paper on my desk, no envelopes or stamps to worry about. A great time saver."

Real-time results

In the fast-paced CRM software industry, as increasingly in all sectors of the economy, vendors need to be able to react to customer feedback very quickly. To address this requirement, we offered the CRM software vendors the option of receiving their survey results in real-time: as customers completed the Web surveys, up-to-the-minute frequency distributions, selected crosstabs and verbatim open-ended responses appeared on password-protected Web pages. The results can be viewed from any Web browser, with no special software required (Figure 4).

Web-based results are a rapid and effective way to disseminate survey results throughout an enterprise. Anyone authorized to view the data may be provided the Web page addresses and password. Subsets of survey results may be published on different password-protected Web pages for different groups.

Conventional "off-line" methods were used for complete crosstabs and factor and regression analysis. Performance benchmarks and ratings for each vendor were determined for multiple demographic segments as well as for the market overall. Findings of the ongoing CRM software benchmark study are compiled for the vendors individually and in aggregate annually or semi-annually.

Vendor performance ratings are confidential to each vendor; aggregate benchmarks are shared by all of the vendors. Selected aggregate findings from the study have been published. For example, the customers most satisfied overall with their CRM systems and vendors were:

  • between $100 million and $1 billion in revenue;
  • in the telecommunications industry segment;
  • in the eastern region of North America.

For more details on survey results, visit http://www.CustomerSat.com/ pressrel980303.htm.

Benchmarking satisfaction with transactions

The real-time benchmarking initiative has now advanced in two directions. First, we have made real-time benchmarking services available to the CRM software vendors' customers. Users of CRM software can assess the satisfaction of their customers relative to comparable companies by industry, size, or geographical region. Second, satisfaction can now be measured in real-time not just with customer-vendor relationships, but with transactions as well. While relationship-oriented surveys measure customers' satisfaction over an extended period, transaction-oriented surveys measure satisfaction with the handling of specific sales or service events.

For real-time transaction-based benchmarking, CRM systems are linked to a secure CustomerSat.com Web site, which e-mails Web survey invitations to a sample of customers immediately after their transactions are recorded by the CRM software. Real-time survey results include trend lines that enable call center managers to track customer satisfaction by day, week or month, and by product line, geographical region, service rep, or any other variable (Figure 5).

E-mail alerts can be automatically generated to call center managers if average customer satisfaction falls below a specified level, or if customers request that someone contact them. As a result, call center managers can focus their efforts and resources to improve customer satisfaction and loyalty faster and more effectively than ever before.

In short, customer satisfaction measurement is increasingly becoming a proactive, real-time tool for management. Concluded an executive of one of the participating vendors, "Establishing benchmarks to raise quality and service standards benefits the entire CRM market, and helps us fulfill our corporate-wide commitment to customer satisfaction."