It’s automated and controlled

Editor’s note: Anthony Pichnarcik is global VOC leader, Honeywell Automation and Control Systems, Phoenix, Ariz.

Automation and Control Solutions (ACS), an $8 billion unit of Honeywell, Phoenix, Ariz., has 40,000 employees who apply sensing and control expertise to create safer, more comfortable and productive environments. My team, global voice of customer (VOC), gathers intelligence that helps ACS better understand buyer behavior, and facilitates and drives action responding to that intelligence. Specific objectives are:

  • Pre-sales: to understand brand awareness, perception and vendor selection criteria, to gain access to and win more business opportunities and revenue.
  • Post-sales: to identify continuous improvement opportunities in ACS project, installation and support services, and facilitate and drive those improvements, to enhance customer satisfaction, loyalty, recurring revenues, and profits.

Before 2003, our customer intelligence was only rudimentary. Until that time, the VOC team used telephone surveys and static reports to obtain customer intelligence. This approach had severe limitations: 1) phone surveys were labor-intensive and expensive; 2) tying reports back to customers for follow-up was difficult and time-consuming; 3) reports arrived many weeks after surveys were conducted; and 4) the VOC team could neither track nor manage follow-up. Without a coordinated, systematic and timely way to link customer feedback to mainstream ACS operations, we were limited in our ability to respond to data showing that service cancellations were increasing and profitability was suffering.

To address these concerns, from 2003 to the present, we have partnered with CustomerSat, Inc., a Mountain View , Calif. , research firm, to institute six online feedback programs. Three of these feedback programs are continuous and event-driven for strategic customer touch points: global win/loss, project installation, and service and support. Three are periodic: customer value analysis (CVA) and brand awareness surveys of our entire market, and a customer survey of preferred communications channels.

These programs correspond to each of the five stages of our customers’ experience with ACS (depicted in Figure 1): awareness of and perceptions of our brand and our competitive positioning (stage 1); why customers chose ACS or our competitors (stage 2); the satisfaction of our customers with all aspects of our solutions and services (stages 3 and 4); and why customers chose to continue, expand, or discontinue service (stage 5).

Taking our customer feedback program online was critical:

  • Online data collection, dashboards, alerts and case management let us distribute intelligence to the right ACS people in real-time, let them act on it immediately, and most importantly, track and manage follow-up actions.
  • Our customers include building owners, facilities managers and engineering contractors. Some are at plants and construction sites during the day and are unavailable to respond to phone surveys. Conversely, others prefer phone. To accommodate everyone, our online surveys automatically escalate to call centers for non-respondents to e-mail invitations and for those without e-mail.
  • Since the phone interviewers enter responses into the same online surveys to which we invite other respondents by e-mail, we are no longer tied to one call center supplier, giving us new vendor independence and negotiating leverage. The balance of power has shifted from vendor to ACS.
  • Our online database enables or facilitates our applying sampling rules across surveys and respondents, aggregating results across surveys, and better predicting customer retention.
  • Online is less expensive, both to gather data and disseminate it, than phone.

We now provide more than 200 ACS professionals with dashboards, automatic action alerts, and cases driven by survey feedback (Figures 2a and 2b).

After using this customer feedback system for six months, cancellations were 40 percent lower in a pilot group that received surveys and follow-up as a result of alerts and cases, than in a control group that did not, preserving several million dollars in service contract revenue. ROI was over 100 percent/yr. because online feedback costs are small, while assets leveraged - the entire customer-facing organization - are large.

Implementation

In mid-2003, we phone interviewed over a hundred ACS customers to 1) identify key loyalty metrics such as percent of preventative maintenance of HVAC systems completed on-schedule, responsiveness, technical expertise, etc., for each of our business processes; and 2) determine what quantitative level of service for each metric customers considered excellent, satisfactory, acceptable, and poor. These definitions were used to calibrate and link survey results to operational metrics. In the future, we plan to conduct these pre-survey interviews via moderated online discussions, allowing more customers to participate and enabling customers to interact with each other.

Also in 2003, we conducted two large-scale, blind (Honeywell not identified) market-wide online surveys. A brand awareness study told us the top attributes customers look for when selecting a service provider. A customer value analysis survey confirmed that findings of the telephone interviews do apply to our entire market. To help design the CVA study, we engaged Bradley Gale as consultant.

These studies provided the foundation for an integrated system of six surveys (Figure 3).

For the brand awareness and CVA surveys, a sample was obtained from both internal sources and external sample providers. For the event-based surveys and customer communications survey, a sample is drawn from sales and service tracking systems. As of June 2005, over 12,000 responses have been received across all surveys; response rates average 30 percent.

All six surveys drive key ACS actions and insights (Figure 4). For example, as a result of the CVA study, we 1) adjusted some of our targets for customer-facing metrics to comply with thresholds that customers deemed to be “excellent” performance; 2) began tracking certain metrics for the first time; and 3) reviewed service area coverage to ensure that emergency response times could be met.

Through their dashboards, managers can “pull” information from the system 24/7 to use in strategic planning, presentations, and account reviews. In addition, my team and I report to the organization quarterly, reviewing survey findings with executive management, regional leadership, and individual regions. With offline feedback and reporting, my group was isolated. With online feedback systems, we have become integral to and facilitate business operations.

Innovative approach

Our approach to obtaining ongoing customer feedback is innovative in several ways:

  • Integrated e-mail and phone. We integrate self-administered e-mail and interviewer-assisted phone surveys in two ways. First, our system recognizes whether a customer is able or has requested to be surveyed online via e-mail or by phone, and then automatically drives the customer record to the e-mail invitation queue or to a phone interviewer’s queue with autodialer support. Second, if a customer does not respond to an e-mail invitation and reminder to a survey, the record is escalated to the phone interviewer queue. This integrated approach has given us the highest overall response rates by the most economical means. A flag available for crosstabbing tracks whether the survey was conducted via e-mail or phone. The mix is gradually shifting (towards e-mail), so to eliminate bias, we report e-mail and phone mean scores separately. For the service and support survey, for example, the mean score for overall satisfaction by telephone was significantly higher than for surveys conducted by e-mail. The differences are significant at the 90 percent confidence levels.
  • Call center productivity, speed, and independence. 1) Since our outsourced call centers access our online surveys from their interviewers’ PCs, no CATI software or programming/set-up is required at the call centers - just Internet access. The call centers become productive faster, and we don’t have to have two separate systems for e-mail and phone. 2) With normal CATI, results are at least a day old; with online surveys, survey results are available within minutes of interviewers keying responses in. 3) Our surveys are now easily transportable from call center to call center, which helps us manage our suppliers. We now have the flexibility to use hundreds of different phone center suppliers, if we wished, rather than a select few.
  •  “Bounce” processing. When sending out e-mail invitations, we receive large volumes of “bounced” e-mail messages: out of office, mailbox full, user unknown, etc. Some of these messages are critical for updating e-mail databases; others can be ignored. Managing them is labor-intensive and time-consuming. CustomerSat provides the VOC team an automated bounce e-mail manager that receives, interprets, categorizes and files these bounced e-mail messages. Parsing technology sorts messages of different types into separate e-mail boxes that we can access directly. This system eliminates labor-intensive tasks, improving productivity and enhancing the quality of service the VOC team delivers to internal customers.
  • Customer “touch” rules. With multiple surveys running concurrently, we need controls to protect customers from being over-surveyed. CustomerSat implemented survey-specific and global invitation rules that ensured, for example, that certain customers would receive no more than one invitation to any or all surveys more frequently than once every 180 days. Doing so helps preserve the goodwill of our customers and increases our response rates overall.

Driving action

Our online survey system directly drives and coordinates ACS-wide action through Web browser-based analytics dashboards, real-time action alerts, and feedback-driven case management.

  • Info rmation-rich dashboards. Online crosstabs, trend lines, correlations (Figure 5), open-ended suggestions, significance testing, positioning charts (Figure 6), and other statistics are generated and updated in real-time through Web-browser-based dashboards tailored to each of the 200+ ACS sales, service, and account management professionals. Filter tools allow users to slice-and-dice results any way desired. A high degree of coordination and alignment arises when so many of our professionals are armed with real-time data feeds. Online analytics allow drill-down from ratings in frequency distributions to summaries of responses comprising those ratings to detailed individual scores and open-ended responses. Results can be analyzed by region, branch office, field service leader, or product line. Decision-makers can examine factors that affect customer satisfaction and loyalty on global, regional, state and local levels.

  • Alerts. When a customer satisfaction score falls below thresholds that we specify, or if the customer asks to be contacted, the system does two things. First, a detailed action alert is automatically e-mailed to the Blackberry, laptop, or desktop PC of people responsible for that customer, including field service leaders, customer care advocates, sales representatives, and regional general managers. Alerts highlight the question response(s) that triggered them, and contain links allowing recipients to directly view the entire survey response and associated respondent-describing fields (such as customer name, address, phone, and contract size) for contacting the customer, if appropriate.
  • Cases. Second, the system automatically opens cases and, using business rules, assigns them to case managers and teams. Online case management enables team members to share information and coordinate responsive actions. Case severity and deadline are based on satisfaction scores and in some cases on the customer’s “tier,” a surrogate for total customer value. The case manager closes the case when the customer concern is addressed. Cases are automatically escalated if not closed before their deadlines. As a result, online feedback directly drives action throughout ACS. In the past, case management has been part of CRM systems that track and manage purchases, inquiries, and other customer behaviors. We see case management as critical to feedback systems that track and manage customer attitudes as well.

To date, over 750 cases have been generated. One example: survey ratings from a government customer receiving HVAC mechanical maintenance from ACS generated an action alert and case. Our customer care advocate (CCA) called the customer to investigate, which led to a face-to-face meeting among the customer, CCA, and field service leader responsible for the account. The meeting resulted in a solution to the initial problem and the opportunity to upgrade the customer’s entire system. The case was closed. Rapid response has consequently also generated upgrade revenues.

Measuring ROI

Measuring ROI was critical to ensure ongoing funding and management support. We used two methods:

1) A pilot with control group to measure the impact on customer renewals. After six months, cancellations were 40 percent lower in the pilot group that received surveys and follow-up as a result of alerts and cases than in the control group that did not, preserving several million dollars in service contract revenue.

2) A formal ROI calculation in which we modeled revenue as the sum of revenue from new customers won; from existing customers retained; and from customers lost during a six-month period. Online VOC impact was highest on revenues from customers that would otherwise have been lost; next highest for retained customers; and lowest, but still significant, for revenue from new customers, which benefits from improved word-of-mouth and referenceability of existing customers. We estimated changes in each of these revenue components from online feedback, and used NPV of discounted cash flows to rigorously calculate ROI, which was over 100 percent/yr. The reason: online feedback costs are small, while assets leveraged - our entire customer-facing organization - are large.

These two measures of success provided a strong validation of our online feedback approach.