Listen to this article

Driving change at your organization: 4 key questions to answer 

Editor's note: This article is an automated speech-to-text transcription, edited lightly for clarity.   

How can you successfully launch a customer satisfaction program? 

Ed Kahn, principal, customer experience insights at Mutual of America encourages researchers to answer four key questions when planning to launch an initiative covering customer satisfaction.  

Learn more about driving change at your organization in this session from the 2025 Quirk’s Event – Virtual Global.

Session transcript

Marlen Ramirez 

Hi everyone, and welcome to the session “Driving change at your organization: 4 key questions to answer.” I'm Quirk’s News and Content Editor, Marlen Ramirez. Before we get started, let's quickly go into the ways you can participate in today's discussion. You can use the chat tab to interact with other attendees during the session, and you can use the Q&A tab to submit questions for the presenters during the session. Our session is presented to you by Mutual of America. Enjoy the presentation.

Ed Kahn 

Hi everyone. I'm Ed Khan, principal for customer experience insights at Mutual of America, and today I'm going to share with you what we learned from a very successful insights initiative. I hope that the session encourages you, inspires you and provides some ideas which will help you achieve your own success with insights. 

First, I'll tell you about my company, Mutual of America. We were founded in 1945. We are a small retirement plan services provider and we serve small and mid-size organizations. As you can see, we support a variety of organization types under the value-based header, if you haven't seen that term before, just think about churches, for example. The customer experience team was created in 2019 and in 2020 they hired me to lead customer experience insights.  

So, what was our initiative? Our initiative was, we launched an NPS customer satisfaction program at this 75-year-old company, which had never had an insights function prior to my being hired. As you can see from some of these comments, we have pushed the customer insights culture forward. And the question of course is, how did we make all this happen? For the most familiar with agile methodology, the story emerged from our retrospective. We asked ourselves what truly moved this company across from what you see on the left side of the slide in quotes to the quotes that you see on the right-hand side.  

So, what we realized looking back is that we had answered four key questions as we progressed through this initiative without even realizing the questions were being asked. So, my idea here is I'm going to show you how we answer these questions for our initiative, and I'm hoping that will help you think about how you're going to answer these same questions for your own insights initiative, whatever that might be.  

So, here's the first question, and this really has to be question one. So, you can address all their questions in any order that you want, but if you can't clearly articulate to yourself and everyone else in your organization why you want to do what you want to do, you're not going to get very far. 

So, what are we looking at here? We are looking at a page from a usability test. This webpage is from a customer portal where someone saving money in a 401K plan, can see how much has been saved among other features. You see the big numbers in the middle, that's how much money has saved so far. But look at what someone said when I pulled up this image for the first time and asked that person: What do you see?  

To be honest, I had not even noticed that we had three shades of blue on that screen. Regardless, that's not the first thing we want someone to say when they log in and see their savings, but it was the first thing this person said, and this was not the only person who mentioned it. So, here's the first reason we wanted to run a customer satisfaction survey and get feedback from customers. We may think we know how a customer will react to something, but we don't.  

Remember one of those quotes from the left-hand side. On my earlier note survey anyone but not customers, I did this usability study with our customer service people. So, I started there and nobody mentioned the blues. They all said the screen was fine, and they told me that our customers would like it and understand it.  

Then I went out and I interviewed our competitors’ customers and their reactions were quite different. And this one immediately called out three shades of blue. So, here's lesson one, right? We don't know how customers will react to things.  

Let's take a look at the second quote and think about that customer trying to save more money from their paycheck each month. This is a moment that could be good for that person, and it's good for us as an organization because our fees are based on how much money gets saved into the plans. What's happening here is that our little buttons, were making it difficult to move forward. Our buttons, which nobody had given any thought to, could actually have a long-term impact on the success of our business. We want people to save more. We want them to confirm their way through the process of doing that and the design of that little pop-up window could help us or could hurt us.  

So, lesson one was we don't know everything. Lesson two is that something that we don't know can have an impact on our business and it can be a negative impact on our business, and that is a very significant issue. So, this was our articulation of why we want to do a customer satisfaction survey. Because we don't know everything there is to know about our customers and what we don't know can hurt us.  

The next question that we answered was how to go small. In this case, for us, this was about using the process internally. We were having difficulty rolling out a new customer relationship management program, that's a CRM we refer to, so we surveyed the CRM users and provided that feedback to the product owner team, which was supervising the evolution and the development work. As you might guess, when you look at the lower left-hand corner with that real low score, a CRM is not very useful to people if they can't enter customer information or write customer reports.  

As you can see, this insight proved valuable in a pretty short timeframe. I selected this slide actually from a much later run of the study to demonstrate that this kind of cycle of collecting feedback and using insights to prioritize changes can be an addictive cycle for the people who were interested in evolving a system or a platform or whatever it is that they're working on. This is fundamentally what the satisfaction process is about.  

Interestingly, as you look at long-term trend on the slide, you get a little bit of an indication that something at a higher level was a bit of a problem. And actually the conclusion we have most recently drawn from this is that the product ownership committee we had in place was really not the most effective way to manage the evolution and development around this platform. And very recently we have changed that and we'll be trying out a different model.  

So, let's get to the third question, which is more around minimizing risk really. So, starting small is a great way to minimize risk, and it's a very important dynamic. But it's important to ask yourself the question, what else can we do to control for risk? Risk is scary. Change is scary for people in an organization. And what we discovered in this particular case for us, and again, this can be different for any particular initiative, but in this case there was an education gap. We were getting a lot of questions from all different kinds of people, and what you see here is my effort to improve transparency around what it was that we were really proposing, helping people understand how things were really going to work.  

So, this is a process flow map for the purists in the house. It's not exactly the way process flow diagrams are supposed to be written, but it did the job and it shows the steps, actually not of ultimately our satisfaction program, but a different survey program that we were developing. And it also shows kind of where the data flows are going to be. So, folks in the compliance world, legal had questions about what data are you capturing and where is it going to be stored? And this proved to be very helpful and proved to be a great confidence builder for people because they could see what was going to be happening when it was happening, what would be happening next so they knew what to expect.  

They also got a sense of the management of the program and how that was going to operate. So, in case something happens that we don't expect to happen, how are we going to troubleshoot? How are we going to know something was off in any way? And so having these steps all laid out in a flow was very, very helpful for actually building confidence among quite number of people, which then enabled us to move forward.  

This last question is an important one. We started with clearly explaining what you want to do and why. You may or may not also be asked how confident you are that it's going to work. Remember, we are the insights professionals. We are the experts. We understand methodologies, data, qualitative output or whatever it is that we're trying to change. We also know that research is never perfect and that there are no guarantees. We also tend to be our own toughest critics. But what I'm sharing here is some data from actually a customer survey that I was able to run. So, we were building credibility step-by-step, and it was another opportunity really for us to go small before we went big.  

The portal that I shared before was actually part of a larger platform migration that we made, and I was able to survey some customers who had gone through that change. I tested a few different customer satisfaction questions, likelihood to recommend as one of them, but we tried a couple of other ones. I was also able to work with our finance department to track customer attrition occurring after the platform migration. And you may view these data differently from how I view them, but for me, the chart was persuasive. This gave me confidence that we were going to learn something that we didn't know about customers, which would help our business in an important way.  

So, you may have to find your own confidence to support your initiative from another organization. It might not be an internal kind of test run or pilot. It may be something from another company. It may be something from a different industry. It may be academic research. The important point is that you should be confident. You should truly believe that what you are proposing is going to work and that it's going to benefit your organization.  

So, summarizing this at the end of the day is what worked for us, right? This is how we got there. This is how we were able to move forward and ultimately launch our NPS customer satisfaction program here at Mutual of America.  

We did not recognize as we were going through the process that these questions were being asked, but with the benefit of having taken a step back and having done that retrospective analysis, we put this together and this is our template for moving forward. We are pursuing new and different insights initiatives here. It continues to be a challenge. It always is. Change is always challenging at an organization, but the methodology, this template of questions that we are working against, is proving itself to be valuable as a methodology. It's working for us and I am confident that it will work for you. So, I hope that you will give it a try. I hope that you will have similar success driving change at your organization. And thank you very much for joining today.