Editor’s note: Jayne Krahn is vice president of research operations for market research firm Kantar Media, Austin, Texas.

Having replaced our online data presentation system last year, I want to share five lessons that we learned along the way. Although a few were painful, by being flexible and correcting our mistakes, we now are positioned to better serve existing customers and have grown our business by entering new markets.

Our division produces syndicated research studies, which means there are typically a large number of respondents with extensive content. Historically, our data was uploaded into third-party media-planning and crosstabulation software that clients manipulated themselves to complete various analyses – based on their business needs – throughout the year.

Although we had been using online dashboards for certain clients for some time, we found them very time-consuming to set up and control for quality. All data had to be analyzed in SPSS or another crosstab program and copied into the database with various lookups. In addition, all possible charts and filters of interest (and there are many with syndicated studies and different types of clients) had to be pre-determined in the design phase. Even a combination of only two conditions – like high blood pressure and diabetes – had to be determined. Nothing could be done on-the-fly. Adding new filters for a particular client would require much re-work for every table and chart in the dashboard. Designing a dashboard from scratch would take weeks and even updating a new wave of data often took many days.

One of the first things we grasped when looking to implement a new online presentation system is that we had been accommodating the deficiencies of our online dashboards and could benefit from one designed specifically for our industry. Further, we realized that our client base, their needs and our study content had evolved in recent years. There is more data available but clients want insights that they can easily read and interpret. Instead of performing deep-dive analysis to find specific information that interests them, they only want to directly obtain the pertinent information.

It’s easy to be lured in by bells and whistles but at the end of the day, behind the charts and graphs, most of what we saw were databases that would leave us with the same challenges, including how we could enter respondent-level survey data into a system and easily perform analysis. In our case, it’s crucial that there is accuracy in replicating or weighting any study that measures media audiences. In the end, while many of the graphics and online platforms we evaluated were appealing, we chose to go with Dapresy Pro from Dapresy.

We overcame many challenges when deciding to replace our online data presentation system and ultimately completing the transition, and we hope you can benefit from five lessons we learned during our experience. 

Lesson No. 1: Spend enough time with a tool and you may not even realize that you are accommodating for it, rather than the reverse.

Despite our initial planning process, we tried loading all our data into the new software and using it like a crosstab program. Our studies can have over 20,000 respondents, each with 6,000 data variables. That is a massive dataset and so it isn’t realistic to use in that way. Nor is there a reason why we would want to use it that way. So while we had thought through our tool requirements, we certainly hadn’t spent enough time reevaluating our processes. First, we recommend creating a storyboard around the data you want to include and the salient points you want to make. Next, share it with others – sales, marketing, friendly clients – for feedback. As data geeks we believe everything we produce is vital but a dashboard needs to communicate the key points. Taking this time really helped us fine-tune and reduce the data we needed to load into the tool. 

Lesson No. 2: Step back and consider revising past processes to take advantage of what your new software can do for you.

After a few months of delays we had the data loaded into our new dashboard tool but we still hadn’t taken advantage of what it could offer. In fact, our initial dashboard looked like a replica of what we had been delivering years before: boring and flat. When we presented our sales team with samples of how we saw our new product, they kicked it back to us and told us there needed to be more interesting analysis and improved graphics. And that meant that we, as a small research team, needed to scale up our graphics and analytic skills.

Lesson No. 3: The research industry historically hasn’t needed sophisticated graphics. With the advent of infographics, clients expect more. Agencies would do well to hire staff with graphics skills.

Once we reset our thinking and figured out what our clients wanted, the solution was straightforward. Luckily we were able to hire someone with an advertising degree who loved data. You will want to have someone who can work with and create images in a program such as Adobe Illustrator. In fact we had two of our staff working part-time to redesign our online presentation system completely in less than three months.

Toward the end of this process, we began internal testing and conducted a soft launch to a small number of key clients and continued to obtain feedback from our vendor.

Lesson No. 4: Roll out new systems slowly and seek feedback from trusted sources before going live.

In the fall of 2014, we released our first three studies with the new online presentation system. Since our clients weren’t familiar with the new software, we developed user guides for each study, in which we took screen shots and included detailed instructions on how to find the data they were accustomed to as well as the new analysis. In addition, we maintained bi-weekly reminder trainings for our own staff.

Lesson No. 5: If you don’t use it, you’ll lose it! Consider regular training to maintain everyone’s efficiency and effectiveness.

Our team still meets quarterly with our sales and client service team to review current dashboards and to brainstorm on ways to improve what we deliver. They are the ones who work with clients and get feedback from our users on a regular basis. The dashboard team appreciates this and it gives them a chance to expand their skills. In addition, we contact our vendor support team regularly to determine what is possible with the new system.

Looking back, it’s surprising to see some of our hiccups. Isn’t 20-20 hindsight wonderful?

Â