Listen to this article

Editor’s note: Jen Golden and Ashley Harrington are project managers at market research and consulting firm CMB, Boston. This is an edited version of a post that originally appeared here under the title, “5 key takeaways from The Quirk’s Event.”

Last week, we spent a few days networking with and learning from some of the industry’s best and brightest at The Quirk’s Event. At the end of the day, a few key ideas stuck out to us, and we wanted to share them with you.

1. Insights need to be actionable: This point may seem obvious but multiple presenters at the conference hammered in on this. Corporate researchers are shifting from a primarily separate entity to a more consultative role within the organization, so they need to deliver insights that best answer business decisions (vs. passing along a 200 slide data-dump). This mindset should flow through the entire lifespan of a project – starting at the beginning by crafting a questionnaire that truly speaks to the business decisions that need to be made (and cuts out all the fluff that may be nice to have but is not actionable) all the way to thoughtful analysis and reporting. Taking this approach will help ensure final deliverables aren’t left collecting dust and are instead used to lead engagement across the organization.

2. Allocate time and resources to socializing insights throughout the organization: All too often, insightful findings are left sitting on a shelf when they have potential to be useful across an organization. Several presenters shared creative approaches to socializing the data so that it lives long after the project ends. From transforming a conference room with life-size cut-outs of key customer segments to creating an app employees can use to access data points quickly and on-the-go, researchers and their partners are getting creative with how they share findings. Effective researchers use research results as a product to be marketed to their stakeholders.

3. Leverage customer data to help validate primary research: Most organizations have a plethora of data to work with, ranging from internal customer databases to secondary sources and primary research. These various sources can be leveraged to paint a full picture of the consumer (and help to validate findings). Etsy – a peer-to-peer e-commerce site – talked about comparing data collected from its customer database to its own primary research to see if what buyers and sellers said they did on the site aligned with what they actually did. For Etsy, past self-reported behaviors (e.g., number of purchases, number of times someone “favorites” a shop, etc.) aligned strongly with its internal database but future behavior (e.g., likelihood to buy from Etsy in the future) did not. Future behaviors might not be something we can easily predict by asking directly in a survey but that data could be helpful as another way to identify customer loyalty or advocacy. A note of caution: if you plan on doing this type of data comparison, make sure the wording in your questionnaire aligns with your existing database. This ensures you’re getting an apples-to-apples comparison.

4. Be cautious when comparing cross-country data: A multi-country study is typically going to ask for a “global overview” or cross-country comparison but this can lead to inaccurate recommendations. Most are aware of cultural biases such as extreme response – e.g., Brazilian respondents often rate higher on rating scales while Japanese respondents tend to rate lower – or acquiescence – e.g., China often has the propensity to want to please the interviewer. These biases should be kept in the back of your mind when delving into the final data. A better indication of performance would be to provide an in-country comparison to competitors or looking at in-country trending data.

5. Remember your results are only as useful as your design is solid: A large number of stakeholders that are invested in a study’s outcome can lead to a project designed by committee since each stakeholder will inevitably have different needs, perspectives and even vocabularies. A presenter shared an example from a study that asked recent mothers, “How long was your baby in the hospital?” Some respondents thought the question referred to the baby’s length, so they answered in inches. Others thought the question referred to the baby’s duration in the hospital, so they answered in days. Throughout the process, it’s our job to ensure that all of the feedback and input from multiple stakeholders adheres to the fundamentals of good questionnaire design: clarity, answerable, ease and lack of bias.