Integrating survey programs with behavioral data
Editor's note: This article is an automated speech-to-text transcription, edited lightly for clarity.
Roku created a survey program that allows for behavioral data to be integrated into the survey. Now, the team can easily target groups based on actions or personal information.
Alex Strauss, senior consumer insights specialist at Roku, explained the process of creating this program, the benefits and drawbacks of it and what Roku plans to do next during this 2025 Quirk’s Event – Virtual Global session.
Session transcript
Marlen Ramirez
Hi everyone and welcome to the session, “Integrating survey programs with behavioral data.” I'm Quirk’s News and Content Editor, Marlen Ramirez.
Before we get started, let's quickly go over the ways you can participate in today's discussion. You can use the chat tab to interact with other attendees during the session, and you can use the Q&A tab to submit questions for the presenters during the session.
Our session is presented to you by Roku. Enjoy the presentation!
Alex Strauss
Hello. Thank you, everyone, for joining today's session. And thank you so much to Quirk’s for inviting me to present on this topic.
My name is Alex Strauss from Roku. I work on our UX research team and today I'm going to be presenting on how we at Roku have connected our behavioral data, what our users do with our email survey program and some of the benefits of building that program for collecting meaningful insights.
A little bit about Roku. Roku, if you are not familiar, Roku is the number one selling TV operating system in the U.S. I imagine many of you participating in today's call have a TV home screen that looks something similar to the picture on the right there.
About me. I've been at Roku about five years and been in insights for a bit over 10. I've been kind of through the research gambit. Starting at a brand marketing research agency, then moving to a small social sports startup. Then to ESPN focusing on content research. And now, at Roku, I focus on UX research. Across the board at those different companies I've worked to build out their email survey programs, which has kind of given me a lot of the learnings that I'll go through today. Within Roku, I focus primarily on the Roku Channel and our Roku Sports offering.
Today's presentation, I'm going to go through a few different topics just starting with why do we even set up this type of program, a few use cases for these types of research programs, how you go about building them and some best practices for including that data and maintaining privacy. Then at the end I'll call out a few watch outs and learnings and kind of next steps, where we're going with our program and I'll leave a minute or two for questions at the end.
Let's start with the ‘why.’ Why is it important that we integrate our behavioral data into our email surveys and what advantages are at play?
To start, there's a lot of major pitfalls that come across when just conducting typical email surveys.
First of all, it is just targeting users or targeting customers. When you're targeting sample, you might send out a lot of invitations and only get minimal response, or if you're looking for a really specific and nuanced audience, it might be very expensive to target them as well.
Beyond that, everyone knows that you only get a moment of people's time, especially if it's unincentivized. So, it's really important that you keep your surveys short and sweet. Making sure that they're not bogged down by behavioral questions. It's really hard when you have to ask a whole screener of questions before you can even get into the important insights you're trying to get to.
The last thing is, I think we can all sympathize with, it's really hard for any user or customer to recall exactly what they did days, weeks, months ago. I don't even remember what I had for lunch yesterday. How am I supposed to tell you what I streamed or what actions I took on the Roku platform a month ago or three weeks ago? That can be really challenging for any customer or user.
On the other side of the coin, now more than ever, there's data and data and data available to most companies. And so it can be as simple as just basic PII about people. It could be about the actions they've taken on platforms, how often they're engaging with you, what they're spending on your platform or with your product or service.
There's so much data that's being collected not just for Roku, but across so many brands now, and it's important that you take advantage of that. The reasons it's important to take advantage of that are to solve for many of these issues.
In the issue of targeting difficult to reach audiences, when you have your behavioral data linked, you can look at the specific users that took that specific action, pull them out and be able to send them a survey. That way you're not guessing and hoping that you're talking to the right people. You know for certain that you're talking to the right people.
In addition to the length of the survey, by appending this data to the survey, by adding that behavioral data in on the backend, you can reduce the length of your surveys because now you didn't have to ask that question. Then once you've freed up that space, now you can get into the meat of your survey and ask those deeper, more perceptual or more emotional questions that you may not have had room for if you had to focus on a long and detailed screener.
Then the last thing is you're just less reliant on your users to provide you with feedback that may or may not be accurate. If you're connecting real data to the backend of your survey, you know for sure that you user took that specific action or how many times or for how long and you don't have to guess and hope that they provided you the right feedback when you're trying to cut data or make assumptions based on actions taken.
When you're building these programs, there's a few places throughout the research cycle where they can be really valuable and help to inform and generate meaningful insights.
Upfront it can be really helpful for foundational studies when you want to target users who are taking a certain action to see how do we improve, how do we get better.
An example at Roku is we looked at people who are watching our free live channels, and we wanted to see how we could make that better. And so, we were targeted viewers to see where their pitfalls in their experience are and where opportunities for us to develop features that make that experience better or closer to cable.
Then further in the cycle, once you're starting to develop features or once you're rolling things out to market, you can get direct feedback from those who are engaging with it. So, if you see someone bought a new product that you just released or someone engaged with a feature that was just put on a platform, you can target those people and make sure that you understand among those who took the action you want, what are their initial perceptions.
Then on the backend after launch, you can then make sure that you're measuring and keeping tabs on your users and what actions they're taking. You can see among those who are heavy users versus lighter users of a feature are their perceptions of it varying and keep tabs on that over time as you make additional iterations and improvements.
A really good case study of how we've built this program at Roku is with our A/B testing program.
On the Roku platform, we run thousands of A/B tests a year that users may or may not explicitly notice or acknowledge, but some users do have a propensity to notice them. So, when we run those A/B tests, it's important that we not only understand how their actions change as a result of that test, but also how their perceptions change.
And so, getting the behavioral data to see how their actions change is only a piece of the pie. It is very possible we could develop a feature that drives the needle when it comes to revenue or streaming engagement, but users hate it. And so it's our job as researchers to make sure we have a pulse on that user sentiment and make sure that we're not releasing features that are going to create a negative experience for users.
So what we do is we target people who are allocated to these A/B tests and we send them a survey. We capture in the control cell versus the test cell what their perceptions are of Roku as well as the specific feature that we're testing, and we observe those shifts. So, while we're observing shifts in behavioral data, we're also observing shifts in their perceptions as well, and making sure that we keep that pulse. In our mind, we’re the last line of defense for whether or not we release a feature. If we're shifting the needle on the metrics that we want to shift, that's great, but if it's not creating a positive user experience, then we're the ones that are going to say, let's pause and iterate and see if we can do it a little bit better.
Let's talk about how we built this program at Roku. It does take a lot of steps. It takes a lot of collaboration, but when you do all of these things upfront, it really sets you up for success.
So, there's four key areas I've identified as far as the teams and the people that you need to collaborate with for success.
The first is, and probably most important, data and analytics. To be able to build that bridge between your behavioral data and your survey data, you need to be able to understand where that data is coming from. And work with that team to be able to pull that data through into however you're sending and collecting the data for these surveys.
The next team that's really important to engage with is your CRM and/or marketing teams. You need to deliver communications to these users in some way. Working closely with your CRM or marketing team makes sure that you're connecting and contacting users in a meaningful way, in an engaging way and in an on-brand way. So, they're an important team to engage with.
The least fun one is legal. You definitely have to engage with legal just to make sure that you're in compliance with policies. When you're collecting and connecting user data, it's really important that you comply with privacy policies. Those can vary across countries. And so, it's really important that legal is brought in from the start and then kept in as your process develops.
Then the last is the research vendor. So whether you're running it yourself or using a full-service vendor, making sure you're compliant with the data you're sharing and the connections you're building. Making sure that's all above board is a really important part of the process.
Let's talk about those a little bit more in detail, particularly analytics. So the first thing that you're going to want to set up with them is how do you pull out a list of users from your database?
You want to establish a process of how you're going to query your database and understand the variables that you can target, and how you're going to pull them out. So that's the front end.
Then on the back end, you're going to want to figure out what data you can connect to those users to help put that ‘what’ and ‘why’ together.
In any survey that we send, we're not only targeting the users based on what actions they've taken. We're also adding several columns of behavioral data to our file that we can connect, so that we can understand when we're doing our analysis, what they said, and also what they did.
So, when building that up, my easy tip here is just make friends with analytics because you're going to need a few favors to set this up, and so you're going to need to ask nicely.
The other big thing that you need to set up with analytics is what your data looks like. How is it configured? How are you able to bring it through in a meaningful way that's easy for your analysis, easy for repeatable behaviors when you're sending out multiple surveys. And so, just setting up that framework is really important.
What does that data look like to be able to target the users? What are some consistent variables you're going to want to pull through to be able to weight your data so that you make sure that your audience that's answering your surveys is aligned with what the actual population you're intending to represent looks like? What does the data you're trying to append look like?
And so, when you're doing that, there's some considerations.
One of the things that we will sometimes pull through for our surveys is the tenure of someone who's been on Roku. How long has their account existed? We'll break that into four different buckets that we've seen as representative of different cohorts in their life cycle. Not only do we pull through their tenure in an absolute fashion, we also cohort it into buckets to make that easier for profiling on the backend.
Another consideration for this data is how often its refreshed. You want to make sure they're pulling through the latest greatest data, but also you don't want to try and pull data that's not yet available.
Another key tip at this stage is you want to have some type of unifier that's going to allow you, especially with PII considerations, you want to have some blinded unifier. We call ours Account ID, but you want to have some blinded unifier to be able to connect your survey data with your behavioral data. In the instance that perhaps you want to add additional data after the fact, you have some way to figure out who that user was and add that data in post hoc.
Then the last step in working with analytics is how do you get the data into your survey platform to ask the surveyors? There's a few ways that we do it at Roku.
One, and I'd say the least sustainable, is you just have analytics pull the list for you, but that requires a lot of ongoing support. The way that we've worked with analytics more seamlessly is we've worked with them to create a number of standardized queries so that a SQL novice like myself can do it. I'm not a coding master, but I'm able to go into SQL myself, leverage these standardized queries that we've made in partnership with analytics and pull the list myself.
The other thing that we have at Roku is to set for our marketing program, we built what's called our DMP, and that's what they use to send out marketing emails to targeted audiences. So, we were able to leverage that same tool to target audiences to send for surveys. So, if something's already built in house like that, it can be possible and beneficial to look to leverage something that's already existing for the novice SQL user like myself.
When I do hit those pitfalls ChatGPT has been very helpful to help me fix some coding. I will say a lot of times maybe it over engineers and it's challenging. So, just make sure you run any ChatGPT engineered queries by a professional analytics person to make sure you're not erroring.
So when it comes to CRM and marketing, I found that there's two ways to approach this.
The first is to leverage the email service through the vendor you're using for research.
So, we use Qualtrics. We used to send our emails directly through Qualtrics. The benefits on that side is it's easy, it's built into the program and there's a layer of protection because you're sending via them and not your own platform, so you're not hurting your users as much.
The challenges on that side is that the surveys are more standardized and there's less flexibility to make them branded. There's also less just general email support. They're a survey platform, not an email platform, so they just don't have the same level of handholding that you might need.
The other option, which is what we do now, and we find it's been working well, is working with whatever your CRM or marketing team uses and doing that.
The reasons that can be really beneficial is if your CRM and marketing team are sending millions or billions of emails a year, then the amount of emails you are sending to invite people to take surveys is going to be a fraction as many as those teams. You can create consistent branding across all of your marketing, messaging and you have a little bit more support.
It's just the main liability there is you don't want to do anything that's going to hurt the user experience. Also, make sure that marketing’s deliverability rates are not going to be impacted. So, you just want to keep that in mind. The last thing you want is an angry email from marketing, so you want to make sure that you're in the loop with them.
The other key things in working with CRM and marketing is making sure that you're sending from a company domain. That can be super helpful.
Keep an eye on how often you're sending things and the quality of your emails. So, is the verbiage you're using consistent, and is it going to get flagged for spam if someone opts out of your surveys? Make sure you're not hitting those people multiple times. That's a good way to hurt your deliverability.
Make sure it's engaging. Make sure that there's some kind of call to action that it makes it easier for them to click into your survey. My little tip here is make sure you have a liberal exclusion policy.
At Roku, we don't contact anyone more than once a quarter, so we have a 90-day exclusion policy so that no one's receiving more than four survey invites in a year. In most instances, people are only receiving one, maybe two in a year.
With legal, there's a lot of considerations. And again, it's not the most fun part, but you have to make sure you're taking care of these types of things. You want to make sure that things like the language you're using in your emails, languages you're using in your questions, how you're connecting the data, where you're storing the data, who you're allowed to contact, is compliant with international policies.
There's so many different things that can be considered, and it's important that you work with legal to understand, through that process, where you need to watch out and what things you need to do. Especially if you're offering sweepstakes or you're asking sensitive questions of things that maybe is part of an unreleased feature. You might need to set up an NDA when you're setting up research vendors to make sure that they're compliant, which we'll get to one of the subsequent slides. It's important that legal's involved throughout this process to make sure you're not in trouble.
The last tip here is policies change, GDPR, those types of things. There's constantly updates to privacy policies and laws. So, it's important that you kind of check in with legal periodically just to make sure that whatever you had set up is still a compliant process.
Then when it comes to the vendors, you just want to make sure that you're protecting your data. This kind of goes hand in hand with legal but is a little bit unique. So just when you're pulling your data through from your platform to a survey platform, you want to make sure that it's secure, that you're not sharing what you're not allowed to, and make sure that data is protected and properly encrypted. Make sure the NDAs and data privacy agreements are all set up properly.
So that there's a lot of things that can go wrong as far as passing PII back and forth. It's important that you figure out the minimum, viable pathway to passing that data back and forth so that there's as little a chance for leaks as possible.
Let's talk through once your program is set up, then let's talk about actually executing an email lifecycle when you're sending these out with the data connected.
There's kind of four steps in this process.
First is when you, as I mentioned in the analytics part, you have to identify the people to pull out and pull that list. The second part is programming that and making sure the data's properly connected and that you're protecting. The next part is when you reach out to the people. The last part is really connecting that data and putting it all together at the end.
So, when you're setting it up, you have to define your audience. Who are the people that you're looking for? What actions do they take? What kind of profile of people are you looking for in this audience? And so you want to make sure you set up the proper parameters to pull those in.
Then on the back end, I've found it's much easier and fewer steps if you actually connect the data that you want to connect to them upfront and as opposed to trying to connect it after the fact. That can cause a number of other issues. So, it's better if you think proactively and think about the data you're going to need when you're doing your analysis upfront. So, when you pull a list of users, you pull it one time with all the data you need, you upload that and then you're done engaging with analytics and with your database.
Once you're programming the study, you need to figure out how you're going to connect that data.
There's the best proxy I can give as to what we do in Qualtrics. There's the field embedded data fields. And so we set up the embedded data fields to tell Qualtrics to pull through the behavioral data columns. And that's how we connect that survey response to the behavioral data, so we can do that combined analysis.
It's important that in however you're programming your study, that you're able to create some linkage between that data and properly pull it in. And then when you contact the users, it's really important, I mentioned this as one of the major advantages of these email programs, that you don't have to ask as many behavioral questions, but you still should ask some because at the end of the day, you want to validate that the person who's answering the survey is the person that took that action. So, we always ask some validating questions.
Also, if you make assumptions that can come off very creepy or big. So it's like we heard you watch this movie last week, tell us about your experience. That's not really how you want to convey things to people and can really hurt your response rates.
Then lastly, just when you're setting up these emails, you really want to create templates that maximize response. We've played around with our subject line, our body and our email domain to maximize how many people respond to our surveys. It's important that you kind of think and pick and choose and A/B test yourself to maximize the response. Obviously, if you can include incentives or sweepstakes, that can be really beneficial at the end of the day, all these things considered.
The pitfalls of this, especially if you're not offering incentives, is usually if you send out a hundred thousand invites, you're only hearing about a thousand responses. It's a one to 2% response rate for these types of initiatives.
Then where the rubber meets the road, where you really deliver on these studies is when you conduct the analysis, and you apply it all together at the end.
As I alluded to when you're configuring your data, weighting is really important. One of the first steps we take when we do any analysis is to make sure that we apply using those behavioral data flags to match the population of who we selected for this study.
And when you're using that behavioral data in your study, it's also important that these are used as filters and not findings. You're only using a subset of that behavioral data, and so it's not something you want to report. You want to report on the stuff that you found and leverage the behavioral data to say why people who are a certain way felt that way.
Then just when you're downloading the data, just make sure you're really careful about any PII or anything else you're passing through because you don't want to store emails or any other sensitive information after the fact on your computer.
So, a few last notes here. There's a few things that we've encountered over the years with these programs and want to call those out.
One, when you're sending out to users versus a panel sample, you can get a much more biased response of the people that are going to choose to answer your survey.
Ours typically skew older. So, look for ways that you can manage your quotas. Offer incentives, deliver surveys in different ways that are going to maximize or kind of diversify who's responding.
Another key consideration is, and I kind of alluded to this as well, but when you're sending a survey, you can have an account versus user problem. So you're targeting people based on the actions they took. But for Roku, there are many people sometimes watching a TV. So, the person who took the action may not be the account holder who's receiving the email. It's important that you align and validate that even if whoever is answering the survey is the one that took the action that caused them to receive the survey.
Then the last consideration is though I mentioned it's typically one to 2%, that can vary a lot based on who you're trying to talk to. If you're talking to your most engaged users on a very specific feature, you might have a higher than 2% response rate. Or if you're talking to people who lapsed and are no longer active on your platform, or you actually probably expect a fraction of that response rate because you already kind of have lost them. So, it's important that when you're pulling the list of people, you kind of accommodate how many people you're going to need based on who you're trying to talk to.
My last note here is just some of the areas that we're trying to continue to optimize.
We're trying to automate a lot of these processes, trying to leverage AI to standardize the reporting or the building of these surveys, things like that.
We're also thinking about different ways that we can deliver these surveys, whether that's directly on our platform or through mobile execution. Again, trying to diversify response and make sure we're talking to all different types of people, but there's more iterations that we're trying to do to make this a diverse and meaningful experience.
Thank you so much everyone. I'll be in the chat answering any questions you have, and please enjoy the rest of the conference. Thank you.