Sara Belt and Peter Gilks respectively lead the Creator and Free Revenue Product Insights teams at Spotify. In this article, Sara will explore the practice of User Research at Spotify, and Peter will lay out how Data Science and User Research work together to drive product decisions.
Part 1. User research at Spotify
Sara Belt, Head of Creator Product Insights
When I say I work in user research at Spotify, folks’ minds tend to travel in two directions: they figure I research either the kinds of music people listen to or the music itself: melodies, harmonies, rhythms, and how they impact people. Because, you know, what else is there to research with the world’s biggest music player?
Over the past few years, Spotify has grown to be much more than that, and the research scope has grown with it. My team, for example, is focused on artists and the music industry ecosystem—how Spotify can help artists grow an audience, express their creativity, and thrive. We research fandom and how it manifests on and outside of Spotify. We study the creative process and the daily hustle of being a musical artist. Spotify’s product insights team is over 100 people strong and we explore topics ranging from intuitive music interfaces for kids to differences in musical traditions of India and Brazil. We study successful strategies for brands to leverage music in ads and how personal taste should be reflected in music recommendation algorithms.
The evolution of Spotify from a music player to a creative platform mirrors a trend we see throughout technology. As the concept of product grows and the way we build them changes, we encounter a set of challenges and opportunities for user research as a practice that requires new tools and ways of working. Let’s take a look at six themes that define research at Spotify.
- Designing interactions between people. We see the rise of products that facilitate interactions and exchanges between people who take on very different but, often, intertwined roles: teachers and students, hosts and guests, drivers and passengers, doctors and patients, writers and readers. At Spotify, we design experiences for music lovers, artists, small businesses, record label employees, and internal playlist editors who all cross paths within a single product. Beyond deeply understanding “the user”, research is faced with complex optimization questions across different types of users with diverse incentives, needs, and abilities. Instead of focusing most of our energy honing the interactions between people and technology, we work on understanding and enabling interactions between people.
- Moving beyond transactional products. Products are growing outside the transactional box: tools for a job that can be perfected for easy task completion. Technology is increasingly designed for lingering. It shapes our culture and our society as it creates new behaviors and new economies. The palette of traditional design research methodologies is optimal for building tools that automate a known task but can fall short when designing experiences that are intended to facilitate a wider sphere of life or culture.
- Studying implicit interactions. The practice of user research has shifted from a core focus of understanding explicit user interactions to understanding implicit motivation and contextual expectations. In addition to interacting with a device through tapping or typing, technology is increasingly experienced in the background without explicit attention or intention. This, of course, is nothing new; from ubiquitous computing to internet of things, we’ve been talking about pervasiveness of technology for decades. What is new is that those ideas have traveled from academia to the everyday jobs of user researchers. Spotify is a background experience: the majority of the time Spotify is consumed through a phone in the pocket or on the speakers in diverse contexts. This means that, often, the only signal we get from the user through logs is what is being played. Research in the lab, or even the field, requires creativity and investment in longitudinal and naturalistic techniques, among others.
- Personalizing experiences. Through expansion of artificial intelligence, a sea of new opportunities opens up not only for experience design but user research alike. We are beginning to abandon the idea that a product is singular and static—one average experience catering to diverse needs. Instead, the research question becomes: “what are the dimensions that make our users different?” One of the distinctive experiences that Spotify is known for is its personalized playlists like Discover Weekly and Daily Mixes. Even though machine learning and algorithms play a central role in creating such playlists, what ultimately matters to the listeners is whether that playlist resonates with them in a given moment of time. Achieving resonance requires an understanding of why people listen to music, how they intuit a good listening experience, and how this understanding can translate into ML models that provide personalized experiences at scale.
- Rapid and incremental cycles. Development of digital products, nowadays, is characterized by nonlinearity and speed. Instead of long development cycles resulting in the release of complete products, atomic components of products are released quickly to accumulate learnings about the intended and unintended outcomes. Having the opportunity to conduct extensive field research prior to a release is rare, but the need of understanding and representing people comprehensively throughout ‘piecemeal’ product launches is more necessary than ever.
- Accountability through measurable outcomes. Measurability of outcomes, and the immediacy of the feedback loop, have a significant impact on our research practice. Researchers not only have to become comfortable with quantitative data and learn to triangulate that with qualitative insight, but also are now empowered and accountable for producing hypotheses that can be tested at scale within weeks or months.
Given this context, what are the strategies we employ in building a research practice at Spotify?
We believe in mixed methods and diverse teams. Not only have we taken the step to merge our data science and user research teams into one (Peter will describe exactly how in a bit), we are investing in heterogeneity within the insights disciplines as well as literacy and collaboration across them. The most important part of an insights practice is our ability to identify the right research questions and then lean on the community for collaboration and methodological expertise. We believe that mixed methods yield comprehensive answers: blind spots and caveats in specific approaches can be tackled through mixing methods. Triangulation allows us to have greater confidence and richer insights than is possible to achieve through a single method alone. We aspire to form a comprehensive narrative of what we know about the current and future users of our products rather than methodologically siloed insights.
We invest in creativity and experimentation. While maintaining the standard for validity and reliability, we encourage and celebrate creativity and experimentation with research designs and tools. Rather than mandating an insights process, we frequently try out novel approaches to gain insight into types of questions we haven’t explored before, or shed light into parts of the experience we haven’t studied in the past. We accept failure and expect inconclusive results from time to time—and believe this to be a valuable part of maturing and learning as an insights organization.
We double down on frameworks that describe our users and on storytelling around the insights. Producing high quality insights is one thing, dispersing that into an organization in a way that evokes empathy, changes minds, and sets direction is another. Writing and reading reports is expensive and large insights organizations produce overwhelming quantities of data. We invest in mixed media storytelling, interactive insights, and employ typologies and illustrations that attempt to encapsulate knowledge beyond isolated findings.
Part 2. The power of cross-disciplinary teams
Peter Gilks, Head of Free Product Insights
Our approach to insights at Spotify is centered on our belief in triangulation, in mixing methods and in cross-pollinating ideas by bringing together Spotifiers of different backgrounds and expertise. We are highly invested in this approach and reflect it in our organizational structure.
At Spotify we have two important and differentiating approaches to how we approach insights for product development; our Data Scientists and User Researchers form a single discipline that we call Product Insights and Product Insights is not a centralized function. Our insights teams are embedded with product teams so that we can work alongside product managers, designers and engineers seamlessly.
I will expand here on the first point and talk about how this approach works in practice and the benefits it brings.
Building a cross-disciplinary insights team
A typical Product Insights team at Spotify contains people who, in most cases, fall into one of two disciplinary families—User Researchers and Data Scientists. In our case we make no title distinctions between people with different expertises within these groups and you will find individuals with different skill sets. Some of our user researchers will be experts in evaluative research methods, some in formative methods, some more qual and some more quant, but most will use a variety of tools as we believe that defining the right questions comes before selecting how they will be answered. Similarly our data scientists wear a number of hats and can be found running A/B tests, building data pipelines, conducting exploratory analysis, building inferential models and designing visualizations.
There are a couple of very important things that are consistent across these roles. First, the primary job of everyone in Product Insights is to drive evidence-based decision making. This at the end of the day is the goal, the methods are just how we get there. Second, all insight teams have both data scientists and user researchers working together and that reporting lines are not split by discipline. We have managers who came up through a data science route leading user researchers, and vice versa. In fact, the two authors of this article typify our approach—Sara comes from a user research background and I come from a data science background—but we lead similar teams and we are both proactive in encouraging cross disciplinary work and learning.
Generating hypotheses through qualitative data
Of course, organizing people from different disciplines into a single team is a good first step, but what does working together look like in practice and what benefits do we see? One type of collaboration that is common is the use of qualitative findings to generate and clearly express hypotheses that we can then test quantitatively.
Testing hypotheses is one of the most fundamental functions of our Product Insights teams. We don’t want to invest in developing a new product or feature without testing the waters first, and we certainly don’t want to deploy it to production without thorough A/B testing. But a good hypothesis doesn’t just come out of thin air—it’s backed by evidence and it is clear about the expected effect to be measured.
Now you can of course achieve this with separate teams, but there are many advantages to working together. As the qualitative findings develop, the data scientist can begin searching for data points that reflect the human activities we are witnessing. This leads to four subsequent advantages:
- The time it takes to start any A/B testing is reduced as we are already preparing data sets and building pipelines.
- If the data can be used to measure the impact of an issue we witness from a handful of users in person in terms of millions of users, we can make the argument to act much more compelling.
- Our data scientists and user researchers don’t have to ‘stay in their lane’. If they have ideas that might improve the impact of each other’s work, they will share these ideas and debate them naturally as part of a team, iterating on each side of the research as they go and developing as professionals.
- We develop a shared language for discussing insights that is focused on outcome (e.g., does this feature drive product satisfaction?) rather than method (e.g., run a logistic regression on this satisfaction data)
Explaining data through rich qualitative analysis
Another typical example of cross-disciplinary work is when we see something in the data we don’t really understand and we need user research to investigate what is really going on.
This can work sequentially—for example we might be doing some exploratory data analysis and spot that a certain group of users is behaving in a way that differentiates them from the majority. We may have some hypotheses as to why this is happening, or we may simply have no idea, but we identify and target users whose behavior displays these traits and conduct research with them—perhaps through usability sessions, through a diary study or through a survey. Our findings can have a profound effect on what we decide to do with the product and how we expect it to perform.
Now that we’ve done this a number of times and have some idea what to expect, we often preempt the need for follow up research and plan ahead as best we can to run in parallel. As an example, we recently performed some research on skippable ads, a test feature that we recently rolled out to our entire Australian user base. To understand how users would discover, perceive and adopt this feature we conducted a large-scale diary study. But instead of going in to this blind, we were able to sample a range of users whom we knew from our data had differing behaviors and we could ensure that we asked the participants to talk about some of the more interesting points we could see from their log data as the study progressed.
Representing users through storytelling
In order to ensure evidence-based decision making we need not only to collect impactful insights, but also to present them, and the importance of each one to our user base, in a compelling way.
Here the cross-disciplinary nature of our team really shines. When working with our colleagues in design, engineering and product management roles, our usual approach to knowledge sharing is to give the holistic picture whenever possible. Charts are accompanied by videos, survey results go side by side with A/B test results, quotes live alongside metrics and key takeaways are built from a triangulation of sources. This has great power in building a case, helps to engender user empathy and means that our findings appeal to a broad audience who may among them have different learning styles or be most interested in different aspects of the findings.
I would also add that this has been one of the most enjoyable aspects of managing a cross-disciplinary team. Seeing the creativity that is used in bringing these insights to life, with video, audio, interactive web design, data visualization, illustrations and more, is a delight and enables all of our team members to bring their own personality to what they deliver.
Even with all the benefits listed above, we continue to refine and tweak our practices. If you’d like to talk about our experience in more detail or share your own, please keep an eye out for us at EPIC2018 in October and we’ll be more than happy to chat.
*Editor’s Note: Spotify is an EPIC2018 Sponsor. EPIC is a volunteer-run, nonprofit organization and the support of sponsoring organizations makes our incredible international conference possible. Conference papers, case studies, PechaKucha, ethnographic film, and gallery installations are selected through a blind peer review process managed by our conference committee, which is independent from our board and sponsors.
With a background in engineering and human-computer interaction, Sara Belt leads the research team that powers Spotify’s product development for the artist community; helping artists grow their audience, express their creativity, and thrive. Prior to Spotify, Sara held research positions at Microsoft, HERE Technologies and at Nokia; exploring topics ranging from future of productivity to how people navigate the world with maps.
Peter Gilks leads the insights team responsible for growing the user and advertiser ecosystem of Spotify’s free tier. His background and passions lie in utilizing the power of statistics and data visualization to better understand the patterns and behaviors we observe in the world. Prior to Spotify Peter held positions at Imperial College London, Barclays and Slalom Consulting. Orginally from Northern England, Peter now lives in New York City and spends his free time playing pinball and learning the guitar.
Related
Data Science & Ethnography, Tye Rattenbury & Dawn Nafus
The Ethnographic Lens: Perspectives and Opportunities for New Data Dialects, Elizabeth Churchill
Mixed-Method Approach for Identifying Emerging Fields and Building Collaborative Teams, Therese Kennelly Okraku et al
0 Comments