Case Study—This case study provides an inside look at what occurs when methods from the data science and ethnographic fields are mixed to solve perennial customer service problems within the call center and cruise industries. The paper details this particular blend of ethnographic practitioners with a data scientist resulted in changes to design approaches, debunking myths about qualitative and quantitative research methods being at odds and altering team member perspectives about the value of both. The project also led to the creation of innovative blended design research and data science methods to discover and leverage the right customer data to the benefit of both the customer and the call center agents who serve them. This paper offers insight into the untold value design teams can unlock when data scientists and ethnographers work together to solve a problem. The result was a design solution that gives a top-performing company an edge to grow even better by leveraging the millions of data records housed in its warehouse to the benefit of its customers.
It is a capital mistake to theorize before one has data. Insensibly one begins to twist facts to suit theories, instead of theories to suit facts.
BACKGROUND AND CONTEXT
Anyone who has ever called a call center more than once knows the process can be painful. Customers call with a quick question only to be forced to provide a slew of non-relevant information. Most often they’re forced to give information to automated systems. Once their question goes unanswered; only then do they get to talk to a person. But by then they’re forced to repeat the irrelevant information to a call center agent before they can actually ask a question.
Even getting a human agent doesn’t guarantee the question gets answered if the information needed isn’t at the agent’s fingertips. Good luck if that call gets interrupted. The customer calls back and the frustrating process begins again. The process can be equally unpleasant for call center employees. Call center workers routinely complain of customer aggression and one study shows 1 out of 5 calls to contact employees are from angry customers, an average of 10 calls a day. The aggression from customers has risen over the years, so much so that the turnover rate within call centers has exploded from 19% in 2008 to 24% and rising today. (Dixon, Ponomareff & Turner 2017)
For every call center interaction between an agent and a customer, there’s an equal and corresponding data creation and storage action. Each phone call a customer makes. Every email an agent sends to a customer. Each time a customer goes to a website and enters the search words of some product they want or exotic local they want to visit data collected and often stored. All of these actions create data points. These data points, or digital bits as they’re known by the industry, make up what’s called the digital universe – all the bits of data that are created, replicated, and consumed by people and businesses. (Gert & Reinsel 2013) But even though all this data available, call centers can’t seem to recognize who is calling, why they’re calling or how to more quickly solve their problem. This not only frustrates customers but annoys agents as well. It’s like there’s data everywhere but not a drop that’s useful. It’s a common problem across industries. A lot of data is being created, collected and stored but that data is incredibly difficult to track back to individuals. There’s many reasons for this, but one of the central reasons is that realistically, 91 percent of companies globally report they have inaccurate data, according to a survey of more than 1,200 global companies on data quality commissioned by Experian. (SourceMedia Research 2018) The most common ingredients that lead to data inaccuracies include incomplete or missing data, outdated data and wrong data.
In addition to the data being notoriously inaccurate, data on individuals are collected through multiple channels that are rarely connected. A company may have data on one individual collected from at least three different channels, including email, a mobile application, a phone call or a text. A customer’s information is split piece by piece into three, four, five, even 10 different databases, never giving companies a true “360-degree” view of the customer.
Cruise Company’s Customer Service Woes
This challenge of having lots of information but not really knowing how to use it, is one many companies face, including, recently, a major international cruise line. Like many companies, the cruise line had collected millions of data records through multiple channels on customers and potential customers. Yet despite this, customers were still experiencing immense frustrations when booking a cruise. Though wildly successful, the cruise line’s parent company’s new CEO wasn’t entirely happy. Rookie cruisers set on having fun in exotic locales, were finding booking cruises extremely difficult and not fun at all. The company’s CEO decided to conduct some “mystery shopping,” calls to his company’s call center. He didn’t like what he experienced. So, the CEO of a large cruise company issued a challenge to the largest of its operating cruise lines – improve the customer service. The design challenge was to improve the cruise line’s customer service so that the booking process mirrored the fun cruisers had onboard the ships.
The company believed the solution was to be found in their call center. There was a list of metrics they wanted to improve including average call time, handle time, how many repeat calls etc. Fix that, they said and the problem goes away. There was also a cursory interest in using customer data to help improve the customer experience. The company had several work streams focused on familiar territory of trying to piece together a “360-degree view,” of their customers. In an effort to improve its customer service efforts the cruise company hired the global design firm IDEO.
IDEO teams working on the project would soon discover that to truly solve the problem, to make sure customers felt heard, understood and confident when buying a cruise, the solution lie not just with data, but with people as well. During the first of several research phases, the team discovered that possessing data alone, it seems, couldn’t solve the very human desire to be heard. A challenge most companies face is how to surface the right data, to the right person at the right time. That’s why companies crave the consult of a data scientist. They seek someone to take largely unorganized and unruly data sets, wrangle them and come up with some salient insights that will give them a business edge. But the team discovered rather quickly that data wrangling wasn’t the answer; at least not at first.
Some Level-Setting Definitions
Before explaining the methods used during this particular design project it’s helpful to do some level-setting definitions. For this paper, let’s define data science and data scientist as it is practiced at IDEO. Data scientists at IDEO are adept at shaping data as a resource for human-centered design. They sketch in pencil and in code, shape and reshape data, know what data can and can’t tell us, help design feedback mechanisms, and prototype machine learning algorithms. They do all this to improve human experiences through the design of intelligent products, services, and systems. Data is the paint they use in the art of creating intelligent products.
The most easily understandable example of how data scientists and designers come together to create products is The Nest. This intelligent thermostat system developed by famed Apple designer Tony Fadell, and now owned by Google, is a smart device. The Nest processes data from a number of places including the Internet, the body heat of people inside a room, and adjusts a home’s temperature accordingly. To create such smart devices, a company needs people who understand and can develop devices that use data to sense, listen (adjust) and then act. These people are data scientists. Generally, data scientists are seen as people who can wrangle large data sets. But data scientists can also be designers, people who use data as their art to create a different, more improved world. And yet, like most designers, data scientists often find it can be difficult to know where to start, because without context, millions of data bytes sit dormant unable to be rendered useful. That’s where ethnographic research can illuminate pathways forward.
For this paper let’s define ethnographic research, in the context of design research. In this context, ethnographic research is the study of how people live their lives in order to better understand their behavior, motivations, needs and aspirational wants to inspire new design. This approach is comprised of various methods including interviews, observations, role-playing games and journey mapping. This paper will show that using ethnographic research at the very beginning of the project, transformed the data scientist’s initial assumptions about how to use data to solve the problem.
METHODOLOGY
To better prepare the team for upcoming field research, the design researcher on the project did a literature review about the cruising industry and cruiser. Using that information, she created an empathy exercise to help the team understand the reality of people who booked cruises over the phone.
Only two out of the team’s six members had ever been on a cruise. And no team member had gone through the entire cruise booking process. Among the team, and in the broader media, the typical persona of someone who goes on cruises is a silver-haired, often retired, affluent couple looking to relax. A quick literature review of the cruise industry painted a very different picture what team members had envisioned. The average cruiser is a 49-year-old employed, married, college graduate with a six-figure income. (Cruise Lines International Association 2017) In addition, “Generation Xers,” and “Millennials,” were fast outpacing Baby Boomers as cruisers. In one study, Millennials were twice as likely to have taken a cruise than their Baby Boomer parents. (Sheivachman 2018)
Repeatedly, in articles about cruising the idea of seeing multiple locations without ever having to unpack a bag, made cruising appealing to all age groups. Exploring the data collected by the client really couldn’t get to the central question, why was booking a cruise so darn difficult in the first place? Once that question was answered, the team felt confident they could design a satisfactory solution.
IDEO conducted two large-scale projects, Project Traveler* and Project Board* (not the projects real names), in multiple phases and with multiple design teams, to diagnose, prototype and design solutions to create a better cruise booking experience. There was 27 weeks of work though the projects spanned more than a year. Only two designers remained constant throughout all phases – the data scientist and the design researcher. The design teams used iterative research and design approaches that included a variety of methods. The project essentially had two user groups: customers who were booking cruises and call center agents who were helping them to book the cruise. The complexity and multi-variant challenge to the problem required an extensive array of design and data science research methods to illuminate a path toward design solutions.
First Phase: Diagnosing the Problem
In the first phase, which lasted five months, the team conducted in-depth interviews with first-time cruisers, cruise-curious people, repeat cruisers in their homes as well as talked to cruisers onboard a cruise to understand their pain points and challenges as well as aspirations when it came to booking a cruise. These interviews were conducted in three rounds before, during and after concept design. During the interviews, team members used card sorting, and role-playing games to help gaining a better understanding of customers’ beliefs, motivations and aspirational needs before, during and after the booking process. The number of customers and experts interviewed during this phase totaled 20.
For its second user group, call center agents, the team not only conducted in-depth interviews but also did on-site observations and multiple co-design sessions with agents. For the on-site observations, teams split off into twos, and sat with agents at their desks while they took phone calls. Using headphones specially made for listening to agents while on calls, team members were able to hear both customers calling in and agents’ responses. The team made multiple trips to the client’s call center to listen in on phone calls. Interviews with agents had to be kept short, because every minute agents were taken from the floor to talk to the team was a minute they lost the opportunity to make money or serve a customer. Because the team didn’t want to inconvenience agents or disrupt their job, the team leaned heavily on observation to learn what was needed.
In addition, the team conducted several co-design sessions one of which included creating a gallery walk of proposed design concepts that was installed on-site at the call center for a weeklong feedback session. Agents were able to view design concepts at their leisure and vote on the ones they liked the most. The team interviewed 40 agents in two rounds of on-site observations and interviews. To help client executives and agents better understand insights gathered through field research with consumers the team also conducted several role-playing and empathy exercises.
Second Phase: Testing Solution Prototypes
Once design direction became more concrete (read about how and why in the Findings section below) the team also conducted two live prototyping sessions. In the first phase, the team and the client designed a two-week prototype which altered the organizational structure of the call center to better meet customers’ needs. In the second phase the team conducted a large-scale usability test to help solidify the desire for a propose software solution. The team tested a proposed new software application live with 11 agents using anonymized customer data. To create ensure that our usability test with agents were as real as possible the team used real customer data that had all personal identifying details expunged but still had real information that agents would recognized. The team created various behavior typologies that we reconstructed from real customer calls and hired four actors from the Chicago-based Second City Comedy Troupe to embodies those behavioral typologies and into the call center to book a cruise.
Agents knew they were testing a new software application but they had no idea the calls weren’t real until after the “customers,” hung up. The actors didn’t receive scripts, rather the team used various behavioral data points to create behavioral frameworks for actors to embody when they made the phone call. Four behavioral frameworks were constructed from anonymized customer data the team collected and analyzed from the client. The behavioral frameworks used during the agent user testing phase stemmed from a new mixed-method approach developed by the data scientist and the ethnographers on the team.
Mix Method Creations: Data/Human Journey Mapping
It is a customary method for data scientists to map where data is housed and how it flows within throughout systems. (Loukides 2018) This is done for a variety of reasons but most acutely so data scientists can pinpoint how data is created and transformed as it is moves from one system to another. Mapping data flows helps to pinpoint flaws, biases and errors in data. It also can yield opportunities for creating new models that help to take what is known as unstructured data, data that is notoriously difficult to analyze and make sense of, and turn that data into something useful for an intelligent model. Plus, it’s just basic to know where data comes from before a data scientists starts massaging and working on it.
For example, a data scientist may map a system and see that a company may not keep track of customer complaints using software but does tape all its phone calls. Locating the audio recordings within the company’s systems is paramount for a data scientist who wants to detect patterns and anomalies in customer speech from those phone calls. While the data scientist may request all the audio recordings, who exactly stores and archives them may be a missing point of information. Often who enters and extracts data is left out of the data mapping process.
During this project, the team decided not only to map the data, but map who created, transferred, transformed and extracted the data. In addition, the team decided to map at which point in the data flow, data was created, transferred or transformed and for what purpose and, of course, by whom. This method was dubbed “A Data/Human Journey Map.” This detailed mapping not only told the team what data was being stored, but where it came from and how it was used by agents. It also allowed the design team to pinpoint the exact data that was most important to both customers and agents, the systems where that data was housed and became a key part of the resulting design of a new software application. Data mapping becomes an essential ethical exercise when you pair this flow mapping with people. A data/human journey map is simply a journey map that shows the input, transference and transforming of data throughout a company’s information technology systems and who creates, touches, transfers or transforms that data as it flows through the system. Creating a data/human data flow journey map from the user’s perspective is best when ethnographic methods such as observations and user interviews are combined with data exploration. While journey maps have always included people and systems, this map included people, systems and data, actual information points created, stored and retrieved by customers and agents as part of the journey.
For example, when a customer called and gave their name and where they were from, it was mapped to the database or system that information was transferred to or retrieved from by an agent. These systems roundtable interviews were also key as they helped the team to determine what data was stored where within the company. Those roundtable discussions gave the team insight into the current data and systems the client used. It also allowed the team to find ways to leverage the client’s existing stored data and systems in a new and innovative way without significant expense or technology investment. This would prove key in getting buy in from the client on design recommendations.
“Having a data scientist on the team, as a form of research with the understanding and the ability to leverage what data exists was huge,” the project leader said. “As designers, we always think to start from scratch. But the [data scientists] pushed us to think about what data we could lever to make the experience better. She pushed us to think what are the systems that our client is using, how do they operate. Normally we wouldn’t consider that until later on.”
In all, the team spent the better part of 18 months researching, concepting, prototyping and user testing more than 21 different design concepts. Teams spent interviewed or observed more than 70 stakeholders, customers and agents in homes, offices and onboard a cruise.
GENERAL FINDINGS
The team’s in-depth interviews quickly lead them to discover a universal truth–no one cruises alone. The complexity of the booking processes compounds this truth. Even if a person is going on the cruise solo, he or she usually has to consult someone else to make the decision. One person may be on the phone, but to make the final decision it could be more than 10 people involved. The team imagined the cruise booking process was simpler than it actually was. After conducting interviews with both first-time cruisers and those who had not been on a cruise yet but were actively looking, it became apparent that cruising ain’t easy. Take Michelle (not her real name). Michelle is a 37-year-old mother of two who lives off the Florida coast. The team interviewed her at her home just as she was in the midst of booking her next cruise. Even though she has been cruising since she was a teen-ager, Michelle says she’s still a novice when it comes to booking a cruise.
“While I’m booking [a cruise], I’m stressed,” she said softly as the team sat at her dining room table in rattan chairs. The stress of the process weighing on her face as she dictated all the decisions she had to make. “I need to figure out the date, the room type, do they have it, is it the right port that I want to go to, so it’s kind of stressful. “I’m not an expert,” she continued. “I’ve done a lot of cruising, but not so much recently. I know about the tipping procedures and what you can bring on board and what you can’t. I mean, I think I do. Who knows? It might have changed since then.”
Even though Michelle had been through the booking process repeatedly, she had been on nine cruises, she still felt like a novice. Her insight that the booking process is ever changing making it difficult for anyone rookie or not to navigate was particularly inspiring to the team. The team sat in Michelle’s living room, outfitted with rattan chairs and a beautiful glass table, covered with a white sheet to protect its shine, as she made a phone call to a cruise line. She said she wanted to call two different cruise lines to get pricing and availability for her next cruise. On the day the team went to interview Michelle, she called the client’s cruise line. (Incidentally, researchers knew about the call before and asked Michelle’s permission to listen to it. It was totally coincidental that Michelle called the client’s cruise line.) The call was painful. Cruise line stakeholders visibly cringed as they listened to the Michelle struggle to get her questions answered. She had to repeat her destination desires multiple times and never really did get the answers she was seeking. Shortly after that call, she called another cruise line, the client’s competition. The call went better, Michelle thought, but she was still unsatisfied.
“I feel okay. I don’t feel fantastic with either of them. I guess I feel better from this call. The other guy seemed just kind of like, you know, wanting to talk about his own trip. … But both of them, I don’t have all my questions answered, so…” and her voice trailed off in soft disappointment.
Cruise Booking: The Super Bowl of Decision-Making
Michelle’s experience was hardly isolated. As the team sat in its project space and debriefed each interview with cruisers, the room seemed to get brighter from the collective light bulbs going off. Selecting the right cruise is the Super Bowl of decision-making. The complicated booking process turned the simple information processing model humans unconsciously do in milliseconds to say, decide whether to buy that new dress from Amazon, into a long drawn out string of second-guesses and “I don’t knows.”
There was the destination decision. The room decision. Add on to that the decision about the location of the room – front or back of the boat? Need two beds or four? Want a view or not? What about dinner? Eat at 5 p.m. or after 8 p.m.? Double those decisions when taking along a friend or a spouse. What about the kids? Don’t live in Florida or one of the many coastline states where cruises embark? On average, people who cruise take between three and 18 months to plan and book a cruise. (Cruise Lines International Association 2017) That’s an extremely long sales cycle. Add airline and departure information to a list of decisions and the process gets even more complicated. In all, a rookie cruiser would have to make about 10 decisions, including how much to pay just to get past the basics. And that’s not even including what to do on the cruise. Imagine doing that on a phone call in between lunch breaks at work?
In addition to talking to rookie cruisers, the design researcher on the team also conducted an autoethnography experience, reflecting on the exceedingly long and difficult journey she took to book a cruise and rooms for all team members. (Four members did end up taking a cruise from this exercise.)
In the age of Kayak, Expedia, Priceline and Orbitz who have made traveling as easy as ordering a pizza, booking a cruise remains maddeningly difficult. Some quotes from our field research:
“It’s overwhelming.” … “It was bad.” The first time was bad ‘cause I got frustrated…Because I thought you could just go online, Google, there you go. No, it’s not like that.”
“My head’s hurting. You know like you’re reading something so long, or you’re studying so long that you give yourself a headache? Just put everything to the side and I’ll come back to that.”
“[With groups] …it’s like herding cats…too much frustration. I decided not to do that anymore.”
Processing what they had heard from the field and the reflection of a team member’s own experience booking a cruise, it was at that moment It was at this moment, just weeks into a multi-month project, that the team’s data scientist, would plant the seed that grew like a foundational vine unifying one of the team’s eventual design solutions.
“If there is an entire space of possible ideas and how to get there, the way that a data scientists would think, we would be to say ‘What are the pie in the sky ideas? What does the future look like?” Ann* the team’s data scientist quipped. “But this is where having a collaborative team with multiple disciplines including ethnographic researchers can make a difference and can come into play. The collaborative team, comes at it from a point of where are the human pain points followed by where are the business opportunities and data comes from it at another angle, how can we make systems more intelligent. We’re looking through all those lenses together to serve a very specific human problem.”
The field interviews yielded some interesting “truths” about customer exchanges with cruise call centers:
- Rookie cruisers didn’t book their cruises in one call. They often made repeated calls because questions popped up as they learned about cruising.
- Rookie cruisers rarely talked to the same person when they call about their trips which forced them to have to repeat the same information multiple times
- Rookie cruisers were often calling for advice but got a sales pitch instead
After the field interviews, the team was left feeling as frustrated as the customers it interviewed. Why was this process so difficult? It didn’t take long to decide that it wasn’t the website. It wasn’t the app. It wasn’t the call center. It was all of the above. The team couldn’t wait to reconcile what they heard in the field with what went on at the client’s call center. And they were able to do just that when they visited the company’s call center for some marathon observation sessions.
Ethnographic Methods Makes Intelligent World Understandable
Walking into the bright, spacious building of the cruise line’s call center was akin to jumping on a merry-go-round going full tilt. Dozens, and dozens of call center agents sat row after row, some standing, others leaning on their chairs some bouncing even, as they cheerily but assertively, talked to customers about booking cruises. There were shouts, and bells ringing when cabins were booked and sold. Leaderboards adorned the entire back wall as names of top sales agents beckoned the competition to come and get them. Sales coaches and managers paced the floors walking up and down the cubicle aisles giving advice. It was a call center, filled with extroverts, deal-making and fun, LOTS of fun. Timing and scheduling forced the team to be on site at the call center for just three days. Yet, that was enough time for center’s infectious energy to engulf the team. Being on-site at the call center allowed the team to see the company’s culture, understand employee incentives and see where service played a part in the sales journey. Listening to frontline agents take and make calls to customers was, by far, one of the most lucrative research activity the team did. The phone calls told a truth that was buried, hidden and obscured by disparate systems, human nature and a lack of transparency. After a couple of hours of listening to phone calls; it was clear that calling into a cruise line’s 1-800 number was an adventure. But not a fun one. It didn’t take long listening to agent’s phone calls with real customers to hear echoes of the frustrations expressed by cruisers during our earlier field research phase.
Team members noticed that the calls weren’t great for agents either. When the customer called in often agents didn’t know anything about the customer, beyond name, address and phone number. Agents would have no clue the customer called in yesterday asking questions about a Caribbean cruise. They wouldn’t know that the customer was afraid of flying and booked a hotel two days in advance from their cruise date and needed directions from the nearest train station.
Ethnographic Methods Makes Data Science More Powerful
Adhering to the research plan which prompted team members to document information exchange and transfer between agents, customers and systems, members noticed agents would have to search through multiple software systems to retrieve crucial information about a customer’s history with the company. This lack of understanding about who was calling into the call center and why was a major roadblock to the customer experience. It led to long wait times as agents sifted through a half a dozen software applications to find the exact information needed to help the customer. It was also obvious that enterprise systems were a barrier to agents having information readily accessible to help customers.
During these observations, the team also discovered something that never would have been found through data exploration alone–the sheer amount of relevant customer data that was being captured by agents but not stored by the cruise line. The team observed frontline agents repeatedly recording information from customers in a notebook or on their desktops in Notepad or Word documents, only to delete or throw away that information after a call was completed.
Sometimes the information was thought to be unnecessary. Other times the agent didn’t really know what to do with the information. And sometimes the agent didn’t see the information as valuable because it didn’t lead to a sale. We also saw agents using these notebooks to “double check,” their sales. Though all sales were marked in their system when a customer booked, some agents weren’t sure they were getting credit for all their sales. So, they recorded every sale in their own notebooks. This let the team know that agents didn’t trust their systems to record data accurately. Coincidentally, many customers didn’t either.
This observation was something the data science designer found invaluable to see before crafting an intelligent model using data for a solution. Such human data exclusion and manipulations are virtually undetectable during a data science usual methodology for dissecting and analyzing data. Data scientists usually only see the output of data. They see data without context. Seeing how data is created and transformed by humans is essential to devising useful data-driven solutions for businesses. Paring what the team saw in observation with the data mapping exercise allowed the team to see opportunities for design. This map pairing allowed the team to see the system barriers agents and customers had when creating or extracting information for use. Mixing the traditional data mapping exercise with the human data interaction exercise allowed the team to actually “see,” how data began and ended in the client’s technology systems. It was also great way to see how user data was actually used by agents. Visualizing this journey was key to finding the root cause of the pain points both agents and customers felt during calls:
- Agents spent a lot of time on calls manually pulling customer information from multiple applications
- Agents struggled to build rapport with customers because personal information about customers was missing or incomplete
- Agents struggled with better visibility into customers’ past interactions with the company forcing customers to repeat information frequently
- Agents wanted to take ownership of their guest relationships but feel they had to depend upon systems they didn’t wholly trust.
After the observations, the team developed a “behavioral data map,” that allowed it to see the connective tissue the customer’s information traveled through during a phone call. The team discovered that the customer’s information was split between three main information systems and it took considerable amount of effort and system know-how to connect them to give a coherent and contextual understanding of a customer’s need, if the agent could do that at all. The team could pinpoint design opportunities to leverage existing data to the benefit of the customer. Without the on-site observations, the data flow mapping would have given an incomplete picture to the team.
“Mapping the people to the data and the journey both take with data makes it easier to identify data priorities, redundancies and how data is transformed and changed throughout the system,” said Ann*, the data scientist on the project. “This gives data scientists the edge they need to focus on the information that really matters and find opportunities to augment human intelligence, instead of just taking a bunch of data without context and plugging it into some new technology model and seeing what comes out.”
Frontline inbound sales agents, who were the agents that answered the cruise line’s 1-800 number were often the ones who recorded the data. It was incredible the amount of personal details these agents were capturing on customers that never ended up in the cruise line’s databases. Information gathered on calls where a customer did not actually book a cruise was just “lost,” preventing other agents from “picking up where another agent left off,” when a customer called back.
“For a company that is very focused on conversion and selling quickly for them to have an agent to capture that information and take the time out to capture that information is huge,” Cheryl*, the project lead said. “As designers, we need to show them the value, to show them how much that information is worth and why it’s worth collecting.”
Using Empathy for Paradigm Shifts
Because the client collected a lot of data about its current and prospective customers–basically anyone who contacts them through their call center or website–the team determined that leveraging that data that lay dormant and often inaccessible to agents could go a long way in shoring up some of their customer service issues. But just like there was a separation of sales and service agents, there was a separation of sales and service data. This left the data the company collected on consumers as fragmented as the service experience.
Still one truth remained–ethnographic research revealed that leveraging data alone wouldn’t solve most of the customer service problems. Optimizing the company’s ability to use data without changes to the culture and organizational structure wouldn’t give agents the freedom and incentive to use that data to serve customer. Infusing agents who were focused just on sales and not service with customer data might benefit the agents, but it would do nothing to improve the customer experience. But how to tell a client expecting some tweaks to a script or a series of new training modules that they needed to change their entire employee organizational structure before they could give their information data systems a modern facelift to serve customers better? Not an easy task. But the team decided that building empathy for the customer’s pain would give the client a sense of urgency that made them conducive to large-scale change.
After returning from the research trip in the field the team holed up in a project space to process what they learned. The team had just a few weeks to synthesize what we had learned. Plastering quotes and notes from our field debriefing sessions the team holed up on their project space to find themes and patterns. In addition, the team’s researcher branched off to read transcripts and listen to audio from various observation sessions the team witnessed. This dissection of transcripts lead to the “Anatomy of the Call,” and exercise that broke down customer calls to agents, section by section marking information and data used during those called. This allowed the team to see what data was most frequently requested and used during phone calls. Subsequent interviews with agents and data analysts and software engineers confirmed the team’s hunches on information most used by agents and most requested by customers. In addition to synthesizing data capture, the team also worked through patterns found in interviews from customers and listening in on calls. Clustering quotes and deconstructing their meaning, the team centered around four actionable insights.
- People want support that acknowledges who they are and adapts to where they are in the journey.
- People seek confidence in their decisions, and rushing to commit before they’re ready makes them feel insecure.
- People are delighted by a streamlined process and deflated by repetition.
- People naturally collaborate during the travel planning process, but they feel overwhelmed by managing this task
The team felt the insights lead to good design opportunities and was excited to share these with the client. But team members knew it would be difficult for the client to act upon the insights without winning the hearts and mind argument. The idea of sales and service being separate was entrenched in the client’s culture. The team had to figure out a way to get the clients to see how disruptive this split was to the customer experience. The stakeholders needed to see their roles and the roles of the agents within their organizations differently, from the customer’s view. Sharing what customers said was one way, but the design researcher and the team wanted their clients to actually feel the barriers that guest felt when accessing their call center. So in true design research form, the design team designed a series of empathy exercises that put the client core team through a one-hour timed game that simulated the way their customers felt. In this newly stylized version of television gameshow “The $10,000 Pyramid,” players were positioned back-to-back and were asked to guess words based upon clues given by their partner.
One partner could see the word and had to give clues to the other partner, while not saying the actual word. On average, there were three to six words to guess. Like a phone call to a call center, partners couldn’t see each other. Each round got harder as the rules changed. The “rules,” mimicked many of the barriers customers encountered when calling the call center, the communication constraints (can only give three clues), the system constraints (players couldn’t skip words) and, of course, time constraints. Players were rushed to make decisions. The giggles were equally matched by groans as player after player failed to win. Older players used clues that younger players didn’t understand. “Who is Mick Jagger?” One player went down in flames because he couldn’t think of the word “Limbo,” though he kept slinking down in his chair as if sliding under a phantom pole. As the words got increasingly abstract the players sounded like they were speaking foreign languages to each other. Only one pair guessed all the words in the allotted time. It was a fun, but poignant way to have them walk in their customer’s shoes. Through the exercise the executives felt much like their customers did on the phone, frustrated, misunderstood and ignored. It was a powerful illustration of just how frustrating the entire booking process could be. And it exhibited the true uniqueness of design research. The client had collected raw data about customer satisfaction. But having to go through an exercise where they were treated like their customers helped them to understand the feelings behind the data. From one executive after the meeting,
“I feel like the shackle has fallen from my eyes. No matter what becomes of this project, I know that I will not think the way I did in the past again. This process has changed my whole outlook.” – Cruise Executive
This meeting was a turning point for the project. It marked the point where the client core team fully came on board. The sincere desire to change exhibited by executive leadership at that meeting gave the team license to expand design opportunities in new and innovative ways. It allowed the team to take some bold design risks including live prototyping a new organizational structure that teamed up sales and service agents to take calls together. But by far the most exciting change the client glommed onto was an opportunity to leverage existing data to make its call center agents more knowledgeable and responsive to customers’ need in a quick and efficient way. Though the one design recommendation the client wanted to implement right away was a data-driven solution, it was the tried-and-true ethnographic method of empathy-building that created the opportunity for this bold new design.
LESSONS LEARNED
Convinced in the new opportunities presented, the client embraced the expansive design solutions proposed. The team landed on four design recommendations to help the company improve their customer experience.
Three of them focused on organizational and communications changes. For example, the team’s research suggested that the way inbound agents were incentivized was a huge barrier to the customer experience. So, a design recommendation devised a different way to organize the two sales divisions and one service department to better meet customers’ needs when they call in. However, there was one design recommendation that sprang solely from collaboration with a data scientist. This design concept, created to envision a new way to organize and surface customer information to agents was designed to make customers feel heard, understood and served and is called “The Board.
Through design research, the team had mapped three parallel journeys – the customer’s, data and systems. Using text from customer phone calls, the team matched all the information that customers talked about during a phone call to the software or enterprise system where that data was retrieved or stored. The team did this to see if there was a better and quicker way to surface this information during phone calls to help eliminate the frustrations customers felt during calls.
Real People, Real Data Usher in Ethics Realities
But now that the laser focus was on real customer data, another challenge surfaced-just what data was needed and exactly how would consumers feel about agents having a fuller picture of them? As human-centered designers, the team felt the idea of surfacing more data on potential and current customers to agents deserved more than a passing thought. Sure, the design idea was to serve customers better, but would having too much data on customers make customers feel vulnerable? How would customers feel about agents already knowing who they were and why they were calling? What data was off limits? What data was OK?
Designing this testing exercises with real data forced the team to have several discussions about the ethical use of data. As a matter of course, the design team decided as a team and company, not to take custody of personal identifiable information (PII) of cruise line customers. Personal identifiable information includes name, address, birthdate, social security numbers and any other information that could immediately identify a person. This decision stems from a universal truth handling data ethically, to only take as much data as one needs to solve a problem no more. The team, thanks from input from the data scientists and a software engineer guide on the project, rationalized that it didn’t need the caller’s real name, phone number, exact address etc., to analyze the calls. Instead the team focused on the content of what the caller was saying, where did they want to book a cruise and what systems and tools they used to learn about those destinations.
While ethnographic researchers are used to getting consent from everyone they engage. This is feasible. But when it comes to data, teams will also have to engage hundreds, if not thousands of people through their data. Getting consent for that engagement on an individual basis is nearly impossible. So, as a practical and ethical practice, it’s good to just ask for the data needed to actually solve the problem and no more. Some tips on doing this include:
- Scrub the PII from the data and use the rest, all but the personal information, for data simulations, prototypes or data visualization models.
- Create new data that simulates the characteristics of real data without the PII to run data simulations, prototypes or data visualization models.
- In rare cases, become an approved vendor with all the data compliance requirements met and transfer raw data (not scrubbed) to company’s servers to work simulations and visualization models. This is very rare and not recommended because it ups liability and often is unnecessary.
The team also took its data ethics discussion to cruisers through concepts that visualized the use of customer data in design. The team talked to current cruisers to get their sense of what data was too much information for their cruise company to have. Far from being invasive, customers said they would welcome their cruise company to use personal information to serve them. Specifically, customers felt knowing intimate details like their birthdays or whom they usually travel with weren’t off limits to their cruise company. They felt having such data would shorten their process and get them closer to the fun quicker. Again, none of the data the team proposed to use was not already collected by the cruise company. The team just wanted to use it in service of the customer. As a cruising couple said to the team, “We’ve been cruising with them for 12 years, they already have all this information on us. Why not use it to our advantage?”
Data Science Methods Amplifies Design Solutions
The team’s research revealed an important insight when it comes to using customer data. As long as the customer felt they were receiving a benefit for the use of their personal data, its use seemed more sanctioned. Even with customers’ willingness to have cruise companies use their data to their benefit, the team still wanted to ensure that what they designed would truly help customers first, and not just benefit the cruise line. This was a point of view came from the team’s dedication to human-centered design. An example of this dedication happened when it came to financial data on customers, specifically historical data on how much money cruisers spend on board the ship.
After concepting, a few versions of a new design that would surface various information data points including how much onboard spending a customer had done on previous cruises, the team decided not to include this financial data in the dashboard for sales agents. After much discussion with the client core team, it was decided that showing such financial data would put customers at a disadvantage of an aggressive salesperson. The sales agent could use the onboard spending average as a barometer for a person’s wealth capacity, tempting them to oversell a package that did not align with what the customer actually wanted. They didn’t want customers to feel agents had inside information that could be used against them. Data should be leveraged for the customer’s, not the agent’s benefit. This type of discussion is what the design firm later called “white hat/black hat.”
It’s looking at the data to be used or model to be created and trying to figure out the worst thing that could happen if a bad actor got ahold of that data or model being created. If the consequences harm people then the team needs to adjust the model, create safeguards, or even abandon it for a new model.
Data Science Methods Amplifies Design Solution
Using the data mapping and system mapping methods of the data scientist on the team gain the ability to not only define the data both customers and agents used most, but map that data to internal systems so it could be resurfaced and previsualized in a new way. In addition, the data scientists offered new ways to bring in external data, such as local weather data to make the calls feel more personal and less transactional. The Project Board team created a newly visualized software application which surfaced the most used information to agents during customer calls, but excluded sensitive information that could categorize people by personal spending habits.
The Board draws from three main cruise line systems to create a more complete picture of customers when they call. Results from the usability testing showed it was nearly unanimous among agents that the experience was better than their current system. Both agents and callers felt the new “Dashboard,” made for a smoother and better customer conversation.
“The feedback I get just from looking at a page like that is awesome,” one agent said. “It provided a lot of information that I usually have to go looking for so it makes it easier to respond to guest,” another said. “This is everything,” another agent exclaimed.
The design also included room for the cruise line to incorporate more sophisticated data modeling including machine learning to help improve a customer’s experience over time.
DATA AND ETHNOGRAPHY SHARE EQUAL BILLING IN PROJECT OUTCOMES
It took about eight months and another project to bring the “Dashboard,” into fruition, but the seed of this new software application was planted a year before in a small project room. That’s when a design researcher, a data scientist, an environments designer, a communication designer, an organizational designer and a business designer all poured over transcripts of customer calls to the call center.
They all felt the pain of both agents and customers with the current systems and processes, and a data scientist saw that same pain and offered a way to leverage data to help make the process better. And an entire design team worked together to create an innovative software product that the client could implement into existing systems to get customer service value right away. There’s no way to know of course if the results would have been as good without the mix of a multidisciplinary team. But it’s a certainty that it took both ethnographic research and data science to pull off the right solution.
“That’s where the design research comes in,” the data scientist said. “We got an accurate deep learning of what people need today and what the pie in the sky was for tomorrow and we worked backward from there. We created the first version given the data limitations they have today but we created it in such a way they can get further and more sophisticated in their data use over time. We created the foundation for an intelligent dashboard. And it’s not just a dashboard that surfaces basic customer information, it’s collecting that information such that when the company is ready to invest in something that’s more intelligent, that uses machine learning for example, they have the foundation to do so.”
Project Board is an example of pairing the highly effective skill of call center agents with the processing power of software programs to get a better result than either alone. For this reason alone, not the data that is integral to the design, but the people who will make use of this surfaced information, Project Board is a good example of the future of human-centered AI. It’s not about the data. It’s about the people, and how surfacing the data made people better at their job, happier doing their job and customers happier when interacting with customer service agents.
Ovetta Sampson is a design research lead at IDEO in Chicago where she uses a variety of qualitative and quantitative methods to inspire innovation in various industries including IoT, transportation, and service. She works with clients to visualize ways to ethically and with compassion serve people through digital and intelligent products. She also an adjunct professor in the School of Design at DePaul University in Chicago. Contact her at ovetta@ovetta-sampson.com.
NOTES
The paper’s author would like to acknowledge the many people who helped make this publication possible. This includes the leader of the project and data scientists who worked on both projects, both of whom are not named at the request of the author’s employer. In addition, the author would like to acknowledge the many data scientist and design research colleagues at IDEO and at DePaul University who helped her highlight the most illuminating and innovative parts of this very complex and unique project. This includes Jess Freaner, Dan Perkel, Lisa Nash, and many others. Special thanks to EPIC Mentor Ryoko Imani for your selfless commitment to this first-time EPIC author and your endless patience answering all of my incessant questions.
The information included is in this paper is accurate but does not represent the official case study about the project from the author’s employer IDEO.
REFERENCES
Cruise Lines International Association
2017 “Cruise Travel Report.” Annual Survey. 2017.
Matthew Dixon, Lara Ponomareff, Scott Turner, and Rick DeLisi
2018 “Kickass Customer Service.” 1 Jan-Feb 2017. HBR.org. 25 September 2018. <https://hbr.org/2017/01/kick-ass-customer-service>.
Gert, John and David Reinsel.
2012 The Digital Universe IN 2020: Big Data, Bigger Digital Shadows and the Biggest Growth in the Far East – United States. 2012 IDC Country Brief . Study. IDC. Framingham: EMC, 2012.
Loukides, Mike.
2018 The Ethics of Data Flow. 11 September 2018. 18 September 2018. <https://www.oreilly.com/ideas/the-ethics-of-data-flow>.
Neff, Jack.
2018 Ad Age. 11 July 2018. 2018 September 2018. https://adage.com/article/news/ai-models-real-consumers-reveal-research-answers/314137/ 2018.
Sheivachman, Andrew.
2018 Millennials Now Enjoy Cruising More than Boomers. 8 May 2017. 8 July 2018. <https://skift.com/2017/05/08/millennials-now-enjoy-cruising-more-than-boomers/>.
SourceMedia Research.
2018 The State of Data Quality in Enterprise 2018. New York, 1 January 2018. Report.