PART I: AN EXPLOSION OF VOICES, BUT LITTLE SENSE-MAKING
With the rise of social networking sites like MySpace and Facebook, as well as YouTube, and the popularity of blogs, there has been no other time where so many voices are being heard on so many topics. Personal blogs, many of which contain writing and photos and video are kept by 12 million Americans and are read by 57 million Americans. (Brown 2007) YouTube is a beacon site on the Web, a much-touted success story since it’s $1.6B acquisition by Google in November 2006. At the time of its acquisition 100 million videos were being watched on the Web every day. A BBC report in June of 2007 stated that “every minute of every day, six hours of fresh video are uploaded.”
These numbers point to an explosion of personal stories, in text, pictures and video, available for any and all to digest. The ability to wander from one person’s story to another linked story to another and so on is infinite. It is easier than ever before to join in the conversation and add your story to the mix. Consider this colorful entry from a YouTube contributor called “the caster”. He’s a young man, possibly still in high school, with very decided opinions. His video was a featured video on YouTube one day. He calls his video blogs “the philosophical scientifical half-journalal just-add-water video blog…rant…talk show…thinger” and “it’s basically some intellectual stuff (science, skepticism, philosophy) with some funny stuff, and my connection to the outside world.” Thecaster is very aware that he is in fact connecting to thousands of people. He actively cultivates interaction by inviting video and text reactions from viewers. Other YouTube users video their responses to his argument and upload their clips to his page which increases the visibility of all voices participating in the current topic of discussion.
“This is a video essay conversion on an essay I wrote before on profanity. Tell me what you think, I’d appreciate it, mac.thecast@gmail.com, there’s also a comment box down there and stuff. What’s wrong with profanity? There’s nothing. Fucking shit, this essay is about how profanity is a load of fucking bullshit. You can say god’s balderdash humbug. These were once vulgar words and if the wrong person heard it you would have been beheaded. Profanity, as we see it today, is a remnant of before the Christian religion was reformed and evolved.” (thecaster 2007)
Thecasters’ enthusiasm and indignation animate him as someone who is learning of injustices for the first time and broadcasting his awakening to anyone who will listen. At the point of this paper’s writing, he had 17 video responses and 2502 text comments to his profanity video.
The recognition of all this sharing and networked interactions culminated in You (or us) being named as Time’s Person of the Year for 2006 (Grossman 2006). Our collaborating, sharing, linking and exposing certified as bonafide cultural significance, having weighty impact on the shape and future of world affairs. Technology has become the great enabler for anyone wanting to tell a story. But there’s something missing from this picture. While the act of storytelling may be stimulating and certainly seems to be inspiring, there’s a correlated lack of sense-making accompanying this explosion. Very little time or attention has been applied to understand and make meaning of the myriad voices. With what Linda Stone, a former Microsoft researcher, has called “continuous partial attention” being paid across media and technology spectrums, it is easy to surf onto the next topic, the next news story, the next issue before any kind of resolution or understanding has been reached (Stone, L 2006).
We see the same phenomenon occurring in the design research field because our clients are now seeking individual stories. No longer do business executives need the big sell to undertake a research project for product design, online experience or marketing and advertising strategies. With the growth of user-centered design, and the attention user research has received in the media and business press, clients expect user research to be a part of the business process (Wharton 2004). Industry research analysts such as those from Forrester rate and rank the user experience practices among interactive agencies and advise clients that the user is of utmost importance in design and marketing projects. Similar to You (or us) the User or Customer is now considered the engine of innovation and differentiation. The User/Customer is now looked upon as a co-creator, helping to create commercials, spread marketing messages, and build a community of partner consumers dedicated to particular brands.
“In the Current TV V-Cam campaign, viewers can enter video for any of seven campaigns and get paid $1000 if their spot is chosen to run on the network. Toyota wants ads for its new Yaris car, L’Oreal Paris is marketing its High-Intensity-Pigments line of cosmetics… L’oreal Paris is also sponsoring a “You Make the Commercial Contest” on the teen entertainment site Varsity World.com…. Nike-owned Converse is asking amateur ad makers for original 24-second videos inspired by the Chuck Taylor AllStar Converse sports shoe… MasterCard is opening up its “Priceless” ad campaign to the public.” (Petrecca 2007; Mills 2007)
In such a climate, the user’s voice is not just heard but actively solicited. However, while the demand for the user’s voice and perspective is high, thorough and subsequent analysis of the meaning behind that user’s experience is severely limited. Clients request user research, but question the time and cost it takes to fulfill a research engagement. Many don’t understand the time it takes to actually achieve real insight from the data. Far too many clients expect “insights” in a couple of days to a week after fieldwork phase is complete. What these clients are looking for, are the raw data, the pictures and quotes or “verbatims” that research produces. Subsequently, time allocated to analysis in project plans is questioned and curtailed. On the agency side, opportunities are not taken to educate the client about the differences between data and analysis. Many business development people are in fact not aware of these differences themselves and projects will come back sold, but completely under-scoped in the research phase. The data stemming from the research – the pictures and video, get baked into deliverables and are used almost as symbols representing the fact that the voice of the consumer is being heard. Far too often the analysis accompanying the data is absent or never given enough time to coalesce into real meaning. What we are left with then, are individual user’s stories, not much different from thecaster, with no framework that exposes the experiences in which these stories have context. We have the impact of a story, but no analysis of multiple stories which would create meaning.
PART II: LACK OF SENSE-MAKING TIED TO LACK OF TIME AND ATTENTION
“To pay continuous partial attention is to pay partial attention – continuously. It is motivated by a desire to be a LIVE node on the network. Another way of saying this is that we want to connect and be connected. We want to effectively scan for opportunity and optimize for the best opportunities, activities, and contacts, in any given moment. To be busy, to be connected, is to be alive, to be recognized, and to matter.” Linda Stone (2007)
To be a live node, to always be on, emphasizes the need to continually project status, to send out more than is taken in. To recognize someone else’s presence in the network is a brief interruption, subtracting time and attention from other tasks. Linda Stone coined the phrase “continuous partial attention” in 1997 but it resonates even more strongly today, as newer technologies siphon our attention spans into ever expanding directions. One of the latest is Twitter – a technology that enables its members to send out blurts of information via the web, SMS text, or IM. It gives its users the ability to say “what they are doing right now” to others in their networks. With Twitter and the tools supporting it, people can continually project their status, sending short updates throughout their networks. This is an excerpt from “Biz” an employee of Obvious, the company behind Twitter: (Stone, B 2007)
Getting an iced soy latte at joes before heading to a rehearsal about 4 hrs ago from txt
Ordering sesame medallions at zen palate about 5 hours ago from txt
Yowza that west village sangria packs a punch about 16 hours ago from txt
In a cab on my way to tapas about 21 hours ago from txt
A quick swim! 01:54 PM July 21, 2007 from txt
Walking through farmers market 07:33 AM July 21, 2007 from txt
Make that angelicas kitchen 06:52 PM July 20, 2007 from txt
Grabbing vegge grub at angelicas lichen 06:51 PM July 20, 2007 from txt
Checking in 03:01 PM July 20, 2007 from txt
Making our way into manhattan 01:46 PM July 20, 2007 from txt
Heading to NYC 07:54 AM July 20, 2007 from txt
To Biz’s friends, back in California, he is not away on a trip which they’ll hear about when he returns. Biz is on the plane, Biz is in a taxi heading into Manhattan, Biz is eating at Angelica’s Kitchen, Biz is tipping back some sangrias in a tapas place in the West Village. When Biz returns to California, they will never need to “catch up”, they’ll already have markers in their memories for what Biz’s trip entailed. But by relying on Twitter, they will never know what it felt like to be a vegetarian in New York, how easy or difficult it was to find veggie-friendly food, or what a quick swim felt like in July heat on the east coast. In short, they will have the outlines of Biz’s trip but nothing about what Biz’s experiences meant to him.
Frog Design’s Ian Curry, on his blog, (2007) relates Twitter to the phatic function, one of Roman Jakobson’s (1960) 6 factors that are necessary for communication to occur. The phatic function establishes contact such that communication can begin. Similarly, Twitter is a constant mike-check but nothing of real substance can be passed along due to the medium’s constraints. Presumably, Twitter’s users use other technologies to establish deeper relationships and convey more meaningful messages.
The effect of continuous partial attention is that active listening and the full attention required to make meaning of what is heard doesn’t occur. Our ability to truly listen and focus on the meaning in the messages we hear has been drastically reduced and fragmented (Mark, Gonzalez, & Harris 2005). The technologies we communicate with and use to monitor our networks of people and information further fragments the time we spend on any particular task. To make sense of data – stories, images, people’s activities – time and focus is required to understand what is being communicated and its context. As a node on a network, we are continually projecting our own status, scanning the surface and alighting briefly for short intervals of time.
Qualitative research reduced to a node
The use of qualitative data – a person’s words or images – in a deliverable, rather than the analysis of an experience, functions as a node on the network. The images of users and sound-bites of their experiences project a certain status, that this person has been heard, that they have a story to tell, that that story is relevant to the client in a particular way. The data becomes symbolic of meaning, often times, taking the place of real meaning.
The desire to see representations of people’s stories is not blameworthy. Narratives, the foundation of qualitative data, are fundamental and seductive. It’s what drives those 57 million US Internet users to read other people’s blogs. “The narratives of the world are without number…the narrative is present at all times, in all places, in all societies; the history of the narrative begins with the history of mankind; there does not exist, and never has existed, a people without narratives.” (Barthes, 1996)
People understand and enjoy the texture of stories – the words, pictures and video that illustrate them. Stories establish cause and effect. It’s one of the main ways in which we interact with the world. In essence, all people establish, negotiate, repair, and terminate relationships with other people through reasons (Tilly, 2006). When people navigate the actions and consequences in their daily lives and interact with others, they give reasons for occurrences. Reason-giving is a social process, it connects people to each other, manages inequities, provides rationales for behavior and resolves emotional conflicts. In Tilly’s work there are 4 kinds of reasons, one of which are stories:
- Conventions – socially accepted rules of appropriateness that explain dereliction, deviation and good fortune, e.g., “stuck in traffic” or “caught a lucky break”
- Stories – explanatory narratives that provide cause and effect accounts of unexpected, problematic, dramatic, or exemplary events
- Codes – procedures and rules that govern actions
- Technical accounts – explanations of cause and effect that are grounded in systematic, specialized disciplines rather than everyday knowledge
As researchers, we probe explanatory narratives, looking for cause and effect which in turn renders people’s behaviors intelligible. Qualitative methods specifically elicit stories (as well as coded and technical accounts) that best elucidate attitudes, needs, preferences and actions. But our methods typically do not stop at the story-telling level. To fully grasp the significance an experience has in the innovation or evolution of a product or service, there must be multiple examined stories, layered to reveal context, patterns, divergences and opportunities. Either due to the power of story-telling or the symbolic representation of story-telling, many clients and agencies do not fully grasp that the analysis of those stories contains the seeds for decisive action and new ideas. A handful of stories can be interesting, even inspiring, but a few stories are not a fully-functioning representation of an experience. Meaning has to be built up and dissected so as to pinpoint where a business can take action, when it should act and how. Stories without analysis tell us about effect without insight into causes. It tells us about behavior without providing reasons.
There are various representations of meaning that stand-in for analysis. Sometimes it is just a straight transference of high-level insights or top-line themes. Others we’ve seen include:
- Market research reports – Many clients have done extensive market research and have reams of reports segmenting their customers. While this information is helpful in understanding how they market and interact with their customers, it doesn’t take the place of analytic insight into behaviors and goals.
- Web analytics without other methods – Knowing where a user is clicking doesn’t take the place of understanding what they’re doing and what they need.
- Satisfaction surveys – Data on how a customer feels about a particular product doesn’t translate to insights into how they’re using it.
- Intuition or hypothesis based on self – sometimes clients already have a path they want to take or an outcome they’d like to see. This is usually backed up with the argument that since they are also customers, “I know what’s needed.”
In many interactive agencies, the final deliverable from a research engagement is the user persona. Personas are another type of “analysis” that could benefit from better contextual understanding of experiences. User personas can be built from either qualitative or quantitative data, but they represent segments of users who are differentiated by their goals, behaviors, and attitudes (Laural & Lunenfeld 2003). These personas are influential in the design process in that they represent the core audience and their needs for this new or improved product or service. Personas are helpful when based on real data and not just intuited or hypothesized, which can sometimes happen. However, like the node blinking in a network, personas quite often reduce experience to only the dominant characteristic of individuals. Many research processes go straight from qualitative research data to the creation of personas without ever undertaking an analysis of the experience in total. Design personas can be helpful in the short term, and they will cut the time required to complete the overall research process but this stops well short of being able to capitalize on opportunities and mine new approaches. It only serves to replicate someone’s story, not evolve or innovate from it.
There are various reasons for the kinds of faux-analysis we’ve seen:
- The time pressures of projects
- A client wanting to control the outcome of a project
- Lack of history of contextual analysis in traditional marketing research
- Clients assuming that user stories in and of themselves are complete and don’t need analysis
- Lack of understanding about what qualitative analysis is and can do
Qualitative is not quantitative
The last point is something that we can address. What makes qualitative data so captivating is often its undoing in being able to serve as the foundation for actionable recommendations and decision-making. While stories bring resonanceand identification, they are also considered malleable, subjective, and tainted with emotion. Those who don’t do research for a living, and who aren’t aware of qualitative analysis processes or methods, question decisive insight from what they consider to be “just stories.”
Quantitative data are not as captivating as qualitative data but are believed to be more objective and factual. The results of a quantitative study are often taken to be “the truth”. There are various reasons for this. But in an appreciative assessment of why most clients sign on for a research project, one of the more compelling reasons is that they want an answer and a direction to take. Whether someone is a Senior Brand Manager, the VP of Marketing or the CEO, they will be held accountable for the decisions they make. Numbers confer authority. Quantitative data takes the unknown and renders it comprehensible in discrete chunks – in percentages. Also, there is no debate between quantitative researchers about what it is or who gets to do it. Survey methods and the tools used to “crunch” the data are well-established and used throughout both academia and the business world. The methods for analyzing numerical data are well established and standardized.
Qualitative methods and outcomes are often compared against quantitative research and its outcomes without regard for the different assumptions underlying these two modes of inquiry. Quantitative research is born out of the positivist philosophy that all things are measurable, and that facts have an objective reality. Quantitative results aim to be reliable and generalizable so that predictions can be made. Qualitative inquiries, on the other hand, proceed from an interpretivist philosophy, which asserts that reality is complex, interwoven and difficult to measure because it is constructed from each individual’s unique experiences. Qualitative data must be interpreted in order to establish a context through which behavior can be understood (Siegle 2007).
There are clear instances in which to use one or the other method depending on the research objectives. It’s possible to combine them for an even more robust set of results. The “interplay between descriptive richness and experimental precision can bring accounts of social phenomena to progressively greater levels of clarity” (Cupchik 2001). Yet too often in the business world, the differences in methods are not teased out and applied appropriately to the various types of projects that come our way. For example, we’ve seen “usability research” routinely scoped for all types of projects when it’s most appropriate for product optimizations or evolutions rather than product concepting or ideation. Online surveys continue to be used to ask how many people perform a certain action even when the objective is to understand the steps involved in a process.
In general, qualitative research is viewed as the weaker of the two since its outcomes do not have the singular authority that a statistic can confer (Myers 2000). For one thing, the qualitative method of eliciting stories lends a false assumption that really anyone could do this kind of research. (Forsythe 1999). A second issue is that the analysis of data is in the hands of the researcher to elicit and make sense of patterns. The qualitative researcher is the instrument of both data collection and analysis, whereas with quantitative data, surveys and questionnaires are the instruments of data collection and statistics packages “crunch” the numbers. These differences underlay the perceived fallibility of humans compared against the objective authority of numbers. Combine this with a general lack of understanding about just how qualitative researchers do analysis and it’s no wonder that skittish managers want the clarity and simplicity that numbers offer.
Within our field, the skills and methods for qualitative analysis are not as well defined or touted as the skills and methods for gathering the data. There are now many books and articles devoted to how to do “ethnography” and how qualitative research methods can be applied to understand the “user experience.” However, a quick glance through the table of contents for most of these of books reveals very little guidance for how to analyze the data once it’s been gathered. Many in our community are still focused on redefining ethnography in the business context, who gets to do it and how it should be carried out. There are still online debates over who “owns” ethnography (anthrodesign 2007). This continues to be the dominant thread even as this type of debate ignores the power and opportunity in making sense of the data. Very few debates begin over what it means to do analysis and what it means to make sense of the data – analytic processes are rarely surfaced. The methods used in academic contexts don’t translate well to the pace and demands of current business settings. As a result, a client’s understanding of qualitative analysis is minimal. They don’t understand the process and must take it on the researcher’s word that they have transformed data into well-grounded recommendations.
Interpretation and data analysis is where research meets the actionable objectives that businesses have. Ignoring this or minimizing its role leaves the field open to others who would claim the “actionable” part for themselves. There are other disciplines who already take on the mantle of analysis and offer solutions to business questions based on the data gathered by researchers. Companies need analysis so that they can reach the outcomes they want but we’re beginning to see that this function is being given to those who are perceived to be “doers & makers”, that is visual/product designers, information architects and internet strategists.(Nussbaum 2005). Current job descriptions for designers and strategists include more responsibility for research. Take for example a job description from IDEO for a conceptual designer. “”You bring…a holistic approach to process: Formulating cultural and user insights, mapping opportunity spaces through strategic frameworks, and expressing compelling solutions.” Ask yourself this: Who in your company at this moment is mapping out opportunity spaces through strategic frameworks?” (Nussbaum 2005) If we, as professional researchers do not take this trend more seriously we may find ourselves left to gather and hand over data to designers and strategists.
We wonder about this lack of focus on analysis. Is it tied to the larger environment in which we live, where the time and appetite for sense-making is in little evidence? Where our news is displayed in tickers scrolling across screens while “experts” yell over one another – neither making any sense but definitely being heard? Are we, in this field, acting as a mirror for US society at large?
PART III: HOW TO ADDRESS LACK OF SENSE-MAKING
While we may not be able to change the larger societal implications of no sense-making, within our own field there are ways to address this.
1. (Re)define research as sense-making
Sense-making, the core activity of research, is inherently an anti-node activity. It is focused on the layers of stories rather than the single, blinking utterance. When we consolidate nodes into a coherent whole we are able to define the entire experience and create a more complete context for the question under investigation. The key feature of sense-making is transforming these layered stories into meanings and reconstructing and abstracting the experience into models that can be used as springboards for discussion and insight. Without meaning, the foundation for actions, needs and priorities is missing.
2. (Re)define us as sense-makers
The responsibility rests with researchers to be sense-makers in the fullest sense of the term. A researchers’ work involves more than gathering data, it includes extracting meaning from the data as well as showing how action can be taken from what the data indicates. We are professionals who craft sense-making and meaning. When we make this assertion we focus attention away from our ability to simply ask questions and towards revealing answers in a way that is attractive to both our business sponsors and ourselves.
As a sense-maker we are interested in the whole, therefore we employ complete and continuous attention to all nodes. In addition, we take in or are aware of all communication factors; the emotive, context, poetic, conative and the phatic. Sense-making requires that we cast our net wide in order to order to place actions in their correct context.
Models of research findings have an additional benefit in that they help to dispel the perception that researchers are not doers, that they do not make anything. Researchers must claim the territory of sense-making and combine it with experience modeling in order to distinguish it from form creation (for designers) or business design (for strategists).
We also need to be more aggressive about promoting sense-making as decision definition, direction and support. What we call ourselves needs to highlight that we are more closely aligned with strategic or analytic functions. When we elevate the importance and usefulness of analysis, we are able to demonstrate to the client that we know what needs to be done.
3. Lead with good protocol design
Sense-making should be applied at the very beginning of research projects when the research plan is being formulated. When, as is often the case, our clients are panicked about a particular problem, they will pressure us to begin work without having a clear idea in mind of what the actual problem may be. Sense-making requires a clearly defined objective and a short list of key questions that need to be answered. It also requires that we learn from past knowledge; secondary reports, clients’ market research studies, surveys, etc. that has been accumulated already. Being free to match the method to the objective is another component of good protocol design. As more clients develop internal research capabilities and become exposed to a variety of research methods, we find them dictating inappropriate methods for their particular question. We should be especially wary of this tendency. Good research planning requires that we be very selective of study subjects and that we work closely with the client to define how to package the research findings.
Good protocol design is the pre-cursor for the sense-making that occurs in analysis. Creating hypothesis and frameworks of an experience to test in the fieldwork phase should be commonplace. As professional researchers we know a good deal about many processes: information-gathering, decision-making, relationship-building, etc. These are the foundations of how people act in the world and accomplish goals. Using these processes to create and guide protocols is good practice and provides a good jump-start on the analysis. In a recent book (Diller, Shedroff & Rhea 2006), researchers at Cheskin lay out 15 “meanings” that they’ve consistently found in their research into the types of meaningful experiences that people value. That kind of insight going into fieldwork is a huge boon to the analytic process. They also apply an “experience framework” with foundational components appropriate to any type of innovation project. The framework explicitly states the objective and then details the benefits derived from the research according to “functional value” “economic value,” “emotional value” and “identity-creation”. This is an example of mining practices and outcomes so that sense-making is a smooth and predictable part of the overall process.
4. Choose the most appropriate analytic method
Qualitative researchers are fortunate to have a wide variety of analytic frameworks at their disposal. A quick review of the landscape has revealed 15 – 20 different approaches, which compared to the 1 method of numerical analysis, is a bit daunting. It’s no surprise then that the individual qualitative researcher usually specializes in one analytic method, or two at most. These options are in addition to the numerous software programs that we can uses as data organizing tools. Fundamentally quantitative analysis involves reducing people (as observed directly or through their texts) to numbers, while qualitative analysis involves reducing people to words.
Mixed types for social scientists – Dr. Russell Bernard believes that “social researchers should be fluent in the full range of methods for collecting and analyzing data. The questions we ask, rather than ideological commitments, drive the choice of one method over another in any particular situation” (Bernard 1996). His assertion is that both qualitative and quant analysis should be applied to data. He points out that coding is just one of the steps in what is often called “qualitative data analysis,” or QDA. Deciding on themes or codes is an unmitigated, qualitative act of analysis in the conduct of a particular study, guided by intuition and experience about what is important and what is unimportant. Once data are coded, statistical treatment is a matter of data processing, followed by further acts of data analysis” (Bernard 1996). Analysis, in his view is an iterative process and it combines both qual and quant methods.
Four types for health care – “Many ethnographers have spent years trying to make sense of her or his field notes. In health care we do not have that luxury or agony” (Stevens 2007). For healthcare researchers, Dr. Stevens advocates four techniques well suited for symptom research; Content Analysis, Grounded Theory, Narrative Summary Analyses and Triangulation. She also suggests protocols for ensuring reliability and validity of results because they need to be compared across cases, institutions, populations, settings and time. In most business settings, we too lack the luxury of years to analyze our field notes.
Custom types for design – In the design research arena social science methods have been further adapted to suit the needs of researchers operating under even tighter deadlines. For example, the “think, do, use” model and analytic frame of AEIOU (Actions, Environment, Interactions, Objects and Users) employed by those at Doblin and E-Lab have since been handed down to others. These tools focus the analysis to construct meaning for design purposes.The Institute of Design offers graduate degrees in design research methods which includes training in analytic frameworks such as Activity Life Cycle Analysis, Value Webs, and Insight Matrix.
5. Make analysis more visible to the client
Analysis gains more value to our clients when they are able to see it as a distinct function offered by a researcher and as a proprietary tool that they pay for. Currently, analysis of research data in the business context is viewed as a bit of a black box. Our clients have likely accompanied us into the field, at the very least attended sessions in a research facility and have seen, perhaps some of our raw interview notes and photos. What they rarely see and what remains a complete mystery to them is researchers putting the data through the analysis that allows us to draw conclusion and come up with the insights that we do. This needs to change. As long as our clients lack the understanding that it is analysis that transforms single narratives and imagery into holistic meaning, we will be fighting an uphill battle to justify the time and cost of our work.
6. Tie research to action outcomes
While we need to highlight the importance of analysis and the activities involved in analyzing our data, the primary and most important outcome of analysis is meaning. The meaning of the data is most effectively illustrated by models, diagrams and maps (McCotter 2001). Meaning visualization is a skill-set not often thought of or talked about in relation to researchers, but this too should change.
Models – When the whole experience has been revealed, illustrative models can satisfy the “urge for story” which is often supplied, in error, by “node” photos and verbatims. Models of experience can also provide the springboard for determining where, how and when to take action. They can point to solutions that would be most effective for particular audiences and when a need must be met in the context of the experience. When we diagram or model our findings, we are able to show the client how to meet their business objectives. We are also able to show, more clearly how the research they undertook, defines the action they must take, clarifies the priorities they must make and in some cases provides the surprise insights they were hoping for.
7. Visualize meaning to tell better stories
Researchers should become more fluent in analysis or meaning visualization – distinct from data visualization which is usually associated with simply charts and graphs. We should be able to create process maps, decision trees and experience models that are visually pleasing and easy to understand by both our clients and our design teams. Finally, as researchers we should take advantage of the “urge for story” and use it to draw attention to the current experience as a whole as well as to the story of the future context.
The explosion of stories presents us with a challenge, but one we are well equipped to tackle. We have before us the opportunity to show how seemingly disparate stories are indeed connected and how meaning, decisions and actions can be derived from them if only we take the time to conduct adequate analysis. Sense-making as a core competency of research, should be and could be, better storytelling.
NOTES
Acknowledgments – the authors would like to thank Alex Taylor for his review of the early version of this paper and Rachel Lovinger for her assistance and expert guidance with the final drafts of this paper.
Bridget Regan is a Senior Researcher at Avenue A | Razorfish in New York City. She has conducted ethnographic research into a wide variety of experiences for a wide range of clients, most recently Ford Motor Company, Capital One, and XM Satellite Radio. She has previously worked as a Senior Experience Modeler with Sapient as well as an Independent Consultant for numerous design research, advertising, and branding projects.
Ajay Revels is a Senior User Researcher at Avenue A | Razorfish in New York responsible for work studies, process mapping and usability testing for clients such as Merrill Lynch, New England Journal of Medicine, Lionsgate and MTV. Prior work includes “ketai” studies in Japan and research for The Webby Awards and Qualcomm.
REFERENCES
Barthes, Roland
1996 Introduction to the Structural Analysis of the Narrative, Occasional Paper, Centre for Contemporary Cultural Studies, University of Birmingham
Bernard, Russell
1996 Qualitative Data, Quantitative Analysis, The Cultural Anthropology Methods Journal, Vol. 8 no. 1
Boztepe, Susan
2007 User Value: Competing Theories and Models, International Journal of Design Vol 1 No.2
Cupchik, Gerald
2001 “Constructivist Realism: An Ontology that Encompasses Positivist and Constructivist Approaches to the Social Sciences”, Forum: Qualitative Social Research, Vol 2, No.1 – February
Diller, Steve, Shedroff, Nathan and Rhea, Darrel
2006 Making Meaning: How Successful Businesses Deliver Meaningful Customer Experiences, New Riders Press
Forsythe, Diana E.
1999 “It’s Just a Matter of Common Sense”: Ethnogrphay as Invisible Work, Comput. Supported Coop. Work, Vol. 8, No. 1-2. pp. 127-145
Grossman, Lev
Dec 13, 2006 Time’s Person of the Year: You, Time Magazine
Jakobson, R.
1960 Closing Statements: Linguistics and Poetics, in T. A. Sebeok (ed.), Style in Language,MIT Press, Cambridge
Laurel, Brenda and Lunenfeld, Peter
2003 Design Research. Methods and Perspectives, MIT Press
Lenhart, Amanda; Fox, Susannah
July 19, 2006 Bloggers: A portrait of the internet’s new storytellers , Pew Internet & American Life Project
Mark, Gloria; Gonzalez Victor M.; Harris, Justin
April 2-7, 2005 No Task Left Behind? Examining the Nature of Fragmented Work. CHI: Take a Number, Stand in Line
McCotter, Suzanne Schwartz
2001 The Journey of a Beginning Researcher, The Qualitative Report, Vol. 6, No. 2
Myers, Margaret
March 2000 Qualitative Research and the Generalizability Question: Standing Firm with Proteus, The Qualitative Report, Vol. 4, No. 3/4
Rae, Jeneanne
Nov 27, 2006 The Importance of Great Customer Experiences…And the Best Ways to Deliver Them, BusinessWeek
Tilly, Charles
2006 Why? What happens when people give reasons…and why, Princeton University Press
Wasson, Christina
2000 Ethnography in the Field of Design, Human Organization 59(4):377-388
WEB RESOURCES
Anthrodesign http://tech.groups.yahoo.com/group/anthrodesign/, accessed 20 Sept 2007
Brown, Jeffrey http://www.pbs.org/newshour/bb/business/july-dec06/youtube_10-10.html, accessed 22 July 2007.
Curry, Ian www.frogdesign.com/frogblog/twitter-the-missing-messenger.html, accessed 22 July 2007.
thecaster http://www.youtube.com/thecaster, accessed 22, July 2007.
Knowledge@Wharton http://knowledge.wharton.upenn.edu/article.cfm?articleid=971&CFID=26271218&CFto, accessed 31, August 2007
MasterCard http://www.priceless.com/us/personal/en/pricelesstv/index.html, accessed 20 September 2007.
Mills, Elinor http://news.com.com/Advertisers+look+to+grassroots+marketing/2100-1024-6057300.html, accessed 22 July 2007.
Nussbaum, Bruce http://www.businessweek.com/bwdaily/dnflash/mar2005/nf2005037_4086.htm, accessed 20 September 2007.
Petrecca, Laura http://www.usatoday.com/money/advertising/2007-06-20-cannes-cover-usat_N.htm, accessed 20, September 2007.
Siegle, Del http://www.gifted.uconn.edu/siegle/research/Qualitative/qualquan.htm, accessed, 22, July 2007
Stevens, Marguerite M http://symptomresearch.nih.gov/chapter_7/sec4_5/cmss45pg1.htm, accessed 20, September 2007.
Stone, Biz http://twitter.com/biz, accessed 22, July 2007.
Stone, Linda http://www.continuouspartialattention.jot.com/wikihome, accessed 22, July 2007.
Waters, Darren http://news.bbc.co.uk/2/hi/technology/6221588.stm, accessed 20, September 2007.