Advancing the Value of Ethnography

Organizational Culture as Lazy Sensemaking: What Ethnographers Can Do about Fundamental Attribution Error

Share:

This essay represents the opinion of the author and not any of her employers, past or present.

The Fundamental Attribution Error

Lately I’ve been ruminating on the fundamental attribution error, also known as correspondence bias. Think of it as a lazy kind of sensemaking, a just-so story that lets us place more agency in individuals than is probably warranted. It’s a common category error (at least among the Western psych undergrads who volunteer for experimental lab credits) that describes our tendency to attribute undesirable outcomes to individual traits, without attending to the role of situational factors in shaping behavior and decision-making. Victim-blaming is an obvious example of attribution error: Really, what kind of idiot shops at that particular convenience store in that particular neighborhood at 2AM? (Answer: A young single mother who works two jobs, one of which lets her off after midnight, and she knows she’s about to run out of diapers. That’s who.)

Fundamental attribution error can be difficult to detect and critique when it’s a subtle thread in complex social narratives of risk, personal responsibility and Bad Things Happening. For example, at the height of the Madoff scandal, the news was full of pundits admonishing investors to remember that returns that seem too good to be true probably are too good to be true. Yes, Madoff was supremely unethical, and the Securities and Exchange Commission failed in its regulatory functions—but if Madoff’s victims had been appropriately skeptical, perhaps they’d not have lost their savings. In contrast, more thoughtful analyses examined how several decades of social, political, and regulatory trends drew an unprecedented wave of investors into a weakly-regulated marketplace. A great example is Michael Hiltzik’s 2009 essay in the LA Times about how systemic dysfunction created an environment in which Madoff’s claims seemed quite reasonable.

Cultural Attribution Error?

I think there’s a similar kind of category error in organizational discourse, something along the lines of a cultural attribution error. I’ve encountered this trope in contexts where people are engaged in sensemaking about some salient aspect of organizational behavior that isn’t readily attributed to obvious factors, like performance incentives or funding cycles. Now, I should point out that I’m not a social psychologist, so I don’t feel comfortable asserting a strong claim to the existence of a “cultural attribution error.” Actually, I’m not even sure it’s a weak claim. It’s a metaphor—and like most metaphors, it breaks in interesting ways.

For example, unlike the fundamental attribution error, what I’m calling “cultural attribution” isn’t only invoked to explain undesirable outcomes. A couple of weeks ago, I was walking down the hall of an office building and encountered a couple of staff members who had just finished sorting and tagging a room full of Christmas gifts for a local charity. Toys, books, clothes and kitchen supplies were piled literally waist high around them. When I commented on the remarkable success of the gift drive, one of them said, “Every year, people outdo themselves. This is just part of our culture.” Explaining our colleagues’ behavior in terms of “our culture” might not have much analytic purchase, but it expressively reinforces our sense of corporate identity. In this sense, deploying “our culture” becomes a way of telling ourselves stories about “who we are and how we do things around here,” as sociologist Penny Edgell Becker puts it.

Culture as Just-So Story

But attributing corporate behavior to “corporate culture” can be problematic, particularly when individuals in leadership positions are the ones engaged in attribution. I’m thinking here of management narratives that explain organizational dysfunction in terms of organizational “culture,” squarely locating causality in some posited set of internal values, beliefs, and/or attitudes driving workforce behavior in undesirable ways.

If you dig around a bit, you can find plenty of “cultural just-so” stories out there. Here’s an example from the National Safety Council’s website:

A safety culture can only grow when everyone in the company embraces safety as a key component in their everyday work.   

Now, I don’t know about you, but I have yet to meet anyone who doesn’t believe that safety is a “key component” of everyday work. And to their credit, the National Safety Council does seem to recognize that reducing accident rates requires more than changing beliefs, attitudes, and values. But it seems to me that putting responsibility for safety primarily on the internal state and motivation of the worker is a limiting construct that doesn’t lead us to ask questions about the external, contextual factors that shape human understanding and behavior. Messaging is important, but industrial safety is a complicated contextual design problem that requires more than encouraging people to embrace ‘safety’ as a normative construct. Creating a safe workplace requires careful analysis of the physical environment, policies, and processes that comprise a context of work, informed by a good dose of practical wisdom from our colleagues in human factors.

“It’s the dysfunctional culture, stupid!”

To my mind, the most spectacular example of a cultural attribution error comes from Los Alamos National Laboratory (LANL) in the mid-2000s. I spent six years at Los Alamos between 1997 and 2003, first as a graduate student and then as a staff member, during one of the most tumultuous periods in the Laboratory’s history. Readers might remember that LANL experienced a series of espionage/ information security crises in the late 1990s and early oughts, culminating dramatically in the reported disappearance of two compact disks containing classified information in 2004. Upon learning of the alleged loss, then-Laboratory director Peter Nanos halted work and excoriated staff (in writing) for perpetuating a “culture” replete with academic “cowboys” and “buttheads” who viewed security regulations as silly and compliance as a joke.

Having spent six years at Los Alamos as an ethnographer, I consider myself pretty knowledgeable about the Laboratory during that period. I wasn’t at Los Alamos when this drama unfolded, but I can unequivocally state that during my six years at LANL, I never saw the “culture” of willful rule-flouting that Nanos described. What I did see was a conscientious population of professionals under tremendous stress, working in an institution that had weathered a major change in its mission space; as well as fifteen years’ worth of federal investigations into environmental, occupational safety and information security practices; topped off in the spring of 2000 by a catastrophic forest fire that came close to destroying the entire town.

Ironically, the disks that evoked such fury were illusory: Seven weeks after international headlines announced that LANL had yet again misplaced classified national security data, Laboratory leadership quietly attributed the “loss” to an error involving bar codes accidentally assigned to nonexistent storage media. Maladaptive employee attitude wasn’t the issue—instead, the error seems to have stemmed from an overly complicated inventory management system.

What’s an Ethnographer to Do?

I’m sure I’m not the only person who grits her teeth upon hearing management consultants offer advice about changing the beliefs, attitudes, values, and whatnot that purportedly comprise organizational “culture.” As ethnographers and anthropologists, we should be prepared to explain that “culture,” when understood as a set of internal states or traits within a population, is neither explanation nor answer for organizational problems. Moreover, there is no such thing as a specialized “culture consultant” who holds a magic key to changing the internal beliefs, understandings, and attitudes of a population.

I am trying in my own organization to challenge the idea of “culture” as a holding place for a) stuff managers haven’t yet worked to understand and b) are likely to associate with the attitudes, beliefs, values, etc. of their workforce. Instead of culture, I am trying to articulate concrete constructs and practical methods that can help my colleagues get a working handle on context—which can include everything from the physical environment to organizational timekeeping policies—and its relationship to individual and collective behavior. Ethnographic practice offers the richest set of constructs and tools for understanding context: through collaborative, participatory engagement, we bring empathetic-but-detached perspectives on organizational behavior, leading to practical design outcomes that can humanize the workplace for everyone.

Image: Illustration from Just so stories (c1912) via WikiMedia


Laura A. McNamara, PhD, is an organizational anthropologist and Principal Member of Technical Staff at Sandia National Laboratories. She has spent her career partnering with computer scientists, software engineers, physicists, human factors experts, I/O psychologists, and analysts of all sorts. Most of her projects involve challenges in analytic technology adoption. She enjoys working with computer science and software teams in design strategies that enhance the future usability, utility, and adoptability of analytic software. She is co-editor of Anthropologists in the SecurityScape and Dangerous Liaisons: Anthropologists and the National Security State, and lead editor of the Elsevier Science Publications series Studies in Multidisciplinarity.


Related

Sensemaking in Organizations: Reflections on Karl Weick and Social Theory, Laura McNamara

Sensemaking Methodology: A Liberation Theory of Communicative Agency, Peter Jones

How Autoethnography Enables Sensemaking across Organizations, Frederik Gottlieb & Wafa Said Mosleh

0 Comments

Share:
photo of Laura McNamara

Laura A. McNamara, Sandia National Laboratories

Laura A. McNamara, PhD, is an organizational anthropologist and Principal Member of Technical Staff at Sandia National Laboratories. She has spent her career partnering with computer scientists, software engineers, physicists, human factors experts, I/O psychologists, and analysts of all sorts. Most of her projects involve challenges in analytic technology adoption. She enjoys working with computer science and software teams in design strategies that enhance the future usability, utility, and adoptability of analytic software. She is co-editor of Anthropologists in the SecurityScape and Dangerous Liaisons: Anthropologists and the National Security State, and lead editor of the Elsevier Science Publications series Studies in Multidisciplinarity.