Cultural cognition
The cultural cognition of risk, sometimes called simply cultural cognition, is the hypothesized tendency of persons to form perceptions of risk and related facts that cohere with their self-defining values. Research examining this phenomenon draws on a variety of social science disciplines including psychology, anthropology, political science, sociology, and communications. The stated objectives of this research are both to understand how values shape political conflict over facts (like whether climate change exists, whether gun control increases crime, whether vaccination of school girls for HPV threatens their health) and to promote effective deliberative strategies for resolving such conflicts consistent with sound empirical data.
Theory and evidence
The cultural cognition hypothesis holds that individuals are motivated by a variety of psychological processes to form beliefs about putatively dangerous activities that match their cultural evaluations of them. Persons who subscribe to relatively individualistic values, for example, tend to value commerce and industry and are inclined to disbelieve that such activities pose serious environmental risks. Persons who subscribe to relatively egalitarian and communitarian values, in contrast, readily credit claims of environmental risks, which is consistent with their moral suspicion of commerce and industry as sources of inequality and symbols of excessive self-seeking.[1][2]
Scholars have furnished two types of evidence to support the cultural cognition hypothesis. The first consists of general survey data that suggest that individuals’ values more strongly predict their risk perceptions than do other characteristics such as race, gender, economic status, and political orientations.[3][4]
The second type of evidence consists in experiments that identify discrete psychological processes that connect individuals’ values to their beliefs about risk and related facts.[5] Such experiments suggest, for example, that individuals selectively credit or dismiss information in a manner that reinforces beliefs congenial to their values.[6] They also show that individuals tend to be more persuaded by policy experts perceived to hold values similar to their own rather than by ones perceived to hold values different from them.[7] Such processes, the experiments suggest, often result in divisive forms of cultural conflict over facts, but can also be managed in fashions that reduce such disagreement.[8]
Cultural cognition project at Yale Law School
Funded by governmental and private foundation grants, much of the work on cultural cognition has been performed by an interdisciplinary group of scholars affiliated with the Cultural Cognition Project.[9] There are currently over a dozen project members from a variety of universities. Two members of the project—Dan Kahan and Douglas Kysar—are Yale Law School faculty, although other members (such as Donald Braman of George Washington University Law School and Geoffrey Cohen of Stanford University) were previously affiliated with Yale Law School or Yale University. Students from Yale University also contribute to Project research.
Significant findings
Science comprehension and cultural polarization
A study conducted by Cultural Cognition Project researchers (using a nationally representative U.S. sample) found that ordinary members of the public do not become more concerned about climate change as their science comprehension increases.[10] Instead, the degree of polarization among cultural groups with opposing predispositions increases.
Nanotechnology
The Cultural Cognition Project has conducted a series of studies on public perceptions of nanotechnology risks and benefits. Combining survey and experimental methods, the studies present evidence that individuals culturally predisposed to be skeptical of environmental risks are both more likely to seek out information on nanotechnology and more likely to infer from that information that nanotechnology’s benefits will outweigh its risks. Individuals culturally predisposed to credit environmental risks construe that same information, when exposed to it in the lab, as implying that nanotechnology’s risks will predominate.[6] The studies also present evidence that individuals tend to credit expert information on nanotechnology—regardless of its content—based on whether they share the perceived cultural values of the expert communicator.[11] The studies were issued by the Project on Emerging Nanotechnologies at the Woodrow Wilson International Center for Scholars, one of the research sponsors.
"Scientific Consensus"
The same dynamics that motivate individuals of diverse cultural outlooks to form competing perceptions of risks are likely to cause them to form opposing perceptions of "scientific consensus," cultural cognition researchers have concluded.[12] In an experimental study, the researchers found that subjects were substantially more likely to count a scientist (of elite credentials) as an "expert" in his field of study when the scientist was depicted as taking a position consistent with the one associated with the subjects’ cultural predispositions than when that scientist took a contrary position. A related survey showed that members of opposed cultural groups hold highly divergent impressions of what most scientific experts believe on various matters, a finding consistent with the ubiquity of culturally biased recognition of who counts as an "expert." Across a range of diverse risks (including climate change, nuclear waste disposal, and private handgun possession), members of no particular cultural group, the study found, were more likely than any other to hold perceptions of scientific consensus that consistently matched those adopted in "expert consensus reports" issued by the U.S. National Academy of Sciences.
This finding has been contested by other scholars, however. For example, Lewandowsky, Gignac & Vaughan (2012) and van der Linden et al. (2015)'s Gateway Belief Model has found that exposing people to the scientific consensus on human-caused climate change actually depoliticizes the issue and reduces political polarization. These studies suggest that the public (liberals and conservatives alike) use consensus information as an influential "gateway" heuristic to guide their beliefs and decisions on the issue.
Law
Scholars have also applied the cultural cognition of risk to legal issues. One such study examined how individuals reacted to a videotape of a high-speed police chase. In Scott v. Harris,[13] the U.S. Supreme Court (by a vote of 8-1) had held that no reasonable jury could view the tape and fail to find that the driver posed a lethal risk to the public large enough to justify deadly force by the police (namely, ramming the fleeing driver's vehicle, causing it to crash). The majority of study subjects agreed with the Court, but there were significant divisions along cultural lines.[14] Other studies have found that individuals’' cultural worldviews influence their perceptions of consent in an acquaintance or date rape scenario,[15] and of the imminence of violence and other facts in self-defense cases involving either battered women or interracial confrontations.[16]
Relationship to other risk perception theories
Cultural cognition is a descendant of two other theories of risk perception. The first is the cultural theory of risk associated with anthropologist Mary Douglas and political scientist Aaron Wildavsky.[17] The cultural cognition hypothesis is derived from Douglas and Wildavsky's claim, advanced most notably in their controversial book Risk and Culture: An Essay on the Selection of Technical and Environmental Dangers (1982), that individuals selectively attend to risks in a manner that expresses and reinforces their preferred way of life.
Cultural cognition researchers, along with other scholars who have investigated Douglas and Wildavsky's theory empirically,[18] use attitudinal scales that reflect Douglas's worldview typology. That typology characterizes worldviews, or preferences about how society should be organized, along two cross-cutting dimensions: "group", which refers to how individualistic or group-oriented a society should be; and "grid", which refers to how hierarchical or egalitarian a society should be.[19]
The second theory is the "psychometric paradigm", to which Paul Slovic, a member of the Cultural Cognition Project, has made significant contributions. The psychometric paradigm links risk perceptions to various cognitive and social mechanisms that generally evade simpler, rational choice models associated with economics.[20][21] Cultural cognition theory posits that these mechanisms mediate between, or connect, individuals' cultural values to their perceptions of risk and other policy-relevant beliefs.
Combining the cultural theory of risk and the psychometric paradigm, cultural cognition, its exponents claim, remedies difficulties with each.[22] The mechanisms featured in the psychometric paradigm (and in social psychology generally) furnish a cogent explanation of why individuals adopt states of mind that fit and promote the aims of groups, including ones featured in Douglas’s culture theory. They do so, moreover, in a manner that avoids "functionalism," a criticized form of analysis that identifies group interests, rather than individual ones, as a cause for human action.[23][24] At the same time cultural theory, by asserting the orienting role of values, explains how the mechanisms featured in the psychometric paradigm can result in differences in risk perception among persons who hold different values. The interrelationship between individual values and perceptions of risk also calls into doubt the depiction of risk perceptions deriving from these mechanisms as products of irrationality or cognitive defect.[25]
Criticisms
Cultural cognition has been subjected to criticisms from a variety of sources. The most comprehensive critique of cultural cognition to date was published by Princeton social-psychologist van der Linden (2015), who suggests that cultural cognition lacks a true "cultural" dimension, that most of its theoretical constructs are not properly defined and differentiated (e.g., values vs. culture vs. worldviews) resulting in a so-called "strange loop". Similarly, Cass Sunstein argues that "cultural cognition" is used as a heuristic, an imperfect mechanism, without a deeper explanation of the role of "culture" .[26] Sunstein writes; "by definition, the idea of cultural cognition is to illuminate risk perceptions only for those risks that are culturally contested" (p. 17).[26] Van der linden argues that this results in a strange self-reinforcing loop (i.e., cultural polarization is predicted based on the observation that cultural polarization is present to begin with, and analyzed by artificially scaling individuals into two polarizing groups). Van der Linden also contends that the extent to which the proposed cognitive mechanisms are generalizable (e.g., motivated reasoning) across groups is greatly exaggerated in the cultural cognition thesis. The rational choice economists Fremling & Lott (2003), as well as the psychologist Sjöberg (1998) have also suggested that the theory (and others based on the cultural theory of risk generally) explain only a small fraction of the variation in popular risk perceptions. Mary Douglas herself has criticized cultural cognition for a conception of values that is too tightly modeled on American political disputes and that implicitly disparages the "hierarchical" worldview.[27]
See also
- Cognitive biases
- Cognitive dissonance
- Cultural bias
- Cultural Theory of risk
- Information deficit model
- Outrage factor
Notes
- ↑ Kahan (2010a), p. 296.
- ↑ Kahan et al. (2006), pp. 1083-84.
- ↑ Kahan & Braman (2006), pp. 155-158.
- ↑ Kahan et al. (2006), pp. 1086-87.
- ↑ Kahan et al. (2007), p. ?.
- 1 2 Kahan et al. (2009), p. ?.
- ↑ Kahan et al. (2010), p. ?.
- ↑ Kahan (2007), p. 138-139.
- ↑ .
- ↑ Kahan et al. (2012), p. ?.
- ↑ Kahan et al. (2008), pp. 5, 14.
- ↑ Kahan, Jenkins-Smith & Braman (2010), p. 27.
- ↑ 550 U.S. 372 (2007).
- ↑ Kahan, Hoffman & Braman (2009), p. 837.
- ↑ Kahan (2010b).
- ↑ Kahan & Braman (2008).
- ↑ Rayner (1992), p. ?.
- ↑ For example, Dake (1991), Jenkins-Smith (2001), and Peters & Slovic (1996)
- ↑ Rayner (1992), pp. 87-91.
- ↑ Slovic (2000), p. ?.
- ↑ Kahneman, Slovic & Tversky (1982), p. ?.
- ↑ Kahan (2008), p. ?.
- ↑ Boholm (1996), pp. 68, 79-80.
- ↑ Kahan & Braman (2006), p. 252.
- ↑ Kahan et al. (2006), pp. 1088-1106.
- 1 2 Sunstein (2006).
- ↑ Douglas (2003).
References
- Boholm, Åsa (1996), "Risk perception and social anthropology: critique of cultural theory", Ethnos, 68 (2): 159–178
- Dake, Karl (1991), "Orienting Dispositions in the Perception of Risk: An Analysis of Contemporary Worldviews and Cultural Biases", Journal of Cross-Cultural Psychology, 22: 61–82, doi:10.1177/0022022191221006
- Douglas, Mary (2003), "Being Fair to Hierarchists", University of Pennsylvania Law Review, 151 (4): 1349–1370, doi:10.2307/3312933
- Fremling, Gertrud M.; Lott, John R. (2003), "The Surprising Finding That "Cultural Worldviews" Don't Explain People's Views On Gun Control.", University of Pennsylvania Law Review, 151 (4): 1341–1348, doi:10.2307/3312932
- Jenkins-Smith, Hank (2001), "Modeling Stigma: An Empirical Analysis of Nuclear Waste Images of Nevada.", in Flynn, James; Slovic, Paul; Kunreuther, Howard, Risk, Media, and Stigma, London; Sterling, VA: Earthscan., pp. 107–132
- Kahan, Dan (2007), "The Cognitively Illiberal State", Stanford Law Review, 60: 115–154
- Kahan, Dan (2008), "Cultural Cognition as a Conception of the Cultural Theory of Risk", Cultural Cognition Project Working Paper No. 73
- Kahan, Dan (2010a), "Fixing the Communications Failure", Nature, 463 (7279): 296–297, doi:10.1038/463296a
- Kahan, Dan (2010b), "Culture, Cognition, and Consent: Who Perceives What, and Why, in Acquaintance Rape Cases", University of Pennsylvania Law Review, 158: 729–812
- Kahan, Dan; Braman, Donald (2006), "Cultural Cognition of Public Policy", Yale Journal of Law and Public Policy, 24: 147–170
- Kahan, Dan; Braman, Donald (2008), "The Self-defensive Cognition of Self-defense", American Criminal Law Review, 45 (1): 1–65
- Kahan, Dan; Braman, Donald; Slovic, Paul; Gastil, John; Cohen, Geoffrey (2007), The Second National Risk and Culture Study: Making Sense of–and Making Progress in–the American Culture War of Fact, Harvard Law School Program on Risk Regulation Research Paper, 08-26
- Kahan, Dan; Braman, Donald; Cohen, Geoffrey; Slovic, Paul; Gastil, John (2010), "Who Fears the HPV Vaccine, Who Doesn't, and Why: An Experimental Study of the Mechanisms of Cultural Cognition", Law and Human Behavior, 34 (6): 501–516, doi:10.1007/s10979-009-9201-0
- Kahan, Dan; Hoffman, D.A.; Braman, Donald (2009), "Whose Eyes are You Going to Believe? Scott v. Harris and the Perils of Cognitive Illiberalism", Harvard Law Review, 122 (3): 837–906
- Kahan, Dan; Jenkins-Smith, Hank; Braman, Donald (2010), "Cultural Cognition of Scientific Consensus", Journal of Risk Research, 14: 147–174, doi:10.1080/13669877.2010.511246
- Kahan, Dan; Peters, Ellen; Wittlin, Maggie; Slovic, Paul; Ouellette, Lisa Larrimore; Braman, Donald; Mandel, Gregory (2012), "The polarizing impact of science literacy and numeracy on perceived climate change risks", Nature Climate Change, 2: 732–735, doi:10.1038/nclimate1547
- Kahan, Dan; Slovic, Paul; Braman, Donald; Cohen, Geoffrey; Gastil, John (2009), "Cultural Cognition of the Risks and Benefits of Nanotechnology", Nature Nanotechnology, 4 (2): 87–90, doi:10.1038/nnano.2008.341
- Kahan, Dan; Slovic, Paul; Braman, Donald; Gastil, John (2006), "Fear of Democracy: A Cultural Critique of Sunstein on Risk", Harvard Law Review, 119: 1071–1109
- Kahan, Dan; Slovic, Paul; Braman, Donald; Gastil, John; Cohen, Geoffrey; Kysar, Douglas (2008), Biased Assimilation, Polarization, and Cultural Credibility: An Experimental Study of Nanotechnology Risk Perceptions, Project on Emerging Nanotechnologies Research Brief (3)
- Kahneman, Daniel; Slovic, Paul; Tversky, Amos (1982), Judgment Under Uncertainty: Heuristics and Biases, Cambridge ; New York:: Cambridge University Press
- Peters, Ellen; Slovic, Paul (1996), "The Role of Affect and Worldviews as Orienting Dispositions in the Perception and Acceptance of Nuclear Power", Journal of Applied Social Psychology, 26 (16): 1427–1453, doi:10.1111/j.1559-1816.1996.tb00079.x
- Rayner, Steve (1992), "Cultural Theory and Risk Analysis.", in Krimsky, Sheldon; Golding, Dominic, Social Theories of Risk, Westport, Conn: Praeger, pp. 83–115
- Sjöberg, Lennart (1998), "World Views, Political Attitudes, and Risk Perception", Risk: Health, Safety and Environment, 9: 137–152
- Slovic, Paul (2000), The Perception of Risk, London: Sterling, VA: Earthscan Publications
- Sunsein, CR (2006), "Misfearing: A Reply" (PDF), Harvard Law Review, 119 (4): 110–1125
- Sunsein, CR (2006), "On the Divergence of American Reactions to Terrorism and Climate Change", John M. Olin Law and Economics Working Paper, 295
- van der Linden, Sander (2015), "A Conceptual Critique of the Cultural Cognition Thesis", Science Communication, 38 (1): 128–138, doi:10.1177/1075547015614970
- van der Linden, Sander; Leiserowitz, Anthony; Feinberg, Geoff; Maibach, Edward (2015), "The Scientific Consensus on Human-Caused Climate Change as a Gateway Belief: Experimental Evidence", PLOS ONE, 10 (2): e0118489, doi:10.1371/journal.pone.0118489
- Lewandowsky, Stephan; Gignac, Gilles; Vaughan, Samuel (2012), "The pivotal role of perceived scientific consensus in acceptance of science", Nature Climate Change, 3: 399–404, doi:10.1038/nclimate1720
Further reading
- Bailey, R. The Culture War on Facts: Are You Entitled to Your Own Truth? Reasonline, Oct. 9, 2007.
- Bailey, R. Everyone Who Knows What They're Talking About Agrees with Me Reasonline, Feb. 23, 2010.
- Bond, M. How to Keep Your Head in Scary Situations. New Scientist, Aug. 27, 2008.
- DiMaggio, P (1997). "Culture and Cognition". Annual Review of Sociology. 23: 263–287.
- Douglas, Mary., & Wildavsky, A. B. (1982). Risk and Culture : An Essay on the Selection of Technical and Environmental Dangers. Berkeley: University of California Press.
- Finucane, M.; Slovic, P.; Mertz, C. K.; Flynn, J.; Satterfield, T. A. (2000). "Gender, Race, and Perceived Risk: The "White Male" Effect". Health, Risk, and Society. 3 (2): 159–172.
- Flynn, J.; Slovic, P.; Mertz, C. K. (1994). "Gender, Race, and Perception of Environmental Health Risk". Risk Analysis. 14 (6): 1101–1108.
- Jones, R. Fearing the Fear of Nanotechnology. Nature News, Dec. 9, 2008.
- Joyce, C. Belief In Climate Change Hinges On Worldview. NPR: All Things Considered, Feb. 23, 2010.
- Kahan, D., Braman, D., Gastil, J., Slovic, P., & Mertz, C. K. (2007). and Identity-Protective Cognition: Explaining the White-Male Effect in Risk Perception. Journal of Empirical Legal Studies, 4(3), 465-505.
- National Science Foundation. New Studies Reveal Differing Perceptions of Nature-Altering Science, Dec. 11, 2008.
- National Science Foundation. Why "Scientific Consensus" Fails to Persuade, Sept. 13, 2010.
- Palmer, C. (2003). Risk Perception: Another Look at the "White Male Effect." Health, Risk
- Shea, Christopher. The Ninth Annual Year in Ideas: Cognitive Illiberalism. N.Y. Times Sunday Magazine, Dec. 10, 2009.
- Vedantam, Shankar. Why Voters Play Follow-the-Leader. Washington Post, Feb. 4, 2008, A3.
- Weber, Bruce. The Deciders: Umpires v. Judges, New York Times, July 12, 2009, WK1.
External links
- Cultural Cognition Project website
- Public Lecture on Cultural Cognition by Dan Kahan, University of Florida, Oct. 6, 2009