The Truth About Denial
Bias and Self- Deception in Science, Politics, and Religion
ADRIAN BARDON
Oxford University Press is a department of the University of Oxford. It furthers the University’s objective of excellence in research, scholarship, and education by publishing worldwide. Oxford is a registered trade mark of Oxford University Press in the UK and certain other countries.
Published in the United States of America by Oxford University Press 198 Madison Avenue, New York, NY 10016, United States of America.
© Oxford University Press 2020
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, without the prior permission in writing of Oxford University Press, or as expressly permitted by law, by license, or under terms agreed with the appropriate reproduction rights organization. Inquiries concerning reproduction outside the scope of the above should be sent to the Rights Department, Oxford University Press, at the address above.
You must not circulate this work in any other form and you must impose this same condition on any acquirer.
CIP data is on file at the Library of Congress
ISBN 978–0–19–006227–9 (pbk.)
ISBN 978–0–19–006226–2 (hbk.)
ISBN 978–0–19–006229–3 (epub.)
Contents
Preface
Acknowledgments
1.Bias and Belief
1.1What Is Denial?
1.2“Hot Cognition”
1.3Mechanisms of Motivated Reasoning
1.4The Origins of Denial
1.5Pathological Ideology, and Denialism as a Social Phenomenon
2.Science Denial
2.1Climate Science Denial
2.2Personality, System, and Status Quo
2.3The Asymmetry Thesis
2.4Science and Societal Change
3.Pride, Prejudice, and Political Economy
3.1Political Economics
3.2Poverty and the Fundamental Attribution Error
3.3Classism and Racial Stereotyping
3.4The Liberty Argument
3.5Asymmetry Again
4.Religion
4.1Reasons to Believe
4.2Is It Even a Belief?
4.3The Origins of Religiosity
4.4Tenacity
4.5The Problem(s) with Religion
4.6The Retreat into Abstraction
Afterword: Directions for Science Communication
A.1More Information?
A.2Message Framing and Delivery
Index
Preface
Eighteenth-century philosopher David Hume is justly celebrated for treating the mind as part of nature, rather than as something that belongs to some spiritual realm separate from the material world. “Naturalism” about the mind just means that the mind’s operations are natural phenomena that obey rules like any other phenomenon described by science. According to naturalism, the proper study of our mental operations is through natural sciences like biology, neuroscience, and psychology. In offering explanations for various phenomena, the sciences center on the search for causal regularities. In seeking to understand belief as a naturally occurring phenomenon, Hume therefore decided to focus on what causes one to hold a given belief, rather than on what justifies one in holding a belief. This might seem the wrong approach if one felt that (a) one’s beliefs about the world are typically well grounded in reasons, and (b) the process of arriving at—and acting on—those beliefs is accurately described as one that involves a dispassionate weighing of evidence before reaching a conclusion. But human beings generally do not operate this way, if ever. Hume argued that reason is an essential tool for achieving our ends, but it remains a “slave” to the passions, in that our actions are ultimately explained by our motives—and reason by itself does not motivate.
The often-insidious influence of unconscious motives on our actions is most apparent in the area of ideology. Philosophers, historians, psychologists, political scientists, and other students of human nature have long noted that we exhibit an enormous susceptibility to unconscious bias in our beliefs. Our interests and emotional needs affect not just our values and choices but also our factual picture of the world around us. In his book on selfdeception, psychologist Harry Triandis discussed the many reasons
to conclude that “people often see what they wish to see, and believe what they wish to believe.”1 Our picture of the world is distorted by self-interest, peer influence, prejudice, fear, and favoritism, and we are often not aware of the influence our motives have on our factual understanding of the evidence for our conclusions. Triandis described how we tend to prefer (again, often without self-awareness) explanations of phenomena that conform to our favored view of things. We seek out evidence and opinions that tend to confirm our prejudices; we ignore or avoid unwanted information. At the same time, we routinely view ourselves as more objective in our judgments than our ideological opponents.
An approach to understanding the mind’s operations that focuses on motives seems particularly apt when our subject of study is ideological or doctrinal belief. Political, religious, and other worldviews include certain ideals and prescriptions, but such worldviews themselves rest on a bedrock of factual claims about the world. Because we are not dispassionate about our ideological commitments, it is exceedingly difficult (indeed, almost unheard of) to be entirely dispassionate in the way we account for them using facts and evidence. The justifications we offer for our ideological positions are suffused by unconscious, implicit bias, and are maintained by selective attention to evidence. “Denial” is a word we sometimes use in describing the psychological state of those who are self-deceived about the real causes for the beliefs they hold. Any economist will tell you that human behavior is all about incentives. We are increasingly coming to understand that factual belief can work much the same way.
Those with an interest in manipulating public opinion are happy to exploit this aspect of human nature by spinning the truth in ways that appeal to existing prejudices. As a result, ideological partisans wind up disagreeing not just on policy preferences but even on basic facts.
Of late, observers of the U.S. political landscape have been commenting more and more on the alarming ways in which Americans of different political persuasions and cultural, racial, and other identity groups seem not just to disagree on issues but
also to be living in different realities. One area where this situation has significant consequences is in the way people can interpret reports of scientific consensus differently, depending on their prejudices and allegiances. Different people, for example, may hear about the science on the human causes of climate change and— sincerely—perceive either certainty, uncertainty, or outright hoax. This phenomenon undercuts public discourse on matters where public policy grounded in solid science has never been more essential.
This phenomenon is on a continuum with the way in which different people can look at those living in poverty, and see them either as victims of unfair circumstances or as people who are complicit in a culture of irresponsibility and dependency. Different people will consider a given refugee population, and see either an alien threat to our way of life or deserving potential members of our society. Different people will see a video of a police shooting; some will see justification and others will see murder.
An environment of polarization, prejudice, bias, and willful selfdeception, combined with an often misleading political and media environment, is toxic for political discourse. Polarization on matters of fact is affecting progress on matters of critical public importance, such as action on climate.
Research on denial has exploded over just the last few years. This includes game-changing work from social, political, cognitive, and evolutionary psychology, as well as from sociology, communication studies, political science, history, and philosophy. My goal has been to bring this diverse work together for the reader while, I hope, convincing readers of the urgent importance of gaining a better understanding of unconscious bias and self-deception. Denial concerns all of us—both as victims and as perpetrators—and so this work is intended not just for an academic audience; it is for everyone.
1 Harry Triandis, Fooling Ourselves: Self-Deception in Politics, Religion, and Terrorism (Westport, CT: Praeger, 2009).
Acknowledgments
This wide-ranging project would have been impossible without the contributions of others. I benefited from discussions with Emily Austin, Mark Bedau, Melissa Harris-Perry, Ana Iltis, Justin Jennings, Ralph Kennedy, Win-Chiat Lee, Christian Miller, Naomi Oreskes, Keith Payne, Jedediah Purdy, Maura Tumulty, and Alan Wilson. A 2015 seminar here at Wake Forest on “The Science of Science Denial” allowed me to have extremely productive interactions with Peter Ditto, Heather Douglas, Erin Hennes, Neil Van Leeuven, Aaron McCright, Mark Navin, Brendan Nyhan, Jay Odenbaugh, Vanessa Schweizer, Elizabeth Suhay, and Sara Yeo. A number of these scholars read portions of my manuscript and made really helpful comments. (Peter Ditto and Elizabeth Suhay need to be given special mention as providing comments, advice, and guidance above and beyond the call of duty.) The amazing Alex Madva read drafts of several chapters and made many great suggestions.
I would like to thank Oxford University Press executive editor Peter Ohlin for his support and advice. I also benefited greatly from the comments and suggestions made by six anonymous referees for Oxford University Press. Peer review is a time-consuming job with little reward, yet it is of absolutely central importance to research and scholarship.
Jacque Acierno helped with research work for the project, and Kathryn Dillin and Ally Howell each provided editorial assistance. Tyler Pruitt rendered the two charts in chapter 3 using data I supplied.
I am grateful for the support I received, over the final year of this project, as a Fellow with the Humility and Conviction in Public Life project at the University of Connecticut.
I have been very fortunate to be able to discuss the issues covered in this book with my wife, Janna Levin; she also checked the manuscript for errors. Dr. Levin made many sacrifices so I could have the time to work on this project over the last few years. This book is dedicated to her, as well as to my two wonderful boys, Zev and Max.
Bias and Belief
1.1 What Is Denial?
In his 1689 book An Essay Concerning Human Understanding, English philosopher John Locke laments the human tendency to close the mind off to unwanted conclusions:
Let ever so much probability hang on one side of a covetous man’s reasoning, and money on the other; it is easy to foresee which will outweigh. Earthly minds, like mud walls, resist the strongest batteries: and though, perhaps, sometimes the force of a clear argument may make some impression, yet they nevertheless stand firm, and keep out the enemy, truth, that would captivate or disturb them. Tell a man passionately in love that he is jilted; bring a score of witnesses of the falsehood of his mistress, it is ten to one but three kind words of hers shall invalidate all their testimonies. Quod volumus, facile credimus; what suits our wishes, is forwardly believed, is, I suppose, what every one hath more than once experimented: and though men cannot always openly gainsay or resist the force of manifest probabilities that make against them, yet yield they not to the argument.1
This observation about human nature is pretty uncontroversial. Indeed, as social psychologist Peter Ditto puts it, the pervasive influence of our hopes and fears on our judgment “would likely seem so obvious to the average person as to defy the need for empirical confirmation.”2 Individual factual beliefs often derive not from a cold assessment of probabilities but, rather, from a psychological phenomenon sometimes simply called denial. Denial
involves the emotionally motivated rejection (or embrace) of a factual claim in the face of strong evidence to the contrary. Easily recognizable examples include denying one’s spouse is being unfaithful despite ample evidence that he or she is cheating; denying that one has a terminal illness despite diagnoses to that effect; or denying one is an alcoholic despite a history of heavy drinking with destructive consequences. In such cases we colloquially describe the person as being “in denial.” (The word “denial” suggests disbelief rather than a positive assertion, but as a misrepresentation of reality, denial can be expressed in terms of either denying something true or affirming something false: The person denying he is an alcoholic may say “I am not an alcoholic,” or, affirmatively, “I can stop drinking anytime I like.” The candidate down 20 points in the polls who privately insists that she can still win the election may be in denial.3)
We may find applications of the concept of being “in denial” most familiar in cases of personal difficulties like those just mentioned.4 But such cases are structurally identical to many instances of belief of much more public import. One pressing example of tendentious belief in the face of contrary evidence is the sincere denial of the reality, severity, and/or urgency of anthropogenic global warming (AGW). Given adequate information about the clear scientific consensus on the overall situation, no one should be denying AGW with confidence. Just as in the more common, more personal instances of denial, the selective representation of the climate consensus is based on a preexisting, affective attachment to a particular conclusion. Another significant example is the not uncommon belief in the inherent superiority of one’s own race or ethnicity—or in the inferiority of another’s. When sincerely articulated by someone who is sane and moderately well informed, this sort of denial of reality—just as in the case of someone who wants to disbelieve one’s spouse has been unfaithful, or who wants to believe he or she has many more years to live— derives from wanting the world to be a certain way that it evidently isn’t. Any sincere statement like “My husband would never do that to me,” or “The Armenian genocide is a myth,” or “Vaccines
frequently cause injury,” or “President Obama was born in Kenya,” or “My financial success has had nothing to do with my inheritance,” or “The Cowboys are definitely going to win the Super Bowl this year” is an indication that sincere speaker is in denial when the speaker (a) has little reason, all things considered, to believe the claim; (b) has been exposed to good reasons, all things considered, to doubt it; and (c) has some emotional need to believe it that accounts for the belief (i.e., if the emotional need weren’t there, the belief wouldn’t be either).
Beliefs like these are not purely self-generated. Powerful political or economic elites, through their paid agents or media surrogates, may be motivated to deliberately misinform the public on various issues. Such efforts are no doubt helped along by ignorance. Many Americans are uninformed about science, the economy, and many other issues relevant to social and economic policy. Obviously, many Americans deny the reality and severity of climate change. A majority deny the evolution of human beings by natural selection.5 When U.S. adults are asked what percentage of the federal budget goes to foreign aid, the median response is 25% (the real foreign aid figure is less than 1%); Americans also grossly overestimate how much those from the U.S. middle class pay in federal income taxes.6 Despite the fact that violent crime in the United States has fallen by over 50% since 1992,7 year after year a majority of Americans report an overall increase in violent crime.8 When asked, in a 2016 Ipsos-MORI poll, what percentage of wealth is held by the bottom 70% of Americans, U.S. respondents guessed 28%, whereas the actual figure is about 7%.9 In the same poll, the average U.S. respondent’s guess as to the Muslim population in the United States was 17%, whereas the actual figure is about 1%. Survey after survey shows that voters know very little about political party platforms, and yet voters’ own policy preferences are heavily influenced by what the party elites favor. But to bring the public along, lies and demagoguery need to find fertile ground. Hitler’s claims that the Jews were responsible for Germany’s economic problems were only effective because they catered to a baseline anti-Semitism on the part of a substantial
portion of the German population. False claims about climate science by vested interests and their allies find a receptive audience in those with preexisting anti-government inclinations. Doctrines upholding the special, divinely chosen status of some particular religious or ethnic group persist because they satisfy powerful emotional needs for affirmation, status, security, and/or meaning. This is why the study of problems caused by the public and private misunderstanding of reality needs to look not just at misinformation but also at the murky psychological processes that allow bias and self-deception to thrive.
The purpose of this book is to examine the pervasive human tendency to deny uncomfortable truths and to discuss how this tendency affects public discourse—as well as private life—on an exceedingly wide range of important topics. The phenomenon of denial, as we shall see, is dependent on motivated cognition. “Motivated cognition” refers to the “unconscious tendency of individuals to process information in a manner that suits some end or goal extrinsic to the formation of accurate beliefs.”10 Motivated cognition happens behind the scenes, but is closely tied to the more overt rationalization of belief, which I shall define as the process of retroactively inventing defensive justifications for holding those beliefs formed via motivated cognition. Motivated cognition is about belief formation, whereas rationalization is about maintaining and defending beliefs. Rationalization is thus a kind of second stage for motivated cognition. Unlike motivated cognition, explicit rationalization is a conscious process, though we are often not consciously aware of our motives when we engage in it. (I shall use the familiar phrase motivated reasoning—the popular use of which doesn’t generally distinguish between initial motivated cognition and the second-stage rationalization of that way of thinking—to denote the whole process wherein implicit, motivated cognition is followed by the generation of spurious reasons to maintain those sincerely held beliefs formed via motivated cognition.)
Let’s get a little clearer on exactly what “denial” does and does not include, for purposes of this discussion. It does not refer, for instance, simply to being misinformed. I wish to examine denial
strictly in that sense of being “in denial” wherein the denier is exhibiting a kind of emotionally self-protective self-deception. (Denial is often misattributed to ignorance; as I shall discuss further, there is good reason to think that the real issue is motivated reasoning.) Denial, in this context, presumes some exposure to relevant—and unwelcome—facts and constitutes a kind of reaction to them. This sort of self-deception is different from mendacity, wherein one purposefully lies to others about the existence of evidence for something, or deliberately misrepresents the evidence. One might know perfectly well, for example, that one’s oil company is responsible for a toxic spill, and respond by actively and consciously engaging in a cover-up and public denial of responsibility.11
Neither am I talking about “spin,” or what philosopher Harry Frankfurt has termed bullshit. 12 The bullshitter’s intent is not to lie but, rather, to influence or to create a certain reality, and is simply indifferent as to whether his or her claims are true or false. The job of the trial attorney, the political operative, or the commercial advertiser is neither to uphold the truth nor to lie; rather, the job is to represent one’s client in the best light possible.
Being in denial is also to be distinguished from wishful thinking. What wishful thinking has in common with denial is that each fulfills an emotional need of some kind. However, with wishful thinking, there is a belief without solid evidence for a conclusion one way or the other. You might wishfully believe, for example, that an acquaintance is romantically interested in you, despite having no clear positive indication of this. This becomes denial only if you come to discredit strong evidence to the contrary, such as the knowledge that the object of your affections is romantically involved with someone else, or is only attracted to members of a different sex, and so on. Unlike beliefs arising from denial, beliefs arising from wishful thinking can even become what philosopher Neil Van Leeuven calls “self-fulfilling beliefs”: An otherwise unwarranted confidence in, say, romantic or athletic prospects can sometimes contribute to the actual fulfillment of those prospects.13 These are also sometimes called “positive
illusions,” and a tendency to experience them may be adaptive, in that a stubborn disposition to maintain a particular belief in the face of contrary evidence might sometimes work in one’s favor.14 Negative emotions can hamper our ability to function, and some ability to automatically discount the factual sources of some negative emotions may be adaptive. As psychologist Timothy Wilson puts it, people are “equipped with powerful psychological defenses that operate offstage, rationalizing, reinterpreting, and distorting negative information in ways that ameliorate its impact.”15 (He calls this our “psychological immune system.”) A tendency to unrealistically positive self-appraisal may give us the confidence to overcome daunting challenges. Undue discounting of the odds against us when faced with, say, an external threat, may expedite a productive response by heading off paralyzing fear. Belief in a benevolent higher power may solidify group membership, or provide the comfort we need to endure loss. Unfortunately, what in some contexts might be an adaptive—even charming—facet of human nature can have very bad consequences. Wishful thinking can easily morph into denial when the evidence turns against you, at which point the failure to respond appropriately to the facts can be destructive on a personal, societal, or global scale.
We sometimes use the word “delusional” as a derogatory term for people who we think are in the grips of motivationally biased thinking, but there is an important distinction between denial and delusion. Delusions arise from illness or psychiatric disorder (like schizophrenia), or from injury (e.g., phantom limb syndrome), rather than from emotional need.16 Neurologist Robert Burton describes known delusional conditions like Cotard’s syndrome (in which a person suffers from an unshakeable conviction that he or she is already dead) or Capgras syndrome (in which one suffers from an unshakeable conviction that a loved one has been replaced by an imposter), that often directly result from acute brain-related events like stroke or viral meningitis.17 In such cases, we may rarely happen also to find some distinct, accompanying emotional need to believe such things, but even if so, we would not attribute
the delusion to that need. Further, a delusional belief need not be based on a rationale the believer actually expects everyone else to accept. In cases of delusion, the victim may not even attempt to rationalize the delusional belief.18 Someone with Capgras, or who suffers from the delusion that he is in communication with aliens, may not necessarily also believe that others ought to be able to come to the same conclusion by considering publicly available evidence. By contrast, the person in denial is rational in the sense that evidence still matters. Typically, as psychologist Ziva Kunda argued in an influential essay on motivated reasoning, even those who are “motivated to arrive at a particular conclusion attempt to be rational and to construct a justification of their desired conclusion that would persuade a dispassionate observer.”19 Motivated reasoners are neither divorced from nor indifferent to reality; their perception of reality is just motivationally skewed in nonconscious ways. Nor do we think of people in denial as literally incapable of revising their beliefs, unlike people whose false beliefs can be traced to acute brain injury. Delusion is unusual, abnormal, and pathological, whereas denial is common, normal, and requires no malfunction.
People are motivated to deny reality for many different reasons, including self-interest, a desire to avoid feelings of insecurity or loss of control, or a desire to defend one’s cultural or political identity. Cognitive dissonance is the state of mind one experiences when one encounters information that is inconsistent with one’s beliefs. This is cognitively disruptive simply because it forces a reassessment of some accustomed representation of reality. Groundbreaking studies on dissonance and dissonance resolution were executed by Leon Festinger in the 1950s, wherein he studied the psychological effects of new, inconsistent information on one’s existing beliefs or worldview.20 He observed a natural, psychological resistance to belief revision as a result of dissonant information. One likely explanation for some of this resistance is an evolved cognitive heuristic telling us that, other things being equal, mental representations of the world built up over time are more likely to be accurate, and so should be favored over new
information up to a point. But Festinger noted an emotional component in subjects’ responses. Cognitively dissonant information can also be experienced as personally disruptive— undermining the comfort one feels in thinking one has a good grasp of things—and therefore be anxiety inducing. This discomfort spurs an unconscious drive to resolve the dissonance by discounting or otherwise dismissing information that contradicts existing beliefs. In important confirming studies of subjects in an induced dissonant state, social psychologists Andrew Elliot and Patricia Devine demonstrated that “dissonance is experienced as psychological discomfort.”21 Further studies by psychologist Eddie Harmon-Jones tested whether the dissonance itself causes the discomfort, or rather simply by some perception of the consequences of being wrong. He confirmed that “dissonance is associated with increased feelings of negative affect even in situations void of aversive consequences.”22 Most dissonant information one encounters is not particularly emotionally threatening in terms of its content (e.g., “I was sure that only Australia had marsupials, but now I hear that American opossums are marsupials,” or “I thought that low-fat diets were better for losing weight than low-carb diets, and now this magazine article is saying that’s not true”). Such new information just spurs brief confusion, followed, in some cases, by dismissal of the new claim or, in others, a not terribly disruptive update of one’s obsolete beliefs and/or behaviors. In his original studies, however, Festinger found that cognitive dissonance can produce intense emotional discomfort, when the particular change in thinking demanded by the dissonant information threatens a representation of reality to which the subject is emotionally attached. Information can be threatening to the self because it conflicts with one’s desires, expectations, sense of control, or cultural or political identity (e.g., “I was expecting the Rapture on this day, but it didn’t happen,” or “I was sure Hillary Clinton was going to win the election, but she didn’t”).23 In other words, cognitive dissonance can refer either to the “plain vanilla” dissonant effect of unexpected information or to the “extra spicy” dissonance experienced upon receiving unwanted information. The effects of the latter feelings of dissonance are
much more dramatic than the effects of the former. It is when it represents some sort of threat to a state of affairs the individual prefers to believe in, or to some system of thought with which the individual identifies, that dissonant information will frequently lead to outright denial. (The cultists Festinger was studying chose to believe not that they had been wrong about the apocalypse but, rather, that they had headed off the apocalypse by their devotion. After the 2016 U.S. presidential election, Hillary Clinton supporters called for recounts in the states they unexpectedly lost, and blamed foreign interference for electoral losses—while at the same time, Donald Trump denied that he had lost the popular vote, insisting that millions of people voted illegally for his opponent.) What is quite clear is that, when dissonance arises, the extent to which one is emotionally committed to maintaining a certain belief or worldview now under threat both increases the negative response to the dissonance and affects how the discrepancy is resolved.24 (The ancient Greek word amathia [“not-learning”] is sometimes taken to refer specifically not to ignorance but to the state of unwillingness to learn—typically, when one is motivated to maintain a certain belief or worldview in the face of recalcitrant evidence.25)
It might be helpful here also to mention compartmentalization, which is a way of doing a kind of cognitive judo move on dissonant beliefs and feelings: When we compartmentalize, we somehow manage—at least temporarily—simply to avoid thinking about one side of the inconsistency in our beliefs and behavior. For example, consider the environmental activist who blithely makes flight reservations, thinking only about the vacation destination and not about his or her knowledge of the contribution passenger jet travel makes to global warming. The liberal opposing unfair labor practices overseas can walk into Walmart and think only about the low prices; someone who is revolted by factory farming practices can grab a burger at a fast-food restaurant without the inconsistency this decision represents ever coming to mind. Mere moments after sincerely avowing a duty to the poor, a devout Christian exiting his church can stroll right by a homeless beggar without giving it a thought. Though it is also an unconscious defensive response to
distressing cognitive dissonance, I would distinguish compartmentalization from denial in that, with compartmentalization, the inconvenient information in question is suppressed rather than denied; dissonance is thus avoided rather than defeated. The compartmentalizer’s beliefs do not change; there is simply a temporarily unrecognized, hypocritical inconsistency between his or her avowed beliefs and his or her behavior. It is a different story when we twist our representation of the facts to suit our needs (e.g., “I’m sure Apple has fixed the labor practices in their factories by now,” or “That guy probably just wants money for drugs”).26 This sort of manipulation is characteristic of denial. The compartmentalizer might change his or her behavior when called out on the inconsistency, but the person in denial has eliminated inconsistency by altering his or her (perceived) reality. The denier is harder to dislodge because he or she has devoted cognitive resources to eliminating dissonance, rather than just ignoring it.
Denial manifests itself in a wide spectrum of contexts, both private and public. Believing in fate or luck, in supernatural powers, in the fidelity of one’s spouse, or in one’s own competence at auto repair can (depending on circumstances) be examples of denial. In a forthcoming book, sociologist Keith Kahn-Harris suggests a further distinction between denial and denialism:
Denialism is an expansion, an intensification, of denial. At root, denial and denialism are simply a subset of the many ways humans have developed to use language to deceive others and themselves. Denial can be as simple as refusing to accept that someone else is speaking truthfully. Denial can be as unfathomable as the multiple ways we avoid acknowledging our weaknesses and secret desires. Denialism is more than just another manifestation of the humdrum intricacies of our deceptions and self-deceptions. It represents the transformation of the everyday practice of denial into a whole new way of seeing the world and—most important—a collective accomplishment. Denial is furtive and routine; denialism is combative and extraordinary. Denial hides from the truth, denialism builds a new and better truth.27
Denialism is the (usually collective) building of a worldview that both derives from and supports the denial of some inconvenient truth. Some forms of denialism have significant public policy implications: for example, holding a belief—despite having good reason not to—in inherent racial superiority, in the status of others as dangerous nonbelievers or apostates, in the efficacy of destructive authoritarian policies, in the claims of vaccine opponents, or in the claims of anthropogenic global warming deniers. These forms of denialism will typically be linked to the believer’s ideology, or ideological worldview.
“Ideology” is another general term without a fully agreed-upon meaning. Roughly, it refers to a set of factual beliefs—together with some evaluative attitudes pertaining to those facts—that give rise to some broader social, cultural, political, economic, or religious viewpoint. An ideology combines a kind of factual, explanatory theory of (some aspect of) the world, with some prescriptive conclusions based on the factual picture presented by the theory. For example, one might believe that (a) self-interested bankers exert disproportionate control over the world economy, and that (b) this control has proven harmful to the interests of much of the world’s population. This picture, in turn, is the (alleged) factual basis for the broader evaluation that unregulated globalized capitalism is a problem, and therefore we ought to have greater governmental control over international banking.
There is nothing inherently wrong with having an ideology. However, in practice, ideology is often tied up with denial, and denial is a primary reason for the intractability of ideological conflict: Someone in denial, by definition, is not receptive to disconfirming evidence or argument, and is highly resistant to change or compromise. This is why “ideologue” tends to be used as a pejorative term, equivalent to characterizing someone as closedminded. Denial can be a cause of ideological positions, or a product of an emotional attachment to such a position. It is interesting to note that the most influential early discussion of ideology was that of Karl Marx, who saw ideology primarily as a vehicle for misconceptions on the part of the majority as to where its economic
interests lie. (Friedrich Engels’ famous term “false consciousness” was coined in reference to this phenomenon.28)
It has been much lamented in recent years that opposing political factions in the United States seem increasingly unable to agree on basic facts, such as the fact that the Earth’s surface is warming due to human activity.29 Ideological conflict is often thought of as primarily a matter of conflicting value systems. Yet, it is a dispute over factual claims, rather than a difference in values, that should be expected to be the primary arena for ideological disagreement in liberal democratic contexts. The defining feature of a liberal democracy, in the Lockean, “classical” sense, is that individual rights trump any particular, sectarian conception of the good (i.e., what way of life and/or belief system is best). By contrast, in, say, an authoritarian communist or theocratic context, the state claims the authority to impose a particular conception of the greater good on its citizens; these latter kinds of regime are thus “illiberal” in the classical sense.* In an open society, mainstream public policy positions at least ostensibly rely on factual claims about the expected benefits of certain social, economic, or foreign policies, rather than explicitly relying on sectarian or doctrinal claims as justifications in themselves.30
The boundaries between factual beliefs and moral values can be fuzzy, partly due to some mutual dependence of one on the other.31 Yet, ideological debates over right and wrong (in liberal democratic contexts) almost always involve some disagreement over factual claims. Consider opposition to marriage equality for same-sex couples. Clearly, some oppose marriage equality because they evaluate homosexuality as wrong, as immoral. But which comes first—value or fact? This moral evaluation of homosexuality is predicated on a set of factual claims, including (a) that God exists and (b) that God condemns homosexuality. In the public sphere we hear the claim that marriage equality would undermine the institution of marriage and/or that “traditional” heterosexual marriages provide a more stable environment for children.32 These are claims of fact, and appeal to these alleged facts in opposing any expansion of the legal definition of marriage implies an appeal to
shared evaluative beliefs: that marriage is a positive social institution and that stable families are best for children. Even Genocide/Holocaust denial can be understood as resting on factual beliefs about the innate qualities of members of other races: genocide apologists value peace and humanity while excluding some inconvenient group, such as the Jews or Kurds, from the ranks of the fully human.
Or consider the mid-2000s debate over the morality of the Bush administration’s “enhanced interrogation” techniques, as practiced on insurgents captured in Afghanistan and elsewhere. Most members of each side would quickly agree with the evaluative statement that “torture is wrong.” The primary dimension of public disagreement was on historical and legal precedents as to what constitutes torture. The Bush administration produced lawyers who were willing to claim that causing physical pain does not constitute torture, so long as it falls short of pain associated with “serious physical injury, such as organ failure, impairment of bodily function, or even death.”33 Under this definition of torture, the U.S. government could argue that waterboarding is not torture, even though, after World War II, Japanese officers were executed by representatives of the U.S. government for having used waterboarding as an interrogation tactic.34
Nothing exemplifies ideological “fact polarization” like the debate over economic policy. During the last few decades in the United States, this debate has revolved around whether taxation and spending policies should reflect Keynesian demand-side economics or small-government supply-side economics. Proponents of the former emphasize fairness and equality of opportunity; proponents of the latter emphasize liberty and personal responsibility. Yet each side typically would agree that, strictly speaking, these are all positive values; the main issue is the factual, empirical question as to what is lost or gained by relatively high taxation and a strong social safety net, versus the reverse policy of low taxation and limited social spending.35
In each of these cases, ideological conflict has centered on a disagreement over facts, not values. Sociologist John Levi Martin
[D]ifferences in ideology seem to correlate much more strongly with differences in descriptive statements than they do with differences in purely prescriptive ones. . . . And this is because . . . the thing about values is that they are all good, considered singly. It’s only in tradeoffs that people begin to distinguish themselves. So people can agree with one another in their value commitments, while still having diametrically opposed opinions.36
In one respect, this is an optimistic view of ideological conflict: If ideological conflict is a matter of irreducible and irreconcilable value systems, there is no possible resolution but for one side to suppress or dominate the other; by contrast, factual disputes are resolvable, in principle, without violence. So there is a puzzle here: In the face of roughly the same available information, how is it that different individuals or groups can come to such wildly different conclusions about reality? If most ideological differences rest on factual disputes, and if factual disputes can (in principle) be resolved by appeal to evidence and reasoning, then most ideological disputes should be resolvable simply by study and debate among reasonable and open-minded persons. Yet in practice this is not what happens. Why not? A vast amount of—mostly quite recent— evidence from social psychology, sociology, political science, and allied fields points to the answer: motivated reasoning and denial. Answering the following questions is therefore essential: (a) What are the motivators behind motivated reasoning? and (b) What are the psychological mechanisms that can turn a defensive emotional impulse into a sincere factual representation?
Much of the academic, philosophical literature on belief has focused on the necessary and sufficient conditions of a belief’s being justified. The phenomenon of denial spotlights the ways in which evidence and justification can be effectively irrelevant as to whether someone believes something or not. In this context, then, the correct line of inquiry (as David Hume proposed) would seem to be one directed at what causes or explains belief, rather than one