a blog about philosophy in public affairs

Defending Science Deniers

In this post, Alex Davies (University of Tartu) discusses his recent paper in the Journal of Applied Philosophy where he urges caution when the conclusions of political psychologists tempt us to blame the audience for failures in science communication.


A slew of newspaper articles were published in the 2010s with titles like: “The facts on why facts alone can’t fight false beliefs” and “Why Facts Don’t Change Our Minds — New discoveries about the human mind show the limitations of reason”. They promoted a common idea: if a person doesn’t conform to the scientific majority, it’s because she forms beliefs on scientific questions in order to achieve social goals (to fit in with people of her kind, to make her social life more comfortable) instead of engaging in an earnest hunt for the truth. Rational persuasion doesn’t work with her. To change her mind, science communicators must become more paternalist. They must adopt methods of persuasion that bypass her awareness—the arts of the marketeer, the ad man. Drawing upon ideas from my recent paper, I want to convince you not to take these articles so seriously.

Glamorous experiment

The newspaper articles draw upon the rhetorical power of psychological experiment: experiments like the following. Experiment participants are presented with a statement by a man with impeccable credentials. But these participants differ in their political leaning: conservative or liberal. Each leaning is associated with a different take on climate change: conservatives are sceptics, liberals are believers. When presented with a statement (by this man), experimenters found the following pattern. If you’re conservative, and you’re told that the researcher with impeccable credentials says there’s much support for the existence of climate change, you’ll deny that the credentials show him to be an expert on climate change. But if you’re conservative, and you’re told that he says there’s much support against the existence of climate change, then you’ll accept that the credentials show him to be an expert (the same goes for liberals, just in the reverse direction). The experimenters concluded that people change their estimates of the quality of evidence (is he an expert on climate change?) depending upon their political leaning.

How the newspaper articles misled

The trouble is, for all we can tell in experiments like this, something which merely correlates with the participants’ political values could be causing people to respond differently to the same evidence: namely, participants’ background beliefs. If you are liberal, then you haven’t just got a set of liberal values. You also have a characteristic set of beliefs about how the world works. For example, you believe that any scientist who denies the reality of climate change is a fringe scientist—a belief that conservatives are likely to lack. Such differences in background belief easily explain why the liberal and the conservative respond differently to the same evidence. The experiments do not force us to accept that anyone is forming beliefs simply to make social life more comfortable.

Defenders of the paternalistic position knew this already. Although the articles from the 2010s might have you believe otherwise, the psychologists don’t reject the relevance of background beliefs because of their experiments. They do so for another reason.

Politicized scientific questions are often logically independent of each other: whether climate change is human-made has nothing to do with the efficacy of gun policy or the safety of nuclear power. So why would liberals develop a unified view on these questions, and conservatives another, unless their political leanings were causing them to form the beliefs they do? If political leanings weren’t playing a causal role here, we’d have to say this is all just a wild coincidence. But wild coincidences aren’t good explanations. It’s better to conclude that people’s belief-behaviour is being driven by their political leanings.

A wild coincidence?

For this reasoning to be sound we need to know what we should expect if conservatives and liberals were forming their beliefs in a way that is not influenced by their political values: a theory of rational belief. If we don’t know that, then we don’t know whether the pattern of beliefs we actually witness really would be all that unusual (“a wild coincidence”) were it resulting simply from perfectly rational belief forming practices (indeed, philosopher Endre Begby has recently produced a book-length statement of this position).

Two simple observations suggest that any such theory worth its salt will make the pattern of conservatives’ and liberals’ expectable. First: any such theory will make what a person should believe dependent on what they already believe: what we already believe shapes what it seems reasonable to believe, given new information. Second: it is no secret that we live in an information environment where people of different political leanings draw upon different sources of information. Put these two simple observations together and we should expect that, because of the differences in information conservatives and liberals acquire (across a range of logically independent scientific questions), they will probably react to the same evidence in markedly different ways—even if they’re not forming beliefs in order just to fit in with a preferred social group.

Not yet a finding

Think of all those people who read those newspaper articles from the 2010s and believe them. They go away thinking that they have a better understanding of why people disagree with them (on, say, climate change) than those people themselves: “you just think that because…” But the content of those articles wasn’t a finding when they were published and still isn’t a finding even now. Our exchanges “across the aisle” would surely benefit from a proper appreciation of this fact.

The Journal of Applied Philosophy is a unique forum for philosophical research that seeks to make a constructive contribution to problems of practical concern. Open to the expression of diverse viewpoints, it brings the identification, justification, and discussion of values to bear on a broad spectrum of issues in environment, medicine, science, policy, law, politics, economics and education. The journal publishes in all areas of applied philosophy, and posts accessible summaries of its recent articles on Justice Everywhere.

Twitter 

Previous

Small in the City: The Exclusion of Children from Public Spaces

Next

Is Ethics Really Good for Business?

1 Comment

  1. Bill Gilmour

    The simple answer to why individuals in groups share a portfolio of similar beliefs is that it is to socialise with other members of the group. To retain our status in the group, we are prepared to find work around any obstacle.

    The more fundamental question is, why do people join these groups? Why are some liberal and others conservative? How does that come about?

Leave a Reply

Your email address will not be published. Required fields are marked *

Powered by WordPress & Theme by Anders Norén