Book review – The Misinformation Age: How False Beliefs Spread

“A lie can travel halfway around the world while the truth is putting on its shoes”. This oft-misattributed quote highlights a persistent problem in our world. Why do false ideas spread so easily? Sure, blame people’s ignorance or stupidity, but philosophers Cailin O’Connor and James Owen Weatherall write that the problem is far more insidious. Through a combination of case studies and modelling work, they convincingly argue that the same social dynamics by which truth spreads are inherently vulnerable to exploitation. But first, some vegetable lamb.

The Misinformation Age

The Misinformation Age: How False Beliefs Spread“, written by Cailin O’Connor and James Owen Weatherall, published by Yale University Press in January 2019 (hardback, 266 pages)

Fake news and lies have a long history (see e.g. Bunk: The Rise of Hoaxes, Humbug, Plagiarists, Phonies, Post-Facts, and Fake News and the forthcoming Truth: A Brief History of Total Bullsh*t). Take, for example, the Vegetable Lamb of Tartary. The idea that lambs grow on trees seems absurd now, but for centuries it was one of the many hybrid beasts populating mediaeval bestiaries. Learned men were convinced of its existence, based on nothing more than hearsay. And that problem remains, in spite of communication technologies allowing the rapid, global spread of information. Our knowledgebase long ago already became too vast for people to construct their worldviews from first principles. So all of us, scientists included, largely have to trust what other people tell us. And that is where things can get messy.

I was prepared for a lot of concerned hand-wringing. But The Misinformation Age offers something far better than that: an incisive analysis in four chunky chapters of how social interactions influence false beliefs, starting with scientists. “Wait now,” I hear you cry “aren’t scientists supposed to be the good guys?”. They are, and that is exactly why we start with them. After all, here is a community of well-informed, highly trained information gatherers and analysts, who dedicate their lives to the pursuit of knowledge – “the closest we have to ideal inquirers” in the words of the authors. And even they are fallible.

By modelling simple communication networks, i.e. individuals exchanging information to decide which of two options to choose, the authors show how consensus is reached, but also how social factors quickly complicate the picture. Evidence presented by others is judged not just on its merits, but also on the trust we have in the presenter. And we are prey to the psychological phenomenon of conformity bias: we are uncomfortable disagreeing with others and like to fit in. Both can rapidly lead to polarization, with groups having different convictions.

“Our knowledgebase long ago already became too vast for people to construct their worldviews from first principles. So all of us […] have to trust what other people tell us. “

And this is before we consider real-life complications of industry interests who will bend the truth to further their own fortunes. “Doubt is our product”, wrote a tobacco company executive once in an unsigned memo, and it is one of the best-known examples of how industries sow discord and confusion (see more in e.g. Doubt is Their Product: How Industry’s Assault on Science Threatens Your Health, the famous Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming, and Creating Scientific Controversies: Uncertainty and Bias in Science and Society). Data fabrication is the blunt strategy, but, as O’Connor and Weatherall show, there is an insidious sliding scale to ever more subtle forms of propaganda. From biased production (i.e. industries funding or doing their own research but selectively reporting only the desired results) or increasing the productivity of certain research groups by funding them, to quoting academics out of context or selectively publicising only certain research findings. And this is before we get to the weaponisation of people’s professional reputations (see also Bending Science: How Special Interests Corrupt Public Health Research and Tainted: How Philosophy of Science Can Expose Bad Science).

Particularly problematic is that industries can simply exploit existing weaknesses in current scientific practice. Academic journals are biased towards publishing novel or positive results. And there is a host of factors stimulating salami science: the publication of more but smaller and statistically underpowered studies, rather than fewer but larger and more powerful ones. These include limited funding, limited time due to short tenures, and the importance attached to publication volume and citation metrics when hiring scientists. The resulting reproducibility crisis and the temptation of doctoring data offer easy pressure points for industry interests (see also my reviews of Stepping in the Same River Twice: Replication in Biological Research and Fraud in the Lab: The High Stakes of Scientific Research).

“”Doubt is our product”[…] is one of the best-known examples of how industries sow discord and confusion.”

Meanwhile, in the “real world”, many of these mechanisms play out, often amplified, in how society at large forms their beliefs. The authors highlight journalism – its ethical framework of fairness and representing-all-sides-of-a-debate can backfire spectacularly. In the UK, for example, the BBC has been lambasted for giving equal weight to lobbyists and scientists in its coverage of climate change, creating the illusion of a debate where there is none. And online social media sites such as Twitter and Facebook can isolate us in so-called filter bubbles (see The Filter Bubble: What The Internet Is Hiding From You, but also Are Filter Bubbles Real?), though I found the authors’ coverage of the algorithms driving these sites fairly limited (for introductions see e.g. Outnumbered: From Facebook and Google to Fake News and Filter-bubbles – The Algorithms That Control Our Lives and my review Rage Inside the Machine: The Prejudice of Algorithms, and How to Stop the Internet Making Bigots of Us All).

You would be forgiven for thinking that two philosophers reporting on modelling work could make for a boring read, but nothing is further from the truth. O’Connor and Weatherall write in a breezy style, making it very easy to follow their argument, and they make good use of diagrams when they discuss their network models. Furthermore, they nicely balance the book with interesting and relevant case studies. I found the finer details on the controversies surrounding ozone depletion and Lyme disease particularly fascinating.

“when applied to scientifically-informed decisions, democracy is a “tyranny of ignorance”. Evidence is simply not up for a vote. “

The authors are outspoken when offering recommendations on how to combat false beliefs. They consider as dangerous and patently false the notion that truth will triumph when allowed to compete with other ideas in the proverbial “marketplace of ideas”. Scientists could organise themselves better (yes), journalistic standards can be improved (sure), and legislative frameworks such as defamation and libel laws could be extended to prohibit industries from spreading misinformation (why not). But then, in the last three (!) pages of the book: “Isn’t it time to reimagine democracy?”

Right, I was not prepared for that one.

They follow Philip Kitcher (see his books Science, Truth, and Democracy and Science in a Democratic Society) by arguing that, when applied to scientifically-informed decisions, democracy is a failure. Most voters have no idea what they are talking about, making democracy a “tyranny of ignorance” or worse, as people are often actively misinformed and manipulated. Evidence, they say, is simply not up for a vote. Given that I have Brennan’s Against Democracy sitting on my shelf here (which champions the idea of an epistocracy, a rule of the knowledgeable), I was all ears.

The Misinformation Age is fantastically readable and makes a convincing case for the importance of social factors in the spread of knowledge. Whether you are interested in the communication of science or worried about the epidemic of false beliefs, this book comes highly recommended.

Disclosure: The publisher provided a review copy of this book. The opinion expressed here is my own, however.

You can support this blog using below affiliate links, as an Amazon Associate I earn from qualifying purchases:

The Misinformation Age paperback, hardback, ebook or audiobook

Other recommended books mentioned in this review:

__________________________________________________________________

__________________________________________________________________

__________________________________________________________________

__________________________________________________________________

__________________________________________________________________

3 comments

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.