The rationality paradox
Attributing credulity to sloppy thinking or ignorance is tempting, but doesn't advance our understanding of post-truths and conspiracy theories
Steven Pinker’s new book is all over the news these days. In our era of fake news, deliberate misinformation, superstition, post truths and alternative facts, the message of the book, as conveyed by its title, Rationality: Why it Seems Scarce and Why it Matters, appears clear and relevant. Rationality, described as “a kit of cognitive tools that can attain particular goals in particular worlds” ought indeed to be the “lodestar” of everything we think and do. And yet it isn’t. Something doesn’t work. Or at least it hasn’t worked. Resources for reasoning and information are abundantly available today and yet, people choose to ignore them, or use them in questionable ways. Why?
I haven’t read Pinker’s book, so I cannot say much about the answers he gives to this question. I have only found a small excerpt here. In it, Pinker uses the example of the San people of the Kalahari Desert to emphasise what he describes as their “scientific” mindset, which they have organically and successfully employed in order to sustain themselves for millennia in that rather inhospitable place. “They reason their way from fragmentary data to remote conclusions”, he writes, “with an intuitive grasp of logic, critical thinking, statistical reasoning, correlation and causation, and game theory.”
If they can do it, why can’t we? What stops us? That’s the paradox Pinker identifies.
Let’s call it the Rationality Paradox.
I have tried to approach such questions before. I don’t have a clear answer yet, but, to be honest, whenever I read about “scientific” this or that, my mind returns to an example my teachers in school were employing in their somewhat failing efforts to convince us of the merits of calculus. It involved foxes, rabbits, pursuit curves and a number of increasingly confusing differential equations. Apparently, this is what you need to do in order to follow on the steps of the fox chasing the rabbit. I had always found this rather amusing. Just think of the headlines: Rabbit escapes thanks to fox’s differential equation error.
In other news, the other day I came across Amy Coney Barrett’s recent speech at the University of Louisville's McConnell Center. Barrett, you will recall, is the newest addition to the Supreme Court of the US, chosen by D. Trump in late 2020 to take the place of the late Ruth Bader Ginsburg. “Judicial philosophies are not the same as political parties”, Barrett stressed. “It’s not my job to decide cases based on the outcome I want”.
A few days later another member of the Supreme Court, Justice Clarence Thomas, expressed a very similar view. “Sometimes, I don't like the results of my decisions. But it's not my job to decide cases based on the outcome I want. Judicial philosophies are not the same as political parties”, he said, echoing Barrett’s turn of speech. That’s a relief, you might think. It should be reassuring that the members of the SCOTUS are humans but can still rise above human failings. Or so they say, at least.
They might be worried that we don’t believe them.
Some months ago, I wrote about Plato’s Protagoras and the innovative literary device he used, of the personified argument which would laugh at Socrates’ and Protagoras’ disagreement and gradual confusion of conclusions. In Plato’s view, premises are connected to conclusions with unique and traceable pathways, which the human intellect must follow – or else risk becoming the argument’s mockery. “This”, I wrote, “has been the hope and promise of modernity, namely the belief in the inherent rationality of the world, as such, and also of the human intellect qua observer in this world.”
Pinker, A. Coney Barrett and Cl. Thomas argue from this exact basic premise. They all firmly believe that the world is inherently rational and claim that “scientific” reasoning is our best compass for navigating its troubled waters. In other words, we are like the fox (or the rabbit), with the added benefit that we know about differential equations, what and how useful they are (admittedly less so when it comes to how to solve them).
So, how do we explain the Rationality Paradox? What makes people become blind, stops them from accepting solid evidence and clear inferences? What stops them from using their minds?
Granted, that’s not an easy question. For one thing, it’s not accurate that they don’t use their minds. They do, very much so, but not in the way we’d like. Perhaps the difficulty is ours, then. Perhaps it is that we are not able to present the problem, or paradox, correctly.
Let’s try another way then.
What makes people suspicious when they hear Amy Coney Barrett explaining that her Catholic faith should not be seen as interfering with her legal reasoning? Is it a justified suspicion? Perhaps yes, given the backstory of Barrett’s nomination. But then, how can we convincingly differentiate between this, “justified” suspicion and, say, the suspicion that is evident in the thinking of all those vaccine deniers and Covid pandemic “truthers”? Or of any similar other run of the mill conspiracy theory for that matter? Can we differentiate?
I reach the same dead end: it’s tempting to attribute conspiracy theories, credulousness, and post truths to sloppy thinking or ignorance alone, but it does not help to advance our understanding of such phenomena.
The rationality paradox requires reflection, not a sloppy explaining away.
This piece was originally published in the October edition of Splinters.
Get our weekly email