NEW HAVEN, CONNECTICUT - Humans are built to make mistakes. Admitting them is crucial to a functioning democracy
When’s the last time you were wrong? I don’t mean wrong about which way the shortcut goes or wrong about which direction the market will move, but wrong about something fundamental, something that you believe deeply and passionately. You probably have to think about that one a little bit, but the correct answer is almost certainly “today.”
Chances are that all of us, right now, are wrong about lots of what we believe. And that’s a good thing. Centuries from now — maybe decades from now — our descendants will have themselves a good laugh over our treasured moral certainties.
This isn’t a new insight. One of my undergraduate philosophy professors used to say it all the time. But it’s a proposition worth remembering at a time when many on the left and right alike are so unshakable in their convictions that they’re willing to cast those who disagree with them into the outer darkness. They seem not to realize that the larger the number of issues on which they’re certain — and they’re certain on a lot! — the more likely they are to be wrong.
I produce a column every week, often taking positions on controversial issues. I’m wrong a lot. I have to be. For me to believe otherwise, I’d have to imagine myself endowed with a godlike intellectual perfection. In truth, of course, I’m as mortal and fallible as anyone else. That’s one reason I refuse to dismiss and deride those who disagree with me. I accept that I’m capable of error. That’s why I believe in dialogue. In the give and take between points of view, we have the chance to improve our own understanding of the world. Put simply, if I listen to you, I might change my mind.
There’s an episode of the television show “Star Trek: The Next Generation” in which a member of the crew has a broken leg, and the fancy 24th-century gizmo that’s supposed to repair the bone isn’t working. The young doctor goes to his more experienced boss, who orders him to splint the leg. He replies, stiffly, “That’s not practicing medicine.”
We’re meant to smile at the reminder that scientific knowledge is always changing. We all understand that today’s technology will tomorrow be in a museum display honoring the ancients. But we tend not to feel that way about moral knowledge. When it comes to right and wrong, we have a lot of confidence in our ideas.
Almost certainly too much.
The always thoughtful Charles Chu published a very fine essay on the subject last month, a frank call for what is sometimes called cognitive humility — the recognition of how regularly we err:
“Ten years ago, almost everything I believed was wrong. If this is the case, then I have a really bad track record. Sure, I might be a little smarter, but there’s a good chance that I’m still wrong about almost everything today.”
Chu borrows from the economist Bryan Caplan the idea of an “ideological Turing test”: If we can’t mimic the arguments of those on the other side, then we don’t understand them well enough to disagree with them. Chu warns against attacking only “the weakest version of someone’s argument.” If you can’t find the strongest version, chances are you’re not really trying. I’ve heard colleagues of mine — serious, educated people — say things like “nothing that defends that position is worth reading.” When I assign in seminars articles or books that take positions unpopular on campus, I find that all too many students are unable to reproduce the arguments in any but the thinnest and most flimsy way.
Chu’s piece led me to an excellent 2010 book I had overlooked — “Being Wrong: Adventures in the Margin of Error” by Kathryn Schulz, a Pulitzer Prize-winning journalist now at the New Yorker. Just about everything Schulz says … seems right. I’d even go so far as to suggest that her book should be required reading in these angry and divided days.
Schulz argues that error is good for us — that we are biologically programmed to make mistakes. “Far from being a sign of intellectual inferiority,” she writes, “the capacity to err is crucial to human cognition.” To put the matter most simply, that’s how we learn. We’re designed to jump to conclusions that the evidence doesn’t warrant. Our perceptions cannot take in everything, so we make simplifying assumptions. In other words, by definition, we’re usually wrong.
In a sense, we suffer from a sort of cognitive mismatch. Brains that once solved the concrete tasks of finding water and avoiding predators have evolved to process enormous amounts of abstract information: when human life begins, how wealth should be distributed, whether to speak to the neighbor who voted for the other candidate. If we’re often inaccurate in doing the things we were designed to do — say, perceiving the world — we must surely be inaccurate in doing the things we weren’t.
According to Schulz, the problem isn’t that we’re capable of error. It’s that we have a hard time appreciating the point. Even when we realize we were wrong before, now we think we’re right. Which, she reminds us, feels nice. And the consequence of our sense of rightness is simple, and tragic: “Like toddlers and tyrants, we are quick to take our own stories for the infallible truth, and to dismiss as wrongheaded or wicked anyone who disagrees.”
Schulz’s excellent book, of course, was published before the current era, in which true believers on both sides believe they should meet outrage with outrage. But she still has a great deal to teach us about the humility with which we should approach public argument — a humility, I would maintain, that is crucial to a functioning democracy. If we listen hard enough to the arguments against us, we might even change our minds.
I know I have.
Don’t get me wrong. I’m not suggesting that we should never fight for our convictions. Nor do I believe we should take the view that we’re not right about anything. I’ve argued for decades that each of us, as moral, reasoning beings, should possess a handful of principles on which we will brook no dissent. In a democracy, however, that list should be quite short.
My ideal number of non-debatable principles is three. If we have many more, we’re simply refusing to accept what should be obvious: Every minute of every day, chances are that lots of what we believe will turn out to be wrong. It’s a humbling realization, but it’s a democratic one.
Stephen L. Carter is a Bloomberg Opinion columnist. He is a professor of law at Yale University and was a clerk to U.S. Supreme Court Justice Thurgood Marshall. His novels include “The Emperor of Ocean Park,” and his nonfiction includes “Civility.”