I Might Be Wrong
I might be wrong
I might be wrong
I could’ve sworn I saw a light coming on
–Radiohead
In the Black Mirror episode, “Bête Noire,” Maria, a successful culinary researcher at a chocolate company, begins to experience strange schisms with reality after her company hires a woman she knew from high school. For example, she and her colleagues get into an argument about the name of a now-closed fast food chicken restaurant–was it called Barnies or Bernies? Maria swears it was Bernies, and states she knows this for an absolute fact (and so do we) because her partner used to work there and still wears a Bernies hat (which we see for ourselves in a previous scene). But when Maria and her colleagues Google the company to settle the debate, Maria is proven wrong. Several incidents like this follow, in which Maria’s sense of the truth (and ours) begins to unravel. The point of the episode is not about the ways our senses deceive us and the ultimately unknowable nature of the universe (spoiler alert: rather, the former high school acquaintance built a quantum computer that can shift reality to alternative timelines and is using it to avenge a humiliating rumor Maria spread in high school and essentially destroy Maria’s life), but the horror of the episode operates on our own discomfort with uncertainty and the unsettling feeling of being wrong when we were 100% certain that we were right.
The morning after I watched this episode, I was cleaning my house and enlisted the help of my three-year-old to help me put away the puzzles we’d done together on the living room floor. As he dismantled one, and I the other, I was being careful not to mix up the pieces, and return the correct ones to the correct boxes. I picked up one piece–which was the blue and green of Earth’s ocean and land–and started to put it in his box. But he stopped me.
“That doesn’t go in my box, it goes in yours,” he said.
“No, Leo,” I said, confidently, “this piece goes in your box.”
“No, it doesn’t!” he insisted.
“Yes,” I said, trying to stay calm, “it does.”
“No it doesn’t!” he cried, putting the piece in my box.
“Leo,” I said, “I promise you this is your piece.”
“No!”
“Leo,” I said, tired of arguing with a toddler, again: “Look–it’s an Earth piece. Your puzzle is the hungry caterpillar on Earth, see?” I said, showing him the box. “My puzzle is Elmo in space and there’s no Earth,” I said, pointing at my box. But when I looked at the box, there was Earth in the lower left hand corner. And the piece in question, of course, belonged to my puzzle.
I was stunned. This was not an alternate reality in the metaverse. I was just wrong. Aware that I now needed to model some intellectual humility, and somehow recover the situation, I apologized to my son and told him he was right, and I was wrong. And I thanked him for pointing out my mistake and insisting that I see my error. “Yeah,” he said casually, “I was right.” Then he continued to put away the puzzle as if nothing had happened.
But I was shaken (and grateful that he didn’t feel compelled to rub it in). I had been so sure of myself. I promised him I was right! But when I break down the situation it’s not hard to see why I made the mistake. Though I’ve done these puzzles with him a dozen times, I don’t pay attention to them the way he, a three-year-old does; as an adult, I’m accustomed to, in general, being right, and so I assume I will be again; and I had a thousand other things swimming in my head at that moment, and a long to-do list I was working through. (Maybe there were other factors at play that I’m still oblivious to.)
This is not the first time I have been so indisputably wrong about something, and I hope (for my sake) that I’m not alone and that this is a universal human experience: we all have been startled by an instance of utter wrongness. Now, I can laugh about how right Leo was and how wrong I was, but at first I felt as unsettled as watching that Black Mirror episode; it’s terrifying to think we have a tenuous grasp on reality (especially terrifying when you are the adult in the room and responsible for a small life). Mistaking a puzzle piece is no big deal, but it’s hard not to wonder how I could be just as wrong when the stakes matter.
Black Mirror episode aside, I was primed to think more acutely about my wrongness because I am reading Kathryn Schulz’s book Being Wrong, in which she explores all the ways we can be wrong and why, how our culture thinks about being wrong, and why we hold the attitudes we do about error. As I mentioned in my last post, I want to begin my thinking about what a popular model of critical thinking might entail from a place of serious intellectual humility; in other words, thinking about all the ways we can be wrong generally, and that I might be wrong specifically. Schulz’s work is providing a really helpful framework for this task.
She writes: “Of all the things we are wrong about, this idea of error might well top the list. It is our meta-mistake: we are wrong about what it means to be wrong. Far from being a sign of intellectual inferiority, the capacity to err is crucial to human cognition. Far from being a moral flaw, it is inextricable from some of our most humane and honorable qualities: empathy, optimism, imagination, conviction, and courage. And far from being a mark of indifference or intolerance, wrongness is a vital part of how we learn and change. Thanks to error, we can revise our understanding of ourselves and amend our ideas about the world” (5).
Schulz unpacks the history, terrain, and complexity of error, what it means, and how we feel about it. As she points out, in some contexts, it’s one of the worst things that can happen to a person, and she cites some of the greatest errors made by humans and their awful ramifications (Alan Greenspan and the 2008 financial crisis, to name one). But in other contexts, error can be a source of human connection. I think one of the most liberating things about becoming an adult, as compared to being an adolescent (when the stakes of everything feel impossibly high and you have don’t yet have any real perspective that teaches you all of this will be over and probably forgotten very soon), is when you realize that you and basically everyone you know is wrong a lot of the time. And–that we can hold our fallibility with grace and humor.
After I got over the initial embarrassment of my mistake about the puzzle piece, I was excited to tell my husband what happened: “I had a Black Mirror moment with Leo this morning.” There was something cathartic about telling him my error, laughing about it, and letting it go. In fact, now that I think about it, perhaps my favorite genre of storytelling with friends is of the “you’ll never believe this stupid thing I did” variety (particularly over a strong cocktail at the end of a long week). It’s such a relief to be honest and real about the ways I mess up.
And yet, so much of my time is spent trying very hard to be right about things. As it should be—as a college instructor, I should be teaching my students correct information, and as a parent, I should be trying to teach my son what is true about the world. Something would be seriously wrong if I was walking through life without attempting to be right about anything. But none of that means I can’t hold my inevitable errors and foibles (and others’ errors) with some kindness and understanding. I think this is the hard part: walking the line of trying to be right while also accepting the likelihood of being wrong. It’s our orientation to being right/wrong that causes the problem, not necessarily being right/wrong. It’s so easy to hold our own, limited understanding of the world too tightly, to believe ourselves, and specifically the chatter that flows through our minds, with too much conviction.
David Foster Wallace makes this point in “This is Water,” (which he originally delivered as a commencement speech at Kenyon College in 2005) in which he recounts multiple examples of the ways we are constantly trapped inside our own heads, our own immediate experiences of the world, without consideration for others, or what we don’t know about others’ experiences. It’s easy and automatic, he says, to judge, for example, the “fat, dead-eyed, over-made-up lady who just screamed at her kid in the checkout line.” But, he counters: “Maybe she’s not usually like this. Maybe she’s been up three straight nights holding the hand of a husband who is dying of bone cancer.”
Just one example like this is enough to throw my whole understanding of people into sharp relief and I’m overwhelmed by how much I don’t know. And, to me, this feels good. Being right, or really–being attached to being right–is a heavy weight to carry. It’s so much easier to move through the world in the lightness of accepting what you don’t know. Which, again, is not giving up on trying to be right or having an accurate, intelligent understanding of the world. It’s just to always be open to the possibility: I might be wrong.
Wallace ends his speech–which he presented as an exploration of what it means to teach students how to think, not what to think–by stating: “The point here is that I think this is one part of what teaching me how to think is really supposed to mean. To be just a little less arrogant. To have just a little critical awareness about myself and my certainties. Because a huge percentage of the stuff that I tend to be automatically certain of is, it turns out, totally wrong and deluded. I have learned this the hard way.” I have too.
(In my life, I have always felt the most cognizant of what other humans might be going through, unbeknownst to me, when I am suffering or grieving. When my dad died, almost two years ago now, I remember walking through the days and weeks that followed with a huge, open, raw heart. When I am suddenly in the midst of that kind of pain and have to push on (for the sake of my kid, or my students, or just to avoid sobbing in front of strangers in the grocery store), I find it effortless to be aware of the pain that other humans are feeling and hiding from me too. I have to make an effort to remember this the rest of the time.)
I used to think that “critical thinking” began with a search for evidence, followed by analysis and reasoning, etc. But what if it began with a robust exploration of all the things we don’t know, all the ways we might be wrong? What if it starts from a place of humility? Most of the books I’ve read so far about critical thinking, especially those focused on how to teach critical thinking, all mention intellectual humility, but usually as an afterthought. Something you acknowledge later. What would it look like to start with intellectual humility, not simply pay lip service to it? Or, perhaps we need to think about it as a recursive process throughout the critical thinking process. Something we come back to again and again as we collect and weigh and analyze evidence/data. Perhaps it’s assumed that this should be the case, but if we don’t make it explicit, it’s easy to forget–just like it’s easy to forget how much we don’t know about what other people are going through on a daily basis.
I’m assuming that most people identify with what I’ve shared from my own experiences above–that to a certain extent, being honest about our errors and limitations feels better than acting like we know everything. I think most people would acknowledge they’ve been wrong about a lot of things over the course of their life, and even acknowledge that they’re probably wrong about something right now and don’t know it yet. If that is true, if I can assume that most humans’ emotional machinery works like mine, why then is this kind of intellectual humility not a larger part of how we think together in public?
For example, recall when Charlie Kirk was murdered in September. Within the first few days, all sorts of explanations and speculations were being offered for what happened, from news outlets, political pundits, and your average social media user. How many of those sources were carefully detailing the things we did not yet know, outlining all of the questions, pointing to our limitations and the gaps in our knowledge, and exercising some restraint in judgement until we had more information? I’d wager that sources leading with intellectual humility were in the quiet minority.
Of course, I understand the desire to immediately get a grasp on an event like this. I, too, was shocked and curious and wanted answers. But I think part of being a strong critical thinker means resisting these very natural impulses. As Kathryn Schulz describes clearly, the same parts of our brain and psychology that have led to the survival and advancement of the human species are also the ones responsible for our muddy, confused, and irrational thinking–a paradox I’m eager to explore next time…
Thanks for reading, and happy new year!



