Some of the most terrifying research of which I’m aware
relates to the nature of human cognition, and the question of how we make
decisions.
You and I assume that we are rational organisms, which is to
say: when we are faced with contradictory data, or with a complex choice, that
we consider the facts involved, weigh probabilities and consequences, think
things through, and then come out the other end with what we think is the best
answer. The implication of this is that
our decisions are perfectible, which is to say: if we get something wrong, it’s
because we misjudged the data, or had bad inputs, or there was something wrong
with the mechanism of calculation. If we
just do a better job of screening out irrelevancies, or consult better sources,
or work on being smarter, we can make a better decision next time.
All of these pleasant possibilities are thrown into chaos by the neuroscientific research of Benjamin Libet, and the subsequent work which builds on his. To greatly simplify (and maybe oversimplify) his conclusions, Libet claims that, through measuring the electrical activity of the brain of a person involved in making a decision, it can be proved that the portions of the brain governing action activate prior to the portions of the brain responsible for cognition. The implication is: the conscious “thinking” we do is not decisionmaking. It is post-facto justification for a decision that is being made by some other, more opaque part of ourselves. We will never get better at making decisions, because the part of us that makes decisions is beyond our understanding or control. We will remain idiots forever.
#
It’s possible to read too much into Libet’s work. Not all decisions are necessarily made in the
same way, and clinical trials that measure one type of decision-making may not
accurately account for factors present in different decision contexts. And some of the successor studies are
sketchy, and a number of the media accounts of these studies are very obviously
massaging the data to justify an ideologically convenient conclusion.
Ah, but there's the rub. What keeps popping up, in Libet’s work, and in the later work, and even in the indictments of the later work, is that we believe what we want to believe. Which is to say: we are good at rationalizing in support of our pre-existing world-view, and equally good at rationalizing away inconvenient evidence.
Scientific American posted a terrific article full of examples of this. The ability of vaccination opponents to continue to justify claims about autism that were based in a study which has been revealed to be an outright forgery. The ability of UFO Cults to preserve their beliefs even in the face of having specific predictions of specific events on specific days empirically falsified. The resilience of 9/11 Truthers or of people who believe that Barack Obama was an Indonesian Muslim agent. I would add certain beliefs prevalent on the left to this list, for instance: the belief that testosterone affects every aspect of human development that occurs below the neck but nothing that goes on above it, or that human behavior is almost entirely the product of environmental influences, with the exception of sexual orientation, which is carved in stone in the womb. If any of the above statements alienate you, fine: choose the irrational predilections of your preferred outgroup, and pretend those are the only ones I referenced.
My point is: we are good at building up walls against facts and narratives which challenge the core of who we are. I’m no exception.
#
I increasingly worry that my life as a high school debate coach has been lived in the service of a lie. Specifically: people in my profession like to believe that we train young people to think. If Libet and his cohorts are correct, it might be more accurate to say that we train young people to rationalize. Good debaters are skilled at marshaling data and anecdotes; GREAT debaters are skilled at framing arguments, which is to say, they learn to leverage data to activate the core narratives that govern the behavior of the people listening. But these skills have little to do with the critical investigation of ideas.
Being good at saying “that guy over there is wrong and here’s why” is a useful skill for a variety of professional applications. Persuading neutral observers of the truth of a proposition is probably less so; there seems to be very little communication these days between parties who genuinely and fundamentally disagree, and precious few neutral observers to be found. Still, I can see how that skill might conceivably be valuable in a pinch. But I’m increasingly convinced that the most important dialogue in which we can engage is internal: a process of calling into question our own deep-seated narratives of how the world works in a spirit of true openness to change. Personal improvement must, by definition, begin with a single assertion: I might be wrong.
And debate as an activity, and debaters as individuals, are terrified of those words. “I might be wrong” is a statement fundamental to the building of successful relationships, but it has no utility in the context of a competitive argument with a designated winner and loser. Perhaps the ugliest habit debate coaches build in the young people under our care is the cultivation of certainty at all costs.
I have long trained my first-year debaters to respond to
questions asked in cross-examination that they don’t know the answers to by
saying, “I don’t know”. Don’t lie or
bareface your way through it, I tell them. If the question is unimportant,
point that out. Write the question down. Bring it to me after the round and we’ll see
if we can’t reason our way through it together.
The community of debate judges—experienced competitors and laypeople alike--decisively repudiates my advice on this issue. When my kids say “I don’t know,” they lose, and the fact that they said it is cited as a primary reason why. In this way, the community reinforces the idea in my students’ minds that while intelligence is useful, certainty is essential. If you don’t know, they are told, pretend that you do.
It’s terrible advice. False certainty is poison.
As a child, I thought my parents knew everything. I assumed that knowledge would descend upon me in a cloud, possibly slowly in stages, but certainly that by my eighteenth birthday I would have attained what children’s author David Wisniewski called “The Secret Knowledge of Grown-Ups.” I suspect that the dawning awareness that my parents were not omniscient, and the resulting horror at the fact that maybe nobody in the world had any idea what they were doing, may have had something to do with my teenage petulance.
And my subsequent adult petulance, as well. Because it is readily apparent to me that the
world is run by adults who 1. Get selected as leaders because they’re
marvelously good at pretending that they know what they’re doing, and at
denying any possibility that they don’t, and 2. That these people are lying through their teeth.
The best of them may have the advantage of good personal judgment, or an intelligent willingness to surround themselves with people who have strong knowledge bases in one field or another and to defer to those people. But specific recent evidence would suggest that the sort of person who is best at projecting an aura of absolute certainty is, in fact, a person who IS absolutely certain, which is to say: a fool. And that the more insistent we are that our leaders project certainty, the less likely we are to wind up with leaders who defer to intelligent subordinates, or who…and here’s a radical thought…dispense with central control entirely, and instead respect the ability of individual citizens to make decisions in their own interest.
#
I think it is urgently necessary to rediscover the beauty of the phrase “I don’t know.” I think we need to learn to respect intellectual humility as a virtue.
I think we need to think about all of those elaborate,
carefully constructed systems created by the most intelligent people, with the
purest of intentions, which produced spectacular misery and utter catastrophe,
and which could not be abandoned because to admit a mistake would have been to
un-do the core not just of the leaders’ authority, but of their reasons for
existence.
I think we need to reflect on all those juries, who evaluated the evidence presented to them by skilled advocates, and the testimony of witnesses credible and incredible, and who retired to review the evidence collectively, and who emerged with carefully considered unanimous verdicts that subsequently turned out to be 100% objectively wrong. We have shielded ourselves with the belief that those people were emotional idiots and that we ourselves, rational beings through-and-through, would do differently. But us rational beings keep wandering into jury boxes and fucking up spectacularly, over and over, and I think it may be time to contemplate the possibility that those jurors might have been people very much like ourselves who were as certain in their decisions as we are in our own.
I think we need to understand that we ourselves, like other people, are inclined to buy into narratives that support our own, and to treat as “facts” stories which support those narratives. And I think we need to do a better job of policing ourselves in situations where our core beliefs are being activated.
#
For instance.
Let’s say you are confronted with two very different narratives, both of them concerning the events of a night thirty-five years ago. The narratives are incompatible. One of the parties involved says: I was at a party, and I was accosted by a pair of young men who intended to rape me and possibly to kill me, and that’s one of them right there. And the party accused says: not only was it not me, but the party never happened and I have never engaged in behavior remotely similar to that which is ascribed to me.
Let’s say, for the sake of argument, that there is no actual physical evidence of any sort presented, and that no contemporaneous reports of wrongdoing were produced. Instead, you are presented with emotionally intense testimony by both parties, and a parade of character witnesses. Who do you choose to believe?
If your core narrative is that sexual assault is an American epidemic, that men in America wield power capriciously, treat women as a means to the end of their own desires and are never held to account, and that the political party of which the accused is a member is interested in extending and indeed doubling down on that pernicious reality, then you will tend to believe, and to treat as credible, the views of the accuser. You will believe that the accused is at best engaged in willful denial enabled by alcohol-induced amnesia and at worst just straight lying through his teeth.
If your core narrative is that public concern about sexual assault has transformed over time into a witch hunt, in which evidence is considered irrelevant and the presumption of innocence inconvenient, and that the Democrats have their backs to the wall and will at this point say literally anything to perpetuate the blindness of the legal system to the butchery of one million babies every year, then you will believe, and treat as credible, the views of the accused. You will believe that the accuser was perhaps assaulted by someone else and has subsequently, over thirty-five long years, superimposed the face of the accused over that of an assailant whom she couldn't identify. Or instead maybe you decide that she is part of a broad-based conspiracy to bring down an innocent man, and that the ever-wilder accounts we're hearing of the accused’s misbehavior by an growing list of accusers are proof of this conspiracy.
You'll believe her. Or you'll believe him. You will believe so strongly as to be certain. But your certainty will be unjustified. In neither case will you be reasoning based on physical evidence or specific facts about the night in question. You will be superimposing your favorite narrative on that event, and placing the two very real human beings involved in this horrific public drama in roles within that narrative.
And if somebody reacts to the whole spectacle by saying that they don’t know what happened on that night in 1982, you will perhaps revile them even more than people on the opposite side of the debate, because it will seem that they are abdicating even the basic level of moral responsibility involved in taking a side; that they are using waffling as a cop-out for their utter lack of any principle whatsoever; that they are willfully blind and trash humans and of no use to anyone, not even worth the trouble of engaging with.
But it will remain true that the people you watched on
television today were actual humans.
They are not paid actors. To
them, this was real. And to reduce them
to placeholders in your narrative is to dehumanize them entirely.
And to pretend certainty about events of which there is no
physical evidence, and to which there were no witnesses, is to tell yourself a
soothing lie: the lie that your narrative is correct on all occasions, and that
so long as you cling to it, you are a soldier on the side of righteousness.
The worst monsters in history were people not very different from you and me. And the belief that their narrative was always correct, and that there could be no incorrect action congruent with it, was the elixir that they drank that transformed them into monsters.
#
I don’t know.
No comments:
Post a Comment