Rather be certain than right?

Deck sub header

Faculty member Linda Riebel, Ph.D., discusses why—today more than ever—we must understand the dangers of the illogical “self-sealing doctrine” in religion and politics.

Written By

By Dr. Linda Riebel

In today’s polarized climate, some people deny the glaringly obvious—insisting, for instance, that there is no climate change. More insidiously, they may twist facts to manipulate them into their own truths.

At a commencement speech in 2017, President Donald Trump said, “The more that a broken system tells you that you’re wrong, the more certain you should be that you must keep pushing ahead. … You aren’t going to let other people tell you what you believe, especially when you know that you are right.”

To him, the very existence of opposition means “you are right.” This is a self-sealing maneuver, in which every inconvenient fact is twisted to mean the belief is right, protecting it from disproof. When the long-promised repeal of the Affordable Care Act failed, Mick Mulvaney (the White House’s budget director) made an excuse.

To President Donald Trump, the very existence of opposition means “you are right.”

“What happened is that Washington won,” he said, according to NBC News. “I think the one thing we learned this week is that Washington is a lot more broken than President Trump thought that it was.”

Trump added, “The best thing that could happen is exactly what happened.”

The first example is simply blaming; the second repackaging of the facts goes beyond making the best of a bad situation. Trump twists the facts to mean that he’s right.

Winning the debate or manipulating the results?

Unfortunately, Trump is not the only one to use this technique. More than 100 years ago, intelligence tests were being devised and put into practice—but some of the test results were unsettling to researchers.

When black children scored higher than white children, the following reason was given: “The apparent mental attainments of children of inferior races may be due to lack of inhibition and so witness precisely to a deficiency in mental growth.”

Another researcher in that era found that whites performed more slowly than blacks, and concluded, “Their [whites’] reactions were slower because they belonged to a more deliberate and reflective race.”

So in spite of results that showed advanced performance by blacks, these researchers made the evidence fit their beliefs, protecting their feelings of racial superiority. This maddening imperviousness to truth has been called a “self-sealing” maneuver, in that the belief system turns disconfirming truths around to mean they support the beliefs—“sealing” the system against disproof.

How cults choose beliefs over facts

Cults are closed groups run with sophisticated mental manipulation—prisons without walls.

I became interested in the vagaries of dangerous thinking when my brother joined a religious cult, distanced himself from the family, and adopted activities that were puzzling and self-destructive. Trying to understand what was happening, I learned that cults are closed groups run with sophisticated mental manipulation—prisons without walls.

Naïve people (often confused young adults) are lured into cults with promises of respect, membership in a “special” group that knows secret truths, and relief from their uncertainties. After the initial “love-bombing” phase, the group and its leaders indoctrinate them in their belief system, which is often bizarre. They are kept in the group by harangues about the evils of the outside world, even though the cult’s own lifestyle may consist of suspicious in-group intrigue, isolation from loved ones, and endless unpaid labor.

Why do they stay? One reason (of many) is that cults explain away inconvenient truths so that if you try to reason with cult members, they have ready-made armor for repelling your facts. Perhaps no truth is more inconvenient or obvious than the failure of the world to end on the date that the leader prophesied. Given how many times the end has been predicted and failed to materialize, it’s surprising to me that eccentric people continue to predict it.

Fascinating research has been done on the mental maneuvers that disappointed cultists may engage in to explain away the non-appearance of the apocalypse. In one case, researchers infiltrated a group whose leader claimed that extraterrestrials would save her and her followers when the earth was soon destroyed. The deadline predicted by the leader passed, and the group decided it had only been an “alert.” Second and third deadlines passed.

After blaming the clock, the leader concluded, “Suppose they gave us a wrong date. Well, this only got into the newspapers on Thursday and people had only 72 hours to get ready to meet their maker. Let’s suppose it happens next year or 2 or 3 or 4 years from now … Maybe people will say it was this little group spreading light here that prevented the flood.”

A few hours later she added, “Not since the beginning of time upon this Earth has there been such a force of good and light as now floods this room.” This is a comforting rationalization.

Another favorite religious explanation for an untoward event is that God (or the leader) allowed it to happen “in order to test our faith” or “to give us a lesson.” Bhagwan Shree Rajneesh was the Indian guru who came to grief in America. His followers explained away the contrast between his teachings of love and the paranoid, violent atmosphere of his compound in Oregon.

“They were, for example, able to convince themselves that the watch towers and the 150-member police force armed with semiautomatic weapons were devices employed by Rajneesh to make them aware of their aggressive impulses by showing them what could happen when such impulses were exaggerated.”

Though he cautions against lumping all fundamentalists together, scholar C.O. Lundberg observed: “The fundamentalist version of truth has a self-sealing insularity: it is not troubled by specifics of context, history, or contrary evidence. … The idea of a strongly held truth, of fidelity to a fundamental tenet of faith authorizing true believers to ignore epistemic [evidential] challenges, is undoubtedly a characteristic of religious fundamentalism.”

Creating proof even when there is none

Self-sealing doctrines can be found in other places too—some of them uncomfortably close to home. Psychoanalysis made some valuable contributions to understanding the human psyche, but its founder was convinced that his theories were universally true, and any challenge was simply further proof of them.

One commentator wrote, “Once Sigmund Freud was convinced of the universality of the Oedipus complex, he looked for it in every patient, whether they admitted it or not … Anyone who failed to agree with Freud’s interpretation of these cases either lacked the special expertise of the initiated or else was repressing the obvious because of personal unresolved complexes.”

Here Freud says the other person is “part of the problem” and therefore not qualified to criticize. A literal example of this comes from one of his published lectures about sex: “Nor will you have escaped worrying over this problem, those of you who are men. To those of you who are women this does not apply—you are yourselves the problem.”

Thus he disqualifies in advance the opinions of those he is discussing; whatever they may say in refutation of his ideas will not be admitted because “you are yourselves the problem.”

Continuing to protect his theory, Freud went on: “For the ladies, whenever some comparison seemed to turn out unfavorable to their sex, were able to utter a suspicion that we, male analysts, had been unable to overcome certain deeply‑rooted prejudices against what was feminine, and that this was being paid for in the partiality of our researches. We, on the other hand, standing on the ground of bisexuality, had no difficulty in avoiding impoliteness. We had only to say, ‘This doesn't apply to you. You're the exception. On this point you are more male than female.’”

This argument is armed against refutation. If females dispute his point, they are either “part of the problem” (what problem?). And if they don’t, they again prove his point because they are exceptions to the theory. They are not simply exceptions but "more male than female." So the admirable qualities are still ascribed to masculinity. Freud also introduced the social convention “politeness" into what purports to be a scientific discussion.

In another example from early psychoanalytic thinking, Alfred Adler was told by another analyst that he had found evidence of the Oedipus complex in dogs. His puppy, it seems, likes to sleep with its mother. Learning that the mother dog had a larger basket than the father dog, Adler asked the analyst to test his theory by switching the baskets. Not surprisingly, the puppy climbed into the familiar larger basket, now occupied by its father. Undaunted, the analyst exclaimed, “Shouldn’t that prove to you that the puppy has now reached the second stage of sexual growth and become homosexual?”

See how this works? You simply claim that the inconvenient evidence is further support for your theory. Of course, you may have to make your cherished theory increasingly complex to account for disconfirming evidence.

You simply claim that the inconvenient evidence is further support for your theory.

The biggest difference between politics, religion, and science

In fact, Thomas Kuhn, in an influential history of science in 1970, argued that this is exactly how genuine science works—a theory is a valuable tool for collecting observations and explaining them but not the ultimate truth. Over time, disconfirming evidence is noticed. Resisted at first, and then reluctantly worked into the theory in increasingly Byzantine corollaries, the disconfirming evidence eventually topples the old theory—or rather, the old “paradigm,” meaning an interconnected set of theories.

Then, scientists “shift” to a new paradigm (partly because adherents of the earlier one have died off). Of course, Kuhn was referring to the historic evolution of ideas taking place over decades and centuries.

Astrophysicist and science educator Carl Sagan offered a more immediate and sympathetic view from the standpoint of a practicing scientist.

“In science it often happens that scientists say, ‘You know that's a really good argument; my position is mistaken,’ and then they would actually change their minds and you never hear that old view from them again,” Sagan says. “They really do it. It doesn't happen as often as it should, because scientists are human and change is sometimes painful. But it happens every day. I cannot recall the last time something like that happened in politics or religion.”

Unfortunately, resistance to unwelcome findings has occurred even among scientists, and even in the field of psychology. The racist interpretation given in the anecdote that started this essay was made by none other than the eminent pioneer of American psychology E. L. Thorndike.

Commenting on psychoanalysis, physicist Robert Oppenheimer wrote that “a self-sealing system … has a way of almost automatically discounting evidence that might bear adversely on the doctrine. The whole point of science is just the opposite: to invite the detection of error and to welcome it.”

To a diligent researcher, it’s wise to think of critics as unexpected friends. They’ll save you from embarrassing yourself, and they may even help your career.

Georg Von Bekesy, a medical researcher, recommended that you should “have friends who are willing to spend the time necessary to carry out a critical examination of the experimental design beforehand and the results after the experiments have been completed.”  Bekesy, who was awarded the 1961 Nobel Prize in Medicine, went on: “An even better way is to have an enemy. An enemy is willing to devote a vast amount of time and brain power to ferreting out errors both large and small, and this without any compensation. The trouble is that really capable enemies are scarce; most of them are only ordinary. Another trouble with enemies is that they sometimes develop into friends and lose a good deal of their zeal. It was in this way that the writer lost his three best enemies."

What’s missing from the self-sealing mindset

A huge commitment to a cult, an ego unable to accept defeat, or a paranoid tendency may all operate to make it impossible for a person to admit being mistaken.

Trump’s biographer Michael D’Antonio, the author of “Never Enough” and “The Truth About Trump,” told Yahoo! News: “I think Trump is temperamentally inclined toward conspiracy theories and, at the same time, disinclined to do the work of studying matters fully.”

Delving deeper, he added, “It takes a flexible, curious mind to seek out competing ideas and weigh them. Then it takes even more rigor to fashion a complex solution to a vexing problem. It’s much easier to listen to one or two voices who affirm your preconceptions and dismiss all others because (you think) they are somehow against you.”

The self-sealing doctrine temporarily makes life easier. Its users don’t have to really entertain inconvenient evidence. They can even turn it aside in ways that make them feel better.   

Taking the high road in politics

How do we avoid such damaging attachment to false beliefs and such twisting of any evidence that disproves them? I believe we need the courage to question our theories, to take inconvenient evidence seriously, and to resist the temptation to use self-sealing maneuvers. This is easy to say but not so easy to do if it’s your own belief that is being challenged.

In his autobiography, physicist and Nobel laureate Richard Feynman offered an inspiring perspective: “It is our responsibility as scientists, knowing the great progress which comes from a satisfactory philosophy of ignorance [and] the great progress which is the fruit of freedom of thought, to proclaim the value of this freedom; to teach how doubt is not to be feared but welcomed and discussed; and to demand this freedom as our duty to all coming generations.”

The self-sealing doctrine is indeed a rich topic. This little verse of mine captures its essence:

Protected from having to question and wonder,
They stifle their doubts, pressing discomfort under.
So whether they follow a leader, or sit all alone at night,
The tragedy is that they’d rather be certain than right. 

Linda Riebel received her Ph.D. from Saybrook (then called the Humanistic Psychology Institute) in 1981 and has been on Saybrook’s faculty since 1993. Originally a clinician, she now serves on the Transformative Social Change and Creativity Studies faculties. Her volunteer efforts involve opera, wildlife, and environmental causes. Do you know a case of the self-sealing doctrine? Please send it to her at [email protected]