Sunday, April 6, 2014

I know that I am not right about everything, and yet I am simultaneously convinced that I am.

From From creationism to ESP: Why believers ignore science by Laura Miller.
Fortunately for Storr, he wanted to write and he’s really good at it. Each chapter in “The Unpersuadables” plunges the author into a peculiar subculture, and each has its own narrative flavor. The account of a family who discovers that their late, estranged daughter and sister had fallen into the clutches of therapists convinced of the existence of networks of Satanic ritual child abusers, murderers and cannibals works like a detective story. Another on Lord Monckton, a famous British climate-change denier, is a profile in reactionary nostalgia as a way of life. The Morgellons chapter is, of course, a medical thriller, although the results are not vindicating when Storr finds a doctor willing to analyze some fibers for a sufferer. (Most medical professionals view even the request to have such lab work done as a symptom of a psychological disorder, but this particular physician had been afflicted with a rare parasite himself and sympathized.) And Storr tries — very, very hard, in a chapter resembling a courtroom drama — to get to the bottom of some troubling disputes surrounding the presiding saint of the skeptics movement, James Randi.

Running through all these stories is Storr’s growing uncertainty about certainty. In the first chapter, he presents his readers with a conundrum: “I consider — as everyone surely does — that my opinions are the correct ones,” yet to assume that he really is right about everything “would mean that I possess a superpower: a clarity of thought that is unique among human beings. Okay, fine. So accept that I am wrong about things — I must be wrong about them.” Yet when Storr surveys his own views, again, they all strike him as spot-on. “I know that I am not right about everything, and yet I am simultaneously convinced that I am. I believe these two things completely, and yet they are in catastrophic logical opposition to each other.”

Such rumination undermines Storr’s faith in his convictions, rooted as they once were in the rather quaint confidence that human beings make up their minds rationally. Instead, exploring recent developments in neuroscience, he learns that we believe first — engaging mental models formed early in life and rarely amenable to change — and come up with the reasons for it afterward. By the now-familiar process of confirmation bias, we ignore what doesn’t support our most favored notions, and shine a brilliant spotlight on the ones that do. Our minds operate unconsciously to a flabbergasting degree, while our consciousness is forced to tag along after, cooking up convincing rationales. “We do not get to choose our most passionately held views, as if we are selecting melons in a supermarket,” is Storr’s provocative conclusion.
I like Storr's articulation of the paradox of people's confidence. “I know that I am not right about everything, and yet I am simultaneously convinced that I am." If we logically accept that we are not right about everything, then the task is to identify what it is that we believe that is not true and eradicate that belief. But we don't do it.

None of us is consciously irrational or illogical or stupid. Consequently everything we believe, must be true, otherwise we would have corrected it. Therefore when we make decisions we must be making good decisions. And when things don't turn out they way they ought to have, then it is the fault of someone else and not ourselves.

The reality is that we do tolerate illogic, irrationality and dubious facts, we just work very hard to ignore that. I guess the resolution is that we do know that we are not right about everything but that we characterize the things about which we are incorrect as being trivial enough not to warrant the effort of investigation. We discount the risk adjusted cost of cognitive mismatch (the mismatch between our chosen belief and the measured reality). If we understood the real cost of holding ill-supported beliefs, we would likely invest more effort in assessing our beliefs. But because we don't understand that cost, and because we have built-in psychological mechanisms for attributing those costs to the failures of others, then we end up never improving our decision processes.

I wonder if this isn't behind the long run success of Protestant Christianity in general (accountable to God) and Calvinist based Christianity in particular, Judaism, Confucian-based cultures in northeast Asia. That with their focus on individual agency, responsibility and accountability, it discourages individuals from blaming others for their failure and thereby forces them to explore their own suppositions thereby creating an epistemeological feedback mechanism for improving knowledge and logic. I guess the counterfactual would be: Are there any cultures marked by long-term success which also disavow personal responsibility for cognitive hygiene?

No comments:

Post a Comment