12 April 2010

Closed Minded Investigators Circle Wagons Against Wild Truths

There is always a tension, as [AF: an investigator], between asking open-ended questions that allow an interview subject to explain something and pressing or challenging them on accuracy or details. But if you think you already know the subject, or already have a story angle half-formed in your head, it's easy to overlook the first part. _Atlantic

Pundits, journalists, and investigators and researchers of all types most frequently go wrong when they begin their investigation with a pre-conceived opinion, a pre-fabricated conclusion. This bias is most clear in mainstream climate science, but it is also abundantly clear in just about any mainstream media investigation of a politically charged topic. If the journalist assumes a person is stupid, their stories will display abundant evidence of the person's stupidity. If journalists consider a person brilliant, the story will be built around the "evident" brilliance of the subject. Bias, bigotry, inflexible prejudice. And these people are the gatekeepers of "the truth."

In his new book, How We Decide, Jonah Lehrer cites a research study done by U.C. Berkeley professor Philip Tetlock. Tetlock questioned 284 people who made their living "commenting or offering advice on political and economic trends," asking them to make predictions about future events. Over the course of the study, Tetlock collected quantitative data on over 82,000 predictions, as well as information from follow-up interviews with the subjects about the thought processes they'd used to come to those predictions.

His findings were surprising. Most of Tetlock's questions about the future events were put in the form of specific, multiple choice questions, with three possible answers. But for all their expertise, the pundits' predictions turned out to be correct less than 33% of the time. Which meant, as Lehrer puts it, that a "dart-throwing chimp" would have had a higher rate of success. Tetlock also found that the least accurate predictions were made by the most famous experts in the group.

Why was that? According to Lehrer,

"The central error diagnosed by Tetlock was the sin of certainty, which led the 'experts' to impose a top-down solution on their decision-making processes ... When pundits were convinced that they were right, they ignored any brain areas that implied they might be wrong."

Tetlock himself, Lehrer says, concluded that "The dominant danger [for pundits] remains hubris, the vice of closed-mindedness, of dismissing dissonant possibilities too quickly." _Atlantic

In part, this is the phenomenon of the True Believer.  Humans are social animals and like reassurance that they are considered a valued part of the group.  It is also a manifestation of mental laziness.  It takes effort to change one's mind.  A person's entire life and lifestyle may be overturned by a justified and seemingly simple change of opinion.  In addition, as individuals age, they sink more deeply into the mental architecture they have constructed.

Personal opinions are fortified to protect the individual from the wildness and unpredictable threat "outside."  Stray too far from the safe, warm confines of personal prejudice and cognitive dissonance will swiftly set in.  Most modern humans are unequipped to deal with high levels of cognitive dissonance.  They quickly retreat back to the familiar. (PDF) They circle the bandwagons against the wild and unruly truths that howl in the night.(PDF)

This is our world, a world where college professors indoctrinate rather than educate, where journalists roam as a pack and savage anyone who threatens the dominant social and political memes, where scientists latch onto a theme which is popular with grant agencies and publishers -- and run with it despite all objective reality.

What would you like to do about it?

H/T Chicago Boyz

Labels: , , ,

Bookmark and Share

2 Comments:

Blogger read it said...

Reminds me of when I was young and I feared losing the ability to change my mind.

Monday, 12 April, 2010  
Blogger al fin said...

Yes. And then one grows older and learns that changing one's mind can lead to significant changes in the way that others look at them.

If one joins the herd -- out of a need for security and a sense of belonging -- one can very easily lose the ability to change one's mind.

That happens with members of private and public sector unions, law enforcement and the military, and other herd institutions that promise some level of long term security. Tenure of any type and at any level can easily lead to the herd mind.

Wednesday, 14 April, 2010  

Post a Comment

“During times of universal deceit, telling the truth becomes a revolutionary act” _George Orwell

<< Home

Newer Posts Older Posts
``