12 December 2011

Everything You Think You Know, Just Ain't So

Here are two ways in which we typically go wrong on a regular basis:
1. Cognitive hubris: each of us believes that his map of the world is more accurate than it really is.

2. Radical ignorance: when it comes to complex social phenomena, our maps are highly inaccurate. _American
We have learned from cognitive psychologists such as Daniel Kahneman that our intuition -- no matter how solid it feels -- is often built on a shaky foundation.
What's interesting is that many a time people have intuitions that they're...confident about except they're wrong. That happens through the mechanism that I call "The mechanism of substitution". You've been asked a question, and instead you answer another question, but that answer comes by itself with complete confidence, and you're not aware that you're doing something that you're not an expert on because you have one answer. Subjectively, whether it's right or wrong, it feels exactly the same. Whether it's based on a lot of information, or a little information, this is something that you may step back and have a look at. But the subjective sense of confidence can be the same for intuition that arrives from expertise, and for intuitions that arise from heuristics, that arrives from substitution, and asking a different question. _Edge
This phenomenon of "false expertise" is extremely common -- particularly among college professors, journalists, politicians, and others who are not typically held to a high standard of performance and proof. It is also a common part of everyday existence for virtually everyone.
Suppose you were to ask yourself how well you understand the world around you. How accurate is your map of reality?

If you interrogate System Two [slow, logical mind], it might reply, “There are many phenomena about which I know little. In the grand scheme of things, I am just blindly groping through a world that is far too complex for me to possibly understand.”

However, if you were to interrogate System One [fast, intuitive mind], it might reply, “My map is terrific. Why, I am very nearly omniscient!”

Evidently, in order to perform its function, System One has to have confidence in its map. Indeed, elsewhere Kahneman has told a story of a group of Swiss soldiers who were lost in the Alps because of bad weather. One of them realized he had a map. Only after they had successfully climbed down to safety did anyone discover that it was a map of the Pyrenees. Kahneman tells that story in the context of discussing economic and financial models. Even if those maps are wrong, we still feel better when using them.2

In fact, a number of the cognitive biases that Kahneman and other psychologists have documented would appear to serve as defense mechanisms, enabling the individual to hold onto the view that his map is the correct one. For example, there is confirmation bias, which is the tendency to be less skeptical toward evidence in support of one's views than toward contrary evidence.

System Two is evidently not able to overcome cognitive hubris, even in situations where one would expect System Two to be invoked, such as forecasting the difficulty of a major undertaking. _American

We have to rely upon our fast, intuitive, unconscious mind in order to get through a normal day. So much of our lives are performed on "auto pilot" simply because we cannot reason through every split second of our conscious lives. It would be a tremendous waste of very expensive conscious attention to do so.

And yet, so much of the time when we should give careful conscious attention to an action, choice, or verbal / written expression, we fail to do so. We are accustomed to trusting our intuitions, and we generally muddle through okay. But not always. And who can teach us when to take the time and make the effort to apply our consciousness, and when we can safely and effortlessly slide by on our unconscious intuition? That is where wisdom comes in.

Arnold Kling applies Kahneman's ideas to the political realm:
When two ideological opponents wind up on different hilltops, neither can believe that the other has sincerely arrived at a different conclusion based on the evidence. As Friedman puts it,

Consider the most reviled pundit on the other side of the political spectrum from yourself. To liberal ears, a Rush Limbaugh or a Sean Hannity, while well informed about which policies are advocated by conservatives and liberals, will seem appallingly ignorant of the arguments and evidence for liberal positions. The same goes in reverse for a Frank Rich or a Paul Krugman, whose knowledge of the “basics” of liberalism and conservatism will seem, in the eyes of a conservative, to be matched by grave misunderstandings of the rationales for conservative policies.5

Indeed, our cognitive hubris is so strong that, according to David McRaney, people believe they understand other people better than others understand themselves. He calls this phenomenon “asymmetric insight.”6

The illusion of asymmetric insight makes it seem as though you know everyone else far better than they know you, and not only that, but you know them better than they know themselves. You believe the same thing about groups of which you are a member. As a whole, your group understands outsiders better than outsiders understand your group, and you understand the group better than its members know the group to which they belong.

In our context, this would mean that liberals believe that they understand better than conservatives how conservatives think, and conservatives believe that they understand better than liberals how liberals think. According to McRaney, such beliefs have indeed been found in studies by psychiatrists Emily Pronin and Lee Ross at Stanford along with Justin Kruger at the University of Illinois and Kenneth Savitsky at Williams College. _Arnold Kling
These are very important insights which should be applied to our own everyday thinking. They could save you from a great deal of embarassment and unnecessary interpersonal friction. But just because "conservatives" and "liberals" make the same types of mistakes when judging the other side, does not make them equally right and equally wrong on every issue. Look for people who have changed their minds, and try to understand their reasons for changing.

Some changes may be a quasi-cohort effect. The old saying, "If you are a conservative at age 20, then you have no heart. But if you are not a conservative at age 30, you have no head," is a reflection of a common tendency for the "hard knocks" of life to beat a bit of conservatism into almost anyone, over time. If one is born into wealth, achieves early success, commits irrevocably to a cause in his youth, or acquires a sinecured position at a fairly early age, he is more likely to be able to avoid many of the uncomfortable changes in perspective which others may feel compelled to go through.

Simple observation over time demonstrates to most intelligent people that utopian ideas -- when implemented -- do not bring about the grand results they promise, but generally produce the opposite effect.

More on why this topic is important, to come.

Labels: , ,

Bookmark and Share

2 Comments:

Blogger Sojka's Call said...

Only recently have I become conscious of asymmetric insight and having terminology to describe this behavior is very helpful. Thanks for this post - a good gentle slap in the face this morning to maintain my humbleness.

Wednesday, 14 December, 2011  
Blogger Sojka's Call said...

Also, to try and understand the reality of political policy and its effect on the economy and society I just purchased and have read a few pages of Presimetrics which came highly recommended from an engineer friend who prizes using data to make decisions. Hopefully, it will help my system 2 mind get some perspective. http://presimetrics.com/?page_id=6

We'll see.....

Wednesday, 14 December, 2011  

Post a Comment

“During times of universal deceit, telling the truth becomes a revolutionary act” _George Orwell

<< Home

Newer Posts Older Posts
``