15 January 2013

Everything You Think You Know, Just Ain't So

Humans cannot perceive reality directly. Instead, the outside world is filtered through our various senses. Significant filtering and pre-processing occur in the sensory organs themselves. Inside the brain, "primary" areas are devoted to each of the senses, where further filtering and pre-processing of information take place -- under the influence of feedback from "higher" brain areas. From the primary sensory cortex, the information proceeds to "associative" sensory areas and other cortical and sub-cortical parts of the brain, where an incredible free-for-all of clashing and matching information occurs deep below the level of consciousness.

All of these sensory and mental processes find themselves immersed within a sea of emotions, influenced by the hormonal, biochemical, physiological, and genetic state of the individual. Every moment of time contains its own mix of emotions, frequently linked and tagged subconsciously to subtle sensory cues, such as fragrances, colours, or musical phrases.

This insistent background will shade our "knowing" in different ways, at different times.


Within the great tumbling scrum of the subconscious, a number of the brain's default mechanisms operate below the level of awareness, detected only by careful testing. But when put under the magnifying glass, the mind betrays its inner workings in the form of a number of sensory illusions and cognitive biases.
Many of the cognitive biases come about due to subconscious laziness, or a type of ongoing conservation of energy by the brain. Others are simply quirks resulting from the basic design of brain circuits and function.

But even after information has survived these minefields -- of sensory filters, perceptional illusions, emotional biasing, and cognitive biasing -- there are other types of information distortion which must be traversed. More on the conscious level, humans must face a large number of "logical fallacies" that are incorporated into verbal communication for many reasons -- some of the honest, and some of them quite devious.

Always remember that descriptions and explanations of logical fallacies may in themselves be fallacious and misleading.


Top 20 Logical Fallacies

Ad hominem
An ad hominem argument is any that attempts to counter another’s claims or conclusions by attacking the person, rather than addressing the argument itself. True believers will often commit this fallacy by countering the arguments of skeptics by stating that skeptics are closed minded. Skeptics, on the other hand, may fall into the trap of dismissing the claims of UFO believers, for example, by stating that people who believe in UFO’s are crazy or stupid.

A common form of this fallacy is also frequently present in the arguments of conspiracy theorists (who also rely heavily on ad-hoc reasoning). For example, they may argue that the government must be lying because they are corrupt.

It should be noted that simply calling someone a name or otherwise making an ad hominem attack is not in itself a logical fallacy. It is only a fallacy to claim that an argument is wrong because of a negative attribute of someone making the argument. (i.e. “John is a jerk.” is not a fallacy. “John is wrong because he is a jerk.” is a logical fallacy.)

The term “poisoning the well” also refers to a form of ad hominem fallacy. This is an attempt to discredit the argument of another by implying that they possess an unsavory trait, or that they are affiliated with other beliefs or people that are wrong or unpopular. A common form of this also has its own name – Godwin’s Law or the reductio ad Hitlerum. This refers to an attempt at poisoning the well by drawing an analogy between another’s position and Hitler or the Nazis.
Ad ignorantiam
The argument from ignorance basically states that a specific belief is true because we don’t know that it isn’t true. Defenders of extrasensory perception, for example, will often overemphasize how much we do not know about the human brain. It is therefore possible, they argue, that the brain may be capable of transmitting signals at a distance.

UFO proponents are probably the most frequent violators of this fallacy. Almost all UFO eyewitness evidence is ultimately an argument from ignorance – lights or objects sighted in the sky are unknown, and therefore they are alien spacecraft.

Intelligent design is almost entirely based upon this fallacy. The core argument for intelligent design is that there are biological structures that have not been fully explained by evolution, therefore a powerful intelligent designer must have created them.

In order to make a positive claim, however, positive evidence for the specific claim must be presented. The absence of another explanation only means that we do not know – it doesn’t mean we get to make up a specific explanation.
Argument from authority
The basic structure of such arguments is as follows: Professor X believes A, Professor X speaks from authority, therefore A is true. Often this argument is implied by emphasizing the many years of experience, or the formal degrees held by the individual making a specific claim. The converse of this argument is sometimes used, that someone does not possess authority, and therefore their claims must be false. (This may also be considered an ad-hominen logical fallacy – see below.)

In practice this can be a complex logical fallacy to deal with. It is legitimate to consider the training and experience of an individual when examining their assessment of a particular claim. Also, a consensus of scientific opinion does carry some legitimate authority. But it is still possible for highly educated individuals, and a broad consensus to be wrong – speaking from authority does not make a claim true.

This logical fallacy crops up in more subtle ways also. For example, UFO proponents have argued that UFO sightings by airline pilots should be given special weight because pilots are trained observers, are reliable characters, and are trained not to panic in emergencies. In essence, they are arguing that we should trust the pilot’s authority as an eye witness.

There are many subtypes of the argument from authority, essentially referring to the implied source of authority. A common example is the argument ad populum – a belief must be true because it is popular, essentially assuming the authority of the masses. Another example is the argument from antiquity – a belief has been around for a long time and therefore must be true.
Argument from final Consequences
Such arguments (also called teleological) are based on a reversal of cause and effect, because they argue that something is caused by the ultimate effect that it has, or purpose that is serves. Christian creationists have argued, for example, that evolution must be wrong because if it were true it would lead to immorality.

One type of teleological argument is the argument from design. For example, the universe has all the properties necessary to support live, therefore it was designed specifically to support life (and therefore had a designer.
Argument from Personal Incredulity
I cannot explain or understand this, therefore it cannot be true. Creationists are fond of arguing that they cannot imagine the complexity of life resulting from blind evolution, but that does not mean life did not evolve.
Begging the Question
The term “begging the question” is often misused to mean “raises the question,” (and common use will likely change, or at least add this new, definition). However, the intended meaning is to assume a conclusion in one’s question. This is similar to circular reasoning, and an argument is trying to slip in a conclusion in a premise or question – but it is not the same as circular reasoning because the question being begged can be a separate point. Whereas with circular reasoning the premise and conclusion are the same.

The classic example of begging the question is to ask someone if they have stopped beating their wife yet. Of course, the question assumes that they every beat their wife.

In my appearance on the Dr. Oz show I was asked – what are alternative medicine skeptics (termed “holdouts”) afraid of? This is a double feature of begging the question. By using the term “holdout” the question assumes that acceptance is already become the majority position and is inevitable. But also, Oz begged the question that skeptics are “afraid.” This also created a straw man (see below) of our position, which is rather based on a dedication to reasonable standards of science and evidence.
Confusing association with causation
This is similar to the post-hoc fallacy in that it assumes cause and effect for two variables simply because they occur together. This fallacy is often used to give a statistical correlation a causal interpretation. For example, during the 1990’s both religious attendance and illegal drug use have been on the rise. It would be a fallacy to conclude that therefore, religious attendance causes illegal drug use. It is also possible that drug use leads to an increase in religious attendance, or that both drug use and religious attendance are increased by a third variable, such as an increase in societal unrest. It is also possible that both variables are independent of one another, and it is mere coincidence that they are both increasing at the same time.

This fallacy, however, has a tendency to be abused, or applied inappropriately, to deny all statistical evidence. In fact this constitutes a logical fallacy in itself, the denial of causation. This abuse takes two basic forms. The first is to deny the significance of correlations that are demonstrated with prospective controlled data, such as would be acquired during a clinical experiment. The problem with assuming cause and effect from mere correlation is not that a causal relationship is impossible, it’s just that there are other variables that must be considered and not ruled out a-priori. A controlled trial, however, by its design attempts to control for as many variables as possible in order to maximize the probability that a positive correlation is in fact due to a causation.

Further, even with purely epidemiological, or statistical, evidence it is still possible to build a strong scientific case for a specific cause. The way to do this is to look at multiple independent correlations to see if they all point to the same causal relationship. For example, it was observed that cigarette smoking correlates with getting lung cancer. The tobacco industry, invoking the “correlation is not causation” logical fallacy, argued that this did not prove causation. They offered as an alternate explanation “factor x”, a third variable that causes both smoking and lung cancer. But we can make predictions based upon the smoking causes cancer hypothesis. If this is the correct causal relationship, then duration of smoking should correlate with cancer risk, quitting smoking should decrease cancer risk, smoking unfiltered cigarettes should have a higher cancer risk than filtered cigarettes, etc. If all of these correlations turn out to be true, which they are, then we can triangulate to the smoking causes cancer hypothesis as the most likely possible causal relationship and it is not a logical fallacy to conclude from this evidence that smoking probably causes lung cancer.
Confusing currently unexplained with unexplainable
Because we do not currently have an adequate explanation for a phenomenon does not mean that it is forever unexplainable, or that it therefore defies the laws of nature or requires a paranormal explanation. An example of this is the "God of the Gapsa" strategy of creationists that whatever we cannot currently explain is unexplainable and was therefore an act of god.
False Analogy
Analogies are very useful as they allow us to draw lessons from the familiar and apply them to the unfamiliar. Life is like a box of chocolate – you never know what you’re going to get.

A false analogy is an argument based upon an assumed similarity between two things, people, or situations when in fact the two things being compared are not similar in the manner invoked. Saying that the probability of a complex organism evolving by chance is the same as a tornado ripping through a junkyard and created a 747 by chance is a false analogy. Evolution, in fact, does not work by chance but is the non-random accumulation of favorable changes.

Creationists also make the analogy between life and your home, invoking the notion of thermodynamics or entropy. Over time your home will become messy, and things will start to break down. The house does not spontaneously become more clean or in better repair.

The false analogy here is that a home is an inanimate collection of objects. Whereas life uses energy to grow and reproduce – the addition of energy to the system of life allows for the local reduction in entropy – for evolution to happen.

Another way in which false analogies are invoked is to make an analogy between two things that are in fact analogous in many ways – just not the specific way being invoked in the argument. Just because two things are analogous in some ways does not mean they are analogous in every way.
False Continuum
The idea that because there is no definitive demarcation line between two extremes, that the distinction between the extremes is not real or meaningful: There is a fuzzy line between cults and religion, therefore they are really the same thing.
False Dichotomy
Arbitrarily reducing a set of many possibilities to only two. For example, evolution is not possible, therefore we must have been created (assumes these are the only two possibilities). This fallacy can also be used to oversimplify a continuum of variation to two black and white choices. For example, science and pseudoscience are not two discrete entities, but rather the methods and claims of all those who attempt to explain reality fall along a continuum from one extreme to the other.
Genetic Fallacy
The term “genetic” here does not refer to DNA and genes, but to history (and therefore a connection through the concept of inheritance). This fallacy assumes that something’s current utility is dictated by and constrained by its historical utility. This is easiest to demonstrate with words – a words current use may be entirely unrelated to its etymological origins. For example, if I use the term “sunset” or “sunrise” I am not implying belief in a geocentric cosmology in which the sun revolves about the Earth and literally “rises” and “sets.”
Inconsistency
Applying criteria or rules to one belief, claim, argument, or position but not to others. For example, some consumer advocates argue that we need stronger regulation of prescription drugs to ensure their safety and effectiveness, but at the same time argue that medicinal herbs should be sold with no regulation for either safety or effectiveness.
No True Scotsman
This fallacy is a form of circular reasoning, in that it attempts to include a conclusion about something in the very definition of the word itself. It is therefore also a semantic argument.

The term comes from the example: If Ian claims that all Scotsman are brave, and you provide a counter example of a Scotsman who is clearly a coward, Ian might respond, "Well, then, he's no true Scotsman." In essence Ian claims that all Scotsman are brave by including bravery in the definition of what it is to be a Scotsman. This argument does not establish and facts or new information, and is limited to Ian's definition of the word, "Scotsman."
Non-Sequitur
In Latin this term translates to "doesn't follow". This refers to an argument in which the conclusion does not necessarily follow from the premises. In other words, a logical connection is implied where none exists.
Post-hoc ergo propter hoc
This fallacy follows the basic format of: A preceded B, therefore A caused B, and therefore assumes cause and effect for two events just because they are temporally related (the latin translates to "after this, therefore because of this").
Reductio ad absurdum
In formal logic, the reductio ad absurdum is a legitimate argument. It follows the form that if the premises are assumed to be true it necessarily leads to an absurd (false) conclusion and therefore one or more premises must be false. The term is now often used to refer to the abuse of this style of argument, by stretching the logic in order to force an absurd conclusion. For example a UFO enthusiast once argued that if I am skeptical about the existence of alien visitors, I must also be skeptical of the existence of the Great Wall of China, since I have not personally seen either. This is a false reductio ad absurdum because he is ignoring evidence other than personal eyewitness evidence, and also logical inference. In short, being skeptical of UFO’s does not require rejecting the existence of the Great Wall.
Slippery Slope
This logical fallacy is the argument that a position is not consistent or tenable because accepting the position means that the extreme of the position must also be accepted. But moderate positions do not necessarily lead down the slippery slope to the extreme.
Special pleading, or ad-hoc reasoning
This is a subtle fallacy which is often difficult to recognize. In essence, it is the arbitrary introduction of new elements into an argument in order to fix them so that they appear valid. A good example of this is the ad-hoc dismissal of negative test results. For example, one might point out that ESP has never been demonstrated under adequate test conditions, therefore ESP is not a genuine phenomenon. Defenders of ESP have attempted to counter this argument by introducing the arbitrary premise that ESP does not work in the presence of skeptics. This fallacy is often taken to ridiculous extremes, and more and more bizarre ad hoc elements are added to explain experimental failures or logical inconsistencies.
Straw Man
A straw man argument attempts to counter a position by attacking a different position – usually one that is easier to counter. The arguer invents a caricature of his opponent’s position – a “straw man” – that is easily refuted, but not the position that his opponent actually holds.

For example, defenders of alternative medicine often argue that skeptics refuse to accept their claims because they conflict with their world-view. If “Western” science cannot explain how a treatment works, then it is dismissed out-of-hand. If you read skeptical treatment of so-called “alternative” modalities, however, you will find the skeptical position much more nuanced than that.

Claims are not a-prior dismissed because they are not currently explained by science. Rather, in some cases (like homeopathy) there is a vast body of scientific knowledge that says that homeopathy is not possible. Having an unknown mechanism is not the same thing as demonstrably impossible (at least as best as modern science can tell). Further, skeptical treatments of homeopathy often thoroughly review the clinical evidence. Even when the question of mechanism is put aside, the evidence shows that homeopathic remedies are indistinguishable from placebo – which means they do not work.
Tautology
Tautology in formal logic refers to a statement that must be true in every interpretation by its very construction. In rhetorical logic, it is an argument that utilizes circular reasoning, which means that the conclusion is also its own premise. Typically the premise is simply restated in the conclusion, without adding additional information or clarification. The structure of such arguments is A=B therefore A=B, although the premise and conclusion might be formulated differently so it is not immediately apparent as such. For example, saying that therapeutic touch works because it manipulates the life force is a tautology because the definition of therapeutic touch is the alleged manipulation (without touching) of the life force.
The Fallacy Fallacy
As I mentioned near the beginning of this article, just because someone invokes an unsound argument for a conclusion, that does not necessarily mean the conclusion is false. A conclusion may happen to be true even if an argument used to support is is not sound. I may argue, for example, Obama is a Democrat because the sky is blue – an obvious non-sequitur. But the conclusion, Obama is a Democrat, is still true.

Related to this, and common in the comments sections of blogs, is the position that because some random person on the internet is unable to defend a position well, that the position is therefore false. All that has really been demonstrated is that the one person in question cannot adequately defend their position.

This is especially relevant when the question is highly scientific, technical, or requires specialized knowledge. A non-expert likely does not have the knowledge at their fingertips to counter an elaborate, but unscientific, argument against an accepted science. “If you (a lay person) cannot explain to me,” the argument frequently goes, “exactly how this science works, then it is false.”

Rather, such questions are better handled by actual experts. And, in fact, intellectual honesty requires that at least an attempt should be made to find the best evidence and arguments for a position, articulated by those with recognized expertise, and then account for those arguments before a claim is dismissed.
The Moving Goalpost
A method of denial arbitrarily moving the criteria for “proof” or acceptance out of range of whatever evidence currently exists. If new evidence comes to light meeting the prior criteria, the goalpost is pushed back further – keeping it out of range of the new evidence. Sometimes impossible criteria are set up at the start – moving the goalpost impossibly out of range -for the purpose of denying an undesirable conclusion.
Tu quoque
Literally, you too. This is an attempt to justify wrong action because someone else also does it. "My evidence may be invalid, but so is yours." 
__Skeptics Guide to Logical Fallacies


This is just a bare introduction to a few of the ways that human knowledge is filtered, processed, shaded, twisted, obscured, and misled -- just in the everyday course of events. Over at the blog Al Fin, the Next Level, they will be looking at some of these issues more closely from the standpoint of Dangerous Child training.

More: Thinking Skills for the Dangerous Child

Labels: , , ,

Bookmark and Share

4 Comments:

Blogger Snake Oil Baron said...

I really don't think one is allowed to distribute this sort of thing. I once tried to find out why my local public school system didn't teach informal logic/critical thinking in schools as a mandatory (or even elective) subject and I was told that they probably do teach it in "language arts". They don't but the lack of interest is telling.

Teaching this sort of thing makes campaigning for votes harder while making the marketing of products and ideologies more time consuming and expensive. It contributes to a diminishing of power for religions and lobbyists and, in short, threatens both the status and the egos of our slack-jawed elites of politics, academia, media and culture.

You might get fined for distributing disruptive concepts.

Tuesday, 15 January, 2013  
Blogger al fin said...

Baron, you may well be correct. Perhaps I will hire an extra guard to monitor the blog syndicate campus perimeter after hours.

As for fines and other government harassment, what do you think I pay all of these lawyers to do?

;-)

Tuesday, 15 January, 2013  
Blogger Stephen said...

There is also that favorite of the idiot masses: "Everyone knows it is so thus it must be true!"

Wednesday, 16 January, 2013  
Blogger Seth Dunoff said...

This is ironic, but in the two-dimensional vs three-dimensional picture (http://4.bp.blogspot.com/-mc7SAQ60ax8/UPV42X_nIlI/AAAAAAAANzY/xgwXP7vltYY/s1600/Kahneman+p33_optical_illusion.jpg), the images are not actually the same size. The image on the left is not as wide. The height of the people is not the same either--the big guy at the top of the drawing on the right is taller. I am not sure what the text claims, but either way it is wrong.

Friday, 25 January, 2013  

Post a Comment

“During times of universal deceit, telling the truth becomes a revolutionary act” _George Orwell

<< Home

Newer Posts Older Posts
``