Schick and Vaughn Chapter 6


Evidence and Inference

People often draw conclusions (make inferences) from evidence. In other words, they construct arguments or explanations. This chapter describes some typical mistakes people make when they draw conclusions from evidence. Some mistakes are logical errors. Interestingly, some logical errors result from ordinary (and often beneficial) psychological tendencies.

Intellectual Inertia: concluding that obviously disconfirming evidence isn’t really disconfirming at all. People are essentially conservative about adopting new beliefs if the old ones work reasonably well. We ignore and misinterpret evidence that conflicts with our current views. This is an ordinary psychological tendency, and its evolutionary usefulness is easy to understand. If it’s not broken, don’t fix it. So if the sun always returns after we shoot flaming arrows into the sky to re-ignite it after it has eclipsed, why should we stop shooting the arrows? Who wants to take the risk that the arrows aren’t working?

Confirmation Bias: looking for confirming evidence, and failing to look for disconfirming evidence. Again, this tendency’s selective advantage is clear. Primitive humans had to make split-second life-and-death decisions about what was so. (Is there a lion behind that bush?) So they developed an instinct to look first for confirming cues and do a complete investigation later (if at all). But in our day we usually have the luxuries of comparative leisure and safety; we can afford to slow down. Scientists do slow down. Most people still don’t.

RULE: “When evaluating a claim, look for disconfirming as well as confirming evidence.” (137)

We should be particularly careful about confirmation bias when the claim being “confirmed” is vague. Vagueness and confirmation bias combine to form especially potent and psychologically compelling effects: the “weasel” claim that can accommodate numerous interpretations, or, even worse, the claim compatible with all states of affairs.

The Logic of Confirmation: confirming evidence does not give conclusive proof; disconfirming evidence does give good reasons for disproof. Note the usefulness of MP and MT, and the siren song of AC and DA (two-thirds of people not trained in logic think they are valid forms).

HINT: to avoid confirmation bias, try to keep more than one explanatory hypothesis in mind.

The Availability Error: basing conclusions on evidence that’s vivid and memorable, dramatic, emotionally charged, easy to visualize, etc. Such evidence might be psychologically compelling, but it’s not necessarily logically compelling. Paradigm fallacies that illustrate this error include hasty generalization and dubious analogy. Again, the naturalness of these fallacies is clear: it makes sense to avoid the water hole where you once saw a lion. It’s wise to suspect all grown-ups are monsters if your parents are monsters.

The Availability Error and Superstition

People often suppose a cause and effect relation between two logically unrelated events. Superstitions result from these fallacies of false cause, very often because the events being related are obvious and memorable or unusual. Think of the guru character in the film All of Me who is unfamiliar with both phones and flush toilets, who comes to believes he can make the phone ring by flushing the toilet.

The Argument from Unnecessary Restrictions

The data that supposedly confirm a claim should not be unnecessarily restricted. For example, if Uri Geller claims he can bend metal with his mind alone, why does he provide only evidence of spoon-bending and key-bending? Why not pennies, or crowbars? Why does he have to first touch the objects he is going to bend?

Other examples: if there really are people who can regularly communicate with the dead or read minds, why don’t historians and detectives regularly consult them? Why don’t scientists eager to win the Nobel Prize flock to these people figure out how they do these amazing things? If claims about communicating with the dead or mind-reading were true and verifiable, then surely someone should be on to scientifically HUGE discoveries, plus fame, glory, riches, etc.

If there really are people who can predict the future, why don’t they just go get rich off the stock market? Why don’t they advise movie studios about what films will be hits? Why aren’t venture capitalists lining up to fund money-making enterprises based on these unique abilities? Surely some people would pay a lot to know in advance who they’re going to marry or when they’re going to die.

If aliens are regularly abducting earthlings and probing their minds for information, why do they pick exactly the people who seem the most unlikely to furnish information (e.g., rural uneducated people with emotional problems)? Why don’t the aliens probe the minds of Nobel Prize winners or great artists, for example?

The Representativeness Heuristic

We tend to employ certain rules for making sense of our experience. Such rules are called heuristics. One very common heuristic is “like goes with like.” This is called the representativeness heuristic. It forms the basis for argument by analogy, in which we predict unknown similarities on the basis of known similarities. Astrology makes use of the representativeness heuristic: e.g., people born under the sign of the bull are thought to be aggressive and dominant, people born under the sign of the twins are thought to be two-faced, and people born under the sign of the virgin are supposed to be modest. This has led to odd medical treatments: foxes’ lungs for people with breathing problems, ground up bats for people with vision problems. (You are what you eat.)

The representativeness heuristic also makes us tend to think that all members of a class should resemble the paradigm (leading to the conjunction fallacy, described on p. 145: thinking a conjunction is more likely simply because one conjunct is likely). And it makes us tend to think that effects should resemble their causes (e.g., the belief in the Middle Ages that sperm consisted of lots of small people, or that drinking red wine made you strong because red is the color of blood).


 

 


Sandy's X10 Host Home Page | Sandy's Google Sites Home Page
Questions or comments? sandy_lafave@yahoo.com