Friday, September 29, 2006

self-deception - The Skeptics Dictionary

This is fun!

Sailom

http://skepdic.com/selfdeception.html

Ninety-four percent of university professors think they are better at their jobs than their colleagues.

Twenty-five percent of college students believe they are in the top 1% in terms of their ability to get along with others.

Seventy percent of college students think they are above average in leadership ability. Only two percent think they are below average.
--Thomas Gilovich How We Know What Isn't So


Eighty-five percent of medical students think it is improper for politicians to accept gifts from lobbyists. Only 46 percent think it's improper for physicians to accept gifts from drug companies.
--Dr. Ashley Wazana JAMA Vol. 283 No. 3, January 19, 2000


A Princeton University research team asked people to estimate how susceptible they and "the average person" were to a long list of judgmental biases; the majority of people claimed to be less biased than the majority of people.

A 2001 study of medical residents found that 84 percent thought that their colleagues were influenced by gifts from pharmaceutical companies, but only 16 percent thought that they were similarly influenced. --Daniel Gilbert, "I'm OK; you're biased"


People tend to hold overly favorable views of their abilities in many social and intellectual domains....This overestimation occurs, in part, because people who are unskilled in these domains suffer a dual burden: Not only do these people reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the metacognitive ability to realize it.
--"Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments," by Justin Kruger and David Dunning Department of Psychology Cornell University, Journal of Personality and Social Psychology December 1999 Vol. 77, No. 6, 1121-1134.

Our capacity for self-deception has no known limits. -- Michael Novak

Self-deception is the process or fact of misleading ourselves to accept as true or valid what is false or invalid. Self-deception, in short, is a way we justify false beliefs to ourselves.

When philosophers and psychologists discuss self-deception, they usually focus on unconscious motivations and intentions. They also usually consider self-deception as a bad thing, something to guard against. To explain how self-deception works, they focus on self-interest, prejudice, desire, insecurity, and other psychological factors unconsciously affecting in a negative way the will to believe. A common example would be that of a parent who believes his child is telling the truth even though the objective evidence strongly supports the claim that the child is lying. The parent, it is said, deceives him or herself into believing the child because the parent desires that the child tell the truth. A belief so motivated is usually considered more flawed than one due to lack of ability to evaluate evidence properly. The former is considered to be a kind of moral flaw, a kind of dishonesty, and irrational. The latter is considered to be a matter of fate: some people are just not gifted enough to make proper inferences from the data of perception and experience.

However, it is possible that the parent in the above example believes the child because he or she has intimate and extensive experience with the child but not with the child's accusers. The parent may be unaffected by unconscious desires and be reasoning on the basis of what he or she knows about the child but does not know about the others involved. The parent may have very good reasons for trusting the child and not trusting the accusers. In short, an apparent act of self-deception may be explicable in purely cognitive terms without any reference to unconscious motivations or irrationality. The self-deception may be neither a moral nor an intellectual flaw. It may be the inevitable existential outcome of a basically honest and intelligent person who has extremely good knowledge of his or her child, knows that things are not always as they appear to be, has little or no knowledge of the child's accusers, and thus has not sufficient reason for doubting the child. It may be the case that an independent party could examine the situation and agree that the evidence is overwhelming that the child is lying, but if he or she were wrong we would say that he or she was mistaken, not self-deceived. We consider the parent to be self-deceived because we assume that he or she is not simply mistaken, but is being irrational. How can we be sure?

A more interesting case would be one where (1) a parent has good reason to believe that his or her child is likely to tell the truth in any given situation, (2) the objective evidence points to innocence, (3) the parent has no reason to especially trust the child's accusers, but (4) the parent believes the child's accusers anyway. Such a case is so defined as to be practically impossible to explain without assuming some sort of unconscious and irrational motivation (or brain disorder) on the part of the parent. However, if cognitive incompetence is allowed as an explanation for apparently irrational beliefs, then appeals to unconscious psychological mechanisms are not necessary even in this case.

Fortunately, it is not necessary to know whether self-deception is due to unconscious motivations or not in order to know that there are certain situations where self-deception is so common that we must systematically take steps to avoid it. Such is the case with belief in paranormal or occult phenomena such as ESP, prophetic dreams, dowsing, therapeutic touch, facilitated communication, and a host of other topics taken up in the Skeptic's Dictionary.

In How We Know What Isn't So, Thomas Gilovich describes the details of many studies which make it clear that we must be on guard against the tendencies to

  1. misperceive random data and see patterns where there are none;
  2. misinterpret incomplete or unrepresentative data and give extra attention to confirmatory data while drawing conclusions without attending to or seeking out disconfirmatory data;
  3. make biased evaluations of ambiguous or inconsistent data, tending to be uncritical of supportive data and very critical of unsupportive data.

It is because of these tendencies that scientists require clearly defined, controlled, double-blind, randomized, repeatable, publicly presented studies. Otherwise, we run a great risk of deceiving ourselves and believing things that are not true. It is also because of these tendencies that in trying to establish beliefs non-scientists ought to try to imitate science whenever possible. In fact, scientists must keep reminding themselves of these tendencies and guard against pathological science.

Many people believe, however, that as long as they guard themselves against wishful thinking they are unlikely to deceive themselves. Actually, if one believes that all one must be on guard against is wishful thinking, then one may be more rather than less liable to self-deception. For example, many intelligent people have invested in numerous fraudulent products that promised to save money, the environment, or the world, not because they were guilty of wishful thinking but because they weren't. Since they were not guilty of wishful thinking, they felt assured that they were correct in defending their product. They could easily see the flaws in critical comments. They were adept at finding every weakness in opponents. They were sometimes brilliant in defense of their useless devices. Their errors were cognitive, not emotional. They misinterpreted data. They gave full attention to confirmatory data, but were unaware of or oblivious to disconfirmatory data. They sometimes were not aware that the way in which they were selecting data made it impossible for contrary data to have a chance to occur. They were adept at interpreting data favorably when either the goal or the data itself was ambiguous or vague. They were sometimes brilliant in arguing away inconsistent data with ad hoc hypotheses. Yet, had they taken the time to design a clear test with proper controls, they could have saved themselves a great deal of money and embarrassment. The defenders of the DKL LifeGuard and the many defenders of perpetual motion machines and free energy devices are not necessarily driven by the desire to believe in their magical devices. They may simply be the victims of quite ordinary cognitive obstacles to critical thinking. Likewise for all those nurses who believe in therapeutic touch and those defenders of facilitated communication, ESP, astrology, biorhythms, crystal power, dowsing, and a host of other notions that seem to have been clearly refuted by the scientific evidence. In short, self-deception is not necessarily a weakness of will, but may be a matter of ignorance, laziness, or cognitive incompetence.

On the other hand, self-deception may not always be a flaw and may even be beneficial at times. If we were too brutally honest and objective about our own abilities and about life in general, we might become debilitated by depression.

See also ad hoc hypothesis, cold reading, communal reinforcement, confirmation bias, control study, Occam's razor, pathological science, placebo effect, post hoc fallacy, selective thinking, subjective validation, testimonials, and wishful thinking.

further reading

Dupuy, Jean Pierre. Editor. Self- Deception and Paradoxes of Rationality (Cambridge University Press 1998).

Fingarette, Henry. Self-Deception (University of California Press, 2000).

Gilovich, Thomas. How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life (New York: The Free Press, 1993).

Kahane, Howard. Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life, 8th edition (Wadsworth, 1997).

Kruger, Justin and David Dunning. "Unskilled and Unaware of It: How Difficulties in Recognizing One's Own Incompetence Lead to Inflated Self-Assessments," Journal of Personality and Social Psychology December 1999 Vol. 77, No. 6, 1121-1134.

McLaughlin, Brian P., Alelie Rorty, Amelia O. Rorty. Editors. Perspectives on Self-Deception (University of California Press 1988).

Mele, Alfred R. Self-Deception Unmasked (Princeton University Press 2001).

Taylor, Shelly E. Positive Illusions: Creative Self-Deception and the Healthy Mind (New York: Basic Books, 1989).

Wiseman, Richard. Deception & Self-Deception: Investigating Psychics (Prometheus, 1997).

1 Comments:

At 1:31 PM, Anonymous Brownie Recipes said...

Hi thanks forr posting this

 

Post a Comment

<< Home