Saturday, April 28, 2007

Some Fallacies, Biases, & Tendencies towards Error in Argumentation


Argumentation is integral to communication. Without argumentation, we are left with mere assertions and opinions. It isn't just that there is a lack of valid argumentation in online discussions, but a general lack of any argumentation, good or bad, valid or invalid.

When most people think of arguments, they think of the emotionally charged shouting matches that are not only not arguments in the formal sense, but are antithetical to them. Rather, those types of exchanges lack the qualities that make something an argument: clear thinking, valid reasoning, and a valid conclusion reached through the application of these. Argumentation makes the reasoning involved explicit. Communication is dependent on argumentation in that it is the only way to communicate with someone who does not share our assumptions, schema, and methodology.

In his book A Rulebook for Arguments, Anthony Westin describes argumentation as a form of inquiry. This is central to the philosophical tradition, and can be viewed as an initial inquiry, that is determining if a conclusion can be supported. It can also be an inquiry into an accepted conclusion or belief, to see if or how it may be a valid belief, and what such a conclusion rests upon.

Argumentation is much more a form of inquiry than it is a form of persuasion. It is as a form of persuasion that it most often looses its way, because a well-constructed argument is by no means the most effective means of persuasion. Fallacies continually crop up because they are often more effective ways to persuade, even while they fail as arguments. However, a valid argument does more than persuade, it adds to knowledge and understanding. An argument is a careful construction that explicitly shows relationships between thought and data otherwise obscured. Thus argumentation is the way much of scholarship is done.

This brief consideration is not meant to cover argumentation or all of the fallacies and tendencies to error in argumentation. We will consider only some that seem particularly relevant to our context. Readers are directed to the works cited and other works on argumentation and critical thinking for more information.

Summaries & “Antidotes”

The following quotes are from Tools for critical thinking: Metathoughts for psychology by David Levy which, features not only logical fallacies, but also tendencies to bias and error known from psychological research. It also offers “antidotes” or active ways to resist these tendencies in our own thinking and arguments.

Differentiating Dichotomous and Continuous Variables: Dichotomous phenomena can be classified into either of two, mutually exclusive categories. Continuous phenomena, in contrast, can be placed somewhere along a particular dimension, depending on their frequency and magnitude.

Antidotes: Remember that most person-related phenomena (especially psychological constructs) lie along a continuum; thus, it is both artificial and inaccurate to group them into categories. (Levy, 1997, p. 212)

This helps us to avoid the all-too-common mistake of false dichotomies. A false dichotomy imposes a logical “exclusive or” on phenomena—only one or the other is the case, with no nuances or overlap. This is sometimes characterized as “black and white thinking.” An example is “you are either for us, or against us.” Such a position excludes a nuanced view in which someone might be in favor of some parts of an agenda, not in favor of others, and indifferent about the rest.

This view is often imposed by means of assuming the argument, where the proponent of a false dichotomy assumes that anything they associate with one side of the dichotomy indicates an argument for that side and against the other. For example, someone who believes in a false dichotomy between institutions of religion and individually pursued spirituality may encounter someone who makes a statement in support of an institution of religion and then assumes that the statement or individual is opposing individually pursued spirituality. This is a position conjured out of the expectations of the reader without any basis in the original statement itself.
The Consequence-Intentionality Fallacy: We have a propensity to assume that the effect of people's behavior reflects the intent of their behavior. However, consequences alone are not sufficient proof of intentionality. That is, we cannot determine others' intentions solely by the effects of their actions.

Antidotes: Make an active effort to consider other plausible causes or pathways of behavior in addition to the ones implied directly by its consequences; in short, consider alternative intents. (Levy, 1997, p. 217)

This fallacy shows a very common tendency to try to demonstrate intention from effect. This ignores not only our own experience, but also the lyrics of the Rolling Stones “you can't always get what you want.” In day-to-day life actually having only the consequences we intend is a seemingly rare thing, and when writ large it is rare indeed.

To go back to the previous example: the existence of instances of religious or spiritual oppression resulting from institutions of religion does not indicate an intent to do so.

There is an opposite application of this fallacy is the claim that since a result was not intended as a consequence of an action, the action cannot have that result. Although obviously flawed,such a defense is by no means rare in online discourse.
The “If I feel it, it must be true” Fallacy: One's experience of emotional comfort or discomfort is not necessarily a valid gage for differentiating what is true from what is false.

Antidotes: Do not rely on your emotions as the sole barometer for distinguishing truths from falsehoods. There may be certainty in what you are feeling, just not in what it “proves.” (Levy, 1997, p. 217)

There are many variations of this fallacy. A common one is the identification fallacy that goes “I identify with x, I do not identify with y therefore y cannot be x.” We see this as a common method of argument in the online “Gnostic scene.” In that context the argument has the explicit or underlying form of: “I am a Gnostic” that is to say, the individual identifies with Gnosticism. “I do not like/agree with/embrace _____” that is, the person does not identify with what is being considered. “Therefore, ______ is not Gnostic” or, if the person identifies with one thing and not the other they have been “proven” to be essentially different in nature.

This fallacy also reminds us not to think of Gnosis in terms of personal preferences, ego-identity, or emotional comfort. While Gnosis may effect all of these things, they are not in and of themselves indicators of validity and should never be equated with Gnosis. Gnostic scripture bid us to attain self-gnosis. The path of Gnosis is a process of transforming oneself not a process of trying to make the world conform to oneself. It is not a path of identification nor of psychological comfort or reassurance.
The Spectacular Explanation Fallacy: the tendency to seek extraordinary explanations for extraordinary events.

Antidotes: Keep in mind that very ordinary causes are capable of producing very extraordinary effects. Whenever you are confronted with instances of human behavior that are particularly unusual, rare, spectacular, or odd make a deliberate effort to consider ordinary, commonplace, or mundane causes or explanations. (Levy, 1997, p. 218)

This is a tendency that often shows up around spiritual experiences. For example, instead of considering a spiritual experience to be a part of human experience, an explanation is invoked with the physical presence of a supernatural being, or perhaps an extraterrestrial one. Alternatively, an extraordinary explanation involving psychopathology might be invoked to account for spiritual experience, that the person experiencing it is mentally ill. Another variation on this theme is looking for extraordinary explanations like extraterrestrials for extraordinary things like Stonehenge or the Great Pyramids.
The Assimilation Bias: A schema is a mental structure that organizes our preconceptions, thereby providing a framework for understanding new events and future experiences. Accommodation means to modify our schema to fit incoming data; assimilation, in contrast, means to fit incoming data into our schema. In general, we are more prone to assimilate than to accommodate, even if this entails altering or distorting the data. Thus, assimilation can profoundly bias our perceptions of reality.

Antidotes: Don't underestimate the extent to which your prior beliefs, knowledge, and expectations (schemata) can affect your current experience, impressions, and perceptions. Try to become as aware as possible of schemata that are important to you; awareness of schemata increases your ability to modify them. Experiment with temporarily lowering or altering your “perceptual filters” or “schema-colored glasses” by attempting to understand someone else's subjective (phenomenological) experience. Learn to differentiate your use of assimilation versus accommodation, particularly when you are faced with a discrepancy between your beliefs (schemas) and the information (data). Beware of the general tendency to assimilate, rather than to accommodate. Prod yourself to accommodate when, out of habit, reflex, or just sheer laziness, you would typically be inclined to automatically assimilate. Strive toward flexibility; guard against “hardening of the categories.” (Levy, 1997, p.220)

This is a very big issue because it involves the nature of how we go about making sense of the world. Assumptions about this are rarely explicit, rarely even conscious, and there is a natural resistance to making them conscious so that they can be examined or stated explicitly. This is the psychological mechanism that lies behind the often overused phrase of “paradigm shift.” However, we often focus on the accepted paradigm or schemata to the extent of ignoring data. If we really seek to understand we must be willing to change our schema when holding on to it means choosing an idea over the reality.
The Confirmation Bias: We more actively seek out information that will confirm our prior beliefs and expectations than information that might refute them.

Antidotes: Be aware of the ways in which you search for evidence, such as the questions that you ask, may lead you to arrive selectively only at those conclusions that corroborate your initial beliefs. Make it a point to seek out evidence that could, in principal, disconfirm your prior expectations. (Levy, 1997, pp. 220-221)

This is a ubiquitous tendency that not only serves as a filter for the evidence that we seek out, but it also shapes the sources of evidence, and more, the sources of data we use for understanding the world as a whole. So that we can see this both as a bias in actively seeking and also in the types of information we may be exposed to through various channels of information. One of the reasons for the extreme political polarization in the US are the many sources of information that are "pre-biased" in that they only present information that confirms their intended audience's beliefs and expectations. The recent influence of the neo-conservative movement grew out of plans that take advantage of this bias. Think-tanks were funded to provide content, various channels of information were turned into soapboxes for this viewpoint and for information and opinion that confirmed it. This ended up doing the disservice of effectively undermining serious considerations of disconfirming and critical information.

This bias takes us back to the consideration of Gnosis as transformative, as it is another tendency to try to avoid data and information that may lead to or require a transformation if only of our beliefs.
The Belief Perseverance Effect: We have a tendency to stubbornly cling to our beliefs, sometimes even in the face of disconfirming evidence. This is especially likely to occur when we feel personally invested in our beliefs. Thus, when these beliefs are challenged, we feel impelled to protect them, almost as if we were protecting ourselves. One consequence of this phenomenon is that it generally requires much more compelling evidence to change our beliefs than it did to create them in the first place.

Antidotes: Keep an open mind to different, and especially challenging, points of view. Remind yourself (and others as well) to think carefully about how you evaluate evidence and to monitor your biases closely as you formulate your conclusions. Make it a point to actively counterargue your preexisting beliefs. That is, ask yourself directly in what ways your beliefs might be wrong. When faced with a discrepancy between your beliefs and the facts, resist the natural tendency to assume that your beliefs are right and the facts must somehow be wrong. (Levy, 1997, p. 221)

A vast preponderance of evidence against a belief or position is rarely enough to provoke serious reconsideration by those who hold it. The reason is often that of identity, we identify with a belief or position and thereafter defend it as if it were ourselves. This can be used strategically as a type of fallacy by a pretense of acknowledging a belief or position as established as a part of a communal identity. This can be seen in accepting the statements of authority as if we were convinced by them as arguments. So we need to be careful not only in the beliefs or positions we may hold individually, but also those we may hold by virtue of our identification with, or participation in, a communal identity.

A further technique for resisting this bias is not to base our identity upon belonging to a group and participating in that communal identity. Even basing one's identity of being an independent or non-conformist evokes this bias.

Also bear in mind that this effect involves more than consciously adopted or considered beliefs. We can see it in the strong influences of cradle creeds on subsequent religious considerations. This is the effect of one's original religious upbringing or culture on both of the contents of religious beliefs, and beliefs about religion. These survive in many respects regardless of whether or not the original religion was rejected—shaping the way any religion is approached.

The most compelling and frequent evidence one is likely to encounter is the holding of positions critical of religious groups that where rivals of the cradle creed, or of religious aspects rejected by it. One example is that of someone opposed to official clergy in a neo-pagan setting, not realizing that this is due to the rejection of official clergy in their Quaker upbringing. Another is that of someone rejecting the use of statues and incense in a Buddhist temple, not realizing that this is due to a protestant rejection of such things as they were associated with Roman Catholic practice in another culture and religion altogether, but remaining as a personal belief from childhood. A way to recognize when this is happening is when the reasons given for the beliefs are the same as those of the cradle creed. For if it were the result of an independent consideration, why would it have such ties to the reasoning of another religious tradition?

Remember that the “truth” of a belief is only one reason for holding a belief, and is often less compelling than other reasons for maintaining it. Be careful about what you choose to believe and how you choose to believe it. Having a provisional or pragmatic attitude towards knowledge can aid in the “costs” of maintaining a belief. Rather than being “true” in some absolute sense, a belief may be “the best account” of the situation and therefore subject to improvement.

Also remember the limitations of context on constructing knowledge. A “truth” in one area does not make a universal “Truth,” for example.


Hopefully, these brief considerations show not only the importance of critical thinking and sound argumentation, but also its real power of emancipation. For its real purpose isn't to “win” arguments or to not appear to be intellectually lazy, but its real purpose is as a tool for liberation. To avoid fallacies and counteract biases isn't just to have good form, it is to be free from their traps. Gnostic texts speak of the Archons, Greek for “the powers,” that keep us ignorant and serving them. What they meant by this was much broader, but we can also see it in this context. If we do not set ourselves free from outworn beliefs, from schemata that blind us, from what defends us from change and transformation, and from other errors of thought and judgment—we are under their power, and are more easily kept ignorant.

We may attempt to not be persuaded by the beliefs and arguments of others, but even then, we are still under the influence of our own. The freeing of our thinking minds is aided in arming ourselves with tools such as those outlined above, and consciously considering our schemata (paradigms or frameworks for understanding) and our beliefs. It is only a part of who we are, and for some of us it is more central to the living of our lives than for others. Thinking is one of four functions identified by Jung. It is one way to assess and to judge. So, if we are one of those who are more guided by it, we must be all the more diligent in obtaining what freedom we can for it.

Levy, D. A. (1997). Tools for critical thinking: Metathoughts for psychology. Long Grove, IL: Waveland Press.

Weston, A. (2000). A Rulebook for Arguments. (Third edition). Indianapolis: Hackett Publishing.
Highly recommended. An affordable, well-written, and concise book that covers the construction and forms of arguments, as well as, logical fallacies.

1 comment:

Jordan Stratford+ said...

Thank you, this is an excellent must-read. I should very much like to purchase the entire post inscribed into a two-by-four with which I might smack several victims!