(back to JUST READ IT)

Love for the Truth About Covers


BARRIERS AND BLOCKS to Critical Thinking

By Carla Thomson
Professor of Critical Reading Palomar College


Blocks to Critical Thinking *

Blocks to critical thinking impede us from arriving at a reasonable basis for belief. They are obstacles that we must not only be aware of but work zealously to avoid following blindly. Otherwise they will thwart our efforts to become more effective thinkers. There are far too many such blocks to give a complete list. Instead we’ll focus on the primary ones, which include cultural conditioning, reliance on authority, hasty moral judgment, black and white thinking, labels, resistance to change, and frame of reference. Although these categories overlap, they are sufficiently different from one another to warrant separate coverage.

Cultural Conditioning and Egocentric Thinking

Cultural conditioning refers to the process by which society’s attitudes and values are passed on to its members. Although the precise nature of cultural conditioning, including how it operates and what lasting effects it has is uncertain, cultures unquestionably condition their members. In other words, you and I are to a large degree the products of the particular place and time in which we were raised. We have been significantly shaped by the customary beliefs, social forms, and material traits of that setting. The assumptions derived from this conditioning are so embedded in our view of things that we probably aren’t even aware of much of it.

For example, most Americans “believe in” individuality. Individuality is a value toward which they have a favorable attitude. Likewise, many of us still believe in “the American dream”: Through sheer hard work, we can become just about anything we want to become. And many Americans remain firmly committed to the idea of “progress” as measured by such things as an increasing gross national product, a higher standard of living, and a growing population. The list of cultural assumptions that we’ve inherited could be extended to include the following:



* Except as otherwise indicated the following information adapted from:
The Ninth Annual & Seventh International Conference on Critical Thinking and Educational Reform, Invitation to Critical Thinking, by Vincent E. Barry

The list goes on and on. In each case it is not so much observed specifics that can lead us to an assumption, but accepted beliefs regarding a particular aspect of culture, social behavior, sexual roles, politics, religion, economics, and so on.

What’s the connection between cultural assumptions and critical thinking? Very simply, blindly accepting such assumptions leads to selective perception --- to seeing only what we want to see. But critical thinking requires objectivity, a dispassionate and impartial examination of the evidence that confirms or disconfirms a position. When I think critically, I must try to put aside my preconceptions and biases about an issue. I must attempt to keep an open mind. While none of us can ever be completely objective, we can be impartial enough to allow our views to conform to the evidence instead of making the evidence conform to our views. There may be good reasons for holding any cultural assumption. But when we leave the assumption unexamined and allow it to bias our thinking, we violate a prerequisite of critical thinking: objectivity. 

To see how cultural conditioning can block critical thinking, consider this simple statement:  “Democracy is the best form of government because it allows people to participate in their governance.”  Probably not many Americans would question this statement because it’s based on our cultural assumptions that allowing people to participate in their governance is good and desirable. “But,” you may be thinking, “Isn’t it? Doesn’t everyone know that?”  No.  Communists don’t believe that such a system of government is good and desirable.  Neither do Fascists.  Nor did Plato or Aristotle, to name just two of Western civilization’s preeminent thinkers.  If you’re serious about critical thinking, then you must at some point examine such an assumption.  What’s to be said for it, what against it?  Only then can you start thinking critically about the statement made above.

The Problem of Egocentric Thinking
From Critical Thinking Concepts and Tools By Richard Paul and Linda Elder (2001) Foundation for Critical Thinking www.criticalthinking.org

Egocentric thinking comes from the unfortunate fact that humans do not naturally consider the rights and needs of others, nor do we naturally appreciate the point of view of others or the limitations of our own point of view. We become explicitly aware of our egocentric thinking only if trained to do so. We do not naturally recognize our egocentric assumptions, the egocentric way we use information, the egocentric way we interpret data, the source of our egocentric concepts and ideas, the implications of our egocentric thought. We do not naturally recognize our self-serving perspective.

As humans we live with the unrealistic but confident sense that we have fundamentally figured out the way things actually are, and that we have done this objectively. We naturally believe in our intuitive perceptions—however inaccurate. Instead of using intellectual standards in thinking, we often use self-centered psychological (rather than intellectual) standards to determine what to believe and what to reject. Here are the most commonly used psychological standards in human thinking.

"IT'S TRUE BECAUSE I BELIEVE IT." Innate egocentrism: I assume that what I believe is true even though I have never questioned the basis for many of my beliefs.

"IT'S TRUE BECAUSE WE BELIEVE IT." Innate sociocentrism: I assume that the dominant beliefs within the groups to which I belong are true even though I have never questioned the basis for many of these beliefs.

"IT'S TRUE BECAUSE I WANT TO BELIEVE IT." Innate wish fulfillment: I believe in, for example, accounts of behavior that put me (or the groups to which I belong) in a positive rather than a negative light even though I have not seriously considered the evidence for the more negative account. I believe what "feels good," what supports my other beliefs, what does not require me to change my thinking in any significant way, what does not require me to admit that I have been wrong.

"IT'S TRUE BECAUSE I HAVE ALWAYS BELIEVED IT." Innate self-validation: I have a strong desire to maintain beliefs that I have long held, even though I have not seriously considered the extent to which those beliefs are justified, given the evidence.

"IT'S TRUE BECAUSE IT IS IN MY SELFISH INTEREST TO BELIEVE IT." Innate selfishness: I hold fast to beliefs that justify my getting more power, money, or personal advantage even though these beliefs are not grounded in sound reasoning or evidence.

Because humans are naturally prone to assess thinking in keeping with the above criteria, it is not surprising that we, as a species, have not developed a significant interest in establishing and teaching legitimate intellectual standards. It is not surprising that our thinking is often flawed. We are truly the "self-deceived animal.”

Reliance on Authority

Authority is an expert outside ourselves.  The expert can be a single individual (a parent, a teacher, a celebrity, a clergy member, the President), a group of individuals (doctors, educators, a peer group, a national consensus), or even an institution (a religion, a government agency, an educational establishment).  Whatever its form, authority is a common source of belief and knowledge.

Just think about everything you claim to know that is based on authority. Examples might include facts and opinion about world history, the state of your health, the direction of the economy, the events of the day, the existence of God and an afterlife --- the list seems endless, the topics unbounded.  In fact, without relying on authority, we would know very little of what we ordinarily take for granted.

But there’s a danger.  We can so rely on authority that we stop thinking for ourselves.  Puzzled about something, we might invoke some authority to decide the answer for us.  When dealing with a controversial issue, we might find out what the majority thinks and, looking no further, adopt the same position.  Following authority blindly is a block to critical thinking as well as an evasion of autonomy.

To get some idea of how influential authority can be, consider a series of experiments conducted by psychologist Stanley Milgram in the 1970’s.  You may know that Milgrim’s experiment consisted of asking subjects to administer strong electrical shocks to people whom the subjects couldn’t see. The subjects supposedly could control the shock’s intensity by means of a shock generator with thirty clearly marked voltages, ranging from 15 to 450 volts and labeled from “Slight Shock (15)” to “XXX ---  Danger’ Severe Shock (450).”

Before you think that psychologist Milgrim was sadistic, I should point out that the entire experiment was a setup.  No one was actually administering or receiving shocks.  The subjects were led to believe that the “victims” were being shocked as part of an experiment to determine the effects of punishment on memory.  The “victims,” who were in fact confederates of the experimenters, were strapped in their seats with electrodes attached to their wrists “to avoid blistering and burning.”  They were told to make no noise until a “300-volt shock” was administered, at which point they were to make noise loud enough for the subjects to hear (for example, pounding on the walls as if in pain).  The subjects were reassured that the shocks, though extremely painful, would cause no permanent tissue injury.

When asked, a number of psychologists said that no more than 10 percent would heed the request to administer a 450-volt shock.  In fact well over half did --- twenty-six out of forty.  Even after hearing the “victims” pounding, 87.5 percent of the subjects (35 out of 40) applied more voltage. The conclusion seems unmistakable: A significant number of people, when asked by legitimate authority, will hurt others.

Authority not only influences the behavior of people, it also affects their judgment, perhaps even more so.  For example, consider these three lines:


Which of the three matches this one?

Undoubtedly, B.  Do you think you could ever be persuaded to choose A or C?  Maybe not, but experiments indicate that some individuals can be persuaded to alter their judgments, even when their judgments are obviously correct.  These experiments involved several hundred individuals who were asked to match lines just as you just did.  In each group, however, one and only one subject was naïve that is, unaware of the nature of the experiment.  The others were confederates of the experimenter, who had instructed them to make incorrect judgments in about two-thirds of the cases and to pressure the dissenting naïve subject to alter his or her correct judgment. 

The results:  When subjects were not exposed to pressure, they inevitably judged correctly.  But when the confederates pressured them, the naïve subjects generally changed their responses, to conform with the unanimous majority judgments.  When one confederate always gave the correct answers, naïve subjects maintained their positions three-fourths of the time.  But when the honest confederate switched to the majority view in later trials, the errors made by naïve subjects rose to about the same level as that of subjects who stood alone against a unanimous majority.

Make no mistake about it.  Authority cows us.  We are impressed, influenced, and intimidated by authority; so much that, under the right conditions, we might even shuck our own values, beliefs, and judgments

Certainly none of this is intended to imply that authority is never a legitimate source of opinion and belief.  When (1) the authority asserting a given view is indeed an expert in the field, (2) authorities are generally agreed on that view, and (3) I can, at least in theory, find out myself whether the view is valid, then I have solid grounds for relying on authority.  But even then I must realize that the authority gets its force from the weight of evidence on which the view it asserts is based.

Authority plays such an influential role in our thinking that I will say more about it later in our study.  Here it’s enough to note that a slavish reliance on authority is foreign to critical thinking.  When we make up our minds and base our actions not on the weight of evidence but on the say-so of someone influential; then we are mere lackeys, not critical thinkers.

Hasty Moral Judgment

A moral judgment is an evaluation of someone or something as good or bad, right or wrong.  We make moral judgments all the time and these are largely influenced by cultural conditioning.  We term the child abuser an “evil person” and child abuse “heinous” or “reprehensible.”  We denounce naked aggression and welcome efforts to rebuff it.  We deplore the liar and admire the truth teller.  We take positions on issues such as abortion, capital punishment, and pornography that reflect approval or disapproval.  “Abortion should [or should not] be legalized.”  “Capital punishment is [or is not] the most effective way of deterring certain serious crimes.”  “Pornography should [or should not] be carefully controlled.”  Value judgments all.

Often we make such judgments hastily.  For example we judge people on the basis of their looks, background, or associations.  We base such judgments not on careful consideration of factual evidence, but on emotion, prejudice, preconception, intolerance, or self-righteousness.  Because hasty moral judgments are essentially nonrational --- that is, unreasoned ---  they blunt the goals of critical thinking: insight and understanding.

Unquestionably, hasty moral judgments are deeply ingrained blocks to thinking critically, for the values upon which they are based typically are imparted very early in our lives, well before we can seriously examine them and the belief systems they spawn.  Thus, before we’re mature enough to think intelligently about things like religion, race, sex, and politics, we have been exposed to the values of parents, teachers, clergy members, and others who participate in our moral development.  As a result we tend to acknowledge what supports our acquired moral value systems and to ignore or dismiss what doesn’t.  We wall ourselves off from disclosure contrary to our preconceptions, thereby purchasing security at the price of insight and understanding.

This doesn’t at all mean you shouldn’t have strong moral beliefs.  But there’s a big difference between moral convictions that precede deliberation and those that follow it.  The former are hasty, the latter…..well considered.  Presumably, you want your moral beliefs to be as carefully thought out as any other kind ---  economic, political, educational, and social.  As we saw earlier, a primary goal of critical thinking is the formulation of justified belief.  Since hasty moral judgments are not such beliefs, they thwart critical thinking

Black and White Thinking

Black and white thinking refers to the tendency to place things in either/or categories, ignoring the complexity of an issue.  Here are some expressions of black and white thinking:

You’re either for me or against me.
A person is either 100 percent American or he/she isn’t.
Most people are either honest or crooked.
There is only one right way to do anything.
America --- love it or leave it.
Guns don’t kill people, people do.
There are two kinds of girls; those that do it and those that don’t.

Several factors account for black and white thinking. One is the human desire for certainty. If we see things in mutually exclusive categories, we stand a far better chance of attaining a certainty than if we consider the complexities of an issue. Black and white thinking also results from the pervasive tendency to confuse negatives and opposites. When two things are genuine negatives, they exclude any middle ground. For example, “cold” and “not cold,” “red” and “not red,” and “conservative” and “nonconservative” are negatives. They exclude any gradations between their extremes. If the room is cold, it cannot be not cold; if the cloth is red, it cannot be not red; if the politician is conservative on an issue, he or she cannot be nonconservative on the same issue.

But negatives are not opposites, which allow for a number of gradations. “Cold” and “hot,” “black” and “white,” and “conservative” and “liberal” are opposites. There is plenty of middle ground between each term and its opposite. With negatives one of the two extremes must be true and the other false, but with opposites both extremes may be false. Thus the room may be neither hot nor cold, but pleasantly warm. The cloth may be neither black nor white, but green. The politician may be neither conservative nor liberal, but moderate.

Failing to perceive the difference between negatives and opposites, people apply a false logic to such concepts as democracy and justice, such states of mind as love and hate, or to human institutions, issues, and behavior. They may falsely believe that if they feel anger or hate toward someone, they cannot at the same time love that person. Or they may believe that if individuals are to be held accountable for their actions, then such actions cannot at the same time be regarded as the outcome of social or environmental forces. Or they may hold that if welfare is abused by some recipients, it can’t at the same time be unabused by other recipients. Such wrong-headedness can produce equal illogical proposals. For example: Regarding welfare fraud, either weed out all cheaters or abolish the system. In order to ensure world peace, either arm to the teeth or disarm. To deal with the problem of abortion, either disallow it or allow it on demand.
Still another explanation for the prevalence of black and white thinking can be found in the nature of our language. English is replete with polar opposites: “right/wrong,” “good/bad,” “happy/sad,” “conservative/liberal,” “love/hate,” “smart/dumb,” “sane/insane,” “drunk/sober” “true/false.” Some semanticists believe that the availability of such opposites encourage black and white thinking; it inclines us to view the world as clear-cut when the fact subtle variations abound. Whatever the explanations for black and white thinking, one thing is clear: It can be a block to critical thinking.

Flaws in Reasoning and Arguments: Black & White Thinking
The following is from:  http://atheism.about.com/od/logicalflawsinreasoning/a/blackwhite.htm

Another aspect of irrationality is absolutism. That is, seeing issues in terms of absolute black and white:
"You are either with us or you are against us."
"You are either part of the solution, or part of the problem."
"If you are not a super-patriotic right-wing conservative like us, then you are a godless liberal."
"Absolutely ALL of our leader's teachings are correct. He never makes any mistakes."
The very word "rational" comes from "ratio", a fraction. Absolutists hate fractional and proportional terms. They love absolute words like "always", "never”, "all", and “none". They dislike words like "usually", "seldom", "mostly", "generally", and "few", which admit to there often being a few exceptions to the rule. All of which means that absolute rules are not always right, and you might actually have to think, rather than just let some stereotypes and slogans and simplistic answers rattle around in your head. Fanatics will say, "Quit trying to confuse me," when you point out the exceptions to their absolute rules.
I find it amusing that William Randolf Hearst, who was possibly the most successful newspaper publisher in the history of the USA, said that it seemed that forcing the American people to think was the greatest torture to which you could subject them. So Hearst became a very rich man by publishing newspapers that didn't require people to think.

Consider also: 

Distorting reality: Seven ways to misinterpret what is happening

Black and White Thinking


Labels are essential for communication. They make it possible for us to communicate a complex situation a piece at a time. The use of labels helps us react specifically to some part of the environment and to deal with new and unfamiliar environments by picking out familiar features. For example, there are about three billion entities in the world today corresponding to our label “the human race.” We can’t possibly deal individually with so many human beings. We can’t even individuate the dozens we encounter daily. Instead we must group them, drawing the many into a single unit by means of a label.

But as useful as labels are, they can be blocks to thinking. First by lumping things into categories, labels ignore individual differences. For example, to call Gloria Steinem a “feminist” is to ignore other aspects of her identity, for Steinem is also a “voter,” “licensed driver,” “taxpayer,” “consumer,” “author,” and so on. Labeling her a “feminist” encourages us to see her exclusively in terms of that label. The result can be a distortion.

Linguist Irvin Lee gives a graphic example of how the very act of labeling causes us to overlook all other features of any entity, many of which might offer a more accurate representation then the label we choose.

          I knew a man who had lost the use of both eyes. He was called “blind man.”
He could also be called an expert typist, a conscientious worker, a good student,
a careful listener, a man who wanted a job. But he couldn’t get a job in the
department store order room where employees sat and typed orders which came
over the telephone. The personnel man was impatient to get the interviewer over.
“But you’re a blind man,” he kept saying, and one could almost feel his silent
assumption that somehow the incapacity in one aspect made the man incapable
in every other. So blinded by the label was the interviewer that he could not be
persuaded to look beyond it.

Besides causing us to overlook individual differences, labels encourage polarization. That is, they encourage us to view things as grouped in opposing factions, for example:  “democracy/ totalitarianism,” “democrat/republican,” “pro-capital punishment/anti-capital punishment,” “capitalist/socialist,” “people who love America/people who do not,” “people who pull their own weight/people who do not,” and so on. Hence labels are one of the great blocks to finding alternative ways of thinking about the world. The fixed patterns of thought are not altered by incoming information, but instead alter it. How many articles have you read that glibly  use such labels as “profitability,” “productivity,” “justice,” “equality,” “human rights,” “individualism,” “free enterprise,” and similar abstractions? There’s nothing necessarily wrong with such labels. But they can be used too conveniently to justify our views. Rather than allowing us to think critically, they inhibit analysis by obscuring, eliciting knee-jerk reactions, and forcing one to take sides.

Take, for example, the term “patriotism.” “Patriotism” is so shrouded by notions of heroics and duty virtue, by “my country right or wrong,” that we must regard it as either honorable or dangerous. Of course national spirit, in terms of individual culture or economic growth, is important, but because the term “patriotism” polarizes, it’s of little use in discussing the important subject of national spirit. In fact it can block the intelligent, searching examination that the subject requires.

How can we escape labels? Challenge them, try to do without them, even establish new labels. In challenging labels ask yourself why you are using them. What do the labels really mean? Are they essential to your communication? Or are they clichés you are using because they are convenient? Watch out for the labels used by others. Ask yourself why you must accept them. In challenging a label, you are not rejecting it. You are simply refusing to accept it blindly. Your purpose is to generate a new view of something, a view that the label may be blocking. In trying to avoid labels you are trying to discover what lies buried beneath them. You may find that there is much or little there. The key is realizing that in removing the block that labels create, you are allowing the free flow of information you need for effective thinking.

Resistance to Change

A big block to thinking effectively is the tendency to cling to preconceived notions, to set ways of viewing and doing things. As a simple example, consider chain smokers. How do some chain smokers contend with the blizzard of statistics that connect smoking to disease? They simply avoid such evidence, consider themselves exceptions to the rule or so rationalize their habit that, in the end, they have more reasons for smoking than not.

Similarly, not many people regularly read political journals that present views contrary to the ones they hold. Probably even fewer have ever seriously investigated religious views incompatible with their own. And precious few ever consider alternatives to their views of what’s right and wrong, good and bad. Indeed many of us react to beliefs, values, and attitudes that challenge our own with self-righteous contempt.

The fact is that most of us not only avoid views contrary to our own we systematically expel them from our experience. We resist change. Why?

We resist change partly because we perceive it as a threat to who and what we are and partly because we believe in the superiority of our own culture in the view that “mine is better – my ideas, my values, my race, my country, my religion.”

The history of science is rife with examples of this kind of resistance to change. For example, Galileo’s astronomical treatise, the Dialogue on the Two Chief Systems of the World (1632), was a thoughtful and devastating attack on the traditional geocentric view of the universe proposed by the ancient Greek Ptolemy (second century A.D.) and accepted by most scholars and scientists of Galileo’s time. Galileo’s treatise was therefore an attack not only on the views of these authorities but also on their self-concepts. Predictably enough, they reacted violently. Pope Urban was persuaded that Simplicio, the butt of the whole dialogue, was intended to represent himself. His vanity bitterly wounded, the Pope ordered Galileo to appear before the Inquisition. Although never formally imprisoned, Galileo was threatened with torture and forced to renounce what he had written. In 1633 he was banished to his country estate. His Dialogue, together with the works of Kepler and Copernicus were placed on the Index of Forbidden Books, from which they were not withdrawn until 1835.

In general, we block out new information that threatens the existing beliefs with which we are secure. For example, all of us like to think of our presidents as honorable and virtuous men whose actions are always motivated by the highest regard for the national welfare. As a result, many of us dismiss the emergence of contrary information about a president as the brainchild of a crank or a sensationalist bent on personal gain. Perhaps it is. But we often form this opinion before even considering the evidence because it threatens our “taken-for-granteds.”

How can you deal with such resistance to change? A good way is to become aware of two danger signals. Whenever you find yourself or others responding in one of these two ways, you or they are probably letting resistance to change block rational inspection of the situation.

One danger signal is what psychologists term “reaction formation,” which refers to an immediate, strong, and emotional reaction against something. A dramatic example of reaction formation is “white flight,” a term used to describe the mass exodus of whites from a neighborhood when a black moves in. Often white flight is preceded by an equally emotional and irrational response prior to the sale of the house to the black. Thus, upon learning that a party is about to sell a house to a black, neighbors have been known to exert intense pressure on the seller to stop the sale. The pressure takes various forms, ranging from “moral suasion” to threats of reprisal. Unable to stop the sale and convinced that property values will plummet, or repelled by the notion of having a black neighbor, white homeowners abandon the neighborhood en masse. Racism aside, the irrationality of the behavior lies in the fact that ironically, the white homeowners themselves reduce the market value of their property by advertising it as undesirable, which they effectively do by bailing out.

A similar problem is now occurring in the area of mental health. New laws have made it increasingly difficult to institutionalize persons or to keep them institutionalized indefinitely. In addition the latest professional thinking holds than an institutional environment is not always the best way to treat the emotionally disturbed. In many cases a treatment program that allows the patient to function in as nearly normal a social environment as possible seems far more therapeutic. The upshot of these new developments is the establishment of residential homes for carefully screened patients. One of the major obstacles to establishing such centers is that neighborhood residents initially react with a mixture of fear, anger, and hostility. In a word, they feel threatened, so much so that they sometimes refuse even to discuss the matter or will threaten court action to prevent the establishment of these homes. And yet where these centers have been established, for the most part they have meshed beautifully with the neighborhoods around them.

Of course, reaction formation need not be confined to phenomena like these. Strong emotional reactions also can be directed against some abstraction. For example, anything that even remotely smacks of socialism likely will elicit passionate disapproval from most Americans. This explains in part why proposals for such things as day-care centers, guaranteed minimum income, national health insurance, rent control, and a raft of other social welfare programs and proposals meet with immediate vitriolic opposition. Perhaps such programs shouldn’t be acted upon. But if not it is because they are ineffective and inadequate solutions to social problems, which can only be determined by examining them and the alternatives. The problem with an immediate, emotional reaction against something is that is precludes the analysis and evaluation of the issues.

A second danger signal is “primary certitude,” which refers to an immediate, strong, emotional feeling about something. Individuals who engage in primary certitude are convinced that they have a corner on the truth and hence are unwilling to entertain any contradictory facts. You may have encountered people whose reactions, in effect, reflect this attitude: “Don’t confuse me with facts, my mind is already made up” or “I don’t care what you say, the fact of the matter is that ….” Such reactions are dead giveaways of primary certitude.
You can find evidence of primary certitude in people’s responses to almost any controversial issue. Take for example, the rising incidence of violent crimes. Countermeasures typically recommended include installing tougher judges, giving stiffer penalties and restoring the death penalty. While these recommendations are understandable, no casual relation has ever been established between the severity of physical punishment and the crime rate. Similarly a common proposal for dealing with the decline in U.S. productivity and competitiveness is to establish import quotas and tariffs. Yet experts tell us that the problem is not that simple. At the root of declining productivity and competitiveness are obsolete equipment and plants, failure to anticipate and meet consumer needs, a less pronounced work ethic among younger workers, increasing labor costs, and inflation.
As with reaction formation, primary certitude cuts off the thorough and intelligent examination of an issue. It blocks effective thinking.
See also:  Overcoming Resistance to Change: Top Ten Reasons for Change Resistance by A. J. Schuler, Psy. D.  http://www.schulersolutions.com/resistance_to_change.html


Limited Frame of Reference

All of us have a tendency to see ourselves and the world according to our own frame of reference, that is, according to the organized body of accumulated knowledge and experience we rely on to interpret new experience and guide our behavior. This frame of reference limits our perception. Perception refers to the process by which we give meaning to sensory stimuli. Although we speak of “seeing” an airplane or “hearing” music, we really are seeing light waves and hearing sound waves. The process by which we give meaning to light and sound waves is perception. We see light waves, but perceive airplanes, houses, trees and other people; we hear sound waves, but we perceive a symphony, a baby crying, a dog barking, and a tree creaking.

Perception –-- giving meaning to sensory stimuli  ---  occurs only in terms of the information in our frame of reference. For example, some African natives with no knowledge of airplanes perceived jet planes as big birds. In fact some natives refer to jets as ndege mkubwa. Swahili for “big bird.” Those who are unfamiliar with jets can’t possibly perceive them as such. Instead they perceive them in terms of the information stored in their frame of reference. Similarly, the first automobiles were called “horseless carriages,” and those many unidentified objects reportedly observed in the sky and variously described as saucer or disc shaped were termed “flying saucers.”

Several important implications for critical thinking follow from the limitations to perception created by our frame of reference. First, in limiting perception, the frame of reference limits our ability to recognize problems. For instance, persons unaware of its danger signals can’t recognize the presence of a cancer in their bodies. Second, the frame of reference limits the acquisition of new knowledge because knowledge is as necessary to acquiring knowledge as money is to making money. For instance, a student who knows nothing about geometry most certainly will flounder in a trigonometry course, and those of us with little or no knowledge of aerodynamics and thermodynamics can’t begin to understand the scientific aspects of launching, sustaining, and recovering space craft. Third, false information in the frame of reference can be worse than no information at all. Individuals who are convinced that sucking on nectarine pits will cure cancer deprive themselves of potentially therapeutic medical treatment.  Employers and labor unions who are certain that fatter paychecks are the only way to motivate workers overlook what may be more effective motivational devices, such as restructuring work and improving job atmosphere.

Lacking a rich and accurate frame of reference, we cannot think critically. If you want to think effectively, then start stocking your mind with information. Of course this poses a problem because the amount of information currently available is awesome. None of us can hope to acquire more than a fraction of it. Nevertheless, there are steps you can take to increase the information in your frame of reference.

The selection of information for your frame of reference largely depends on the kind of person you are and will become. More specifically, you might categorize the information you need in terms of career aspirations, social roles, and human potential.

If you wish to succeed in your career, you will surely need to acquire the necessary technical knowledge and to keep abreast of current developments in your field. But you will be more than a professional; you will also be a person who functions in numerous social roles – the role of spouse, parent, consumer, citizen, and so on. To be effective in these roles, you will need just as much information – in some cases more – as you need to succeed in a career. In general this calls for wide reading and study of the social, political, biological, and physical sciences. Beyond this you presumably wish to become everything you are capable of becoming: You wish to realize as much of your potential as possible. If, besides success as a professional and a player of various social roles, you desire fulfillment as a human being, you must at least expose yourself to what is termed the humanities, those branches of learning having primarily a cultural character, for example, philosophy, art, music, and literature.

But there is something further you must know – something more basic than what you need to succeed in a career or social role or as a human being. You need the knowledge of how to act freely as a human being. Specifically, you must first be able to learn for yourself. Lacking this knowledge you remain a slave to the ideas of others and the machines programmed by them. You must know how to think for yourself. If you don’t, you can never go beyond what others have learned or thought and, again, you remain enslaved to the ideas of others. In a word, you must know how to learn and think for yourself. Frankly, I can think of no better way to make this discovery than with a study of critical thinking. Indeed it’s safe to say that lacking critical thinking skills, you will never know how to learn and think for yourself.

(back to JUST READ IT)