Critical thinking

We’ve already established that information can be biased. Now it’s time to look at our own bias.

Studies have shown that we are more likely to accept information when it fits into our existing worldview, a phenomenon known as confirmation or myside bias (for examples see Kappes et al., 2020 ; McCrudden & Barnes, 2016 ; Pilditch & Custers, 2018 ). Wittebols (2019) defines it as a “tendency to be psychologically invested in the familiar and what we believe and less receptive to information that contradicts what we believe” (p. 211). Quite simply, we may reject information that doesn’t support our existing thinking.

This can manifest in a number of ways with Hahn and Harris (2014) suggesting four main behaviours:

  • Searching only for information that supports our held beliefs
  • Failing to critically evaluate information that supports our held beliefs - accepting it at face value - while explaining away or being overly critical of information that might contradict them
  • Becoming set in our thinking, once an opinion has been formed, and deliberately ignoring any new information on the topic
  • A tendency to be overconfident with the validity of our held beliefs.

Peters (2020) also suggests that we’re more likely to remember information that supports our way of thinking, further cementing our bias. Taken together, the research suggests that bias has a huge impact on the way we think. To learn more about how and why bias can impact our everyday thinking, watch this short video.

Filter bubbles and echo chambers

The theory of filter bubbles emerged in 2011, proposed by an Internet activist, Eli Pariser. He defined it as “your own personal unique world of information that you live in online” ( Pariser, 2011, 4:21 ). At the time that Pariser proposed the filter bubble theory, he focused on the impact of algorithms, connected with social media platforms and search engines, which prioritised content and personalised results based on the individuals past online activity, suggesting “the Internet is showing us what it thinks we want to see, but not necessarily what we should see” (Pariser, 2011, 3:47. Watch his TED talk if you’d like to know more).

Our understanding of filter bubbles has now expanded to recognise that individuals also select and create their own filter bubbles. This happens when you seek out likeminded individuals or sources; follow your friends or people you admire on social media; people that you’re likely to share common beliefs, points-of-view, and interests with. Barack Obama (2017) addressed the concept of filter bubbles in his presidential farewell address:

For too many of us it’s become safer to retreat into our own bubbles, whether in our neighbourhoods, or on college campuses, or places of worship, or especially our social media feeds, surrounded by people who look like us and share the same political outlook and never challenge our assumptions… Increasingly we become so secure in our bubbles that we start accepting only information, whether it’s true or not, that fits our opinions, instead of basing our opinions on the evidence that is out there. ( Obama, 2017, 22:57 ).

Filter bubbles are not unique to the social media age. Previously, the term echo chamber was used to describe the same phenomenon in the news media where different channels exist, catering to different points of view. Within an echo chamber, people are able to seek out information that supports their existing beliefs, without encountering information that might challenge, contradict or oppose.

Other forms of bias

There are many different ways in which bias can affect the way you think and how you process new information. Try the quiz below to discover some additional forms of bias, or check out Buzzfeed’s 2017 article on cognitive bias.

Robert Evans Wilson Jr.

Cognitive Bias Is the Loose Screw in Critical Thinking

Recognizing your biases enhances understanding and communication..

Posted May 17, 2021 | Reviewed by Jessica Schrader

  • People cannot think critically unless they are aware of their cognitive biases, which can alter their perception of reality.
  • Cognitive biases are mental shortcuts people take in order to process the mass of information they receive daily.
  • Cognitive biases include confirmation bias, anchoring bias, bandwagon effect, and negativity bias.

When I was a kid, I was enamored of cigarette-smoking movie stars. When I was a teenager , some of my friends began to smoke; I wanted to smoke too, but my parents forbid it. I was also intimidated by the ubiquitous anti-smoking commercials I saw on television warning me that smoking causes cancer. As much as I wanted to smoke, I was afraid of it.

When I started college as a pre-med major, I also started working in a hospital emergency room. I was shocked to see that more than 90% of the nurses working there were smokers, but that was not quite enough to convince me that smoking was OK. It was the doctors: 11 of the 12 emergency room physicians I worked with were smokers. That was all the convincing I needed. If actual medical doctors thought smoking was safe, then so did I. I started smoking without concern because I had fallen prey to an authority bias , which is a type of cognitive bias. Fortunately for my health, I wised up and quit smoking 10 years later.

It's Likely You're Unaware of These Habits

Have you ever thought someone was intelligent simply because they were attractive? Have you ever dismissed a news story because it ran in a media source you didn’t like? Have you ever thought or said, “I knew that was going to happen!” in reference to a team winning, a stock going up in value, or some other unpredictable event occurring? If you replied "yes” to any of these, then you may be guilty of relying on a cognitive bias.

In my last post, I wrote about the importance of critical thinking, and how in today’s information age, no one has an excuse for living in ignorance. Since then, I recalled a huge impediment to critical thinking: cognitive bias. We are all culpable of leaning on these mental crutches, even though we don’t do it intentionally.

What Are Cognitive Biases?

The Cambridge English Dictionary defines cognitive bias as the way a particular person understands events, facts, and other people, which is based on their own particular set of beliefs and experiences and may not be reasonable or accurate.

PhilosophyTerms.com calls it a bad mental habit that gets in the way of logical thinking.

PositivePsychology.com describes it this way: “We are often presented with situations in life when we need to make a decision with imperfect information, and we unknowingly rely on prejudices or biases.”

And, according to Alleydog.com, a cognitive bias is an involuntary pattern of thinking that produces distorted perceptions of people, surroundings, and situations around us.

In brief, a cognitive bias is a shortcut to thinking. And, it’s completely understandable; the onslaught of information that we are exposed to every day necessitates some kind of time-saving method. It is simply impossible to process everything, so we make quick decisions. Most people don’t have the time to thoroughly think through everything they are told. Nevertheless, as understandable as depending on biases may be, it is still a severe deterrent to critical thinking.

Here's What to Watch Out For

Wikipedia lists 197 different cognitive biases. I am going to share with you a few of the more common ones so that in the future, you will be aware of the ones you may be using.

Confirmation bias is when you prefer to attend media and information sources that are in alignment with your current beliefs. People do this because it helps maintain their confidence and self-esteem when the information they receive supports their knowledge set. Exposing oneself to opposing views and opinions can cause cognitive dissonance and mental stress . On the other hand, exposing yourself to new information and different viewpoints helps open up new neural pathways in your brain, which will enable you to think more creatively (see my post: Surprise: Creativity Is a Skill, Not a Gift! ).

Anchoring bias occurs when you become committed or attached to the first thing you learn about a particular subject. A first impression of something or someone is a good example (see my post: Sometimes You Have to Rip the Cover Off ). Similar to anchoring is the halo effect , which is when you assume that a person’s positive or negative traits in one area will be the same in some other aspect of their personality . For example, you might think that an attractive person will also be intelligent without seeing any proof to support it.

critical thinking involves action of bias

Hindsight bias is the inclination to see some events as more predictable than they are; also known as the “I knew it all along" reaction. Examples of this bias would be believing that you knew who was going to win an election, a football or baseball game, or even a coin toss after it occurred.

Misinformation effect is when your memories of an event can become affected or influenced by information you received after the event occurred. Researchers have proven that memory is inaccurate because it is vulnerable to revision when you receive new information.

Actor-observer bias is when you attribute your actions to external influences and other people's actions to internal ones. You might think you missed a business opportunity because your car broke down, but your colleague failed to get a promotion because of incompetence.

False consensus effect is when you assume more people agree with your opinions and share your values than actually do. This happens because you tend to spend most of your time with others, such as family and friends, who actually do share beliefs similar to yours.

Availability bias occurs when you believe the information you possess is more important than it actually is. This happens when you watch or listen to media news sources that tend to run dramatic stories without sharing any balancing statistics on how rare such events may be. For example, if you see several stories on fiery plane crashes, you might start to fear flying because you assume they occur with greater frequency than they actually do.

Bandwagon effect, also known as herd mentality or groupthink , is the propensity to accept beliefs or values because many other people also hold them as well. This is a conformity bias that occurs because most people desire acceptance, connection, and belonging with others, and fear rejection if they hold opposing beliefs. Most people will not think through an opinion and will assume it is correct because so many others agree with it.

Authority bias is when you accept the opinion of an authority figure because you believe they know more than you. You might assume that they have already thought through an issue and made the right conclusion. And, because they are an authority in their field, you grant more credibility to their viewpoint than you would for anyone else. This is especially true in medicine where experts are frequently seen as infallible. An example would be an advertiser showing a doctor, wearing a lab coat, touting their product.

Negativity bias is when you pay more attention to bad news than good. This is a natural bias that dates back to humanity’s prehistoric days when noticing threats, risks, and other lethal dangers could save your life. In today’s civilized world, this bias is not as necessary (see my post Fear: Lifesaver or Manipulator ).

Illusion of control is the belief that you have more control over a situation than you actually do. An example of this is when a gambler believes he or she can influence a game of chance.

Understand More and Communicate Better

Learning these biases, and being on the alert for them when you make a decision to accept a belief or opinion, will help you become more effective at critical thinking.

Source: Cognitive Bias Codex by John Manoogian III/Wikimedia Commons

Robert Wilson is a writer and humorist based in Atlanta, Georgia.

  • Find a Therapist
  • Find a Treatment Center
  • Find a Psychiatrist
  • Find a Support Group
  • Find Online Therapy
  • United States
  • Brooklyn, NY
  • Chicago, IL
  • Houston, TX
  • Los Angeles, CA
  • New York, NY
  • Portland, OR
  • San Diego, CA
  • San Francisco, CA
  • Seattle, WA
  • Washington, DC
  • Asperger's
  • Bipolar Disorder
  • Chronic Pain
  • Eating Disorders
  • Passive Aggression
  • Personality
  • Goal Setting
  • Positive Psychology
  • Stopping Smoking
  • Low Sexual Desire
  • Relationships
  • Child Development
  • Therapy Center NEW
  • Diagnosis Dictionary
  • Types of Therapy

March 2024 magazine cover

Understanding what emotional intelligence looks like and the steps needed to improve it could light a path to a more emotionally adept world.

  • Emotional Intelligence
  • Gaslighting
  • Affective Forecasting
  • Neuroscience

Library homepage

  • school Campus Bookshelves
  • menu_book Bookshelves
  • perm_media Learning Objects
  • login Login
  • how_to_reg Request Instructor Account
  • hub Instructor Commons
  • Download Page (PDF)
  • Download Full Book (PDF)
  • Periodic Table
  • Physics Constants
  • Scientific Calculator
  • Reference & Cite
  • Tools expand_more
  • Readability

selected template will load here

This action is not available.

Humanities LibreTexts

2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

  • Last updated
  • Save as PDF
  • Page ID 162135

  • Nathan Smith et al.

Learning Objectives

By the end of this section, you will be able to:

  • Label the conditions that make critical thinking possible.
  • Classify and describe cognitive biases.
  • Apply critical reflection strategies to resist cognitive biases.

To resist the potential pitfalls of cognitive biases, we have taken some time to recognize why we fall prey to them. Now we need to understand how to resist easy, automatic, and error-prone thinking in favor of more reflective, critical thinking.

Critical Reflection and Metacognition

To promote good critical thinking, put yourself in a frame of mind that allows critical reflection. Recall from the previous section that rational thinking requires effort and takes longer. However, it will likely result in more accurate thinking and decision-making. As a result, reflective thought can be a valuable tool in correcting cognitive biases. The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist the quick, immediate, and likely conclusion you want to draw. By reflecting critically on your own thinking, you can become aware of the natural tendency for your mind to slide into mental shortcuts.

This process of critical reflection is often called metacognition in the literature of pedagogy and psychology. Metacognition means thinking about thinking and involves the kind of self-awareness that engages higher-order thinking skills. Cognition, or the way we typically engage with the world around us, is first-order thinking, while metacognition is higher-order thinking. From a metacognitive frame, we can critically assess our thought process, become skeptical of our gut reactions and intuitions, and reconsider our cognitive tendencies and biases.

To improve metacognition and critical reflection, we need to encourage the kind of self-aware, conscious, and effortful attention that may feel unnatural and may be tiring. Typical activities associated with metacognition include checking, planning, selecting, inferring, self-interrogating, interpreting an ongoing experience, and making judgments about what one does and does not know (Hackner, Dunlosky, and Graesser 1998). By practicing metacognitive behaviors, you are preparing yourself to engage in the kind of rational, abstract thought that will be required for philosophy.

Good study habits, including managing your workspace, giving yourself plenty of time, and working through a checklist, can promote metacognition. When you feel stressed out or pressed for time, you are more likely to make quick decisions that lead to error. Stress and lack of time also discourage critical reflection because they rob your brain of the resources necessary to engage in rational, attention-filled thought. By contrast, when you relax and give yourself time to think through problems, you will be clearer, more thoughtful, and less likely to rush to the first conclusion that leaps to mind. Similarly, background noise, distracting activity, and interruptions will prevent you from paying attention. You can use this checklist to try to encourage metacognition when you study:

  • Check your work.
  • Plan ahead.
  • Select the most useful material.
  • Infer from your past grades to focus on what you need to study.
  • Ask yourself how well you understand the concepts.
  • Check your weaknesses.
  • Assess whether you are following the arguments and claims you are working on.

Cognitive Biases

In this section, we will examine some of the most common cognitive biases so that you can be aware of traps in thought that can lead you astray. Cognitive biases are closely related to informal fallacies. Both fallacies and biases provide examples of the ways we make errors in reasoning.

CONNECTIONS

See the chapter on logic and reasoning for an in-depth exploration of informal fallacies.

Watch the video to orient yourself before reading the text that follows.

Cognitive Biases 101, with Peter Bauman

Click to view content

Confirmation Bias

One of the most common cognitive biases is confirmation bias , which is the tendency to search for, interpret, favor, and recall information that confirms or supports your prior beliefs. Like all cognitive biases, confirmation bias serves an important function. For instance, one of the most reliable forms of confirmation bias is the belief in our shared reality. Suppose it is raining. When you first hear the patter of raindrops on your roof or window, you may think it is raining. You then look for additional signs to confirm your conclusion, and when you look out the window, you see rain falling and puddles of water accumulating. Most likely, you will not be looking for irrelevant or contradictory information. You will be looking for information that confirms your belief that it is raining. Thus, you can see how confirmation bias—based on the idea that the world does not change dramatically over time—is an important tool for navigating in our environment.

Unfortunately, as with most heuristics, we tend to apply this sort of thinking inappropriately. One example that has recently received a lot of attention is the way in which confirmation bias has increased political polarization. When searching for information on the internet about an event or topic, most people look for information that confirms their prior beliefs rather than what undercuts them. The pervasive presence of social media in our lives is exacerbating the effects of confirmation bias since the computer algorithms used by social media platforms steer people toward content that reinforces their current beliefs and predispositions. These multimedia tools are especially problematic when our beliefs are incorrect (for example, they contradict scientific knowledge) or antisocial (for example, they support violent or illegal behavior). Thus, social media and the internet have created a situation in which confirmation bias can be “turbocharged” in ways that are destructive for society.

Confirmation bias is a result of the brain’s limited ability to process information. Peter Wason (1960) conducted early experiments identifying this kind of bias. He asked subjects to identify the rule that applies to a sequence of numbers—for instance, 2, 4, 8. Subjects were told to generate examples to test their hypothesis. What he found is that once a subject settled on a particular hypothesis, they were much more likely to select examples that confirmed their hypothesis rather than negated it. As a result, they were unable to identify the real rule (any ascending sequence of numbers) and failed to “falsify” their initial assumptions. Falsification is an important tool in the scientist’s toolkit when they are testing hypotheses and is an effective way to avoid confirmation bias.

In philosophy, you will be presented with different arguments on issues, such as the nature of the mind or the best way to act in a given situation. You should take your time to reason through these issues carefully and consider alternative views. What you believe to be the case may be right, but you may also fall into the trap of confirmation bias, seeing confirming evidence as better and more convincing than evidence that calls your beliefs into question.

Anchoring Bias

Confirmation bias is closely related to another bias known as anchoring. Anchoring bias refers to our tendency to rely on initial values, prices, or quantities when estimating the actual value, price, or quantity of something. If you are presented with a quantity, even if that number is clearly arbitrary, you will have a hard discounting it in your subsequent calculations; the initial value “anchors” subsequent estimates. For instance, Tversky and Kahneman (1974) reported an experiment in which subjects were asked to estimate the number of African nations in the United Nations. First, the experimenters spun a wheel of fortune in front of the subjects that produced a random number between 0 and 100. Let’s say the wheel landed on 79. Subjects were asked whether the number of nations was higher or lower than the random number. Subjects were then asked to estimate the real number of nations. Even though the initial anchoring value was random, people in the study found it difficult to deviate far from that number. For subjects receiving an initial value of 10, the median estimate of nations was 25, while for subjects receiving an initial value of 65, the median estimate was 45.

In the same paper, Tversky and Kahneman described the way that anchoring bias interferes with statistical reasoning. In a number of scenarios, subjects made irrational judgments about statistics because of the way the question was phrased (i.e., they were tricked when an anchor was inserted into the question). Instead of expending the cognitive energy needed to solve the statistical problem, subjects were much more likely to “go with their gut,” or think intuitively. That type of reasoning generates anchoring bias. When you do philosophy, you will be confronted with some formal and abstract problems that will challenge you to engage in thinking that feels difficult and unnatural. Resist the urge to latch on to the first thought that jumps into your head, and try to think the problem through with all the cognitive resources at your disposal.

Availability Heuristic

The availability heuristic refers to the tendency to evaluate new information based on the most recent or most easily recalled examples. The availability heuristic occurs when people take easily remembered instances as being more representative than they objectively are (i.e., based on statistical probabilities). In very simple situations, the availability of instances is a good guide to judgments. Suppose you are wondering whether you should plan for rain. It may make sense to anticipate rain if it has been raining a lot in the last few days since weather patterns tend to linger in most climates. More generally, scenarios that are well-known to us, dramatic, recent, or easy to imagine are more available for retrieval from memory. Therefore, if we easily remember an instance or scenario, we may incorrectly think that the chances are high that the scenario will be repeated. For instance, people in the United States estimate the probability of dying by violent crime or terrorism much more highly than they ought to. In fact, these are extremely rare occurrences compared to death by heart disease, cancer, or car accidents. But stories of violent crime and terrorism are prominent in the news media and fiction. Because these vivid stories are dramatic and easily recalled, we have a skewed view of how frequently violent crime occurs.

Another more loosely defined category of cognitive bias is the tendency for human beings to align themselves with groups with whom they share values and practices. The tendency toward tribalism is an evolutionary advantage for social creatures like human beings. By forming groups to share knowledge and distribute work, we are much more likely to survive. Not surprisingly, human beings with pro-social behaviors persist in the population at higher rates than human beings with antisocial tendencies. Pro-social behaviors, however, go beyond wanting to communicate and align ourselves with other human beings; we also tend to see outsiders as a threat. As a result, tribalistic tendencies both reinforce allegiances among in-group members and increase animosity toward out-group members.

Tribal thinking makes it hard for us to objectively evaluate information that either aligns with or contradicts the beliefs held by our group or tribe. This effect can be demonstrated even when in-group membership is not real or is based on some superficial feature of the person—for instance, the way they look or an article of clothing they are wearing. A related bias is called the bandwagon fallacy . The bandwagon fallacy can lead you to conclude that you ought to do something or believe something because many other people do or believe the same thing. While other people can provide guidance, they are not always reliable. Furthermore, just because many people believe something doesn’t make it true. Watch the video below to improve your “tribal literacy” and understand the dangers of this type of thinking.

The Dangers of Tribalism, Kevin deLaplante

Sunk Cost Fallacy

Sunk costs refer to the time, energy, money, or other costs that have been paid in the past. These costs are “sunk” because they cannot be recovered. The sunk cost fallacy is thinking that attaches a value to things in which you have already invested resources that is greater than the value those things have today. Human beings have a natural tendency to hang on to whatever they invest in and are loath to give something up even after it has been proven to be a liability. For example, a person may have sunk a lot of money into a business over time, and the business may clearly be failing. Nonetheless, the businessperson will be reluctant to close shop or sell the business because of the time, money, and emotional energy they have spent on the venture. This is the behavior of “throwing good money after bad” by continuing to irrationally invest in something that has lost its worth because of emotional attachment to the failed enterprise. People will engage in this kind of behavior in all kinds of situations and may continue a friendship, a job, or a marriage for the same reason—they don’t want to lose their investment even when they are clearly headed for failure and ought to cut their losses.

A similar type of faulty reasoning leads to the gambler’s fallacy , in which a person reasons that future chance events will be more likely if they have not happened recently. For instance, if I flip a coin many times in a row, I may get a string of heads. But even if I flip several heads in a row, that does not make it more likely I will flip tails on the next coin flip. Each coin flip is statistically independent, and there is an equal chance of turning up heads or tails. The gambler, like the reasoner from sunk costs, is tied to the past when they should be reasoning about the present and future.

There are important social and evolutionary purposes for past-looking thinking. Sunk-cost thinking keeps parents engaged in the growth and development of their children after they are born. Sunk-cost thinking builds loyalty and affection among friends and family. More generally, a commitment to sunk costs encourages us to engage in long-term projects, and this type of thinking has the evolutionary purpose of fostering culture and community. Nevertheless, it is important to periodically reevaluate our investments in both people and things.

In recent ethical scholarship, there is some debate about how to assess the sunk costs of moral decisions. Consider the case of war. Just-war theory dictates that wars may be justified in cases where the harm imposed on the adversary is proportional to the good gained by the act of defense or deterrence. It may be that, at the start of the war, those costs seemed proportional. But after the war has dragged on for some time, it may seem that the objective cannot be obtained without a greater quantity of harm than had been initially imagined. Should the evaluation of whether a war is justified estimate the total amount of harm done or prospective harm that will be done going forward (Lazar 2018)? Such questions do not have easy answers.

Table 2.1 summarizes these common cognitive biases.

Table 2.1 Common Cognitive Biases

Think Like A Philosopher

As we have seen, cognitive biases are built into the way human beings process information. They are common to us all, and it takes self-awareness and effort to overcome the tendency to fall back on biases. Consider a time when you have fallen prey to one of the five cognitive biases described above. What were the circumstances? Recall your thought process. Were you aware at the time that your thinking was misguided? What were the consequences of succumbing to that cognitive bias?

Write a short paragraph describing how that cognitive bias allowed you to make a decision you now realize was irrational. Then write a second paragraph describing how, with the benefit of time and distance, you would have thought differently about the incident that triggered the bias. Use the tools of critical reflection and metacognition to improve your approach to this situation. What might have been the consequences of behaving differently? Finally, write a short conclusion describing what lesson you take from reflecting back on this experience. Does it help you understand yourself better? Will you be able to act differently in the future? What steps can you take to avoid cognitive biases in your thinking today?

loading

How it works

For Business

Join Mind Tools

Article • 8 min read

Critical Thinking

Developing the right mindset and skills.

By the Mind Tools Content Team

We make hundreds of decisions every day and, whether we realize it or not, we're all critical thinkers.

We use critical thinking each time we weigh up our options, prioritize our responsibilities, or think about the likely effects of our actions. It's a crucial skill that helps us to cut out misinformation and make wise decisions. The trouble is, we're not always very good at it!

In this article, we'll explore the key skills that you need to develop your critical thinking skills, and how to adopt a critical thinking mindset, so that you can make well-informed decisions.

What Is Critical Thinking?

Critical thinking is the discipline of rigorously and skillfully using information, experience, observation, and reasoning to guide your decisions, actions, and beliefs. You'll need to actively question every step of your thinking process to do it well.

Collecting, analyzing and evaluating information is an important skill in life, and a highly valued asset in the workplace. People who score highly in critical thinking assessments are also rated by their managers as having good problem-solving skills, creativity, strong decision-making skills, and good overall performance. [1]

Key Critical Thinking Skills

Critical thinkers possess a set of key characteristics which help them to question information and their own thinking. Focus on the following areas to develop your critical thinking skills:

Being willing and able to explore alternative approaches and experimental ideas is crucial. Can you think through "what if" scenarios, create plausible options, and test out your theories? If not, you'll tend to write off ideas and options too soon, so you may miss the best answer to your situation.

To nurture your curiosity, stay up to date with facts and trends. You'll overlook important information if you allow yourself to become "blinkered," so always be open to new information.

But don't stop there! Look for opposing views or evidence to challenge your information, and seek clarification when things are unclear. This will help you to reassess your beliefs and make a well-informed decision later. Read our article, Opening Closed Minds , for more ways to stay receptive.

Logical Thinking

You must be skilled at reasoning and extending logic to come up with plausible options or outcomes.

It's also important to emphasize logic over emotion. Emotion can be motivating but it can also lead you to take hasty and unwise action, so control your emotions and be cautious in your judgments. Know when a conclusion is "fact" and when it is not. "Could-be-true" conclusions are based on assumptions and must be tested further. Read our article, Logical Fallacies , for help with this.

Use creative problem solving to balance cold logic. By thinking outside of the box you can identify new possible outcomes by using pieces of information that you already have.

Self-Awareness

Many of the decisions we make in life are subtly informed by our values and beliefs. These influences are called cognitive biases and it can be difficult to identify them in ourselves because they're often subconscious.

Practicing self-awareness will allow you to reflect on the beliefs you have and the choices you make. You'll then be better equipped to challenge your own thinking and make improved, unbiased decisions.

One particularly useful tool for critical thinking is the Ladder of Inference . It allows you to test and validate your thinking process, rather than jumping to poorly supported conclusions.

Developing a Critical Thinking Mindset

Combine the above skills with the right mindset so that you can make better decisions and adopt more effective courses of action. You can develop your critical thinking mindset by following this process:

Gather Information

First, collect data, opinions and facts on the issue that you need to solve. Draw on what you already know, and turn to new sources of information to help inform your understanding. Consider what gaps there are in your knowledge and seek to fill them. And look for information that challenges your assumptions and beliefs.

Be sure to verify the authority and authenticity of your sources. Not everything you read is true! Use this checklist to ensure that your information is valid:

  • Are your information sources trustworthy ? (For example, well-respected authors, trusted colleagues or peers, recognized industry publications, websites, blogs, etc.)
  • Is the information you have gathered up to date ?
  • Has the information received any direct criticism ?
  • Does the information have any errors or inaccuracies ?
  • Is there any evidence to support or corroborate the information you have gathered?
  • Is the information you have gathered subjective or biased in any way? (For example, is it based on opinion, rather than fact? Is any of the information you have gathered designed to promote a particular service or organization?)

If any information appears to be irrelevant or invalid, don't include it in your decision making. But don't omit information just because you disagree with it, or your final decision will be flawed and bias.

Now observe the information you have gathered, and interpret it. What are the key findings and main takeaways? What does the evidence point to? Start to build one or two possible arguments based on what you have found.

You'll need to look for the details within the mass of information, so use your powers of observation to identify any patterns or similarities. You can then analyze and extend these trends to make sensible predictions about the future.

To help you to sift through the multiple ideas and theories, it can be useful to group and order items according to their characteristics. From here, you can compare and contrast the different items. And once you've determined how similar or different things are from one another, Paired Comparison Analysis can help you to analyze them.

The final step involves challenging the information and rationalizing its arguments.

Apply the laws of reason (induction, deduction, analogy) to judge an argument and determine its merits. To do this, it's essential that you can determine the significance and validity of an argument to put it in the correct perspective. Take a look at our article, Rational Thinking , for more information about how to do this.

Once you have considered all of the arguments and options rationally, you can finally make an informed decision.

Afterward, take time to reflect on what you have learned and what you found challenging. Step back from the detail of your decision or problem, and look at the bigger picture. Record what you've learned from your observations and experience.

Critical thinking involves rigorously and skilfully using information, experience, observation, and reasoning to guide your decisions, actions and beliefs. It's a useful skill in the workplace and in life.

You'll need to be curious and creative to explore alternative possibilities, but rational to apply logic, and self-aware to identify when your beliefs could affect your decisions or actions.

You can demonstrate a high level of critical thinking by validating your information, analyzing its meaning, and finally evaluating the argument.

Critical Thinking Infographic

See Critical Thinking represented in our infographic: An Elementary Guide to Critical Thinking .

critical thinking involves action of bias

You've accessed 1 of your 2 free resources.

Get unlimited access

Discover more content

How to use linkedin effectively.

Getting the Best from the World's Biggest Networking Site

Expert Interviews

Wrong Fit, Right Fit: Why How We Work Matters More Than Ever

With André Martin

Add comment

Comments (1)

priyanka ghogare

critical thinking involves action of bias

Get 30% off your first year of Mind Tools

Great teams begin with empowered leaders. Our tools and resources offer the support to let you flourish into leadership. Join today!

Sign-up to our newsletter

Subscribing to the Mind Tools newsletter will keep you up-to-date with our latest updates and newest resources.

Subscribe now

Business Skills

Personal Development

Leadership and Management

Member Extras

Most Popular

Latest Updates

Article aaimtlg

Tips for Dealing with Customers Effectively

Article aafqx8n

Pain Points Podcast - Procrastination

Mind Tools Store

About Mind Tools Content

Discover something new today

Pain points podcast - starting a new job.

How to Hit the Ground Running!

Ten Dos and Don'ts of Career Conversations

How to talk to team members about their career aspirations.

How Emotionally Intelligent Are You?

Boosting Your People Skills

Self-Assessment

What's Your Leadership Style?

Learn About the Strengths and Weaknesses of the Way You Like to Lead

Recommended for you

Developing surveys.

Asking the Right Questions the Right Way

Business Operations and Process Management

Strategy Tools

Customer Service

Business Ethics and Values

Handling Information and Data

Project Management

Knowledge Management

Self-Development and Goal Setting

Time Management

Presentation Skills

Learning Skills

Career Skills

Communication Skills

Negotiation, Persuasion and Influence

Working With Others

Difficult Conversations

Creativity Tools

Self-Management

Work-Life Balance

Stress Management and Wellbeing

Coaching and Mentoring

Change Management

Team Management

Managing Conflict

Delegation and Empowerment

Performance Management

Leadership Skills

Developing Your Team

Talent Management

Problem Solving

Decision Making

Member Podcast

Cognitive Bias: How We Are Wired to Misjudge

Charlotte Ruhl

Research Assistant & Psychology Graduate

BA (Hons) Psychology, Harvard University

Charlotte Ruhl, a psychology graduate from Harvard College, boasts over six years of research experience in clinical and social psychology. During her tenure at Harvard, she contributed to the Decision Science Lab, administering numerous studies in behavioral economics and social psychology.

Learn about our Editorial Process

Saul Mcleod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul Mcleod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

On This Page:

Have you ever been so busy talking on the phone that you don’t notice the light has turned green and it is your turn to cross the street?

Have you ever shouted, “I knew that was going to happen!” after your favorite baseball team gave up a huge lead in the ninth inning and lost?

Or have you ever found yourself only reading news stories that further support your opinion?

These are just a few of the many instances of cognitive bias that we experience every day of our lives. But before we dive into these different biases, let’s backtrack first and define what bias is.

Cognitive Bias and Judgement Error - Systematic Mental Pattern of Deviation from Norm or Rationality in Judgement - Conceptual Illustration

What is Cognitive Bias?

Cognitive bias is a systematic error in thinking, affecting how we process information, perceive others, and make decisions. It can lead to irrational thoughts or judgments and is often based on our perceptions, memories, or individual and societal beliefs.

Biases are unconscious and automatic processes designed to make decision-making quicker and more efficient. Cognitive biases can be caused by many things, such as heuristics (mental shortcuts) , social pressures, and emotions.

Broadly speaking, bias is a tendency to lean in favor of or against a person, group, idea, or thing, usually in an unfair way. Biases are natural — they are a product of human nature — and they don’t simply exist in a vacuum or in our minds — they affect the way we make decisions and act.

In psychology, there are two main branches of biases: conscious and unconscious. Conscious or explicit bias is intentional — you are aware of your attitudes and the behaviors resulting from them (Lang, 2019).

Explicit bias can be good because it helps provide you with a sense of identity and can lead you to make good decisions (for example, being biased towards healthy foods).

However, these biases can often be dangerous when they take the form of conscious stereotyping.

On the other hand, unconscious bias , or cognitive bias, represents a set of unintentional biases — you are unaware of your attitudes and behaviors resulting from them (Lang, 2019).

Cognitive bias is often a result of your brain’s attempt to simplify information processing — we receive roughly 11 million bits of information per second. Still, we can only process about 40 bits of information per second (Orzan et al., 2012).

Therefore, we often rely on mental shortcuts (called heuristics) to help make sense of the world with relative speed. As such, these errors tend to arise from problems related to thinking: memory, attention, and other mental mistakes.

Cognitive biases can be beneficial because they do not require much mental effort and can allow you to make decisions relatively quickly, but like conscious biases, unconscious biases can also take the form of harmful prejudice that serves to hurt an individual or a group.

Although it may feel like there has been a recent rise of unconscious bias, especially in the context of police brutality and the Black Lives Matter movement, this is not a new phenomenon.

Thanks to Tversky and Kahneman (and several other psychologists who have paved the way), we now have an existing dictionary of our cognitive biases.

Again, these biases occur as an attempt to simplify the complex world and make information processing faster and easier. This section will dive into some of the most common forms of cognitive bias.

Cognitive biases as systematic error in thinking and behavior outline diagram. Psychological mindset feeling with non logic judgment effects vector illustration.

Confirmation Bias

Confirmation bias is the tendency to interpret new information as confirmation of your preexisting beliefs and opinions while giving disproportionately less consideration to alternative possibilities.

Real-World Examples

Since Watson’s 1960 experiment, real-world examples of confirmation bias have gained attention.

This bias often seeps into the research world when psychologists selectively interpret data or ignore unfavorable data to produce results that support their initial hypothesis.

Confirmation bias is also incredibly pervasive on the internet, particularly with social media. We tend to read online news articles that support our beliefs and fail to seek out sources that challenge them.

Various social media platforms, such as Facebook, help reinforce our confirmation bias by feeding us stories that we are likely to agree with – further pushing us down these echo chambers of political polarization.

Some examples of confirmation bias are especially harmful, specifically in the context of the law. For example, a detective may identify a suspect early in an investigation, seek out confirming evidence, and downplay falsifying evidence.

Experiments

The confirmation bias dates back to 1960 when Peter Wason challenged participants to identify a rule applying to triples of numbers.

People were first told that the sequences 2, 4, 6 fit the rule, and they then had to generate triples of their own and were told whether that sequence fits the rule. The rule was simple: any ascending sequence.

But not only did participants have an unusually difficult time realizing this and instead devised overly-complicated hypotheses, they also only generated triples that confirmed their preexisting hypothesis (Wason, 1960).

Explanations

But why does confirmation bias occur? It’s partially due to the effect of desire on our beliefs. In other words, certain desired conclusions (ones that support our beliefs) are more likely to be processed by the brain and labeled as true (Nickerson, 1998).

This motivational explanation is often coupled with a more cognitive theory.

The cognitive explanation argues that because our minds can only focus on one thing at a time, it is hard to parallel process (see information processing for more information) alternate hypotheses, so, as a result, we only process the information that aligns with our beliefs (Nickerson, 1998).

Another theory explains confirmation bias as a way of enhancing and protecting our self-esteem.

As with the self-serving bias (see more below), our minds choose to reinforce our preexisting ideas because being right helps preserve our sense of self-esteem, which is important for feeling secure in the world and maintaining positive relationships (Casad, 2019).

Although confirmation bias has obvious consequences, you can still work towards overcoming it by being open-minded and willing to look at situations from a different perspective than you might be used to (Luippold et al., 2015).

Even though this bias is unconscious, training your mind to become more flexible in its thought patterns will help mitigate the effects of this bias.

Hindsight Bias

Hindsight bias refers to the tendency to perceive past events as more predictable than they actually were (Roese & Vohs, 2012). There are cognitive and motivational explanations for why we ascribe so much certainty to knowing the outcome of an event only once the event is completed.

 Hindsight Bias Example

When sports fans know the outcome of a game, they often question certain decisions coaches make that they otherwise would not have questioned or second-guessed.

And fans are also quick to remark that they knew their team was going to win or lose, but, of course, they only make this statement after their team actually did win or lose.

Although research studies have demonstrated that the hindsight bias isn’t necessarily mitigated by pure recognition of the bias (Pohl & Hell, 1996).

You can still make a conscious effort to remind yourself that you can’t predict the future and motivate yourself to consider alternate explanations.

It’s important to do all we can to reduce this bias because when we are overly confident about our ability to predict outcomes, we might make future risky decisions that could have potentially dangerous outcomes.

Building on Tversky and Kahneman’s growing list of heuristics, researchers Baruch Fischhoff and Ruth Beyth-Marom (1975) were the first to directly investigate the hindsight bias in the empirical setting.

The team asked participants to judge the likelihood of several different outcomes of former U.S. president Richard Nixon’s visit to Beijing and Moscow.

After Nixon returned back to the States, participants were asked to recall the likelihood of each outcome they had initially assigned.

Fischhoff and Beyth found that for events that actually occurred, participants greatly overestimated the initial likelihood they assigned to those events.

That same year, Fischhoff (1975) introduced a new method for testing the hindsight bias – one that researchers still use today.

Participants are given a short story with four possible outcomes, and they are told that one is true. When they are then asked to assign the likelihood of each specific outcome, they regularly assign a higher likelihood to whichever outcome they have been told is true, regardless of how likely it actually is.

But hindsight bias does not only exist in artificial settings. In 1993, Dorothee Dietrich and Matthew Olsen asked college students to predict how the U.S. Senate would vote on the confirmation of Supreme Court nominee Clarence Thomas.

Before the vote, 58% of participants predicted that he would be confirmed, but after his actual confirmation, 78% of students said that they thought he would be approved – a prime example of the hindsight bias. And this form of bias extends beyond the research world.

From the cognitive perspective, hindsight bias may result from distortions of memories of what we knew or believed to know before an event occurred (Inman, 2016).

It is easier to recall information that is consistent with our current knowledge, so our memories become warped in a way that agrees with what actually did happen.

Motivational explanations of the hindsight bias point to the fact that we are motivated to live in a predictable world (Inman, 2016).

When surprising outcomes arise, our expectations are violated, and we may experience negative reactions as a result. Thus, we rely on the hindsight bias to avoid these adverse responses to certain unanticipated events and reassure ourselves that we actually did know what was going to happen.

Self-Serving Bias

Self-serving bias is the tendency to take personal responsibility for positive outcomes and blame external factors for negative outcomes.

You would be right to ask how this is similar to the fundamental attribution error (Ross, 1977), which identifies our tendency to overemphasize internal factors for other people’s behavior while attributing external factors to our own.

The distinction is that the self-serving bias is concerned with valence. That is, how good or bad an event or situation is. And it is also only concerned with events for which you are the actor.

In other words, if a driver cuts in front of you as the light turns green, the fundamental attribution error might cause you to think that they are a bad person and not consider the possibility that they were late for work.

On the other hand, the self-serving bias is exercised when you are the actor. In this example, you would be the driver cutting in front of the other car, which you would tell yourself is because you are late (an external attribution to a negative event) as opposed to it being because you are a bad person.

From sports to the workplace, self-serving bias is incredibly common. For example, athletes are quick to take responsibility for personal wins, attributing their successes to their hard work and mental toughness, but point to external factors, such as unfair calls or bad weather, when they lose (Allen et al., 2020).

In the workplace, people attribute internal factors when they have hired for a job but external factors when they are fired (Furnham, 1982). And in the office itself, workplace conflicts are given external attributions, and successes, whether a persuasive presentation or a promotion, are awarded internal explanations (Walther & Bazarova, 2007).

Additionally, self-serving bias is more prevalent in individualistic cultures , which place emphasis on self-esteem levels and individual goals, and it is less prevalent among individuals with depression (Mezulis et al., 2004), who are more likely to take responsibility for negative outcomes.

Overcoming this bias can be difficult because it is at the expense of our self-esteem. Nevertheless, practicing self-compassion – treating yourself with kindness even when you fall short or fail – can help reduce the self-serving bias (Neff, 2003).

The leading explanation for the self-serving bias is that it is a way of protecting our self-esteem (similar to one of the explanations for the confirmation bias).

We are quick to take credit for positive outcomes and divert the blame for negative ones to boost and preserve our individual ego, which is necessary for confidence and healthy relationships with others (Heider, 1982).

Another theory argues that self-serving bias occurs when surprising events arise. When certain outcomes run counter to our expectations, we ascribe external factors, but when outcomes are in line with our expectations, we attribute internal factors (Miller & Ross, 1975).

An extension of this theory asserts that we are naturally optimistic, so negative outcomes come as a surprise and receive external attributions as a result.

Anchoring Bias

individualistic cultures is closely related to the decision-making process. It occurs when we rely too heavily on either pre-existing information or the first piece of information (the anchor) when making a decision.

For example, if you first see a T-shirt that costs $1,000 and then see a second one that costs $100, you’re more likely to see the second shirt as cheap as you would if the first shirt you saw was $120. Here, the price of the first shirt influences how you view the second.

 Anchoring Bias Example

Sarah is looking to buy a used car. The first dealership she visits has a used sedan listed for $19,000. Sarah takes this initial listing price as an anchor and uses it to evaluate prices at other dealerships.

When she sees another similar used sedan priced at $18,000, that price seems like a good bargain compared to the $19,000 anchor price she saw first, even though the actual market value is closer to $16,000.

When Sarah finds a comparable used sedan priced at $15,500, she continues perceiving that price as cheap compared to her anchored reference price.

Ultimately, Sarah purchases the $18,000 sedan, overlooking that all of the prices seemed like bargains only in relation to the initial high anchor price.

The key elements that demonstrate anchoring bias here are:

  • Sarah establishes an initial reference price based on the first listing she sees ($19k)
  • She uses that initial price as her comparison/anchor for evaluating subsequent prices
  • This biases her perception of the market value of the cars she looks at after the initial anchor is set
  • She makes a purchase decision aligned with her anchored expectations rather than a more objective market value

Multiple theories seek to explain the existence of this bias.

One theory, known as anchoring and adjustment, argues that once an anchor is established, people insufficiently adjust away from it to arrive at their final answer, and so their final guess or decision is closer to the anchor than it otherwise would have been (Tversky & Kahneman, 1992).

And when people experience a greater cognitive load (the amount of information the working memory can hold at any given time; for example, a difficult decision as opposed to an easy one), they are more susceptible to the effects of anchoring.

Another theory, selective accessibility, holds that although we assume that the anchor is not a suitable answer (or a suitable price going back to the initial example) when we evaluate the second stimulus (or second shirt), we look for ways in which it is similar or different to the anchor (the price being way different), resulting in the anchoring effect (Mussweiler & Strack, 1999).

A final theory posits that providing an anchor changes someone’s attitudes to be more favorable to the anchor, which then biases future answers to have similar characteristics as the initial anchor.

Although there are many different theories for why we experience anchoring bias, they all agree that it affects our decisions in real ways (Wegner et al., 2001).

The first study that brought this bias to light was during one of Tversky and Kahneman’s (1974) initial experiments. They asked participants to compute the product of numbers 1-8 in five seconds, either as 1x2x3… or 8x7x6…

Participants did not have enough time to calculate the answer, so they had to estimate based on their first few calculations.

They found that those who computed the small multiplications first (i.e., 1x2x3…) gave a median estimate of 512, but those who computed the larger multiplications first gave a median estimate of 2,250 (although the actual answer is 40,320).

This demonstrates how the initial few calculations influenced the participant’s final answer.

Availability Bias

Availability bias (also commonly referred to as the availability heuristic ) refers to the tendency to think that examples of things that readily come to mind are more common than what is actually the case.

In other words, information that comes to mind faster influences the decisions we make about the future. And just like with the hindsight bias, this bias is related to an error of memory.

But instead of being a memory fabrication, it is an overemphasis on a certain memory.

In the workplace, if someone is being considered for a promotion but their boss recalls one bad thing that happened years ago but left a lasting impression, that one event might have an outsized influence on the final decision.

Another common example is buying lottery tickets because the lifestyle and benefits of winning are more readily available in mind (and the potential emotions associated with winning or seeing other people win) than the complex probability calculation of actually winning the lottery (Cherry, 2019).

A final common example that is used to demonstrate the availability heuristic describes how seeing several television shows or news reports about shark attacks (or anything that is sensationalized by the news, such as serial killers or plane crashes) might make you think that this incident is relatively common even though it is not at all.

Regardless, this thinking might make you less inclined to go in the water the next time you go to the beach (Cherry, 2019).

As with most cognitive biases, the best way to overcome them is by recognizing the bias and being more cognizant of your thoughts and decisions.

And because we fall victim to this bias when our brain relies on quick mental shortcuts in order to save time, slowing down our thinking and decision-making process is a crucial step to mitigating the effects of the availability heuristic.

Researchers think this bias occurs because the brain is constantly trying to minimize the effort necessary to make decisions, and so we rely on certain memories – ones that we can recall more easily – instead of having to endure the complicated task of calculating statistical probabilities.

Two main types of memories are easier to recall: 1) those that more closely align with the way we see the world and 2) those that evoke more emotion and leave a more lasting impression.

This first type of memory was identified in 1973, when Tversky and Kahneman, our cognitive bias pioneers, conducted a study in which they asked participants if more words begin with the letter K or if more words have K as their third letter.

Although many more words have K as their third letter, 70% of participants said that more words begin with K because the ability to recall this is not only easier, but it more closely aligns with the way they see the world (knowing the first letter of any word is infinitely more common than the third letter of any word).

In terms of the second type of memory, the same duo ran an experiment in 1983, 10 years later, where half the participants were asked to guess the likelihood of a massive flood would occur somewhere in North America, and the other half had to guess the likelihood of a flood occurring due to an earthquake in California.

Although the latter is much less likely, participants still said that this would be much more common because they could recall specific, emotionally charged events of earthquakes hitting California, largely due to the news coverage they receive.

Together, these studies highlight how memories that are easier to recall greatly influence our judgments and perceptions about future events.

Inattentional Blindness

A final popular form of cognitive bias is inattentional blindness . This occurs when a person fails to notice a stimulus that is in plain sight because their attention is directed elsewhere.

For example, while driving a car, you might be so focused on the road ahead of you that you completely fail to notice a car swerve into your lane of traffic.

Because your attention is directed elsewhere, you aren’t able to react in time, potentially leading to a car accident. Experiencing inattentional blindness has its obvious consequences (as illustrated by this example), but, like all biases, it is not impossible to overcome.

Many theories seek to explain why we experience this form of cognitive bias. In reality, it is probably some combination of these explanations.

Conspicuity holds that certain sensory stimuli (such as bright colors) and cognitive stimuli (such as something familiar) are more likely to be processed, and so stimuli that don’t fit into one of these two categories might be missed.

The mental workload theory describes how when we focus a lot of our brain’s mental energy on one stimulus, we are using up our cognitive resources and won’t be able to process another stimulus simultaneously.

Similarly, some psychologists explain how we attend to different stimuli with varying levels of attentional capacity, which might affect our ability to process multiple stimuli simultaneously.

In other words, an experienced driver might be able to see that car swerve into the lane because they are using fewer mental resources to drive, whereas a beginner driver might be using more resources to focus on the road ahead and unable to process that car swerving in.

A final explanation argues that because our attentional and processing resources are limited, our brain dedicates them to what fits into our schemas or our cognitive representations of the world (Cherry, 2020).

Thus, when an unexpected stimulus comes into our line of sight, we might not be able to process it on the conscious level. The following example illustrates how this might happen.

The most famous study to demonstrate the inattentional blindness phenomenon is the invisible gorilla study (Most et al., 2001). This experiment asked participants to watch a video of two groups passing a basketball and count how many times the white team passed the ball.

Participants are able to accurately report the number of passes, but what they fail to notice is a gorilla walking directly through the middle of the circle.

Because this would not be expected, and because our brain is using up its resources to count the number of passes, we completely fail to process something right before our eyes.

A real-world example of inattentional blindness occurred in 1995 when Boston police officer Kenny Conley was chasing a suspect and ran by a group of officers who were mistakenly holding down an undercover cop.

Conley was convicted of perjury and obstruction of justice because he supposedly saw the fight between the undercover cop and the other officers and lied about it to protect the officers, but he stood by his word that he really hadn’t seen it (due to inattentional blindness) and was ultimately exonerated (Pickel, 2015).

The key to overcoming inattentional blindness is to maximize your attention by avoiding distractions such as checking your phone. And it is also important to pay attention to what other people might not notice (if you are that driver, don’t always assume that others can see you).

By working on expanding your attention and minimizing unnecessary distractions that will use up your mental resources, you can work towards overcoming this bias.

Preventing Cognitive Bias

As we know, recognizing these biases is the first step to overcoming them. But there are other small strategies we can follow in order to train our unconscious mind to think in different ways.

From strengthening our memory and minimizing distractions to slowing down our decision-making and improving our reasoning skills, we can work towards overcoming these cognitive biases.

An individual can evaluate his or her own thought process, also known as metacognition (“thinking about thinking”), which provides an opportunity to combat bias (Flavell, 1979).

This multifactorial process involves (Croskerry, 2003):

(a) acknowledging the limitations of memory, (b) seeking perspective while making decisions, (c) being able to self-critique, (d) choosing strategies to prevent cognitive error.

Many strategies used to avoid bias that we describe are also known as cognitive forcing strategies, which are mental tools used to force unbiased decision-making.

The History of Cognitive Bias

The term cognitive bias was first coined in the 1970s by Israeli psychologists Amos Tversky and Daniel Kahneman, who used this phrase to describe people’s flawed thinking patterns in response to judgment and decision problems (Tversky & Kahneman, 1974).

Tversky and Kahneman’s research program, the heuristics and biases program, investigated how people make decisions given limited resources (for example, limited time to decide which food to eat or limited information to decide which house to buy).

As a result of these limited resources, people are forced to rely on heuristics or quick mental shortcuts to help make their decisions.

Tversky and Kahneman wanted to understand the biases associated with this judgment and decision-making process.

To do so, the two researchers relied on a research paradigm that presented participants with some type of reasoning problem with a computed normative answer (they used probability theory and statistics to compute the expected answer).

Participants’ responses were then compared with the predetermined solution to reveal the systematic deviations in the mind.

After running several experiments with countless reasoning problems, the researchers were able to identify numerous norm violations that result when our minds rely on these cognitive biases to make decisions and judgments (Wilke & Mata, 2012).

Key Takeaways

  • Cognitive biases are unconscious errors in thinking that arise from problems related to memory, attention, and other mental mistakes.
  • These biases result from our brain’s efforts to simplify the incredibly complex world in which we live.
  • Confirmation bias , hindsight bias, mere exposure effect , self-serving bias , base rate fallacy , anchoring bias , availability bias , the framing effect ,  inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect .
  • Cognitive biases directly affect our safety, interactions with others, and how we make judgments and decisions in our daily lives.
  • Although these biases are unconscious, there are small steps we can take to train our minds to adopt a new pattern of thinking and mitigate the effects of these biases.

Allen, M. S., Robson, D. A., Martin, L. J., & Laborde, S. (2020). Systematic review and meta-analysis of self-serving attribution biases in the competitive context of organized sport. Personality and Social Psychology Bulletin, 46 (7), 1027-1043.

Casad, B. (2019). Confirmation bias . Retrieved from https://www.britannica.com/science/confirmation-bias

Cherry, K. (2019). How the availability heuristic affects your decision-making . Retrieved from https://www.verywellmind.com/availability-heuristic-2794824

Cherry, K. (2020). Inattentional blindness can cause you to miss things in front of you . Retrieved from https://www.verywellmind.com/what-is-inattentional-blindness-2795020

Dietrich, D., & Olson, M. (1993). A demonstration of hindsight bias using the Thomas confirmation vote. Psychological Reports, 72 (2), 377-378.

Fischhoff, B. (1975). Hindsight is not equal to foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance, 1 (3), 288.

Fischhoff, B., & Beyth, R. (1975). I knew it would happen: Remembered probabilities of once—future things. Organizational Behavior and Human Performance, 13 (1), 1-16.

Furnham, A. (1982). Explanations for unemployment in Britain. European Journal of social psychology, 12(4), 335-352.

Heider, F. (1982). The psychology of interpersonal relations . Psychology Press.

Inman, M. (2016). Hindsight bias . Retrieved from https://www.britannica.com/topic/hindsight-bias

Lang, R. (2019). What is the difference between conscious and unconscious bias? : Faqs. Retrieved from https://engageinlearning.com/faq/compliance/unconscious-bias/what-is-the-difference-between-conscious-and-unconscious-bias/

Luippold, B., Perreault, S., & Wainberg, J. (2015). Auditor’s pitfall: Five ways to overcome confirmation bias . Retrieved from https://www.babson.edu/academics/executive-education/babson-insight/finance-and-accounting/auditors-pitfall-five-ways-to-overcome-confirmation-bias/

Mezulis, A. H., Abramson, L. Y., Hyde, J. S., & Hankin, B. L. (2004). Is there a universal positivity bias in attributions? A meta-analytic review of individual, developmental, and cultural differences in the self-serving attributional bias. Psychological Bulletin, 130 (5), 711.

Miller, D. T., & Ross, M. (1975). Self-serving biases in the attribution of causality: Fact or fiction?. Psychological Bulletin, 82 (2), 213.

Most, S. B., Simons, D. J., Scholl, B. J., Jimenez, R., Clifford, E., & Chabris, C. F. (2001). How not to be seen: The contribution of similarity and selective ignoring to sustained inattentional blindness. Psychological Science, 12 (1), 9-17.

Mussweiler, T., & Strack, F. (1999). Hypothesis-consistent testing and semantic priming in the anchoring paradigm: A selective accessibility model. Journal of Experimental Social Psychology, 35 (2), 136-164.

Neff, K. (2003). Self-compassion: An alternative conceptualization of a healthy attitude toward oneself. Self and Identity, 2 (2), 85-101.

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2 (2), 175-220.

Orzan, G., Zara, I. A., & Purcarea, V. L. (2012). Neuromarketing techniques in pharmaceutical drugs advertising. A discussion and agenda for future research. Journal of Medicine and Life, 5 (4), 428.

Pickel, K. L. (2015). Eyewitness memory. The handbook of attention , 485-502.

Pohl, R. F., & Hell, W. (1996). No reduction in hindsight bias after complete information and repeated testing. Organizational Behavior and Human Decision Processes, 67 (1), 49-58.

Roese, N. J., & Vohs, K. D. (2012). Hindsight bias. Perspectives on Psychological Science, 7 (5), 411-426.

Ross, L. (1977). The intuitive psychologist and his shortcomings: Distortions in the attribution process. In Advances in experimental social psychology (Vol. 10, pp. 173-220). Academic Press.

Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5 (2), 207-232.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185 (4157), 1124-1131.

Tversky, A., & Kahneman, D. (1983). Extensional versus intuitive reasoning: The conjunction fallacy in probability judgment. Psychological Review , 90(4), 293.

Tversky, A., & Kahneman, D. (1992). Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4), 297-323.

Walther, J. B., & Bazarova, N. N. (2007). Misattribution in virtual groups: The effects of member distribution on self-serving bias and partner blame. Human Communication Research, 33 (1), 1-26.

Wason, Peter C. (1960), “On the failure to eliminate hypotheses in a conceptual task”. Quarterly Journal of Experimental Psychology, 12 (3): 129–40.

Wegener, D. T., Petty, R. E., Detweiler-Bedell, B. T., & Jarvis, W. B. G. (2001). Implications of attitude change theories for numerical anchoring: Anchor plausibility and the limits of anchor effectiveness. Journal of Experimental Social Psychology, 37 (1), 62-69.

Wilke, A., & Mata, R. (2012). Cognitive bias. In Encyclopedia of human behavior (pp. 531-535). Academic Press.

Further Information

Test yourself for bias.

  • Project Implicit (IAT Test) From Harvard University
  • Implicit Association Test From the Social Psychology Network
  • Test Yourself for Hidden Bias From Teaching Tolerance
  • How The Concept Of Implicit Bias Came Into Being With Dr. Mahzarin Banaji, Harvard University. Author of Blindspot: hidden biases of good people5:28 minutes; includes transcript
  • Understanding Your Racial Biases With John Dovidio, PhD, Yale University From the American Psychological Association11:09 minutes; includes transcript
  • Talking Implicit Bias in Policing With Jack Glaser, Goldman School of Public Policy, University of California Berkeley21:59 minutes
  • Implicit Bias: A Factor in Health Communication With Dr. Winston Wong, Kaiser Permanente19:58 minutes
  • Bias, Black Lives and Academic Medicine Dr. David Ansell on Your Health Radio (August 1, 2015)21:42 minutes
  • Uncovering Hidden Biases Google talk with Dr. Mahzarin Banaji, Harvard University
  • Impact of Implicit Bias on the Justice System 9:14 minutes
  • Students Speak Up: What Bias Means to Them 2:17 minutes
  • Weight Bias in Health Care From Yale University16:56 minutes
  • Gender and Racial Bias In Facial Recognition Technology 4:43 minutes

Journal Articles

  • An implicit bias primer Mitchell, G. (2018). An implicit bias primer. Virginia Journal of Social Policy & the Law , 25, 27–59.
  • Implicit Association Test at age 7: A methodological and conceptual review Nosek, B. A., Greenwald, A. G., & Banaji, M. R. (2007). The Implicit Association Test at age 7: A methodological and conceptual review. Automatic processes in social thinking and behavior, 4 , 265-292.
  • Implicit Racial/Ethnic Bias Among Health Care Professionals and Its Influence on Health Care Outcomes: A Systematic Review Hall, W. J., Chapman, M. V., Lee, K. M., Merino, Y. M., Thomas, T. W., Payne, B. K., … & Coyne-Beasley, T. (2015). Implicit racial/ethnic bias among health care professionals and its influence on health care outcomes: a systematic review. American journal of public health, 105 (12), e60-e76.
  • Reducing Racial Bias Among Health Care Providers: Lessons from Social-Cognitive Psychology Burgess, D., Van Ryn, M., Dovidio, J., & Saha, S. (2007). Reducing racial bias among health care providers: lessons from social-cognitive psychology. Journal of general internal medicine, 22 (6), 882-887.
  • Integrating implicit bias into counselor education Boysen, G. A. (2010). Integrating Implicit Bias Into Counselor Education. Counselor Education & Supervision, 49 (4), 210–227.
  • Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect Christian, S. (2013). Cognitive Biases and Errors as Cause—and Journalistic Best Practices as Effect. Journal of Mass Media Ethics, 28 (3), 160–174.
  • Empathy intervention to reduce implicit bias in pre-service teachers Whitford, D. K., & Emerson, A. M. (2019). Empathy Intervention to Reduce Implicit Bias in Pre-Service Teachers. Psychological Reports, 122 (2), 670–688.

Print Friendly, PDF & Email

Contextual Debiasing and Critical Thinking: Reasons for Optimism

  • Published: 26 April 2016
  • Volume 37 , pages 103–111, ( 2018 )

Cite this article

critical thinking involves action of bias

  • Vasco Correia 1  

1562 Accesses

8 Citations

19 Altmetric

Explore all metrics

In this article I argue that most biases in argumentation and decision-making can and should be counteracted. Although biases can prove beneficial in certain contexts, I contend that they are generally maladaptive and need correction. Yet critical thinking alone seems insufficient to mitigate biases in everyday contexts. I develop a contextualist approach, according to which cognitive debiasing strategies need to be supplemented by extra-psychic devices that rely on social and environmental constraints in order to promote rational reasoning. Finally, I examine several examples of contextual debiasing strategies and show how they can contribute to enhance critical thinking at a cognitive level.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price includes VAT (Russian Federation)

Instant access to the full article PDF.

Rent this article via DeepDyve

Institutional subscriptions

Similar content being viewed by others

critical thinking involves action of bias

Accountability Breeds Response-Ability: Contextual Debiasing and Accountability in Argumentation

critical thinking involves action of bias

Wise Reasoning in an Uncertain World

critical thinking involves action of bias

Cognitive bias, situationism, and virtue reliabilism

Despite mounting criticism, Meliorism remains the dominant position among philosophers and psychologists. See for example Elster ( 2007 ), Evans ( 2007 ), Kahneman ( 2011 ), Kenyon and Beaulac ( 2014 ), Stanovich ( 2011 ), Larrick ( 2004 ), Croskerry et al. ( 2013a ), Tetlock ( 2005 ), Wilson and Brekke ( 1994 ).

Taylor ( 1989 , p. 237) explicitly acknowledges this aspect: “Unrealistic optimism might lead people to ignore legitimate risks in their environment and to fail to take measures to offset those risks”.

See, for example, Aberdein ( 2010 ) and Cohen ( 2009 ).

Psychologists distinguish between two kinds of cognitive illusions: motivational (or “hot”) biases, on the one hand, which stem from the influence of emotions and interests on cognitive processes, and cognitive (or “cold”) biases, on the other hand, which stem from inferential errors due to cognitive malfunctioning (Kunda 1990 ; Nisbett 1993 ).

Cf. Fisher ( 2011 , p. 4), Lau ( 2011 , p. 2), Siegel ( 1988 , p. 32).

The “tools” metaphor can also be found in other approaches that stress the importance of non-cognitive (or extra-psychic) devices as means to promote rationality: Soll et al. ( 2015 ) refer to “debiasing tools”, Hogarth ( 2001 ) to “decision-making tools”, Elster ( 1989 ) to the “toolbox of mechanisms”, and Gigerenzer and Selten ( 2002 ) to the “adaptive toolbox”.

See, for example, Kenyon and Beaulac ( 2014 ), Larrick ( 2004 ), Soll et al. ( 2015 ).

Aberdein A (2010) Virtue in argument. Argumentation 24(2):165–179

Article   Google Scholar  

Ainslie G (2005) Précis of breakdown of will. Behav Brain Sci 28:635–673

Google Scholar  

Anderson C, Sechler E (1986) Effects of explanation and counterexplanation on the development and use of social theories. J Pers Soc Psychol 50:24–54

Arkes H (1981) Impediments to accurate clinical judgment and possible ways to minimize their impact. J Consult Clin Psychol 49:323–330

Arkes H (1991) Costs and benefits of judgment errors. Psychol Bull 110(13):486–498

Brest P, Krieger L (2010) Problem solving, decision making and professional judgment. Oxford University Press, Oxford

Budden A, Tregenza T, Aarssen L, Koricheva J, Leimu R, Lortie CJ (2008) Double-blind review favours increased representation of female authors. Trends Ecol Evol 23(1):4–6

Cohen J (1981) Can human irrationality be experimentally demonstrated? In: Adler J, Rips L (eds) Reasoning. Cambridge University Press, Cambridge

Cohen D (2009) Keeping an open mind and having a sense of proportion as virtues in argumentation. Cogency 1(2):49–64

Croskerry P, Singhal G, Mamede S (2013a) Cognitive debiasing 1: origins of bias and theory of debiasing. Qual Saf 22(2):58–64

Croskerry P, Singhal G, Mamede S (2013b) Cognitive debiasing 2: impediments to and strategies for change. Qual Saf 22(2):65–72

Davidson D (1985) Incoherence and irrationality. Dialectica 39(4):345–353

Dick Cheney’s Suite Demands (2006) Retrieved January 8, 2016, from http://www.thesmokinggun.com/file/dick-cheneys-suite-demands

Dunning D (2009) Disbelief and the neglect of environmental context. Behav Brain Sci 32:517–518

Elster J (1989) Nuts and bolts for the social sciences. Cambridge University Press, Cambridge

Book   Google Scholar  

Elster J (2007) Explaining social behavior. Cambridge University Press, Cambridge

Engel P (ed) (2000) Believing and accepting. Kluwer, Dordrecht

Evans J (2007) Hypothetical thinking. Psychology Press, New York

Fischhoff B (1982) Debiasing. In: Kahneman D, Slovic P, Tversky A (eds) Judgment under uncertainty. Cambridge University Press, Cambridge

Fischhoff B (2002) Heuristics and biases in application. In: Gilovich T, Griffin D, Kahneman D (eds) Heuristics and biases. Cambridge University Press, Cambridge

Fisher A (2011) Critical thinking: an introduction. Cambridge University Press, Cambridge

Galinsky A, Moskowitz G, Gordon B (2000) Perspective taking. J Pers Soc Psychol 784:708–724

Gigerenzer G (2008) Rationality for mortals. Oxford University Press, Oxford

Gigerenzer G, Selten R (2002) Bounded rationality. MIT Press, Cambridge

Gigerenzer G, Todd P (2000) Précis of simple heuristics that make us smart. Behav Brain Sci 23:727–780

Hirt E, Markman K (1995) Multiple explanation: a consider-an-alternative strategy for debiasing judgments. J Pers Soc Psychol 69:1069–1086

Hogarth R (2001) Educating intuition. University of Chicago Press, Chicago

Johnson R, Blair A (2006) Logical self-defense. International Debate Association, New York

Kahneman D (2011) Thinking, fast and slow. Farrar, Straus and Giroux, New York

Kenyon T, Beaulac G (2014) Critical thinking education and debiasing. Informal Log 34(4):341–363

Kunda Z (1990) The case for motivated reasoning. Psychol Bull 108(3):480–498

Larrick R (2004) Debiasing. In: Koehler D, Harvey N (eds) The Blackwell handbook of judgment and decision making. Blackwell Publishing, Oxford

Lau J (2011) An introduction to critical thinking and creativity. Wiley, New Jersey

Lilienfeld S, Ammirati R, Landfield K (2009) Giving debiasing away. Perspect Psychol Sci 4(4):390–398

Lord G, Lepper R, Preston E (1984) Considering the opposite: a corrective strategy for social judgment. J Pers Soc Psychol 47:1231–1243

McKay R, Dennett D (2009) The evolution of misbelief. Behav Brain Sci 32:493–561

Mercier H, Sperber D (2011) Why do humans reason? Behav Brain Sci 34:57–111

Mussweiler T, Strack F, Pfeiffer T (2000) Overcoming the inevitable anchoring effect. Pers Soc Psychol Bull 26:1142–1150

Myers D (1975) Discussion-induced attitude-polarization. Hum Relat 28:699–714

Nisbett R (ed) (1993) Rules for reasoning. Erlbaum, Hillsdale

Oaksford M, Chater N (2009) Précis of Bayesian Rationality. Behav Brain Sci 32:69–120

Paluk E, Green D (2009) Prejudice reduction: what works? A review and assessment of research and practice. Annu Rev Psychol 60:339–367

Paul W (1986) Critical thinking in the strong and the role of argumentation in everyday life. In: Eemeren F, Grootendorst R, Blair A, Willard C (eds) Argumentation. Foris Publications, Dordrecht

Pelham B, Neter E (1995) The effect of motivation of judgment depends on the difficulty of the judgment. J Pers Soc Psychol 68(4):581–594

Pronin E, Lin D, Ross L (2002) The bias blind spot: perceptions of bias in self versus others. Pers Soc Psychol Bull 28:369–381

Rawls J (2000) Lectures on the history of political philosophy. Harvard University Press, Cambridge

Sanna L, Schwarz N, Stocker S (2002) When debiasing backfires. J Exp Psychol 28:497–502

Siegel H (1988) Educating reason. Routledge, New York

Soll J, Milkman K, Payne J (2015) Outsmart your own biases. Harv Bus Rev 93:65–71

Stanovich K (2005) The robot’s rebellion. The University of Chicago Press, Chicago

Stanovich K (2011) Rationality and the reflective mind. Oxford University Press, New York

Stanovich K, West R (2008) On the relative independence of thinking biases and cognitive ability. J Pers Soc Psyshol 94:672–695

Stein E (1996) Without good reason. Clarendon Press, Oxford

Stich S (1990) The fragmentation of reason. MIT Press, Cambridge

Sunstein C (2003) Why societies need dissent. Harvard University Press, Harvard

Sunstein C, Schkade D, Ellman L (2004) Ideological voting on federal courts of appeal. Va Law Rev 90(1):301–354

Taber C, Lodge M (2006) Motivated skepticism in the evaluation of political beliefs. Am J Polit Sci 50(3):755–769

Taylor S (1989) Positive illusions. Basic Books, New York

Taylor S, Brown J (1988) Illusion and well-being. Psychol Bull 103(2):193–210

Tetlock P (2002) Intuitive politicians, theologians, and prosecutors. In: Gilovich T, Griffin D, Kahneman D (eds) Heuristics and biases. Cambridge University Press, Cambridge

Tetlock P (2005) Expert political judgment. Princeton University Press, Princeton

Tetlock P, Boettger R (1989) Accountability. J Pers Soc Psychol 57:388–398

Thagard P (2011) Critical thinking and informal logic. Informal Log 31(3):152–170

Thaler R, Sunstein C (2008) Nudge. Yale University Press, New Haven

Tversky A, Kahneman D (2008) Extensional versus intuitive reasoning. In: Gilovich T, Griffin D, Kahneman D (eds) Heuristics and biases. Cambridge University Press, Cambridge

Willingham D (2007) Critical thinking: why is it so hard to teach? Am Educ 31(2):8–19

Wilson T, Brekke N (1994) Mental contamination and mental correction. Psychol Bull 116(1):117142

Wilson T, Centerbar D, Brekke N (2002) Mental contamination and the debiasing problem. In: Gilovich T, Griffin D, Kahneman D (eds) Heuristics and biases. Cambridge University Press, Cambridge

Download references

Acknowledgments

I would like to thank the editor and two anonymous reviewers for their constructive comments. Work on this article was conducted under the grant SFRH/BPD/101744/2014 by the “Portuguese Foundation for Science and Technology” (FCT), as part of the project “Values in argumentative discourse” (PTDC/MHC-FIL/0521/2014).

Author information

Authors and affiliations.

ArgLab, IFILNOVA, Nova Institute of Philosophy, Universidade Nova de Lisboa, Av. De Berna 26, 4º piso, 1069-061, Lisbon, Portugal

Vasco Correia

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Vasco Correia .

Rights and permissions

Reprints and permissions

About this article

Correia, V. Contextual Debiasing and Critical Thinking: Reasons for Optimism. Topoi 37 , 103–111 (2018). https://doi.org/10.1007/s11245-016-9388-x

Download citation

Published : 26 April 2016

Issue Date : March 2018

DOI : https://doi.org/10.1007/s11245-016-9388-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Contextualism
  • Critical thinking
  • Rationality
  • Find a journal
  • Publish with us
  • Track your research

Kendall College of Art & Design

Critical Thinking & Evaluating Information

  • Critical Thinking Skills
  • Critical Thinking Questions
  • Fake News & Misinformation
  • Checkers & Cheat Sheets
  • Evaluate Using T.R.A.A.P.
  • Alternate Videos
  • Sources & Links

What is Bias?

                Sources of bias image bubble

Biases also play a role in how you approach all information. The short video below provides definitions of 12 types of cognitive biases.

There are two forms of bias of particular importance given today's information laden landscape, implicit bias and confirmation bias .

Implicit Bias & Confirmation Bias

Implicit / Unconscious Bias 

"Original definition (neutral) - Any personal preference, attitude, or expectation that unconsciously affects a person's outlook or behaviour.

Current definition (negative) - Unconscious favouritism towards or prejudice against people of a particular race, gender, or group that influences one's actions or perceptions; an instance of this."

"unconscious bias, n." OED Online, Oxford University Press, December 2020, www.oed.com/view/Entry/88686003 .

"Thoughts and feelings are “implicit” if we are unaware of them or mistaken about their nature. We have a bias when, rather than being neutral, we have a preference for (or aversion to) a person or group of people. Thus, we use the term “implicit bias” to describe when we have attitudes towards people or associate stereotypes with them without our conscious knowledge." 

https://perception.org/research/implicit-bias/

Confirmation Bias – "Originating in the field of psychology; the tendency to seek or favour new information which supports one’s existing theories or beliefs, while avoiding or rejecting that which disrupts them." 

Addition of definition to the Oxford Dictionary in 2019 

"confirmation, n." OED Online, Oxford University Press, December 2020, www.oed.com/view/Entry/38852. 

Simply put, confirmation bias is the tendency to seek out and/ or interpret new information as confirmation of one's existing beliefs or theories and to exclude contradictory or opposing information or points of view.

Put Bias in Check!

                Who, what, when, where, why, how blocks image

Now that you are aware of bias, your personal biases and bias that can be found in sources of information, you can put it in check . You should approach information objectively, neutrally and critically evaluate it. Numerous tools included in this course can help you do this, like the critical thinking cheat sheet in the previous module.

  • << Previous: Critical Thinking Questions
  • Next: Evaluating News & Media >>
  • Last Updated: Sep 9, 2021 12:09 PM
  • URL: https://ferris.libguides.com/criticalthinking

Ferris State University Imagine More

  • Bipolar Disorder
  • Therapy Center
  • When To See a Therapist
  • Types of Therapy
  • Best Online Therapy
  • Best Couples Therapy
  • Best Family Therapy
  • Managing Stress
  • Sleep and Dreaming
  • Understanding Emotions
  • Self-Improvement
  • Healthy Relationships
  • Student Resources
  • Personality Types
  • Guided Meditations
  • Verywell Mind Insights
  • 2024 Verywell Mind 25
  • Mental Health in the Classroom
  • Editorial Process
  • Meet Our Review Board
  • Crisis Support

13 Types of Common Cognitive Biases That Might Be Impairing Your Judgment

Which of these sway your thinking the most?

Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

critical thinking involves action of bias

Amy Morin, LCSW, is a psychotherapist and international bestselling author. Her books, including "13 Things Mentally Strong People Don't Do," have been translated into more than 40 languages. Her TEDx talk,  "The Secret of Becoming Mentally Strong," is one of the most viewed talks of all time.

critical thinking involves action of bias

The Confirmation Bias

The hindsight bias, the anchoring bias, the misinformation effect, the actor-observer bias, the false consensus effect, the halo effect, the self-serving bias, the availability heuristic, the optimism bias.

  • Other Kinds

Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases . These biases distort thinking , influence beliefs, and sway the decisions and judgments that people make each and every day.

Sometimes, cognitive biases are fairly obvious. You might even find that you recognize these tendencies in yourself or others. In other cases, these biases are so subtle that they are almost impossible to notice.

At a Glance

Attention is a limited resource. This means we can't possibly evaluate every possible detail and event ​when forming thoughts and opinions. Because of this, we often rely on mental shortcuts that speed up our ability to make judgments, but this can sometimes lead to bias. There are many types of biases—including the confirmation bias, the hindsight bias, and the anchoring bias, just to name a few—that can influence our beliefs and actions daily.

The following are just a few types of cognitive biases that have a powerful influence on how you think, how you feel, and how you behave.

Tara Moore / Getty Images

The confirmation bias is the tendency to listen more often to information that confirms our existing beliefs. Through this bias, people tend to favor information that reinforces the things they already think or believe.

Examples include:

  • Only paying attention to information that confirms your beliefs about issues such as gun control and global warming
  • Only following people on social media who share your viewpoints
  • Choosing news sources that present stories that support your views
  • Refusing to listen to the opposing side
  • Not considering all of the facts in a logical and rational manner

There are a few reasons why this happens. One is that only seeking to confirm existing opinions helps limit mental resources we need to use to make decisions. It also helps protect self-esteem by making people feel that their beliefs are accurate.

People on two sides of an issue can listen to the same story and walk away with different interpretations that they feel validates their existing point of view. This is often indicative that the confirmation bias is working to "bias" their opinions.

The problem with this is that it can lead to poor choices, an inability to listen to opposing views, or even contribute to othering people who hold different opinions.

Things that we can do to help reduce the impact of confirmation bias include being open to hearing others' opinions and specifically looking for/researching opposing views, reading full articles (and not just headlines), questioning the source, and [doing] the research yourself to see if it is a reliable source.

The hindsight bias is a common cognitive bias that involves the tendency to see events, even random ones, as more predictable than they are. It's also commonly referred to as the "I knew it all along" phenomenon.

Some examples of the hindsight bias include:

  • Insisting that you knew who was going to win a football game once the event is over
  • Believing that you knew all along that one political candidate was going to win an election
  • Saying that you knew you weren't going to win after losing a coin flip with a friend
  • Looking back on an exam and thinking that you knew the answers to the questions you missed
  • Believing you could have predicted which stocks would become profitable

Classic Research

In one classic psychology experiment, college students were asked to predict whether they thought then-nominee Clarence Thomas would be confirmed to the U.S. Supreme Court.

Prior to the Senate vote, 58% of the students thought Thomas would be confirmed. The students were polled again following Thomas's confirmation, and a whopping 78% of students said they had believed Thomas would be confirmed.  

The hindsight bias occurs for a combination of reasons, including our ability to "misremember" previous predictions, our tendency to view events as inevitable, and our tendency to believe we could have foreseen certain events.

The effect of this bias is that it causes us to overestimate our ability to predict events. This can sometimes lead people to take unwise risks.

The anchoring bias is the tendency to be overly influenced by the first piece of information that we hear. Some examples of how this works:

  • The first number voiced during a price negotiation typically becomes the anchoring point from which all further negotiations are based.
  • Hearing a random number can influence estimates on completely unrelated topics.
  • Doctors can become susceptible to the anchoring bias when diagnosing patients. The physician’s first impressions of the patient often create an anchoring point that can sometimes incorrectly influence all subsequent diagnostic assessments.

While the existence of the anchoring bias is well documented, its causes are still not fully understood. Some research suggests that the source of the anchor information may play a role. Other factors such as priming and mood also appear to have an influence.

Like other cognitive biases, anchoring can have an effect on the decisions you make each day. For instance, it can influence how much you are willing to pay for your home. However, it can sometimes lead to poor choices and make it more difficult for people to consider other factors that might also be important.

The misinformation effect is the tendency for memories to be heavily influenced by things that happened after the actual event itself. A person who witnesses a car accident or crime might believe that their recollection is crystal clear, but researchers have found that memory is surprisingly susceptible to even very subtle influences.

For example:

  • Research has shown that simply asking questions about an event can change someone's memories of what happened.
  • Watching television coverage may change how people remember the event.
  • Hearing other people talk about a memory from their perspective may change your memory of what transpired.

Classic Memory Research

In one classic experiment by memory expert Elizabeth Loftus , people who watched a video of a car crash were then asked one of two slightly different questions: “How fast were the cars going when they hit each other?” or “How fast were the cars going when they smashed into each other?”  

When the witnesses were then questioned a week later whether they had seen any broken glass, those who had been asked the “smashed into” version of the question were more likely to report incorrectly that they had seen broken glass.

There are a few factors that may play a role in this phenomenon. New information may get blended with older memories.   In other cases, new information may be used to fill in "gaps" in memory.

The effects of misinformation can range from the trivial to much more serious. It might cause you to misremember something you thought happened at work, or it might lead to someone incorrectly identifying the wrong suspect in a criminal case.

The actor-observer bias is the tendency to attribute our actions to external influences and other people's actions to internal ones. The way we perceive others and how we attribute their actions hinges on a variety of variables, but it can be heavily influenced by whether we are the actor or the observer in a situation.

When it comes to our own actions, we are often far too likely to attribute things to external influences. For example:

  • You might complain that you botched an important meeting because you had jet lag.
  • You might say you failed an exam because the teacher posed too many trick questions.

When it comes to explaining other people’s actions, however, we are far more likely to attribute their behaviors to internal causes. For example:

  • A colleague screwed up an important presentation because he’s lazy and incompetent (not because he also had jet lag).
  • A fellow student bombed a test because they lack diligence and intelligence (and not because they took the same test as you with all those trick questions).

While there are many factors that may play a role, perspective plays a key role. When we are the actors in a situation, we are able to observe our own thoughts and behaviors. When it comes to other people, however, we cannot see what they are thinking. This means we focus on situational forces for ourselves, but guess at the internal characteristics that cause other people's actions.

The problem with this is that it often leads to misunderstandings. Each side of a situation is essentially blaming the other side rather than thinking about all of the variables that might be playing a role.

The false consensus effect is the tendency people have to overestimate how much other people agree with their own beliefs, behaviors, attitudes, and values. For example:

  • Thinking that other people share your opinion on controversial topics
  • Overestimating the number of people who are similar to you
  • Believing that the majority of people share your preferences

Researchers believe that the false consensus effect happens for a variety of reasons. First, the people we spend the most time with, our family and friends, do often tend to share very similar opinions and beliefs. Because of this, we start to think that this way of thinking is the majority opinion even when we are with people who are not among our group of family and friends.

Another key reason this cognitive bias trips us up so easily is that believing that other people are just like us is good for our self-esteem . It allows us to feel "normal" and maintain a positive view of ourselves in relation to other people.

This can lead people not only to incorrectly think that everyone else agrees with them—it can sometimes lead them to overvalue their own opinions. It also means that we sometimes don't consider how other people might feel when making choices.

The halo effect is the tendency for an initial impression of a person to influence what we think of them overall. Also known as the "physical attractiveness stereotype" or the "what is beautiful is 'good' principle" we are either influenced by or use the halo to influence others almost every day. For example:

  • Thinking people who are good-looking are also smarter, kinder, and funnier than less attractive people
  • Believing that products marketed by attractive people are also more valuable
  • Thinking that a political candidate who is confident must also be intelligent and competent

One factor that may influence the halo effect is our tendency to want to be correct. If our initial impression of someone was positive, we want to look for proof that our assessment was accurate. It also helps people avoid experiencing cognitive dissonance , which involves holding contradictory beliefs.

This cognitive bias can have a powerful impact in the real world. For example, job applicants perceived as attractive and likable are also more likely to be viewed as competent, smart, and qualified for the job.

The self-serving bias is a tendency for people tend to give themselves credit for successes but lay the blame for failures on outside causes. When you do well on a project, you probably assume that it’s because you worked hard. But when things turn out badly, you are more likely to blame it on circumstances or bad luck.

Some examples of this:

  • Attributing good grades to being smart or studying hard
  • Believing your athletic performance is due to practice and hard work
  • Thinking you got the job because of your merits

The self-serving bias can be influenced by a variety of factors. Age and sex have been shown to play a part. Older people are more likely to take credit for their successes, while men are more likely to pin their failures on outside forces.  

This bias does serve an important role in protecting self-esteem. However, it can often also lead to faulty attributions such as blaming others for our own shortcomings.

The availability heuristic is the tendency to estimate the probability of something happening based on how many examples readily come to mind. Some examples of this:

  • After seeing several news reports of car thefts in your neighborhood, you might start to believe that such crimes are more common than they are.
  • You might believe that plane crashes are more common than they really are because you can easily think of several examples.

It is essentially a mental shortcut designed to save us time when we are trying to determine risk. The problem with relying on this way of thinking is that it often leads to poor estimates and bad decisions.

Smokers who have never known someone to die of a smoking-related illness, for example, might underestimate the health risks of smoking. In contrast, if you have two sisters and five neighbors who have had breast cancer, you might believe it is even more common than statistics suggest.

The optimism bias is a tendency to overestimate the likelihood that good things will happen to us while underestimating the probability that negative events will impact our lives. Essentially, we tend to be too optimistic for our own good.

For example, we may assume that negative events won't affect us such as:

The optimism bias has roots in the availability heuristic. Because you can probably think of examples of bad things happening to other people it seems more likely that others will be affected by negative events.

This bias can lead people to take health risks like smoking, eating poorly, or not wearing a seat belt. The bad news is that research has found that this optimism bias is incredibly difficult to reduce.

There is good news, however. This tendency toward optimism helps create a sense of anticipation for the future, giving people the hope and motivation they need to pursue their goals.

Other Kinds of Cognitive Bias

Many other cognitive biases can distort how we perceive the world. Just a partial list:

  • Status quo bias reflects a desire to keep things as they are.
  • Apophenia is the tendency to perceive patterns in random occurrences.
  • Framing is presenting a situation in a way that gives a certain impression.

Keep in Mind

The cognitive biases above are common, but this is only a sampling of the many biases that can affect your thinking. These biases collectively influence much of our thoughts and ultimately, decision making.

Many of these biases are inevitable. We simply don't have the time to evaluate every thought in every decision for the presence of any bias. Understanding these biases is very helpful in learning how they can lead us to poor decisions in life.

Dietrich D, Olson M. A demonstration of hindsight bias using the Thomas confirmation vote . Psychol Rep . 1993;72(2):377-378. doi:/10.2466/pr0.1993.72.2.377

Lee KK.  An indirect debiasing method: Priming a target attribute reduces judgmental biases in likelihood estimations .  PLoS ONE . 2019;14(3):e0212609. doi:10.1371/journal.pone.0212609

Saposnik G, Redelmeier D, Ruff CC, Tobler PN. Cognitive biases associated with medical decisions: A systematic review .  BMC Med Inform Decis Mak . 2016;16(1):138. doi:10.1186/s12911-016-0377-1

Furnham A., Boo HC. A literature review of anchoring bias .  The Journal of Socio-Economics.  2011;40(1):35-42. doi:10.1016/j.socec.2010.10.008

Loftus EF.  Leading questions and the eyewitness report .  Cognitive Psychology . 1975;7(4):560-572. doi:10.1016/0010-0285(75)90023-7

Challies DM, Hunt M, Garry M, Harper DN. Whatever gave you that idea? False memories following equivalence training: a behavioral account of the misinformation effect .  J Exp Anal Behav . 2011;96(3):343-362. doi:10.1901/jeab.2011.96-343

Miyamoto R, Kikuchi Y.  Gender differences of brain activity in the conflicts based on implicit self-esteem .  PLoS ONE . 2012;7(5):e37901. doi:10.1371/journal.pone.0037901

Weinstein ND, Klein WM.  Resistance of personal risk perceptions to debiasing interventions .  Health Psychol . 1995;14(2):132–140. doi:10.1037//0278-6133.14.2.132

Gratton G, Cooper P, Fabiani M, Carter CS, Karayanidis F. Dynamics of cognitive control: theoretical bases, paradigms, and a view for the future . Psychophysiology . 2018;55(3). doi:10.1111/psyp.13016

By Kendra Cherry, MSEd Kendra Cherry, MS, is a psychosocial rehabilitation specialist, psychology educator, and author of the "Everything Psychology Book."

Bias and Critical Thinking

Note: The German version of this entry can be found here: Bias and Critical Thinking (German)

Note: This entry revolves more generally around Bias in science. For more thoughts on Bias and its relation to statistics, please refer to the entry on Bias in statistics .

In short: This entry discusses why science is never objective, and what we can really know.

  • 1 What is bias?
  • 2 Design criteria
  • 3 Bias in gathering data, analysing data and interpreting data
  • 4 Bias and philosophy
  • 5 Critical Theory and Bias
  • 6 Further Information

What is bias?

"The very concept of objective truth is fading out of the world." - George Orwell

A bias is “the action of supporting or opposing a particular person or thing in an unfair way, because of allowing personal opinions to influence your judgment” (Cambridge Dictionary). In other words, bias clouds our judgment and often action in the sense that we act wrongly. We are all biased, because we are individuals with individual experiences, and are unconnected from other individuals and/or groups, or at least think we are unconnected.

Recognising bias in research is highly relevant, because bias exposes the myth of objectivity of research and enables a better recognition and reflection of our flaws and errors. In addition, one could add that understanding bias in science is relevant beyond the empirical, since bias can also highlight flaws in our perceptions and actions as humans. To this end, acknowledging bias is understanding the limitations of oneself. Prominent examples are gender bias and racial bias, which are often rooted in our societies, and can be deeply buried in our subconscious. To be critical researchers it is our responsibility to learn about the diverse biases we have, yet it is beyond this text to explore the subjective human bias we need to overcome. Just so much about the ethics of bias: many would argue that overcoming our biases requires the ability to learn and question our privileges. Within research we need to recognise that science has been severely and continuously biased against ethnic minorities, women, and many other groups. Institutional and systemic bias are part of the current reality of the system, and we need to do our utmost to change this - there is a need for debiasing science, and our own actions. While it should not go unnoticed that institutions and systems did already change, injustices and inequalities still exist. Most research is conducted in the global north, posing a neo-colonialistic problem that we are far from solving. Much of academia is still far away from having a diverse understanding of people, and systemic and institutional discrimination are parts of our daily reality. We are on the path of a very long journey, and there is much to be done concerning bias in constructed institutions.

All this being said, let us shift our attention now to bias in empirical research. Here, we show three different perspective in order to enable a more reflexive understanding of bias. The first is the understanding how different forms of biases relate to design criteria of scientific methods. The second is the question which stage in the application of methods - data gathering, data analysis, and interpretation of results - is affected by which bias, and how. Finally, the third approach is to look at the three principal theories of Western philosophy, namely reason, social contract and utilitarianism - to try and dismantle which of the three can be related to which bias. Many methods are influenced by bias, and recognising which bias affects which design criteria, research stage and principal philosophical theory in the application of a method can help to make empirical research more reflexive.

critical thinking involves action of bias

Design criteria

While qualitative research is often considered prone to many biases, it is also often more reflexive in recognising its limitations. Many qualitative methods are defined by a strong subjective component - i.e. of the researcher - and a clear documentation can thus help to make an existing bias more transparent. Many quantitative approaches have a reflexive canon that focuses on specific biases relevant for a specific approach, such as sampling bias or reporting bias. These are often less considered than in qualitative methods, since quantitative methods are still – falsely - considered to be more objective. This is not true. While one could argue that the goal of reproducibility may lead to a better taming of a bias, this is not necessarily so, as the crisis in psychology clearly shows. Both quantitative and qualitative methods are potentially strongly affected by several cognitive biases, as well as by bias in academia in general, which includes for instance funding bias or the preference of open access articles. While all this is not surprising, it is still all the much harder to solve.

Another general differentiation can be made between inductive and deductive approaches. Many deductive approaches are affected by bias that is associated to sampling. Inductive approaches are more associated to bias during interpretation. Deductive approaches often build around designed experiments, while the strongpoint of inductive approaches is being less bound by methodological designs, which can also make bias more hidden and thus harder to detect. However, this is why qualitative approaches often have an emphasis on a concise documentation.

The connection between spatial scales and bias is rather straightforward, since the individual focus is related to cognitive bias, while system scales are more associated to prejudices, bias in academia and statistical bias. While the impact of temporal bias is less explored, forecast bias is a prominent example when it comes to future predictions, and another error is applying our cultural views and values on past humans, which has yet to be clearly named as a bias. What can be clearly said about both spatial and temporal scales is that we are often irrationally biased towards very distant entities - in space or time - and even irrationally more than we should be. We are for instance inclined to reject the importance of a distant future scenario, although it may widely follow the same odds to become a reality than a close future. For example, almost everybody would like to win the lottery tomorrow rather than win the lottery in 20 years, irrespective of your chances to live and see it happen, or the longer time you may spend with your lottery prize for the (longer) time to come. Humans are most peculiar constructed beings, and we are notorious to act irrationally. This is equally true for spatial distance. We may care irrationally more for people that are close to us as compared to people that are very distant, even independent of joined experience (e.g with friends) or joined history (e.g. with family). Again, this infers a bias which we can be aware of, but which has to be named. No doubt the current social developments will increase our capacities to recognise our biases even more, as all these phenomena also affect scientists.

The following table categorizes different types of Bias as indicated in the Wikipedia entry on Bias according to two levels of the Design Criteria of Methods .

Bias in gathering data, analysing data and interpreting data

The three steps of the application of a method are clearly worth investigating, as it allows us to dismantle at which stage we may inflict a bias into our application of a method. Gathering data is strongly associated with cognitive bias, yet also to statistical bias and partly even to some bias in academia. Bias associated to sampling can be linked to a subjective perspective as well as to systematic errors rooted in previous results. This can also affect the analysis of data, yet here one has to highlight that quantitative methods are less affected by a bias through analysis than qualitative methods. This is not a normative judgement, and can clearly be counter-measured by a sound documentation of the analytical steps. We should nevertheless not forget that there are even different assumptions about the steps of analysis in such an established field as statistics. Here, different schools of thought constantly clash regarding the optimal approach of analysis, sometimes even with different results. This exemplifies that methodological analysis can be quite normative, underlining the need for a critical perspective. This is also the case in qualitative methods, yet there it strongly depends on the specific methods, as these methods are more diverse. Concerning the interpretation of scientific results, the amount and diversity of biases is clearly the highest, or in other words, worst. While this is related to the cognition bias we have as individuals, it is also related to prejudices, bias in academia and statistical bias. Overall, we need to recognise that some methods are less associated to certain biases because they are more established concerning the norms of their application, while other methods are new and less tested by the academic community. When it comes to bias, there can be at least a weak effect that safety - although not diversity - concerning methods comes in numbers. More and diverse methods may offer new insights on biases, since one method may reveal a bias that another method cannot reveal. Methodological plurality may reduce bias. For a fully established method the understanding of its bias is often larger, because the number of times it has been applied is larger. This is especially but not always true for the analysis step, and in parts also for some methodological designs concerned with sampling. Clear documentation is however key to make bias more visible among the three stages.

Bias and philosophy

The last and by far most complex point is the root theories associated to bias. Reason, social contract and utilitarianism are the three key theories of Western philosophy relevant for empiricism, and all biases can be at least associated to one of these three foundational theories. Many cognitive bias are linked to reason or unreasonable behaviour. Much of bias relates to prejudices and society can be linked to the wide field of social contract. Lastly, some bias is clearly associated with utilitarianism. Surprisingly, utilitarianism is associated to a low amount of bias, yet it should be noted that the problem of causality within economical analysis is still up for debate. Much of economic management is rooted in correlative understandings, which are often mistaken for clear-cut causal relations. Psychology also clearly illustrates that investigating a bias is different from unconsciously inferring a bias into your research. Consciousness of bias is the basis for its recognition : if you are not aware of bias, you cannot take it into account regarding your knowledge production. While it thus seems not directly helpful to associate empirical research and its biases to the three general foundational theories of philosophy - reason, social contract and utilitatrianism -, we should still take this into account, least of all at it leads us to one of the most important developments of the 20th century: Critical Theory.

Critical Theory and Bias

Out of the growing empiricism of the enlightenment there grew a concern which we came to call Critical Theory. At the heart of critical theory is the focus on critiquing and changing society as a whole, in contrast to only observing or explaining it. Originating in Marx, Critical Theory consists of a clear distancing from previous theories in philosophy - or associated with the social - that try to understand or explain. By embedding society in its historical context (Horkheimer) and by focussing on a continuous and interchanging critique (Benjamin) Critical Theory is a first and bold step towards a more holistic perspective in science. Remembering the Greeks and also some Eastern thinkers, one could say it is the first step back to a holistic thinking. From a methodological perspective, Critical Theory is radical because it seeks to distinguish itself not only from previously existing philosophy, but more importantly from the widely dominating empiricism, and its societal as well as scientific consequences. A Critical Theory should thus be explanatory, practical and normative, and what makes it more challenging, it needs to be all these three things combined (Horkheimer). Through Habermas, Critical Theory got an embedding in democracy, yet with a critical view of what we could understand as globalisation and its complex realities. The reflexive empowerment of the individual is as much as a clear goal as one would expect, also because of the normative link to the political.

Critical Theory is thus a vital step towards a wider integration of diverse philosophies, but also from a methodological standpoint it is essential since it allowed for the emergence of a true and holistic critique of everything empirical. While this may be valued as an attack, is can also be interpreted as necessary step, since the arrogance and the claim of truth in empiricism can be interpreted not only as a deep danger to methods. Popper does not offer a true solution to positivism, and indeed he was very much hyped by many. His thought that the holy grail of knowledge can ultimately be never truly reached also generates certain problems. He can still be admired because he called for scientists to be radical, while acknowledging that most scientists are not radical. In addition, we could see it from a post-modernist perspective as a necessary step to prevent an influence of empiricism that might pose a threat to and by humankind itself, may it be through nuclear destruction, the unachievable and feeble goal of a growth economy (my wording), the naive and technocratic hoax of the eco modernists (also my wording) or any other paradigm that is short-sighted or naive. In other words, we look at the postmodern.

Critical Theory to this end is now developing to connect to other facets of the discourse, and some may argue that its focus onto the social science can be seen critical in itself, or at least as a normative choice that is clearly anthropocentric, has a problematic relationship with the empirical, and has mixed relations with its diverse offspring that includes gender research, critique of globalisation, and many other normative domains that are increasingly explored today. Building on the three worlds of Popper (the physical world, the mind world, human knowledge), we should note another possibility, that is Critical Realism. Roy Bhaskar proposed three ontological domains ( strata of knowledge ): the real (which is everything there is ), the actual ( everything we can grasp ), and the empirical ( everything we can observe ). During the last decade, humankind unlocked ever more strata of knowledge, hence much of the actual became empirical to us. We have to acknowledge that some strata of knowledge are hard to relate, or may even be unrelatable, which has consequences for our methodological understanding of the world. Some methods may unlock some strata of knowledge but not others. Some may be specific, some vague. And some may only unlock new strata based on a novel combinations. What is most relevant to this end is however, that we might look for causal links, but need to be critical that new strata of knowledge may make them obsolete. Consequently, there are no universal laws that we can thrive for, but instead endless strata to explore.

Coming back to bias, Critical Theory seems as an antidote to bias , and some may argue Critical Realism even more so, as it combines the criticality with a certain humbleness necessary when exploring the empirical and causal. The explanatory characteristic allowed by Critical Realism might be good enough for the pragmatist, the practical may speak to the modern engagement of science with and for society, and the normative is aware of – well - all things normative, including the critical. Hence a door was opened to a new mode of science, focussing on the situation and locatedness of research within the world. This was surely a head start with Kant, who opened the globe to the world of methods. There is however a critical link in Habermas, who highlighted the duality of the rational individual on a small scale and the role of global societies as part of the economy (Habermas 1987). This underlines a crucial link to the original three foundational theories in philosophy, albeit in a dramatic and focused interpretation of modernity. Habermas himself was well aware of the tensions between these two approaches – the critical and the empirical -, yet we owe it to Critical Theory and its continuations that a practical and reflexive knowledge production can be conducted within deeply normative systems such as modern democracies.

Linking to the historical development of methods, we can thus clearly claim that Critical Theory (and Critical Realism) opened a new domain or mode of thinking, and its impact can be widely felt way beyond the social science and philosophy that it affected directly. However, coming back to bias, the answer to an almost universal rejection of empiricism will not be followed here . Instead, we need to come back to the three foundational theories of philosophy, and need to acknowledge that reason, social contract and utilitarianism are the foundation of the first empirical disciplines that are at their core normative (e.g. psychology, social and political science, and economics). Since bias can be partly related to these three theories, and consequentially to specific empirical disciplines, we need to recognise that there is an overarching methodological bias. This methodological bias has a signature rooted in specific design criteria, which are in turn related to specific disciplines. Consequently, this methodological bias is a disciplinary bias - even more so, since methods may be shared among scientific disciplines, but most disciplines claim either priority or superiority when it comes to the ownership of a method.

The disciplinary bias of modern science thus creates a deeply normative methodological bias, which some disciplines may try to take into account yet others clearly not. In other words, the dogmatic selection of methods within disciplines has the potential to create deep flaws in empirical research, and we need to be aware and reflexive about this. The largest bias concerning methods is the choice of methods per se. A critical perspective is thus not only of relevance from a perspective of societal responsibility, but equally from a view on the empirical. Clear documentation and reproducibility of research are important but limited stepping stones in a critique of the methodological. This cannot replace a critical perspective, but only amends it. Empirical knowledge will only look at parts - or strata according to Roy Bhaskar - of reality, yet philosophy can offer a generalisable perspective or theory, and Critical Theory, Critical Realism as well as other current developments of philosophy can be seen as a thriving towards an integrated and holistic philosophy of science, which may ultimately link to an overaching theory of ethics (Parfit). If the empirical and the critical inform us, then both a philosophy of science and ethics may tell us how we may act based on our perceptions of reality.

Further Information

Some words on Critical Theory A short entry on critical realism

The author of this entry is Henrik von Wehrden.

  • Normativity of Methods

Powered by MediaWiki

GCFGlobal Logo

  • Get started with computers
  • Learn Microsoft Office
  • Apply for a job
  • Improve my work skills
  • Design nice-looking docs
  • Getting Started
  • Smartphones & Tablets
  • Typing Tutorial
  • Online Learning
  • Basic Internet Skills
  • Online Safety
  • Social Media
  • Zoom Basics
  • Google Docs
  • Google Sheets
  • Career Planning
  • Resume Writing
  • Cover Letters
  • Job Search and Networking
  • Business Communication
  • Entrepreneurship 101
  • Careers without College
  • Job Hunt for Today
  • 3D Printing
  • Freelancing 101
  • Personal Finance
  • Sharing Economy
  • Decision-Making
  • Graphic Design
  • Photography
  • Image Editing
  • Learning WordPress
  • Language Learning
  • Critical Thinking
  • For Educators
  • Translations
  • Staff Picks
  • English expand_more expand_less

Critical Thinking and Decision-Making  - What is Critical Thinking?

Critical thinking and decision-making  -, what is critical thinking, critical thinking and decision-making what is critical thinking.

GCFLearnFree Logo

Critical Thinking and Decision-Making: What is Critical Thinking?

Lesson 1: what is critical thinking, what is critical thinking.

Critical thinking is a term that gets thrown around a lot. You've probably heard it used often throughout the years whether it was in school, at work, or in everyday conversation. But when you stop to think about it, what exactly is critical thinking and how do you do it ?

Watch the video below to learn more about critical thinking.

Simply put, critical thinking is the act of deliberately analyzing information so that you can make better judgements and decisions . It involves using things like logic, reasoning, and creativity, to draw conclusions and generally understand things better.

illustration of the terms logic, reasoning, and creativity

This may sound like a pretty broad definition, and that's because critical thinking is a broad skill that can be applied to so many different situations. You can use it to prepare for a job interview, manage your time better, make decisions about purchasing things, and so much more.

The process

illustration of "thoughts" inside a human brain, with several being connected and "analyzed"

As humans, we are constantly thinking . It's something we can't turn off. But not all of it is critical thinking. No one thinks critically 100% of the time... that would be pretty exhausting! Instead, it's an intentional process , something that we consciously use when we're presented with difficult problems or important decisions.

Improving your critical thinking

illustration of the questions "What do I currently know?" and "How do I know this?"

In order to become a better critical thinker, it's important to ask questions when you're presented with a problem or decision, before jumping to any conclusions. You can start with simple ones like What do I currently know? and How do I know this? These can help to give you a better idea of what you're working with and, in some cases, simplify more complex issues.  

Real-world applications

illustration of a hand holding a smartphone displaying an article that reads, "Study: Cats are better than dogs"

Let's take a look at how we can use critical thinking to evaluate online information . Say a friend of yours posts a news article on social media and you're drawn to its headline. If you were to use your everyday automatic thinking, you might accept it as fact and move on. But if you were thinking critically, you would first analyze the available information and ask some questions :

  • What's the source of this article?
  • Is the headline potentially misleading?
  • What are my friend's general beliefs?
  • Do their beliefs inform why they might have shared this?

illustration of "Super Cat Blog" and "According to survery of cat owners" being highlighted from an article on a smartphone

After analyzing all of this information, you can draw a conclusion about whether or not you think the article is trustworthy.

Critical thinking has a wide range of real-world applications . It can help you to make better decisions, become more hireable, and generally better understand the world around you.

illustration of a lightbulb, a briefcase, and the world

/en/problem-solving-and-decision-making/why-is-it-so-hard-to-make-decisions/content/

  • Exploring the SCAMPER Technique for Creative Problem Solving
  • Exploring Brainwalking: A Creative Problem-Solving Technique
  • Logic Puzzles and Brain Teasers: A Comprehensive Overview
  • Exploring the Serendipity Technique of Creative Problem Solving
  • Analytical problem solving
  • Identifying root causes
  • Analyzing consequences
  • Brainstorming solutions
  • Heuristic problem solving
  • Using analogies
  • Applying existing solutions
  • Trial and error
  • Creative problem solving
  • Mind mapping
  • Brainstorming
  • Lateral thinking
  • Research skills
  • Interpreting information
  • Data collection and analysis
  • Identifying patterns
  • Critical thinking skills
  • Recognizing bias
  • Analyzing arguments logically
  • Questioning assumptions
  • Communication skills
  • Negotiation and compromise
  • Listening skills
  • Explaining ideas clearly
  • Planning techniques
  • SWOT analysis
  • Gantt charting
  • Critical path analysis
  • Decision making techniques
  • Force field analysis
  • Paired comparison analysis
  • Cost-benefit analysis
  • Root cause analysis
  • Five whys technique
  • Fault tree analysis
  • Cause and effect diagrams
  • Brainstorming techniques
  • Brainwriting
  • Brainwalking
  • Round-robin brainstorming
  • Creative thinking techniques
  • Serendipity technique
  • SCAMPER technique
  • Innovation techniques
  • Value innovation techniques
  • Design thinking techniques
  • Idea generation techniques
  • Personal problems
  • Deciding what career to pursue
  • Managing finances effectively
  • Solving relationship issues
  • Business problems
  • Increasing efficiency and productivity
  • Improving customer service quality
  • Reducing costs and increasing profits
  • Environmental problems
  • Preserving natural resources
  • Reducing air pollution levels
  • Finding sustainable energy sources
  • Individual brainstorming techniques
  • Thinking outside the box
  • Word association and random word generation
  • Mind mapping and listing ideas
  • Group brainstorming techniques
  • Synectics technique
  • Online brainstorming techniques
  • Online whiteboarding tools
  • Virtual brainstorming sessions
  • Collaborative mind mapping software
  • Team activities
  • Group decision making activities
  • Debate activities and role-play scenarios
  • Collaborative problem solving games
  • Creative activities
  • Creative writing exercises and storyboards
  • Imagination activities and brainstorming sessions
  • Visualization activities and drawing exercises
  • Games and puzzles
  • Crossword puzzles and Sudoku
  • Logic puzzles and brain teasers
  • Jigsaw puzzles and mazes
  • Types of decisions
  • Structured decisions
  • Simple decisions
  • Complex decisions
  • Problem solving skills
  • Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

Learn how to identify and address bias in decision making with our guide to recognizing bias in problem solving and critical thinking.

Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

In today's world, it is becoming increasingly important to recognize bias and how it can affect our decision-making. Bias can cloud our judgement, lead us to make decisions that are not in our best interests, and limit our ability to solve problems effectively. In this guide, we will explore the concept of recognizing bias and how it can be used as a tool for developing critical thinking and problem-solving skills. We will discuss the various types of biases, why recognizing them is important, and how to identify and counteract them.

Confirmation bias

Cognitive bias.

This type of bias can lead to unfair judgments or decisions. Other common types of bias include cultural bias, which is the tendency to favor one’s own culture or group; and political bias, which is the tendency to favor one’s own political party or beliefs. In order to identify and address bias in oneself and others, it is important to be aware of potential sources of bias. This includes personal opinions, values, and preconceived notions. Being mindful of these potential sources of bias can help us become more aware of our own biases and recognize them in others.

Additionally, it is important to be open-minded and willing to consider alternative perspectives. Additionally, it is helpful to challenge our own assumptions and beliefs by questioning them and seeking out evidence that supports or refutes them. The potential implications of not recognizing or addressing bias are significant. If left unchecked, biases can lead to unfair decisions or judgments, as well as inaccurate conclusions. This can have serious consequences for individuals and organizations alike.

Implications of Not Recognizing or Addressing Bias

Strategies for identifying and addressing bias.

Recognizing bias in oneself and others is an important part of making informed decisions. There are several strategies that can be used to identify and address bias. One of the most effective strategies is to take a step back and look at the situation objectively. This involves examining the facts and assumptions that are being used to make decisions.

It can also involve assessing the potential impact of decisions on multiple stakeholders. By removing personal biases from the equation, it is possible to make more informed decisions. Another important strategy for identifying and addressing bias is to question the sources of information. It is important to consider the credibility of sources, as well as any potential biases that may be present.

Fact-checking sources and considering multiple perspectives can help identify any potential biases in the information being used. In addition, it is important to remain aware of our own biases. We all have preconceived notions about certain topics that can affect our decision-making process. By being mindful of our biases, we can avoid making decisions that are influenced by them. Finally, it is important to be open to other perspectives and willing to engage in meaningful dialogue with others.

Types of Bias

Halo effect, what is bias.

It can be an unconscious preference that influences decision making and can lead to adverse outcomes. It is important to recognize bias because it can have a negative impact on our ability to make sound decisions and engage in problem solving and critical thinking. Bias can manifest itself in various ways, from subtle mental shortcuts to overt prejudices. Types of bias include confirmation bias, where we seek out information that confirms our existing beliefs; availability bias, where we base decisions on the information that is most readily available; and representativeness bias, where we assume that two events or objects are related because they share similar characteristics. Other forms of bias include halo effect, where a single positive quality or trait can influence the perception of an entire person; and stereotyping, which is the tendency to make judgments about individuals based on their perceived membership in a certain group. It is important to recognize bias in ourselves and others so that we can make informed decisions and engage in problem solving and critical thinking.

Sources of Bias

Bias can have a profound effect on decisions, leading to outcomes that are not based on facts or evidence. Personal opinions and values can lead to biased decision-making. They can be shaped by past experiences, cultural background , and other personal factors. For example, someone's opinion about a certain topic may be based on what they have previously heard or read. Similarly, preconceived notions can also lead to biased conclusions. Cultural norms can also play a role in creating bias.

For instance, people may be more likely to believe information from a source they trust or respect, even if it is not based on fact. Similarly, people may be more likely to make decisions that conform to the expectations of their culture or society. In addition, people can also be influenced by their own prejudices or stereotypes. This type of bias can lead to unfair treatment of certain individuals or groups of people. Finally, it is important to be aware of the potential for confirmation bias, where people will seek out information that confirms their existing beliefs and disregard any contradictory evidence. By recognizing and understanding these sources of bias, people can make more informed decisions and engage in more effective problem solving and critical thinking.

In conclusion, recognizing and addressing bias is an essential part of problem solving and critical thinking. Bias can come from many sources, including our own beliefs, cultural norms, and past experiences. Knowing the types of bias and strategies for identifying and addressing them can help us make informed decisions and better engage in critical thinking. Taking time to reflect on our own biases is also important for making unbiased decisions.

Ultimately, recognizing and addressing bias will improve our problem-solving and critical thinking skills.

Mind Mapping: A Creative Problem Solving Tool

  • Mind Mapping: A Creative Problem Solving Tool

Learn all about mind mapping and how it can help you to solve problems creatively and effectively.

Visualization Activities and Drawing Exercises

  • Visualization Activities and Drawing Exercises

Learn more about visualization activities and drawing exercises, from problem-solving activities to creative activities.

Brainstorming Solutions: A Problem-Solving Guide

  • Brainstorming Solutions: A Problem-Solving Guide

Learn how to use brainstorming to come up with creative solutions to complex problems. Discover problem-solving strategies that work.

Thinking Outside the Box: An Overview of Individual Brainstorming Techniques

  • Thinking Outside the Box: An Overview of Individual Brainstorming Techniques

Learn about individual brainstorming techniques to help you think outside the box and come up with creative solutions.

  • Mind Mapping - Creative Problem Solving and Creative Thinking Techniques
  • Collaborative Problem Solving Games: Exploring Creative Solutions for Teams
  • Force Field Analysis for Problem Solving and Decision Making
  • Finding Sustainable Energy Sources

Improving Customer Service Quality

  • Value Innovation Techniques
  • Mind Mapping and Listing Ideas
  • Collaborative Mind Mapping Software
  • SWOT Analysis: A Comprehensive Overview
  • Fault Tree Analysis: A Comprehensive Overview
  • Data Collection and Analysis - Problem Solving Skills and Research Skills
  • Virtual Brainstorming Sessions: A Comprehensive Overview
  • Cause and Effect Diagrams: A Problem-Solving Technique

Exploring Trial and Error Problem Solving Strategies

  • Interpreting Information: A Problem-Solving and Research Skills Primer
  • Brainstorming: A Comprehensive Look at Creative Problem Solving
  • Gantt Charting: A Primer for Problem Solving & Planning Techniques
  • Debate Activities and Role-Play Scenarios
  • Design Thinking Techniques: A Comprehensive Overview
  • Cost-benefit Analysis: A Guide to Making Informed Decisions
  • Managing Your Finances Effectively
  • Idea Generation Techniques: A Comprehensive Overview
  • Structured Decisions: An Overview of the Decision Making Process
  • Preserving Natural Resources
  • Critical Path Analysis: A Comprehensive Guide
  • Maximizing Efficiency and Productivity
  • Crossword Puzzles and Sudoku: A Problem-Solving Exploration
  • Word Association and Random Word Generation
  • Paired Comparison Analysis: A Comprehensive Overview
  • Choosing the Right Career: Problem-Solving Examples
  • Brainwriting: A Creative Problem-Solving Technique
  • Applying Existing Solutions for Problem Solving Strategies
  • Identifying Patterns: A Practical Guide
  • Imagination Activities and Brainstorming Sessions
  • How to Explain Ideas Clearly
  • Analyzing Arguments Logically
  • Reducing Costs and Increasing Profits: A Problem Solving Example
  • Creative Writing Exercises and Storyboards
  • Exploring Synectics Technique: A Comprehensive Guide
  • Jigsaw Puzzles and Mazes: Problem Solving Activities for Fun and Learning
  • Brainwriting: A Group Brainstorming Technique
  • Questioning Assumptions: A Critical Thinking Skill
  • Analyzing Consequences: A Problem Solving Strategy
  • Identifying Root Causes
  • Exploring Lateral Thinking: A Comprehensive Guide to Problem Solving Strategies
  • Making Complex Decisions: A Comprehensive Overview
  • Round-robin Brainstorming: A Creative Problem Solving Tool
  • Solving Relationship Issues
  • Negotiation and Compromise

Using Analogies to Solve Problems

  • Five Whys Technique: A Comprehensive Analysis
  • Exploring Online Whiteboarding Tools for Brainstorming
  • Round-robin brainstorming: Exploring a Group Brainstorming Technique
  • Listening Skills: A Comprehensive Overview
  • Simple Decisions - An Overview
  • Reducing Air Pollution Levels
  • Group Decision Making Activities

New Articles

Improving Customer Service Quality

Which cookies do you want to accept?

Back Home

  • Search Search Search …
  • Search Search …

Master cognitive biases and improve your critical thinking

cognitive bias

If you are seeking to improve your critical thinking abilities, you need to learn how to recognize, reduce, and redirect cognitive biases. Learn more about the most common cognitive biases, strategies for combatting cognitive bias, and the benefits of beating cognitive bias.

What is a cognitive bias?

A cognitive bias is an unconscious error in judgment you make due to a built-in inclination or prejudice. Cognitive biases exist because our brains need to take shortcuts to process the vast amounts of information needed to make decisions. Most of the time, those shortcuts are helpful.

Occasionally, our brains create shortcuts that lead to irrational thinking. Cognitive biases are the irrational ways we search for, interpret, evaluate, and use the information to make decisions. Some people believe that humans will always experience cognitive biases because our ability to think rationally is bounded by the capabilities of our brains.

What is the difference between a logical fallacy and a cognitive bias?

A logical fallacy is an error in a line of reasoning, whereas a cognitive bias is an error in thought. An argument made with logical fallacies has faults in either the premises of the argument or the conclusions drawn from those premises. An argument made from a cognitive bias may be totally logically sound, but reflect some distortions in thinking.

For example, someone might argue “I burn everything I cook, therefore I am a bad cook.” There is no problem with the logic of the argument, so it is not a logical fallacy. However, it’s likely that the person is subconsciously over-emphasizing how frequently they burn food. If the person doesn’t actually burn their food every time, this is an example of a cognitive bias.

What are the signs of cognitive bias?

The easiest way to think of a cognitive bias is “a simplification that leads to exaggeration.” All cognitive biases are forms of distortion of the truth. To recognize cognitive biases, look for hyperbole and oversimplification.

Which cognitive biases are the most common?

Confirmation bias.

Confirmation bias is when you are more attuned to information that confirms what you already believe to be true. You may write off or ignore information that conflicts with your existing belief.

Hindsight Bias

Hindsight bias is when you overestimate your ability to predict events. You remember the events you predicted accurately and forget about the predictions you made that didn’t come true.

Anchoring Bias

Anchoring bias is when you rely too heavily on the earliest information you learned about a topic as a reference point. Marketers are aware of the anchoring bias, which is why some stores seem to be perpetually on sale. We are more likely to buy something if our original perception of its value is high.

Optimism Bias/ Pessimism Bias

Optimism bias and pessimism bias are two sides of the same coin. In these biases, you are predisposed to assume good outcomes are more likely (optimism bias) or that things will go wrong (pessimism bias).

Overconfidence from the optimism bias leads to reckless choices that may put you at risk. Conversely, being overly pessimistic can cause you to shy away from great opportunities because you assume that you will fail.

Bandwagon Effect

The bandwagon effect is when you are more likely to do or believe something because others are doing the same thing. “If all your friends jumped off a cliff, would you?” is a challenge to the bandwagon effect. This bias is also known as “herd mentality.”

Halo Effect

The halo effect is the assumption that if a person has positive attributes in one area, they will have positive attributes in other, unrelated areas. A common example of the halo effect is the assumption that better-looking people are more likely to be intelligent or talented.

The reverse halo effect is the assumption that one negative trait applies more universally. The halo effect can apply to our perceptions of brands or social groups, not just individuals.

Framing Effect

The framing effect refers to the way that perception can be altered by the way information is presented or framed. For example, you would be more likely to purchase a weight loss supplement that advertised “75% of users lost weight in 6 months” than one that advertised “25% of users did not lose weight in 7 months” even though these percentages add up to the same total.

Availability Heuristic

The availability heuristic is the tendency to judge how likely an event is based on how easily you can recall other examples of that event. For example, people tend to overestimate the frequency of shark attack deaths because shark attacks are common in both pop culture and news articles.

How can I reduce cognitive biases?

First, practice noticing thoughts that are distorted. Your emotions can clue you into times when your anxiety, stress, or frustration may heighten irrational thinking. Try using the Socratic method on your own thoughts. Ask yourself “how did I come to that conclusion?”

Once you start digging into a thought that seems irrational, your knowledge of common cognitive biases can help you detect which biases are present in that thought. It might be helpful to write down which cognitive biases you notice frequently in your thoughts. Don’t be afraid to challenge yourself.

Re-visiting a decision or thought process can help you reduce cognitive bias. By removing yourself from the emotions of the initial thought process, you may be able to spot bias in your previous thinking. This is why “before you make a big decision, sleep on it” is actually good advice!

How will reducing cognitive biases improve my critical thinking?

Being able to recognize and reduce your own cognitive biases will allow you to make better decisions. You will recognize your own strengths and limitations more accurately. You will begin to notice which situations spike your irrational judgments, and which situations allow your rational

When you become familiar with cognitive bias , you will be able to evaluate the premises of your arguments for bias, not just logical fallacies. This could make you a more convincing person. It might open you up to opportunities you had ruled out because of previous biases.

https://www.verywellmind.com/cognitive-biases-distort-thinking-2794763

https://www.simplypsychology.org/cognitive-bias.html

You may also like

critical thinking for the workplace

Critical Thinking Skills in the Workplace

Critical thinking refers to the analysis and evaluation of different points to form a decision or judgment. Whether a person’s position in […]

Thinking vs Critical thinking

Thinking Vs. Critical Thinking: What’s the Difference?

Thinking and critical thinking do not sound that different in nature. After all, they both include the verb thinking, and therefore, imply […]

Divergent vs Convergent Thinking

Divergent vs Convergent Thinking – What are They and How are They Different?

They say that necessity is the mother of invention but without a dash of creativity and clear thinking, innovation would stall out […]

religion and critical thinking

Religion and Critical Thinking: How critical thinking impacts religion

The more critical thinking skills you have, the less religious beliefs you have. It has been found that those who think critically […]

May 4, 2024

Implicit Bias Hurts Everyone. Here’s How to Overcome It

The environment shapes stereotypes and biases, but it is possible to recognize and change them

By Corey S. Powell & OpenMind Magazine

Serious woman of color scientist wearing protective eyewear in white coat.

fotostorm/Getty Images

We all have a natural tendency to view the world in black and white—to the extent that it's hard not to hear "black" and immediately think "white." Fortunately, there are ways to activate the more subtle shadings in our minds. Kristin Pauker is a professor of psychology at the University of Hawaiʻi at Mānoa who studies stereotyping and prejudice, with a focus on how our environment shapes our biases. In this podcast and Q&A, she tells OpenMind co-editor Corey S. Powell how researchers measure and study bias, and how we can use their findings to make a more equitable world. (This conversation has been edited for length and clarity.)

When I hear “bias,” the first thing I think of is a conscious prejudice. But you study something a lot more subtle, which researchers call “implicit bias.” What is it, and how does it affect us?

Implicit bias is a form of bias that influences our decision-making, our interactions and our behaviors. It can be based on any social group membership, like race, gender, age, sexual orientation or even the color of your shirt. Often we’re not aware of the ways in which these biases are influencing us. Sometimes implicit bias gets called unconscious bias, which is a little bit of a misnomer. We can be aware of these biases, so it's not necessarily unconscious. But we often are not aware of the way in which they're influencing our behaviors and thoughts.

On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing . By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.

You make it sound like almost anything can set us off. Why is bias so deeply ingrained in our heads?

Our brain likes to categorize things because it makes our world easier to process. We make categories as soon as we start learning about something. So we categorize fruits, we categorize vegetables, we categorize chairs, we categorize tables for their function—and we also categorize people. We know from research that categorization happens early in life, as early as 5 or 6, in some cases even 3 or 4. Categorization creates shortcuts that help us process information faster, but that also can lead us to make assumptions that may or may not hold in particular situations. What categories we use are directed by the environment that we're in. Our environment already has told us certain categories are really important, such as gender, age, race and ethnicity. We quickly form an association when we’re assigned to a particular group.

Listen to the Podcast

Kristin Pauker: We have to think about ways in which we can change the features of our environment—so that our weeds aren’t so prolific.

In your research, you use a diagnostic tool called an “ implicit association test .” How does it work, and what does it tell you?

Typically someone would show you examples of individuals who belong to categories, and then ask you to categorize those individuals. For example, you would see faces and you would categorize them as black and white. You’re asked to make a fast categorization, as fast as you can. Then you are presented with words that could be categorized as good or bad, like “hero” and “evil,” and again asked to categorize the words quickly. The complicated part happens when, say, good and white are paired together or bad and black are paired together. You're asked to categorize the faces and the words as you were before. Then it's flipped, so that bad and white are paired together, and good and black are paired together. You’re asked to make the categorizations once again with the new pairings.

The point of the test is, how quickly do you associate certain concepts together? Oftentimes if certain concepts are more closely paired in your mind, then it will be easier for you to make that association. Your response will be faster. When the pairing is less familiar to you or less closely associated, it takes you longer to respond. Additional processing needs to occur.

When you run this implicit association test on your test subjects or your students, are they often surprised by the results?

We’ve done it as a demonstration in the classroom, and I've had students come up and complain saying, “There’s something wrong with this test. I don't believe it.” They’ll try to poke all kinds of holes in the test because it gave them a score that wasn’t what they felt it should be according to what they think about themselves. This is the case, I think, for almost anyone. I've taken an implicit association test and found that I have a stronger association with men in science than women in science . And I'm a woman scientist! We can have and hold these biases because they’re prevalent in society, even if they’re biases that may not be beneficial to the group we belong to.

Studies show that even after you make people aware of their implicit biases, they can’t necessarily get rid of them. So are we stuck with our biases?

Those biases are hard to change and control, but that doesn't mean that they are un controllable and un changeable. It’s just that oftentimes there are many features in our environment that reinforce those biases. I was thinking about an analogy. Right now I’m struggling with weeds growing in my yard, invasive vines. It’s hard because there are so many things supporting the growth of these vines. I live in a place that has lots of sun and rain. Similarly, there’s so much in our environment that is supporting our biases. It’s hard to just cut them off and be like, OK, they're gone. We have to think about ways in which we can change the features of our environment—so that our weeds aren’t so prolific.

Common programs aimed at reducing bias, such as corporate diversity training workshops, often seem to stop at the stage of making people aware that bias exists. Is that why they haven’t worked very well ?

If people are told that they’re biased, the reaction that many of them have is, “Oh, that means I'm a racist? I'm not a racist!” Very defensive, because we associate this idea of being biased with a moral judgment that I'm a bad person. Because of that, awareness-raising can have the opposite of the intended effect. Being told that they're biased can make people worried and defensive, and they push back against that idea. They're not willing to accept it.

A lot of the diversity training models are based on the idea that you can just tell people about their biases and then get them to accept them and work on them. But, A, some people don't want to accept their biases. B, some people don't want to work on them. And C, the messaging around how we talk about these biases creates a misunderstanding that they can’t be changed. We talk about biases that are unconscious, biases that we all hold, that are formed early in life—it creates the idea, “Well, there’s nothing I can do, so why should I even try?”

How can we do better in talking about bias, so that people are more likely to embrace change instead of becoming defensive or defeated?

Some of it is about messaging. Biases are hard to change, but we should be discussing the ways in which these biases can change, even though it might take some time and work. You have to emphasize the idea that these things can change, or else why would we try? There is research showing that if you just give people their bias score, normally that doesn't result in them becoming more aware of their bias. But if you combine that score with a message that this is something controllable, people are less defensive and more willing to accept their biases.

What about concrete actions we can take to reduce the negative impact of implicit bias?

One thing is thinking about when we do interventions. A lot of times we’re trying to make changes in the workplace. We should be thinking more about how we're raising our children. The types of environments we're exposing them to, and the features that are in our schools , are good places to think about creating change. Prejudice is something that’s malleable.

Another thing is not always focusing on the person. So much of what we do in these interventions is try to change individual people's biases. But we can also think about our environment. What are the ways in which our environments are communicating these biases, and how can we make changes there? A clever idea people have been thinking about is trying to change consequences of biases. There's a researcher, Jason A. Okonofua , who talks about this and calls it “sidelining bias.” You're not targeting the person and trying to get rid of their biases. You're targeting the situations that support those biases. If you can change that situation and kind of cut it off, then the consequences of bias might not be as bad. It could lead to a judgment that is not so influenced by those biases.

There’s research showing that people make fairer hiring decisions when they work off tightly structured interviews and qualification checklists, which leave less room for subjective reactions. Is that the kind of “sidelining” strategy you’re talking about?

Yes, that’s been shown to be an effective way to sideline bias. If you set those criteria ahead of time, it's harder for you to shift a preference based on the person that you would like to hire. Another good example is finding ways to slow down the processes we're working on. Biases are more likely to influence our decision-making when we have to make really quick decisions or when we are stressed—which is the case for a lot of important decisions that we make.

Jennifer Eberhardt does research on these kinds of implicit biases. She worked with NextDoor (a neighborhood monitoring app) when they noticed a lot of racial profiling in the things people were reporting in their neighborhood. She worked with them to change the way that people report a suspicious person. Basically they added some extra steps to the checklist when you report something. Rather than just reporting that someone looks suspicious, a user had to indicate what about the behavior itself was suspicious. And then there was an explicit warning that they couldn't just say the reason for the suspicious behavior was someone's race. Including extra check steps slowed down the process and reduced the profiling.

It does feel like we’re making progress in addressing bias but, damn, it’s been a slow process. Where can we go from here?

A big part that’s missing in the research on implicit bias is creating tools that are useful for people. We still don’t know a lot about bias, but we know a lot more than we're willing to put into practice. For instance, creating resources for parents to be able to have conversations about bias , and to be aware that the everyday things we do are really important. This is something that many people want to tackle, but they don’t know how to do it. Just asking questions about what is usual and what is unusual has really interesting effects. We’ve done that with our son. He’d say something and I would ask, “Why is that something that only boys can do? You say girls can't do that, is that really the case? Can you think of examples where the opposite is true?”

This Q&A is part of a series of OpenMind essays, podcasts and videos supported by a generous grant from the Pulitzer Center 's Truth Decay initiative.

This story originally appeared on OpenMind , a digital magazine tackling science controversies and deceptions.

  • Search Menu
  • Advance Articles
  • Special Issues
  • Virtual Issues
  • Trending Articles
  • IMPACT Content
  • Author Guidelines
  • Submission Site
  • Open Access Options
  • Self-Archiving Policy
  • Author Resources
  • Read & Publish
  • Why Publish with JOPE?
  • About the Journal of Philosophy of Education
  • About The Philosophy of Education Society of Great Britain
  • Editorial Board
  • Advertising & Corporate Services
  • Journals on Oxford Academic
  • Books on Oxford Academic

Issue Cover

Article Contents

Introduction, the central argument, where should non-western contributions to logic and critical thinking be taught, acknowledgements.

  • < Previous

Does Critical Thinking and Logic Education Have a Western Bias? The Case of the Nyāya School of Classical Indian Philosophy

  • Article contents
  • Figures & tables
  • Supplementary Data

Anand Jayprakash Vaidya, Does Critical Thinking and Logic Education Have a Western Bias? The Case of the Nyāya School of Classical Indian Philosophy, Journal of Philosophy of Education , Volume 51, Issue 1, February 2017, Pages 132–160, https://doi.org/10.1111/1467-9752.12189

  • Permissions Icon Permissions

In this paper I develop a cross-cultural critique of contemporary critical thinking education in the United States, the United Kingdom, and those educational systems that adopt critical thinking education from the standard model used in the US and UK. The cross-cultural critique rests on the idea that contemporary critical thinking textbooks completely ignore contributions from non-western sources, such as those found in the African, Arabic, Buddhist, Jain, Mohist and Nyāya philosophical traditions. The exclusion of these traditions leads to the conclusion that critical thinking educators, by using standard textbooks are implicitly sending the message to their students that there are no important contributions to the study of logic and argumentation that derive from non-western sources. As a case study I offer a sustained analysis of the so-called Hindu Syllogism that derives from the Nyāya School of classical Indian philosophy. I close with a discussion of why contributions from non-western sources, such as the Hindu Syllogism, belong in a Critical Thinking course as opposed to an area studies course, such as Asian Philosophy .

One question in the philosophy of education is the question concerning education for democracy , EDQ: How should public education enable the ethical implementation and proper functioning of democratic processes, such as voting on the basis of public and civic discourse? At least one plausible answer is that a public education should provide citizens of a political body with basic skills in public discourse, which is inclusive of critical thinking and civic debate. That is, education for democracy should have an element that enables ethical public discourse on topics of shared concern. This answer is grounded on two ideas. First, democratic processes, such as voting, take into account the will of the people through reflective deliberation and the exchange of ideas on matters of public concern, such as prison reform, marriage, taxation and gun control. Second, critical thinking through civic engagement allows for the expression of individual autonomy on a matter of public concern. Call this general answer to EDQ, the critical thinking and civic debate response , CTCD. Two important questions the CTCD response faces are the content question : What exactly are critical thinking, civic debate and ethical public discourse? And the normative question : What are the appropriate forms, norms and intellectual virtues by which we should engage in critical thinking, civic debate and ethical public discourse? 1

On March 24, 2014 at the Cross Examination Debate Association (CEDA) Championships at Indiana University, two Towson University students, Ameena Ruffin and Korey Johnson, became the first African-American women to win a national college debate tournament, for which the resolution asked whether the US president's war powers should be restricted. Rather than address the resolution straight on, Ruffin and Johnson, along with other teams of African-Americans, attacked its premise. The more pressing issue, they argued, is how the US government is at war with poor black communities. In the final round, Ruffin and Johnson squared off against Rashid Campbell and George Lee from the University of Oklahoma, two highly accomplished African-American debaters with distinctive dreadlocks and dashikis. Over four hours, the two teams engaged in a heated discussion of concepts like “nigga authenticity” and performed hip-hop and spoken-word poetry in the traditional timed format. At one point during Lee's rebuttal, the clock ran out but he refused to yield the floor. “Fuck the time!” he yelled. His partner Campbell, who won the top speaker award at the National Debate Tournament two weeks later, had been unfairly targeted by the police at the debate venue just days before, and cited this experience as evidence for his case against the government's treatment of poor African-Americans.
In the 2013 championship, two men from Emporia State University, Ryan Walsh and Elijah Smith, employed a similar style and became the first African-Americans to win two national debate tournaments. Many of their arguments, based on personal memoir and rap music, completely ignored the stated resolution, and instead asserted that the framework of collegiate debate has historically privileged straight, white, middle-class students . ( emphasis added ) 3

Although there are many important features that these cases bring to light, here I want to draw attention to three features that help us understand the importance of both the content question and the normative question. First, the kind of evidence that is appealed to does not just consist in objective facts, such as what the law states, and reasoning deductively or inductively from a set of premises, it also contains personal experience. Second, the mode of engagement used does not consist simply in rational argumentation through the use of the standard format of ethical theory, followed by premise, application and finally a conclusion. Rather, it includes poetry and hip-hop that takes both a reason-based approach and an emotional and musical element into play. Third, the norm of engagement used does not see deference to rules as trumping either the importance of what is talked about or the length of time one talks about it. We might summarise a caricature reaction to these students by a hypothetical critic as follows: how rude of these students to not debate the issue, to disregard the rules, and to fail to take into consideration the kinds of evidence required for public debate and discourse on matters of social and political concern. In light of these cases and the caricature, the critical thinking and civic debate community faces an important and unexamined question : does critical thinking and civic debate education rest on an uncritical examination of its very foundation? Is the foundation perhaps insensitive to race, class, gender and non-western traditions of critical thinking and debate? Call this question the meta-critical question about critical thinking.

The meta-critical question about critical thinking and civic debate education is extremely important to any education policy that embraces the CTCD response to EDQ. Furthermore, it is central to the project of coming to understand how public discourse is possible in a community that has diverse individuals with non-overlapping conceptions of the good life. In the next section, I present, explain, and defend the central argument leading to the conclusion that critical thinking education should include contributions from non-western sources. As a case study I present some material, well known in the classical Indian philosophical and comparative philosophy community, concerning contributions to logic and critical thinking deriving from the Nyāya tradition of orthodox Indian philosophy. My examination of these contributions aims to establish that there are things that critical thinking education can take on board from non-western traditions that are important and valuable to critical thinking and logic education. In the third section , I present and respond to the objection that contributions to critical thinking from non-western traditions should not be taught in a Critical Thinking course but rather in an area studies course, such as Asian Philosophy .

At present there is a social blindspot that critical thinking and debate education suffers from in the US, UK and those countries that use the standard model that originates from the US and UK. In short the blindspot is that critical thinking and debate education is insensitive to variation over what could count as critical thinking and civic debate based on an examination of non-western contributions to critical thinking and debate. The neglect of these traditions is largely due to the fact that those that work on critical thinking, logic and debate, such as members of the informal logic community are generally not historically informed about non-western contributions to critical thinking through engagement with those that work on Asian and comparative philosophy. Simply put, institutional separation has led to an impoverished educational package for critical thinking education for the past 100 years in which the modern university has developed. The central argument I will develop to expose the problem is as follows.

Critical thinking, and practice in ethical civic debate, is important to public discourse on social and political issues that citizens of a democratic body vote on when making policy decisions that concern all members of the public. The current model for critical thinking and civic debate education is dominated by a western account of informal logic, formal logic, debate rules and intellectual virtues. Critical thinking education should include contributions from non-western philosophers.∴ Critical thinking education should be revised so as to be inclusive of contributions from non-western thinkers.

Why should we accept the premises of this argument?

Premise 1: Critical thinking, and practice in ethical civic debate, is important to public discourse on social and political issues that citizens of a democratic body vote on when making policy decisions that concern all members of the public.

The main objection to Premise 1 derives from the work of Michael Huemer's (2005) paper: Is Critical Thinking Epistemically Responsible? In this work he provides an argument against the epistemic responsibility of critical thinking. The core idea is that if one is forming a belief on an issue of public concern one ought to do so responsibly. Given that one ought to form the belief in a responsible way, one might consider different strategies open to the person for how to form the belief. Consider the following belief forming strategies.

Credulity : In forming a belief a person is to canvass the opinions of a number of experts and adopt the belief held by most of them. In the best case, the person finds a poll of the experts; failing that, the person may look through several reputable sources, such as scholarly books and peer-reviewed journal articles, and identify the conclusions of the experts.

Skepticism : In forming a belief a person is to form no opinion on the matter; that is, the person is to withhold judgement about the issue.

Critical Thinking : In forming a belief a person is to gather arguments and evidence that are available on the issue, from all sides, and assess them. The person tries thereby to form some overall impression on the issue. If the person forms such an impression, then she bases her belief on it. Otherwise, the person suspends judgement.

Now, where P is a specific controversial and publicly debated issue, and C b P is the context of belief formation for P, the central argument against the epistemic responsibility of critical thinking is the following.

Adopting Critical Thinking about P in C b P is epistemically responsible only if Critically Thinking about P is the most reliable strategy from the available strategies in C b P. One ought to always use the most reliable strategy available in forming a belief about an issue of public concern. Critical Thinking about P is not the most reliable strategy from the available strategies in C b P.∴ It is not the case that Critical Thinking about P in C b P is epistemically responsible.

In Vaidya (2013) , I offered an extensive argument against Huemer's position. That argument depends on a development of a theory of what constitutes an autonomous critical identity, and why forming a critical identity is valuable for a person. As a consequence, I will not go into a sustained response to Huemer's argument here. Rather, I will note a simple set of points that can be used to assuage the initial force of what is being argued.

First, in so far as the claim is that critical thinking about an issue is not to be preferred over taking the view of an expert on an issue it is clear that choosing who the expert is on the issue is a matter of critical thinking. In order for one to identify someone as an expert one must understand how to track through sources for the appropriate identification of experts. So, in general, deference to experts actually depends on critical thinking , since deference is a choice one must make. The core idea is that one cannot outsource all cognition to an alternative source, since outsourcing is itself a decision that has to be made .

Secondly, it is important to distinguish between the different types of issues in which deference to experts can be made. For example, there is a difference between an argument that contains a scientific conclusion, with mathematical and scientific premises, and an argument that contains moral premises and has a moral conclusion. Given this difference, it is plausible to maintain that deference to experts in the scientific and mathematics case is not the same as deference in the moral case. While I can defer to a moral expert to tell me what a moral theory says, such as what the details of consequentialism, as opposed to deontology, are, I can't defer to a moral expert on the issue of what the correct moral conclusion is, independently of the adoption of a specific moral view. By contrast, I can defer to a scientist or a mathematician as to which conclusion to believe on the basis of the premises.

Premise 2: The current model for critical thinking and civic debate education is dominated by a western account of informal logic, formal logic, debate rules and intellectual virtues.

The key defence I will offer for Premise 2 relies on an examination of two of the most commonly used textbooks for critical thinking in the US and UK, Patrick Hurley's A Concise Introduction to Logic and Lewis Vaughn's The Power of Critical Thinking: Effective Reasoning about Ordinary and Extraordinary Claims . The guiding idea of the argument is that if our main textbooks for teaching critical thinking and logic at the introductory level do not engage non-western philosophy, we can reasonably infer that those that use the textbook are not teaching critical thinking and logic by way of engaging non-western sources. Of course there will be those that supplement the texts, perhaps even for the very reason I am presenting here—that they lack non-western ideas. But we can safely examine and entertain the claim that I am defending as being primarily about textbooks as opposed to variable classroom practice . More importantly, if the textbooks are widely used, which they are, we can ask: what do they represent about critical thinking and civic debate education?

I will focus my examination of non-western contributions to critical thinking on two places in critical thinking education where adjustments can be made. The point of this presentation is to show that our main textbooks can be altered to include non-western sources in specific ways. Although there are many contributions I could discuss, for simplicity I will focus on contributions from the Nyāya School of classical Indian philosophy concerning the nature of argumentation. In future work I will discuss other cases, such as those deriving from Africana, Arabic, Jaina, and Jewish philosophy. I begin my investigation of the Nyāya by answering a basic question: Is there a distinction in Indian philosophy between different kinds of discussions that allows us to isolate out critical discourse from non-critical discourse?

Is there any critical thinking in Indian philosophy?

Of course this question cannot be seriously entertained by anyone who works in Indology or Asian and Comparative Philosophy. However, for those who are not in the know, a presentation and defence of an affirmative answer must be made. For if one is to include non-western ideas about critical thinking in a textbook that is eventually used to teach the subject, one needs to show that non-western traditions are in fact engaging in critical thinking. In order to do that we need to look at competing views of what critical thinking is, the content question , in order to locate critical thinking outside of the west.

The Skill View holds that critical thinking is exhausted by the acquisition and proper deployment of critical thinking skills.
The Character View holds that critical thinking involves the acquisition and proper deployment of specific skills as well as the acquisition of specific character traits, dispositions, attitudes, and habits of mind. These components are aspects of the “critical spirit” ( Siegel, 1993 , 163–165).

Given this distinction, where does classical Indian philosophy fall? We have three options. Indian philosophical traditions take one or another of the views, or there is no discussion at all of either of these views. I will show that some Indian philosophical traditions make a distinction between various kinds of discussion, one of which is a critical discussion, and that there is evidence for the character view of critical thinking.

Discussion is the adoption of one of two opposing sides. What is adopted is analyzed in the form of the five members, and defended by the aid of any of the means of right knowledge, while its opposite is assailed by confutation, without deviation from the established tenets ( Sinha, 1990 , p. 19).
Wrangling , which aims at gaining victory, is the defense or attack of a proposition in the manner aforesaid, by quibbles, futilities, and other processes which deserve rebuke ( Sinha, 1990 , p. 20).
Cavil is a kind of wrangling, which consists in mere attacks on the opposite side ( Sinha, 1990 , p. 20).

Matilal maintains, on the basis of Akṣapāda's Nyāya Sūtras that there are three distinct kinds of discussions. Vāda is an honest debate where both sides, proponent and opponent, are seeking the truth, that is, wanting to establish the right view. Jalpa , by contrast, is a discussion/debate in which one tries to win by any means, fair or unfair. Vitaṇdā is a discussion in which one aims to destroy or demolish the opponent no matter how. One way to explain the distinctions is as follows: (i) vāda is an honest debate for the purposes of finding the truth, (ii) jalpa is a debate aimed at victory where one propounds a thesis; (ii) vitaṇdā is a debate aimed at victory, where no thesis is defended, one simply aims to demolish the view propounded by the proponent. 4   The distinction between these three kinds of discussions grounds the claim that classical Indian philosophers were aware of different kinds of discussions based on the purpose of the discussion, and that critical thinking, for the purposes of finding the truth on an issue, was not at all a foreign idea .

One who has acquired the knowledge (given by the authoritative text) based on various reasons and refuting the opponent's view in debates, does not get fastened by the pressure of the opponent's arguments nor does he get subdued by their arguments ( Van Loon, 2002 , p. 115).
Discussion with specialists: promotes pursuit and advancement of knowledge, provides dexterity, improves power of speaking, illumines fame, removes doubt in scriptures, if any, by repeating the topics, and it creates confidence in case there is any doubt, and brings forth new ideas. The ideas memorized in study from the teacher, will become firm when applied in (competitive) discussion ( Van Loon, 2002 , pp. 115–116).
Discussion with specialists is of two types— friendly discussion and hostile discussion. The friendly discussion is held with one who is endowed with learning, understanding and the power of expression and contradiction, devoid of irritability, having uncensored knowledge, without jealousy, able to be convinced and convince others, enduring and adept in the art of sweet conversation. While in discussion with such a person one should speak confidently, put questions unhesitatingly, reply to the sincere questioner with elaborateness, not be agitated with fear of defect, not be exhilarated on defeating the partner, nor boast before others, not hold fast to his solitary view due to attachment, not explain what is unknown to him, and convince the other party with politeness and be cautious in that. This is the method of friendly discussion ( Van Loon, 2002 , pp. 117–118, emphasis added ).

The passages from the Handbook of Ayurveda , especially the emphasised area, substantiate the idea that the character view is in play in one of the oldest recorded presentations of critical reasoning and how it is to be executed.

Furthermore, in his Indian Logic , Jonardon Ganeri (2004) presents a picture of argumentation and critical thinking in ancient India by turning to the classic dialogue of the Buddhist tradition: Milinda-pañha ( Questions for King Milinda ). Ganeri presents an important passage on discussion and critical thinking. 5

Milinda: Reverend Sir, will you discuss with me again?

Nāgasena: If your Majesty will discuss ( vāda ) as a scholar, well, but if you will discuss as a king, no.

Milinda: How is it that scholars discuss?

Nāgasena: When scholars talk a matter over one with another, then there is a winding up, an unraveling, one or other is convicted of error, and he then acknowledges his mistake; distinctions are drawn, and contra-distinctions; and yet thereby they are not angered. Thus do scholars, O King, discuss.

Milinda: And how do kings discuss?

Nāgasena: When a king, your Majesty, discusses a matter, and he advances a point, if any one differ from him on that point, he is apt to fine him, saying “Inflict such and such a punishment upon that fellow!” Thus, your Majesty, do kings discuss.

Milinda: Very well. It is as a scholar, not as a king, that I will discuss.( As quoted in   Ganeri, 2004 , p. 17)

When scholars talk a matter over one with another, then is there a winding up, an unraveling, one or other is convicted of error, and he then acknowledges his mistake ; distinctions are drawn, and contra-distinctions; and yet thereby they are not angered ( as quoted in   Ganeri, 2004 , p. 17, emphasis added ).

One reading of this claim is that Nāgasena is pointing out that a good discussion requires not only that certain moves are made ‘a winding up’ and an ‘unraveling’, but that the persons involved in making those moves have a certain epistemic temper . Participants in a good debate moreover have the capacity, and exercise the capacity, to (i) acknowledge mistakes , and (ii) not become angered by the consequences of where the inquiry leads . Nāgasena's answer to King Milinda suggests that Buddhist accounts of critical thinking also adopt the character view as opposed to the skill view . It is not enough to simply know how to ‘make moves’, ‘destroy’ or ‘demolish’ an opponent by various techniques. What is central to an honest debate is that a participant must also have a certain attitude and character that exemplifies a specific epistemic temper .

If one agrees with the character view, then this simple passage from Milinda - pañha could be compared with other passages, such as from the Meno , to teach critical thinking students what critical thinking is about. 6

The tale of two syllogisms

But once we have introduced students to what critical thinking is we are often faced with having to show them how to present their ideas for the purposes of a critical discussion. This takes us to the normative question : what are the appropriate forms, norms and intellectual virtues by which we should engage in critical thinking and civic debate? Many contemporary introductory level textbooks, such as Hurley's Concise Introduction to Logic and Vaughn's The Power of Critical Thinking contain some section where they present and discuss how an argument should be put into, what is often called, standard form . The notion of a standard form is normative . It suggests that an argument has a way that it should be presented for the purposes of engaging someone in a dialectical inquiry. Often discussion of standard form takes place either in the context of the presentation of how to identify an argument, or in the area where Aristotelian Categorical Logic is presented. However, the presentation of what constitutes a good argument, in either Hurley or Vaughn, is not given comparatively by considering other traditions. For example, it is simply presupposed that there is no alternative way in which one could present an argument. In contrast to the Aristotelian picture, the Hindu Syllogism has a different structure. It was developed and debated in classical Hindu, Buddhist and Jain philosophy for centuries. What is the basic contrast between the Aristotelian Syllogism and the Hindu Syllogism?

Aristotle was the ancient Greek philosopher who first codified logic for the western tradition. Students of logic and critical thinking are often brought into the topic of the syllogism and the standard form of reasoning by the following example from Aristotle.

Major Premise: All men are mortal.

Minor Premise: Socrates is a man.

Conclusion: Socrates is mortal.

Akṣapāda Gautama was the founding father of the Nyāya School of philosophy. Like Aristotle he was also concerned with the proper form of how an argument should be displayed. The most commonly discussed argument in Indian philosophy deriving from his work, and perhaps even earlier, is the following:

Thesis: The hill has fire.

Reason/Mark: Because of smoke.

Rule/Examples: Wherever there is smoke, there is fire, as in a kitchen.

Application: This is such a case, i.e., the hill has smoke pervaded by fire.

Conclusion: Therefore it is so, i.e., the hill has fire. 7

There are many differences between the two examples. Two of the most important, highlighted by Matilal (1985 , pp. 6–7), are: (i) Aristotle's Syllogism is in subject-predicate form, Akṣapāda's Syllogism is in property-location form; (ii) Aristotle's study of syllogistic inference is primarily about universal and particular form propositions, Akṣapāda's study involves singular propositions in the thesis and conclusion. However, even though there are these differences, both examples have a similar normative force . They are both offered as a case of good reasoning , and they both are examples of what counts as how one should present their argument in a debate.

Given that neither Hurley nor Vaughn discuss the Hindu Syllogism, we might ask all of the following questions. Does it make sense as an argument form? Is there any benefit to teaching it? What do we gain by including it?

The western gaze on classical Indian logic

To answer these questions we need to look at the history of the reception of the Hindu Syllogism and how to correct the colonialist interpretation of it.

In the study of classical Indian logic from the Anglo-European point of view it is well known that the Hindu Syllogism received a great deal of criticism and was often presented as being inferior to the Aristotelian Syllogism. Jonardon Ganeri (2001) has compiled a list of some of these critiques in his work on Indian Logic:

[Western philosophy] looks outward and is concerned with Logic and with the presuppositions of scientific knowledge; [Indian philosophy] looks inward, into the ‘deep yet dazzling darkness’ of the mystical consciousness ( as quoted in   Ganeri, 2001 , p. 1).
I have a great doubt of [Indian Logical] views becoming of any value whatever in the cause of general knowledge or science, or of ever having any fair claim to be admitted as an integral part of the Catholic philosophy of mankind. It is absurd to conceive that a logic can be of any value from a people who have not a single sound philosophical principle, nor any intellectual power whatever to work out a problem connected with human nature in a manner that is at all rational or intelligent. Reasoning at least in the higher forms of it among such semi-barbarous nations, must be at its lowest ebb; [and there] does [not] seem to be any intellectual stamina, in such races of men, to impart to it more vigour and rationality. ( as quoted in , brackets added, Ganeri, 2001 , p. 7).
One point alone appears certain, and that is, that they [the Nyāya] can lay but slight claims to accuracy of exposition. This is proved clearly enough by the form of their syllogism, which is made to consist of five instead of three parts. Two of these are manifestly superfluous, while by the introduction of an example in the third the universality of the conclusion is vitiated ( as quoted in   Ganeri, 2001 , p. 9).
That Hindu philosophy will have any great influence on the development of European philosophy and mediately of European civilization must be denied. You are compelled to think by reading the works of the Greeks, they introduce you to the process of their thoughts, and by this force you to accompany them with your own thoughts, until you arrive as it were by your own mind at the principles of their systems … The Hindus, on the other hand, are dogmatical. They commence synthetically with a statement of their principles, yet do not condescend to unfold the train of thought which has led to them ( as quoted in   Ganeri, 2001 , p. 14).

As a consequence, of these attitudes one can see how and why it may have been acceptable to exclude Indian contributions to logic for the purposes of teaching. The guiding idea is that if the Hindu Syllogism is actually confused and not a good form of reasoning, then we ought not to teach it in a critical thinking course . Thus, one needs to defend the plausibility of teaching the Hindu Syllogism through a partial defence of what is valuable in it. Below I offer an account of some of the criticisms of the Hindu Syllogism, based on the work of Ganeri (1996 , 2001 ). From there I proceed to a defence of the Hindu Syllogism through an examination of J. L. Shaw's (2010 , 2016a , 2016b ) work on the distinctions between inference for oneself and inference for another, and Ga n ˙ ngeśa's notion of relevance, and my own distinction between different models through which we can understand a piece of reasoning.

The criticisms of the so-called Hindu Syllogism come largely from having two important figures in western logic in mind when thinking about the Hindu Syllogism. The figures are Aristotle and Mill. The former is important for his work on the codification of deductive patterns of inference. The latter is important for his work on inductive inference. Here are some common criticisms of the Hindu Syllogism:

It is redundant, since the Thesis and Conclusion say the same thing. It is superfluous, since the Application step is unnecessary. It is a convoluted hybrid of two distinct types of reasoning: inductive and deductive. In particular the argument can be broken down as follows:Deductive component:

All locations where there is smoke are locations where there is fire. There is smoke on the hill. ∴ There is fire on the hill. Notice this has the same form as Aristotle's argument: All Men are Mortal. Socrates is a Man. ∴ Socrates is Mortal. Inductive component: In a kitchen a fire is followed by smoke. ∴ In all cases fire is followed by smoke.

Given that the argument can be broken down into two independent and distinct arguments it can be argued that: (i) the good part is simply the deductive version offered by Aristotle, and (ii) the bad part is offered by Akṣapāda when the inductive component is combined with the deductive component. The inductive component joined to the deductive component is bad because a single instance is never capable of proving a universal rule. On an available western interpretation the critical question is: why does Akṣapāda think that the observation of fire in the kitchen followed by smoke is enough to justify the claim that all locations where there is smoke are locations where there is fire ? This question can be amplified into an argument that suggests that we should not teach the Hindu Syllogism in critical thinking, since it would confuse students about the difference between inductive and deductive reasoning. Although that critique assumes that we have the right account of how to divide different kinds of reasoning, I will forgo challenging that claim and simply show how the Hindu Syllogism can be defended even with that account in place. Thus, it will be useful to decode the western gaze on the Hindu Syllogism by providing an interpretative lens that is available from within western philosophy and classical Indian philosophy. 8

A corrective lens for the western gaze on the Hindu Syllogism

Not all western philosophers saw classical Indian philosophy in a negative light. Some who were more careful readers of the tradition saw that something drastically different could be going on. These interpretations are largely in line with how several philosophers in the Nyāya tradition see the steps in the set up of the Hindu Syllogism. I will build a defence of the Hindu Syllogism based on some of these ideas in conjunction with the following idea. Tarka-vidyā is the science of debate. It is an open question whether we should think of the classical Indian tradition of engaging debate, logic and philosophy along the same lines as we find in Plato's presentation of philosophy as distinct from rhetoric, and Aristotle's codification of logic separately from rhetoric. In particular there is nothing in classical Indian philosophy that speaks to the issue of separating ‘philosophy’ from the art of persuasion. All traditions of Indian philosophy are steeped in debate, and have their own competing manuals of debate.

Following the work of J. L. Shaw (2016a) it is worth noting three points. First , the theory of inference in classical Indian philosophy is largely based on the idea of how to cause a specific cognition (the conclusion) to arise on the basis of steps leading to the conclusion. Second , a good argument is one that is free of specific kinds of defects that can block the conclusion from arising in the correct way. Third , it is central to understanding inference in classical Indian philosophy to pay attention to the distinction between inference for oneself vs. inference for another and the concept of relevance as it pertains to questions and answers. 9 Finally, as a helpful comparative guide, we can distinguish between three models of reasoning:

In the manipulation model reasoning is fundamentally about manipulating a person's mind so that they believe what you want them to believe—no matter how that is brought about through reasoning. In the veritic model reasoning is fundamentally about finding the truth. In the erotetic model reasoning is fundamentally about engaging questions that arise from natural doubts or through dialectical inquiry.

Given the different models of reasoning we might ask: what base model of reasoning is at play in the western interpretation of the Hindu Syllogism? If the answer comes from philosophers and logicians from the Anglo-European tradition, it is likely to be the case that the veritic model is used, since that model is what is associated with philosophy and with the science of logic. Logic is about what follows from what. For example, propositional logic is about what we can conclude about the truth of a compound formula, such as (P ∧ Q), on the basis of what the truth-value is of each of its components, such as that P is true and Q is true. However, we can look at the Hindu Syllogism from another perspective.

According to the Nyāya each of the sentences in an inference for others is an answer to a question and each of them, except the last one, will give rise to a question. Moreover, each of them is used to generate a cognition in the hearer ( Shaw, 2010 , p. 45).
In an inference for others, all the five sentences are needed, because each of them is an answer to a different question and gives some new information. But in an inference for oneself all of them are not required and there is no need to use a sentence. Hence a deaf and a mute person can also have an inferential cognition ( Shaw, 2010 , p. 46).
According to Ga n ˙ ngeśa, a Navya-Nyāya philosopher, there are several types of relevance ( san n ˙ gati ). The three important kinds for inference are: (i) justification ( upodghāta ), (ii) causing-effect ( kāryatva ), and (iii) cessation of objectionable questions ( avasara ). These three concepts of relevance are tied to an epistemic account of inference in terms of answering certain questions that arise from doubt. The steps in an inference aim to provide justification that puts an end to questions, including questions about the sequence of steps. This conception is central for understanding the classical Indian conception of dialectical reasoning. This account of relevance deals with (erotetic ordering-effects). The core idea is that by sequencing statements in a certain way relative to certain intellects we can lead one to the conclusion we want and end questions that arise either from doubt or objections ( Shaw, 2016b , pp. 286–293).

Let us look at what happens when you reinterpret the Hindu Syllogism as an inference for others under the erotetic model bearing in mind the sequencing of statements:

According to Shaw ( 2016a , pp. 92–100), the correct account of the Hindu Syllogism is the following:

( pratijñā ): The hill has fire. (The thesis is an answer to a question that arises on the basis of doubt. The question is: what is to be established?)

( hetu) : Because of smoke. (The reason is an answer to the question: what signifies what is to be established? In this case smoke signifies what is to be established.)

( udāharaṇa ): Wherever there is smoke there is fire, like in a kitchen when one is cooking and observes fire followed by smoke. (The rule/example is an answer to the question: why should one consider a to be a signifier of b ? In this case: why is smoke a signifier of fire? The answer is given by stating a rule along with examples.)

( upanaya ): The case of smoke on the hill is like the case of smoke in the kitchen. (The application step answers the question: Is the hill characterised by the particular and relevant kind of smoke?)

( nigamana ): The hill has fire. (The conclusion is an answer to the question: Is the fire, which is the significate of this kind of smoke, present on the hill? This is how the conclusion removes the doubt expressed in the thesis.)

Under the erotetic/inference-for-others model nothing is redundant and nothing is superfluous, the steps follow naturally from a series of questions that one would ask their interlocutor for the purposes of understanding why they believe that there is a fire on the hill. And importantly, the erotetic model does not preclude the discovery of truth. Rather, it aims at it through the investigation of questions. The main point is that one is brought into critiquing the Hindu Syllogism as redundant and superfluous in virtue of not drawing a distinction between an inference for oneself vs . an inference for others as well as failing to distinguish between different models we can use to understand a representation of reasoning. Who or what is the representation for?

Furthermore, Shaw informs us that some theorists in classical Indian philosophy think of how many steps there should be in an inference for others relative to one person's understanding of another's intellect . 10 For example, some scholars point out that for a sharp intellect one might only use the third and fourth step, and for a middle intellect one might only use the third, fourth and fifth step; while for a soft intellect one should use all five steps. This kind of theory makes sense if we are focused on a causal account of how the conclusion is caused to be cognised in an individual. The core question being: what does it take to correctly cause someone to cognise the conclusion in a way in which they will understand it ?

Thus, a complete understanding of the theory of cognition and the context of how the theory of argument and debate developed in India is required for understanding why exactly the steps are given. Once that is in place the initial objections go away.

Consider the Inductive Veritic version:

I have observed fire followed by smoke in my kitchen. ∴ Wherever there is smoke there is fire.

By contrast, the Analogical Erotetic version looks good.

Asha: Why do you believe that wherever there is smoke there is fire?

Anu: I have observed fire followed by smoke in my kitchen. Have you observed that?

Anu: I think the case of fire followed by smoke in my kitchen is the same as what is going on over there where I see smoke on the hill.

Asha: Why do you think these cases are similar?

Anu: Well I have never seen fire without smoke nor have I ever seen smoke without fire. That is I have always observed the co-presence of smoke and fire and co-absence of smoke and fire.

In the dialogue the main point of offering the example is not to inductively offer support for the conclusion. Rather, the point is to offer an example that has the following properties:

The interlocutor is likely to have experienced the same thing. The example has the properties of the universal claim. One can move from the example to an understanding of why one would believe the universal claim.

We can see the force of the use of the example in the dialogue by paying attention to what would happen had the interlocutor, Asha, responded differently. Imagine the following alteration in the conversation.

Asha: No! I have never seen fire followed by smoke in my kitchen because I don't cook. So, I don't see any reason to think that there is smoke on the hill over there because there is fire on the hill in a way that is similar to what is observed in a kitchen when one is cooking.

We can interpret Anu as giving the example of the kitchen, as opposed to observing fire followed by smoke in another case, because there is a likelihood that Asha has experienced something similar that would allow her to see why Anu holds that wherever there is smoke there is fire. Now, when Asha answers in the negative, this puts Anu in the situation of having to produce another example, since the first example cannot be used to persuade or help Asha understand why one should or would believe that there is fire on the hill simply because there is smoke on the hill and wherever there is smoke there is fire.

Moreover, we should be sensitive to the difference between the following questions:

Argument : What is the argument?

Knowledge : How are the premises of the argument known?

Persuasion : How do I get someone to believe the conclusion?

A natural question to ask after we have identified an argument is: how are the premises known? While the tradition stemming from Aristotle forward tends to separate the identification of the argument from how the premises are known, and how we should go about convincing someone in a debate, the tradition stemming from Akṣapāda does not. The Hindu Syllogism binds the logical, epistemic and persuasive aspects of reasoning together. And in fact when we look at scientific reasoning, this is what we often see. In science we are always concerned with using induction and deduction together. The idea that an argument can be good in science independently of the knowability of the premises is anathema to scientific investigation. Thus, we can see that there are virtues to at least a comparative examination of what counts as a legitimate argument form, and that by introducing our students to what an argument is through a comparative examination we allow them to have an open mind about how discussion and argumentation can be conducted.

Premise 3: Critical thinking education should include contributions from non-western philosophers.

Even though I have argued that there are legitimate things we can teach from outside the western tradition of logic and critical thinking, it does not follow that we should include them in a course on critical thinking and logic. Thus, I will begin a defence of this premise by examining a change that has occurred in textbooks for critical thinking. I will use the change that has occurred as the basis for posing a critical question: how can we allow for one kind of change, and not another kind of change?

If we examine, for example, the 1 st –10 th editions of Hurley's Concise Introduction to Logic and compare it to the 12 th edition we will see some changes with respect to explanations and problem sets, but we will also note an additional stark contrast. While the earlier editions only discuss philosophical contributions from men, such as Aristotle, Boole, Venn, Frege, Quine and Kripke, the 12 th edition includes discussion of Ruth Barcan Marcus and Ada Byron Lovelace. 11 Why was the change made? One hypothesis is that there was external pressure on the author from either the public at large, the external reviewers, or from publishers to change the fact that they were representing critical thinking and logic as a place where only men contributed. There are two basic ideas here. First, it is wrong to present logic through the eyes of the contributions of men, if in fact women did make contributions. Second, there might be something like an upward identity trajectory for women in logic and critical thinking when we present it alongside the fact that women made important contributions to the field. Another way to see the second point is as follows: by not presenting the works of women in logic, teachers and the book itself reinforced, the already present idea, that logic and critical thinking is for men, and not for women (more on this in the third section). But now to the critical question: why include women and leave out non-western thinkers? One way to show that there is no good reason to draw a difference is simply to examine a number of responses to this question and show how each is ineffective. The responses to the question will come by way of objections to the idea of including non-western sources.

Objection 1: Non-western thinkers do not belong in a logic and critical thinking textbook because they have no ideas that pertain to logic and critical thinking .

Response 1: In the prior section I have defended the idea that the Nyāya School of classical Indian philosophy has important ideas that are contributions to logic and critical reasoning. So, at this point what is important to point out is that the Nyāya School is one of many traditions that could be appealed to. Contributions, to name a few, have also been made by Africana, Jain, Buddhist, Arabic and Mohist traditions.

Objection 2: Non-western thinkers contributed ideas to logic and critical thinking, but all of their contributions are false, irrelevant, or not important .

Response 2: In the prior section I argued that one can read the Hindu Syllogism as a confused bit of proto-logic that forms part of the general history of logic, but that this reading is not necessarily the only one available. Against the reading I offered a corrective lens internal to the western tradition, based on the distinction between veritic and erotetic models of reasoning, that can be used to show how the Hindu Syllogism makes sense. Thus, the main response is that some of our thoughts about contributions from non-western philosophers are themselves confused by imposing a singular western lens on them when we are interpreting them. More importantly, we have the following situation. In some cases we could be interpreting a contribution from a non-western thinker as being incoherent because we are using the wrong lens for interpreting what is going on. In another case, it may be that the contribution is wrong only because we assume that there is only one correct understanding of western logic, as if no one in western logic has debated what the correct account of logic is. For example, independently of the contributions of non-western thinkers there is a debate, internal to western philosophy, over whether the logical connectives should be given a classical, intuitionistic or paraconsistent interpretation. And that debate sits alongside the debate over whether logical monism or logical pluralism is correct, the debate about whether there is more than one correct account of the consequence relationship: B is a logical consequence of A. But perhaps it is too much to defend the claim that the contributions coming from non-western traditions are in fact correct or better than those found in a standard logic and critical thinking textbook. So, lets consider a stronger, and distinct objection.

Objection 3: Logic and critical thinking textbooks should only contain information that is to the best of our knowledge true .

Response 3: The core of the objection is that we should only include contributions from non-western thinkers once they have been defended at a higher level and shown to be superior to, or at least as good as, the ideas that are presently discussed in an introductory level book. One argument for this is by way of analogy. Just as we don't include discussion of intuitionistic logic in an introduction to logic and critical thinking course, but rather only classical logic, we need not include ideas from non-western logic. Only the best, and true ideas about logic and critical thinking should be in an introductory level book.

Of course, this objection would be powerful, if it were in fact true. That is, if it were true that logic and critical thinking textbooks only contain true theories about how to reason. Lets consider one issue found in many introductory level textbooks: the inference from a universal proposition to a particular, often discussed as existential import .

1. All men are mortal ∴ 2. Some men are mortal.

Under Aristotle's interpretation the universal claim that All As are Bs entails the particular claim that Some As are Bs, because we can only be talking about categories that contain at least one instance. However, Boole disagrees, since some universal sentences articulate essential properties of entities, or definitions of entities, which are true, without there being anything that falls under one of the categories.

1. All unicorns are single-horned creatures. ∴ 2. Some unicorns are single-horned creatures.

On Boole's interpretation, universal claims, such as All As are Bs, need not imply particular claims, such as Some As are Bs, because there might not be any entities that fall under one or the other of the categories. The fact that we have a true statement about unicorns embedded in the sentence ‘All unicorns are single-horned creatures’ can be very useful even if there are no unicorns. For example, we may wonder whether there are any creatures of a certain kind, and then go search for them on the basis of the statement. Surely, we can discover that there are no creatures of the relevant kind. As a consequence, we would conclude that there are no unicorns.
Thus, Boole's interpretation and Aristotle's both make sense. So, it seems reasonable to teach them. But then what is the objection to teaching the Hindu Syllogism alongside Aristotle's syllogism? The fact is: logic and critical thinking textbooks do not teach: (i) only things that are true, and (ii) things that are uncontroversial truths. For the most part they teach that which has been canonised. There is no reason a comparative presentation of the Hindu Syllogism and Aristotle's Syllogism cannot be taught in much the same way that we currently teach Aristotle's Square of Opposition comparatively with Boole's interpretation of it, where existential import fails. The fact is: even if the Hindu Syllogism is inferior to Aristotle's ( which it isn't ) we can still teach them comparatively as we already do in the case of teaching universal to particular inferences.
Conclusion: Critical thinking education should be revised so as to be inclusive of contributions from non-western thinkers.

Let me conclude my presentation of the argument by clarifying the conclusion of it so as to block an immediate objection to it, as opposed to the premises. One might simply object to the conclusion by pointing to the fact that there are textbooks available to educators that focus on cross-cultural critical thinking, or at least there are texts that are sensitive to ideas that come from outside of the western canon. For example, Wanda Teays's (1996) ground breaking Second Thoughts: Critical Thinking from a Multicultural Perspective , and Maureen Linker's (2015) Intellectual Empathy: Critical Thinking for Social Justice , are two texts that include material from non-western traditions. However, this objection to the conclusion, based on pointing to texts like Teays's and Linker's rests on a confusion between two ways in which critical thinking can be cross-culturally sensitive.

The multicultural approach to critical thinking takes critical thinking tools that originated in the west and applies them to the multicultural world we live in. By contrast, the cross - cultural approach to critical thinking aims to include tools that originated from non-western traditions into the actual curriculum of critical thinking for the purposes of improving the set of tools available and being respectful of the idea of inclusion in critical thinking. While the two approaches are distinct, they are not mutually exclusive. One could write a text that is both cross-cultural and multicultural. For example, meditation is a tool of critical thinking that derives from Hindu and Buddhist philosophy. It aims to help us gain critical self-understanding of our own mental states. It can be included in a critical thinking textbook as an example of critical thinking from outside the western canon. It is, for the most part, a non-western contribution to critical thinking.

Moreover, it should be clear that the argument offered here aims at the cross-cultural inclusion approach.

Suppose now that the argument for inclusion of non-western ideas into critical thinking education is good. One critical question we can ask is the following: where should non-western ideas about critical thinking and logic be taught? One response is simply the following: of course non-western contributions to logic and critical thinking should be taught. However, they should be taught in an area studies course, such as Asian Philosophy . They do not belong in an introductory level course on logic and critical thinking, especially one that aims to help us understand how to critically think in the context of public policy and decision making through civic debate and public discourse.

The Character View holds that critical thinking involves the acquisition and proper deployment of specific skills as well as the acquisition of specific character traits, dispositions, attitudes, and habits of mind. These components are aspects of the ‘critical spirit’.
The Comprehensive View holds that critical thinking involves (i) the development of the appropriate skills that are constitutive of critical thinking, (ii) along with the appropriate character traits, dispositions, attitudes, and habits of mind, which are constitutive of the ‘critical spirit’. However, it also requires (iii) that the skills/tools and the nature of the ‘critical spirit’ be derived from all traditions that have contributed to critical discourse. Finally, the view requires that at some point a critical thinker engage the meta-critical question about critical thinking. That is, that a critical thinker acquire a proper understanding and appreciation of the sources of critical discourse for the purposes of bringing harmony to all that participate in the activity.

On the basis of this distinction, the following argument can be made. At the introductory level the primary goal of a course on critical thinking and logic is to teach students thinking skills, since the skills are essential for college success, life-long learning, civic engagement and public discourse. As a consequence, the historical source from which the skills derive is not important. Rather, the skill itself is important. One way to amplify the argument's force is to concede that it was a mistake to include references to western thinkers in the presentation of logic and critical thinking in the first place. The simple idea is that just as there is a difference between maths and the history of maths, there is a difference between logic and critical thinking, and the history of it.

This argument is powerful, since there is so much need for students to learn critical thinking skills as opposed to the mere history of the discipline. But as soon as this point is made, a key presumption is revealed: there are no skills that can be acquired through studying non-western contributions to logic and there is nothing to be gained critically by studying logic and critical thinking from a historically informed global perspective .

However, there is an interesting and substantial response that can be given to this point. The argument builds from the discussion in the prior section where I explored the relationship between the inclusion of women in critical thinking and logic textbooks in contrast to the absence of non-western thinkers. I will present the argument as an analogy:

Inclusion of women in critical thinking and logic textbooks along with women role models for critical thinking and logic education reduces stereotype threat . The problem that women face in critical thinking and logic education is sufficiently similar to the case of minorities.∴ Inclusion of minorities in critical thinking and logic textbooks with minority role models for critical thinking and logic education would reduce stereotype threat for minorities.

Stereotype threat occurs when a person believes they will be judged on the basis of some group-based stereotype. They do not need to believe the stereotype , and the stereotype need not even be prevalent in their environment . All that is necessary to activate this particular social identity threat is that a person believes that others will treat them negatively or evaluate them unfairly on the basis of one of their social identities. For example, a woman who thinks either that ‘women are not logical’ is true or that many other people believe this to be true may find that such a belief impacts her performance on logical tasks or enjoyment of these tasks ( Lehan, 2015 , pp. 3–4).
A female philosophy student will probably be in the minority as a woman in her department, and she'll almost certainly be in the minority as a woman if she takes classes in the more stereotypically male areas like (for example) logic, language and metaphysics. As she continues on to higher levels of study, the number of women will be steadily diminishing. In any class she takes other than feminist philosophy, she's likely to encounter a syllabus that consists overwhelmingly (often exclusively) of male authors . The people teaching most of the classes are also very likely to be male. All of these factors calling attention to low numbers of women are known to provoke stereotype threat. Since stereotype threat has its strongest effect on the most committed students, this means that the most committed women are likely to underperform ( Saul, 2013 , p. S. 2.1, emphasis added ).

Saul's point, in the emphasised text, is equally true of minority students and their upward trajectory in philosophy. The syllabi and the people teaching the courses will largely be white males.

[A] successful method for reducing stereotype threat is the introduction of counter-stereotype role models. One way to do this is to introduce students to members of the stereotyped group who have done well in the area. For example, “when female students are exposed to women that have performed successfully in mathematics and science related fields, they perform better than female students who do not have examples of women with such performance” … One study showed that reading essays about women who are successful in math can reduce the negative effects of stereotype threat … “Thus, direct and indirect exposure to women that have successfully navigated the field can be enough to reduce the negative impacts of stereotype threat for female students”… This suggests the importance of highlighting women in logic. “[T]he direction of [the] impact [of role model introduction] depends on the believed attainability of their success: Models of attainable success can be inspiring and self-enhancing, whereas models of unattainable success can be threatening and deflating”. In the interest of attainability, it is also extremely important to mention women currently working in logic such as Audrey Yap, Penelope Maddy, Dorothy Edgington, Susan Haack and many others conveniently listed on the Women in Logic list ( Lehan, 2015 , pp. 10–11).

Thus, given that the technique of including women in critical thinking textbooks, and as role models in the classroom, has successfully led to stereotype reduction for women, we can legitimately ask: would the same technique work for minorities? It seems that the relevant question to explore is: are the two cases similar enough? Are the stereotypes that women face the same as the stereotypes that minorities face? And interestingly: what about the intersectional case of minority women? Here are some important considerations.

Unlike the case of the category woman , the category minority is quite diverse with various stereotype alterations within the category. For example, do Asians face the same stereotype threat in a critical thinking and logic course that African Americans or Latin Americans face? Arguably they do not, given the model minority status that is often attributed to Asian Americans (Indians, Pakistani, Chinese, Koreans or Japanese). The difference is that teachers, in the US, don't typically look at Asian Americans thinking that they are going to do poorly in a critical thinking or logic course as much as they think that an African American or Latin American might. But this opens up the intersectionality question: given that everyone that has a race has a gender, could it be that the stereotype threat that women face applies without any thought to racial differences? More specifically, do teachers operate with different implicit biases about Asian women than African American women or Latin American women? And do these gender-race interactions alter the stereotype threat?

More research needs to be done on these questions. For the purposes of what I am arguing here, I cannot answer them. What is relevant to my argument is that we look closely at the fact that there are two distinct questions in the area, one concerning performance, the other concerning retention. Suppose that Asian Americans, either male or female, generally perform well on critical thinking and logic, so that they do not face a stereotype threat the way an African American male or a Latina female might. We might say something like the following. Because of the stereotype threats that the African American and the Latin American faces they perform poorly, and their poor performance is one factor that accounts for why they do not stay in the field of philosophy. However, this cannot be the explanation in the case of Asian Americans, since there is no relevantly similar stereotype threat. Many Asian women perform extremely well on first-year courses in logic and critical thinking.

Philosophy as a science could [not] originate among the Orientals, who, though susceptible of the elements of high culture, were content simply to retain them in a spirit of passive resignation ( as quoted in   Ganeri, 2001 , p. 13).

The core idea is that showing interest in Asian philosophy is showing interest in something that is mystical, non-rational and not really philosophy or science. Asians are often pressured into performing well in the sciences as a sign of intelligence. Thus, studying Asian philosophy is studying Asian religion, and not studying science. Anglophone philosophy focuses on logic and reason, and the stereotype of, for example Chinese or Indian philosophy, is that it does not, but in some form is mystical –in a bad sense. Consequently, Asian students typically adopt the dominant interests of western philosophers. The idea is that to be a real philosopher one must adopt an interest in western philosophy, since that is where one finds the true origins of rationality and science. In fact one often finds that it is easier for non-Asians to show a genuine interest in Asian philosophy than it is for an Asian to show an interest, since Europeans do not face a stereotype threat when engaging Asian philosophy. Rather, they are seen as having an open-minded interest in other traditions.

As a consequence, what can be seen is that the inclusion of non-western thinkers in critical thinking and logic education isn't just about informing others that non-western thinkers have contributed to critical thinking and logic in important ways. If it were about that, it could be solved by an area studies course. Rather, it is about altering perceptions, held by westerners and non-westerners about the content of Asian philosophy. By introducing it in the context of an introduction to critical thinking and logic course we do away with the idea that there is something called Buddhist logic or Chinese logic. We introduce students to critical thinking and logic through contributions from everyone that in fact did contribute. In short:

We can make clear that critical thinking doesn't just come from the Greco-Roman-European tradition. It is part of the human condition. Many cultures contributed in interesting and controversial ways to what falls under the semantic range of the English phrase ‘critical thinking’. By introducing critical thinking through a cross-cultural lens we can reduce stereotype threat revolving around the idea that non-western cultures did not contribute to critical thinking, which is often touted as the prized reason for studying the humanities. We can help minority students that are interested stay in philosophy. Help the dominant group come to a better understanding of the roots of critical thinking. We can point out that how one person debates and discusses an issue of importance to their lives doesn't always follow the way in which another person does. And that this kind of cross-cultural understanding is important for the possibility of meaningful public discourse, disagreement and the development of epistemic tolerance and temper—tolerance of other epistemic norms.

How should public education enable the ethical implementation and proper functioning of democratic processes, such as voting on the basis of public and civic discourse?
Public education should provide citizens of a political body with basic skills in critical thinking, civic debate and ethical public discourse,
We live in a multicultural world where it is no longer possible to say that the demographics of, for example the US and the UK, are not sufficiently diverse across Indian, Chinese, Arabic, African … persons of origin to leave out ideas about critical discourse and discussion emanating from these traditions. To present critical thinking as originating from the human condition, as opposed to the western condition, is to give proper place to each individual, in a diverse body of individuals, who participates in an ethical public exchange of ideas leading to an outcome that pertains to all.

I would like to thank Purushottama Bilimoria, Karin Brown, Janet Stemwedel, Rita Manning, Peter Hadreas, Tom Leddy, Krupa Patel, Jessica Kraft, Stephen Phillips, members of the Association for Informal Logic and Critical Thinking, The Society for Philosophy in the Contemporary World, San Jose State University's 2015 Buddhism Conference , and The Society for Asian and Comparative Philosophy for discussion of this piece. I would also like to thank two anonymous referees of JOPE for their generous and outstanding comments on this piece. The ideas in this paper were much improved by their suggestions and guidance. I also owe a special intellectual debt to B. K. Matilal's (1985 , 1998 ), and Jonardon Ganeri's (1996 , 2001 , 2004 , 2011 ) outstanding work on Indian Logic. I also owe a deep debt of gratitude to Jaysankar Lal Shaw for his patience in explaining the importance of the precise formulation of the Nyāya account of inference, relevance and the distinction between inference for oneself and inference for others. Finally, I would like to thank my wife Manjula Rajan for helping me to see, through actual engagement, how the Hindu Syllogism works. Her constant evaluation of examples and pressing me for examples has shown me how the method actually works. This paper is actually written in a form that mixes Aristotle's Syllogism and Akṣapāda's Syllogism.

It should be noted here that the distinction between the content question and the normative question is notional. One could argue that the content question either determines the answer to the normative question or it restricts the acceptable answers to it. I am notionally separating these so as to not presuppose a specific answer to the question: how is the content of critical thinking related to the norms of civic debate and public discourse?

I take this work here to be an instance of the public-to-philosophy direction of fit through the aid of Kraft's work on public debate in The Atlantic . Her work pushed me to examine the presuppositions of what is going on in critical thinking and logic education.

It should be clear that in pointing to the actions of these students I am in no way endorsing their behaviour. Rather, I am using their actions as a moment for reflection on what constitutes critical thinking and what should be the norms for engaging in public discourse. Furthermore, it is important to note that there is more than one interpretation of what the students are trying to do by not engaging with the standard rules of civic debate that they were informed of prior to the competition. For example, it is possible to interpret their acts not as an engagement with an alternative model of critical thinking, but rather as an act of civil disobedience. If their act is one of civil disobedience, then it is unlikely that we can claim that they are engaging in an alternative form of critical thinking. However, regardless of the multiple interpretations, it is possible to use an interpretation of their actions as a guide to the critical question: could they be engaging in critical thinking and civic debate albeit an alternative one that may have its own merits?

Stephen Phillips's Critical Thinking in Service of Knowledge: Nyāya according to the Nyāya school of Classical Indian Philosophy , a currently unpublished presentation, has a discussion of this way of drawing the distinction.

In this section and the next I borrow heavily from the work of Ganeri, 1996 , 2001 and 2004 . While there are many controversies surrounding what actually happens in classical Indian logic, for the purposes of this paper I have decided to present a picture that shows that there are important contributions from Indian logic that can be used to teach critical thinking and logic at the introductory level. I take it that just as one can teach first-order logic while recognising that there are controversies concerning it, one can also teach portions of classical Indian logic while recognising that there are controversies concerning how to interpret it.

See Matilal, 1985 , p. 11 for discussion of the comparative point about division between different kinds of debate in Nyāya vs. Meno .

The version I am offering of the standard example derives from the work of Jaysankar Lal Shaw in his Nyāya on the Sources of Knowledge ( Shaw, 2016a ) and conversation. In his work he has articulated a sustained analysis of how the standard example of inference is to be presented. Although there is a debate in Indian philosophy, historically and in contemporary commentary, on the nature of inference, this should be no barrier to teaching the inference, for if the existence of a debate were sufficient, then we would not be teaching Aristotle either.

It is important to note, as B K. Matilal, 1985 , pp. 2–3 does, that western Indologists and philosophers are not the only people to blame when it comes to confusions about the so-called ‘Hindu Syllogism’. Matilal critiques S.C. Vidyābhūṣaṇa's very own article Influence of Aristotle on the Development of the syllogism in Indian Logic, which appeared in the pioneering work History of Indian Logic published in 1920. In that article Vidyābhūṣaṇa attempts to show that there are some commonalities between ‘The Syllogism in Indian Logic’ and the ‘logical rules’ and syllogism as found in Aristotle.

For a sustained presentation of the core ideas see the Inference section in Shaw, 2016a , for an excellent discussion of the relevant points.

Shaw informs us that according to Srinivasa Dasa in his book Jyatīndramata-dīpikā we must think of how many steps there are in a syllogism for others relative to our understanding of their intellect.

For example, compare Chapters 1 & 6 of the 12th edition with Chapters 1 & 6 of the 8th edition. Both chapters are on propositional logic, but only the 12th edition contains the presentation of Ruth Barcan Marcus, p. 35, and Ada Byron Lovelace p. 353. The 8th edition does not contain either. Yet both editions contain a discussion note on the history of logic, see p. 5 in the 8th edition, and compare that to p. 5 in the 12th edition.

Ganeri , J. ( 1996 ) The Hindu Syllogism: Nineteenth-Century Perceptions of Indian Logical Thought , Philosophy East and West , 46 . 1 , pp. 1 – 16 .

Google Scholar

Ganeri , J. ( 2001 ) Indian Logic: A Reader ( New York , Routledge Publishing ).

Ganeri , J. ( 2004 ) Indian Logic , in: D.   Gabbay and J.   Woods (eds.) Handbook of the History of Logic Vol. 1 . ( Amsterdam, Netherlands , Elsevier B. V. Publishing) .

Google Preview

Ganeri , J. ( 2011 ) The Lost Age of Reason: Philosophy in Early Modern India   (1450 – 1700 ) ( Oxford , Oxford University Press ).

Huemer , M. ( 2005 ) Is Critical Thinking Epistemically Responsible?   Metaphilosophy , 36 . 4 , pp. 522 – 531 .

Hurley , P. ( 2014 ) A Concise Introduction to Logic. 12 th edition ( Boston, MA , Cengage-Wadsworth Publishing ).

Kraft , J. ( 2014 ) Hacking Traditional College Debate's White Privilege Problem . The Atlantic , April 16, 2014. Available online at: http://www.theatlantic.com/education/archive/2014/04/traditional-college-debate-white-privilege/360746/ . (Last accessed: February 15, 2015).

Lehan , V. ( 2015 ) Reducing Stereotype Threat in First-Year Logic Classes , Feminist Philosophy Quarterly , 1 . 2.4 , pp. 1 – 13 .

Linker , M. ( 2014 ) Intellectual Empathy: Critical Thinking for Social Justice ( Ann Arbor, MI , University of Michigan Press ).

Matilal , B. K. ( 1985 ) Logic, Language, and Reality ( Delhi , Motilal Banarsidass Publishing) .

Matilal , B. K. ( 1998 ) The Character of Logic in India   J.   Ganeri and H.   Tiwari (eds.) ( Albany, NY , State University of New York Press) .

Saul , J. ( 2013 ) Implicit Bias, Stereotype Threat, and Women in Philosophy , in: F.   Jenkins and K.   Hutchinson (eds.) Women in Philosophy: What Needs to Change ( Oxford , Oxford University Press ).

Shaw , J. ( 2010 ) The Nyāya on Sources of Knowledge , in: P.   Ghose (ed.), Materialism and Immaterialism in Indian and the West: Varying Vistas ( Centre for Studies in Civilization, Ministry of Human Resources, Government of India , New Delhi ), pp. 117 – 152 .

Shaw , J. ( 2016a ) The Nyāya on Sources of Knowledge , in: J. L.   Shaw (ed.) The Collected Writings of Jaysankar Lal Shaw: Indian Analytic and Anglophone Philosophy ( London , Bloomsbury Publishing) .

Shaw , J. ( 2016b ) The Concept of Relevance ( Saṅgati ) in Gaṅgeśa , in: J. L   Shaw (ed.) The Collected Writings of Jaysankar Lal Shaw: Indian Analytic and Anglophone Philosophy ( London , Bloomsbury Publishing) .

Siegel , H. ( 1993 ) Not by Skill Alone: The Centrality of Character to Critical Thinking , Informal Logic , 25 . 3 , pp. 163 – 175 .

Sinha , N. ( 1990 ) The Nyāya Sutras of Gotama , trans. M.M.   Satisa Chandra Vidyābhūṣaṇa ( Delhi , Motilal Banarsidass Publishers ).

Teays . W. ( 1996 ) Second Thoughts: Critical Thinking from a Multicultural Perspective ( Mayfield Publishing) .

Vaidya , A. J. ( 2013 ) Epistemic Responsibility and Critical Thinking , Metaphilosophy , 44 . 4 , pp. 533 – 556 .

Van Loon , G. ( 2002 ) Caraka Saṃhitā: Handbook of Ayurveda ( Varanasi, India , Chaukhambha Orientialia Publishers ).

Vaughn , L. ( 2012 ) The Power of Critical Thinking: Effective Reasoning about Ordinary and Extraordinary Claims. 4 th edition . ( Oxford , Oxford University Press ).

Email alerts

Citing articles via.

  • Recommend to Your Librarian
  • Advertising and Corporate Services
  • Journals Career Network

Affiliations

  • Online ISSN 1467-9752
  • Print ISSN 0309-8249
  • Copyright © 2024 Philosophy of Education Society of Great Britain
  • About Oxford Academic
  • Publish journals with us
  • University press partners
  • What we publish
  • New features  
  • Open access
  • Institutional account management
  • Rights and permissions
  • Get help with access
  • Accessibility
  • Advertising
  • Media enquiries
  • Oxford University Press
  • Oxford Languages
  • University of Oxford

Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide

  • Copyright © 2024 Oxford University Press
  • Cookie settings
  • Cookie policy
  • Privacy policy
  • Legal notice

This Feature Is Available To Subscribers Only

Sign In or Create an Account

This PDF is available to Subscribers Only

For full access to this pdf, sign in to an existing account, or purchase an annual subscription.

SEP logo

  • Table of Contents
  • New in this Archive
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Critical Thinking

Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.

2.1 Dewey’s Three Main Examples

2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.

Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as

active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)

and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.

In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment. Political and business leaders endorse its importance.

For details on this history, see the Supplement on History .

2. Examples and Non-Examples

Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.

Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.

Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o'clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68-69; 1933: 91-92)

Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.

“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.

“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot's position, it must appear to project far out in front of the boat. Morevoer, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69-70; 1933: 92-93)

Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).

Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.

Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).

Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).

Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).

Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).

Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).

Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond line from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.

Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.

Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as

a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)

A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.

Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.

What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as

a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)

Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.

  • It is done for the purpose of making up one’s mind about what to believe or do.
  • The person engaging in the thinking is trying to fulfill standards of adequacy and accuracy appropriate to the thinking.
  • The thinking fulfills the relevant standards to some threshold level.

One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.

If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009), others on the resulting judgment (Facione 1990a), and still others on the subsequent emotive response (Siegel 1988).

In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.

Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).

Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.

Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:

  • suggestions , in which the mind leaps forward to a possible solution;
  • an intellectualization of the difficulty or perplexity into a problem to be solved, a question for which the answer must be sought;
  • the use of one suggestion after another as a leading idea, or hypothesis , to initiate and guide observation and other operations in collection of factual material;
  • the mental elaboration of the idea or supposition as an idea or supposition ( reasoning , in the sense on which reasoning is a part, not the whole, of inference); and
  • testing the hypothesis by overt or imaginative action. (Dewey 1933: 106–107; italics in original)

The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).

The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).

Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.

If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.

  • Observing : One notices something in one’s immediate environment (sudden cooling of temperature in Weather , bubbles forming outside a glass and then going inside in Bubbles , a moving blur in the distance in Blur , a rash in Rash ). Or one notes the results of an experiment or systematic observation (valuables missing in Disorder , no suction without air pressure in Suction pump )
  • Feeling : One feels puzzled or uncertain about something (how to get to an appointment on time in Transit , why the diamonds vary in frequency in Diamond ). One wants to resolve this perplexity. One feels satisfaction once one has worked out an answer (to take the subway express in Transit , diamonds closer when needed as a warning in Diamond ).
  • Wondering : One formulates a question to be addressed (why bubbles form outside a tumbler taken from hot water in Bubbles , how suction pumps work in Suction pump , what caused the rash in Rash ).
  • Imagining : One thinks of possible answers (bus or subway or elevated in Transit , flagpole or ornament or wireless communication aid or direction indicator in Ferryboat , allergic reaction or heat rash in Rash ).
  • Inferring : One works out what would be the case if a possible answer were assumed (valuables missing if there has been a burglary in Disorder , earlier start to the rash if it is an allergic reaction to a sulfa drug in Rash ). Or one draws a conclusion once sufficient relevant evidence is gathered (take the subway in Transit , burglary in Disorder , discontinue blood pressure medication and new cream in Rash ).
  • Knowledge : One uses stored knowledge of the subject-matter to generate possible answers or to infer what would be expected on the assumption of a particular answer (knowledge of a city’s public transit system in Transit , of the requirements for a flagpole in Ferryboat , of Boyle’s law in Bubbles , of allergic reactions in Rash ).
  • Experimenting : One designs and carries out an experiment or a systematic observation to find out whether the results deduced from a possible answer will occur (looking at the location of the flagpole in relation to the pilot’s position in Ferryboat , putting an ice cube on top of a tumbler taken from hot water in Bubbles , measuring the height to which a suction pump will draw water at different elevations in Suction pump , noticing the frequency of diamonds when movement to or from a diamond lane is allowed in Diamond ).
  • Consulting : One finds a source of information, gets the information from the source, and makes a judgment on whether to accept it. None of our 11 examples include searching for sources of information. In this respect they are unrepresentative, since most people nowadays have almost instant access to information relevant to answering any question, including many of those illustrated by the examples. However, Candidate includes the activities of extracting information from sources and evaluating its credibility.
  • Identifying and analyzing arguments : One notices an argument and works out its structure and content as a preliminary to evaluating its strength. This activity is central to Candidate . It is an important part of a critical thinking process in which one surveys arguments for various positions on an issue.
  • Judging : One makes a judgment on the basis of accumulated evidence and reasoning, such as the judgment in Ferryboat that the purpose of the pole is to provide direction to the pilot.
  • Deciding : One makes a decision on what to do or on what policy to adopt, as in the decision in Transit to take the subway.

By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.

Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.

Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.

Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)

8. Critical Thinking Dispositions

Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).

On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.

A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.

Critical thinking dispositions can usefully be divided into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started) (Facione 1990a: 25). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.

Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.

  • Attentiveness : One will not think critically if one fails to recognize an issue that needs to be thought through. For example, the pedestrian in Weather would not have looked up if he had not noticed that the air was suddenly cooler. To be a critical thinker, then, one needs to be habitually attentive to one’s surroundings, noticing not only what one senses but also sources of perplexity in messages received and in one’s own beliefs and attitudes (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Habit of inquiry : Inquiry is effortful, and one needs an internal push to engage in it. For example, the student in Bubbles could easily have stopped at idle wondering about the cause of the bubbles rather than reasoning to a hypothesis, then designing and executing an experiment to test it. Thus willingness to think critically needs mental energy and initiative. What can supply that energy? Love of inquiry, or perhaps just a habit of inquiry. Hamby (2015) has argued that willingness to inquire is the central critical thinking virtue, one that encompasses all the others. It is recognized as a critical thinking disposition by Dewey (1910: 29; 1933: 35), Glaser (1941: 5), Ennis (1987: 12; 1991: 8), Facione (1990a: 25), Bailin et al. (1999b: 294), Halpern (1998: 452), and Facione, Facione, & Giancarlo (2001).
  • Self-confidence : Lack of confidence in one’s abilities can block critical thinking. For example, if the woman in Rash lacked confidence in her ability to figure things out for herself, she might just have assumed that the rash on her chest was the allergic reaction to her medication against which the pharmacist had warned her. Thus willingness to think critically requires confidence in one’s ability to inquire (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Courage : Fear of thinking for oneself can stop one from doing it. Thus willingness to think critically requires intellectual courage (Paul & Elder 2006: 16).
  • Open-mindedness : A dogmatic attitude will impede thinking critically. For example, a person who adheres rigidly to a “pro-choice” position on the issue of the legal status of induced abortion is likely to be unwilling to consider seriously the issue of when in its development an unborn child acquires a moral right to life. Thus willingness to think critically requires open-mindedness, in the sense of a willingness to examine questions to which one already accepts an answer but which further evidence or reasoning might cause one to answer differently (Dewey 1933; Facione 1990a; Ennis 1991; Bailin et al. 1999b; Halpern 1998, Facione, Facione, & Giancarlo 2001). Paul (1981) emphasizes open-mindedness about alternative world-views, and recommends a dialectical approach to integrating such views as central to what he calls “strong sense” critical thinking.
  • Willingness to suspend judgment : Premature closure on an initial solution will block critical thinking. Thus willingness to think critically requires a willingness to suspend judgment while alternatives are explored (Facione 1990a; Ennis 1991; Halpern 1998).
  • Trust in reason : Since distrust in the processes of reasoned inquiry will dissuade one from engaging in it, trust in them is an initiating critical thinking disposition (Facione 1990a, 25; Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001; Paul & Elder 2006). In reaction to an allegedly exclusive emphasis on reason in critical thinking theory and pedagogy, Thayer-Bacon (2000) argues that intuition, imagination, and emotion have important roles to play in an adequate conception of critical thinking that she calls “constructive thinking”. From her point of view, critical thinking requires trust not only in reason but also in intuition, imagination, and emotion.
  • Seeking the truth : If one does not care about the truth but is content to stick with one’s initial bias on an issue, then one will not think critically about it. Seeking the truth is thus an initiating critical thinking disposition (Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001). A disposition to seek the truth is implicit in more specific critical thinking dispositions, such as trying to be well-informed, considering seriously points of view other than one’s own, looking for alternatives, suspending judgment when the evidence is insufficient, and adopting a position when the evidence supporting it is sufficient.

Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .

Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.

Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).

The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.

Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.

Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.

Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).

Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.

Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).

Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.

Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).

Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.

Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.

Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.

In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.

We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), and Black (2012).

According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work.

Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.

Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .

What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? Abrami et al. (2015) found that in the experimental and quasi-experimental studies that they analyzed dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.

Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .

12. Controversies

Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.

McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), and Bailin et al. (1999b).

McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.

The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.

It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.

Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:

  • reinforcement of egocentric and sociocentric biases over dialectical engagement with opposing world-views (Paul 1981, 1984; Warren 1998)
  • distancing from the object of inquiry over closeness to it (Martin 1992; Thayer-Bacon 1992)
  • indifference to the situation of others over care for them (Martin 1992)
  • orientation to thought over orientation to action (Martin 1992)
  • being reasonable over caring to understand people’s ideas (Thayer-Bacon 1993)
  • being neutral and objective over being embodied and situated (Thayer-Bacon 1995a)
  • doubting over believing (Thayer-Bacon 1995b)
  • reason over emotion, imagination and intuition (Thayer-Bacon 2000)
  • solitary thinking over collaborative thinking (Thayer-Bacon 2000)
  • written and spoken assignments over other forms of expression (Alston 2001)
  • attention to written and spoken communications over attention to human problems (Alston 2001)
  • winning debates in the public sphere over making and understanding meaning (Alston 2001)

A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as

thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)

Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should

be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)

Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.

The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:

  • Focus on argument networks with dialectical exchanges reflecting contesting points of view rather than on atomic arguments, so as to develop “strong sense” critical thinking that transcends egocentric and sociocentric biases (Paul 1981, 1984).
  • Foster closeness to the subject-matter and feeling connected to others in order to inform a humane democracy (Martin 1992).
  • Develop “constructive thinking” as a social activity in a community of physically embodied and socially embedded inquirers with personal voices who value not only reason but also imagination, intuition and emotion (Thayer-Bacon 2000).
  • In developing critical thinking in school subjects, treat as important neither skills nor dispositions but opening worlds of meaning (Alston 2001).
  • Attend to the development of critical thinking dispositions as well as skills, and adopt the “critical pedagogy” practised and advocated by Freire (1968 [1970]) and hooks (1994) (Dalgleish, Girard, & Davies 2017).

A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.

What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.

Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .

As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.

  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Person, 2015, “Strategies for Teaching Students to Think Critically: A Meta-analysis”, Review of Educational Research , 85(2): 275–314. doi:10.3102/0034654314551063
  • Aikin, Wilford M., 1942, The Story of the Eight-year Study, with Conclusions and Recommendations , Volume I of Adventure in American Education , New York and London: Harper & Brothers. [ Aikin 1942 available online ]
  • Alston, Kal, 1995, “Begging the Question: Is Critical Thinking Biased?”, Educational Theory , 45(2): 225–233. doi:10.1111/j.1741-5446.1995.00225.x
  • –––, 2001, “Re/Thinking Critical Thinking: The Seductions of Everyday Life”, Studies in Philosophy and Education , 20(1): 27–40. doi:10.1023/A:1005247128053
  • American Educational Research Association, 2014, Standards for Educational and Psychological Testing / American Educational Research Association, American Psychological Association, National Council on Measurement in Education , Washington, DC: American Educational Research Association.
  • Anderson, Lorin W., David R. Krathwohl, Peter W. Airiasian, Kathleen A. Cruikshank, Richard E. Mayer, Paul R. Pintrich, James Raths, and Merlin C. Wittrock, 2001, A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives , New York: Longman, complete edition.
  • Bailin, Sharon, 1987, “Critical and Creative Thinking”, Informal Logic , 9(1): 23–30. [ Bailin 1987 available online ]
  • –––, 1988, Achieving Extraordinary Ends: An Essay on Creativity , Dordrecht: Kluwer. doi:10.1007/978-94-009-2780-3
  • –––, 1995, “Is Critical Thinking Biased? Clarifications and Implications”, Educational Theory , 45(2): 191–197. doi:10.1111/j.1741-5446.1995.00191.x
  • Bailin, Sharon and Mark Battersby, 2009, “Inquiry: A Dialectical Approach to Teaching Critical Thinking”, in Juho Ritola (ed.), Argument Cultures: Proceedings of OSSA 09 , CD-ROM (pp. 1–10), Windsor, ON: OSSA. [ Bailin & Battersby 2009 available online ]
  • –––, 2016, “Fostering the Virtues of Inquiry”, Topoi , 35(2): 367–374. doi:10.1007/s11245-015-9307-6
  • Bailin, Sharon, Roland Case, Jerrold R. Coombs, and Leroi B. Daniels, 1999a, “Common Misconceptions of Critical Thinking”, Journal of Curriculum Studies , 31(3): 269–283. doi:10.1080/002202799183124
  • –––, 1999b, “Conceptualizing Critical Thinking”, Journal of Curriculum Studies , 31(3): 285–302. doi:10.1080/002202799183133
  • Berman, Alan M., Seth J. Schwartz, William M. Kurtines, and Steven L. Berman, 2001, “The Process of Exploration in Identity Formation: The Role of Style and Competence”, Journal of Adolescence , 24(4): 513–528. doi:10.1006/jado.2001.0386
  • Black, Beth (ed.), 2012, An A to Z of Critical Thinking , London: Continuum International Publishing Group.
  • Bloom, Benjamin Samuel, Max D. Engelhart, Edward J. Furst, Walter H. Hill, and David R. Krathwohl, 1956, Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , New York: David McKay.
  • Casserly, Megan, 2012, “The 10 Skills That Will Get You Hired in 2013”, Forbes , Dec. 10, 2012. Available at https://www.forbes.com/sites/meghancasserly/2012/12/10/the-10-skills-that-will-get-you-a-job-in-2013/#79e7ff4e633d ; accessed 2017 11 06.
  • Center for Assessment & Improvement of Learning, 2017, Critical Thinking Assessment Test , Cookeville, TN: Tennessee Technological University.
  • Cohen, Jacob, 1988, Statistical Power Analysis for the Behavioral Sciences , Hillsdale, NJ: Lawrence Erlbaum Associates, 2nd edition.
  • College Board, 1983, Academic Preparation for College. What Students Need to Know and Be Able to Do , New York: College Entrance Examination Board, ERIC document ED232517.
  • Commission on the Relation of School and College of the Progressive Education Association, 1943, Thirty Schools Tell Their Story , Volume V of Adventure in American Education , New York and London: Harper & Brothers.
  • Council for Aid to Education, 2017, CLA+ Student Guide . Available at http://cae.org/images/uploads/pdf/CLA_Student_Guide_Institution.pdf ; accessed 2017 09 26.
  • Dalgleish, Adam, Patrick Girard, and Maree Davies, 2017, “Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration”, Informal Logic , 37(4): 351–369. [ Dalgleish et al. available online ]
  • Dewey, John, 1910, How We Think , Boston: D.C. Heath. [ Dewey 1910 available online ]
  • –––, 1916, Democracy and Education: An Introduction to the Philosophy of Education , New York: Macmillan.
  • –––, 1933, How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process , Lexington, MA: D.C. Heath.
  • –––, 1936, “The Theory of the Chicago Experiment”, Appendix II of Mayhew & Edwards 1936: 463–477.
  • –––, 1938, Logic: The Theory of Inquiry , New York: Henry Holt and Company.
  • Dominguez, Caroline (coord.), 2018a, A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century , Vila Real, Portugal: UTAD. Available at http://bit.ly/CRITHINKEDUO1 ; accessed 2018 04 09.
  • ––– (coord.), 2018b, A European Review on Critical Thinking Educational Practices in Higher Education Institutions , Vila Real: UTAD. Available at http://bit.ly/CRITHINKEDUO2 ; accessed 2018 04 14.
  • Dumke, Glenn S., 1980, Chancellor’s Executive Order 338 , Long Beach, CA: California State University, Chancellor’s Office. Available at https://www.calstate.edu/eo/EO-338.pdf ; accessed 2017 11 16.
  • Ennis, Robert H., 1958, “An Appraisal of the Watson-Glaser Critical Thinking Appraisal”, The Journal of Educational Research , 52(4): 155–158. doi:10.1080/00220671.1958.10882558
  • –––, 1962, “A Concept of Critical Thinking: A Proposed Basis for Research on the Teaching and Evaluation of Critical Thinking Ability”, Harvard Educational Review , 32(1): 81–111.
  • –––, 1981a, “A Conception of Deductive Logical Competence”, Teaching Philosophy , 4(3/4): 337–385. doi:10.5840/teachphil198143/429
  • –––, 1981b, “Eight Fallacies in Bloom’s Taxonomy”, in C. J. B. Macmillan (ed.), Philosophy of Education 1980: Proceedings of the Thirty-seventh Annual Meeting of the Philosophy of Education Society , Bloomington, IL: Philosophy of Education Society, pp. 269–273.
  • –––, 1984, “Problems in Testing Informal Logic, Critical Thinking, Reasoning Ability”. Informal Logic , 6(1): 3–9. [ Ennis 1984 available online ]
  • –––, 1987, “A Taxonomy of Critical Thinking Dispositions and Abilities”, in Joan Boykoff Baron and Robert J. Sternberg (eds.), Teaching Thinking Skills: Theory and Practice , New York: W. H. Freeman, pp. 9–26.
  • –––, 1989, “Critical Thinking and Subject Specificity: Clarification and Needed Research”, Educational Researcher , 18(3): 4–10. doi:10.3102/0013189X018003004
  • –––, 1991, “Critical Thinking: A Streamlined Conception”, Teaching Philosophy , 14(1): 5–24. doi:10.5840/teachphil19911412
  • –––, 1996, “Critical Thinking Dispositions: Their Nature and Assessability”, Informal Logic , 18(2–3): 165–182. [ Ennis 1996 available online ]
  • –––, 1998, “Is Critical Thinking Culturally Biased?”, Teaching Philosophy , 21(1): 15–33. doi:10.5840/teachphil19982113
  • –––, 2011, “Critical Thinking: Reflection and Perspective Part I”, Inquiry: Critical Thinking across the Disciplines , 26(1): 4–18. doi:10.5840/inquiryctnews20112613
  • –––, 2013, “Critical Thinking across the Curriculum: The Wisdom CTAC Program”, Inquiry: Critical Thinking across the Disciplines , 28(2): 25–45. doi:10.5840/inquiryct20132828
  • –––, 2016, “Definition: A Three-Dimensional Analysis with Bearing on Key Concepts”, in Patrick Bondy and Laura Benacquista (eds.), Argumentation, Objectivity, and Bias: Proceedings of the 11th International Conference of the Ontario Society for the Study of Argumentation (OSSA), 18–21 May 2016 , Windsor, ON: OSSA, pp. 1–19. Available at http://scholar.uwindsor.ca/ossaarchive/OSSA11/papersandcommentaries/105 ; accessed 2017 12 02.
  • –––, 2018, “Critical Thinking Across the Curriculum: A Vision”, Topoi , 37(1): 165–184. doi:10.1007/s11245-016-9401-4
  • Ennis, Robert H., and Jason Millman, 1971, Manual for Cornell Critical Thinking Test, Level X, and Cornell Critical Thinking Test, Level Z , Urbana, IL: Critical Thinking Project, University of Illinois.
  • Ennis, Robert H., Jason Millman, and Thomas Norbert Tomko, 1985, Cornell Critical Thinking Tests Level X & Level Z: Manual , Pacific Grove, CA: Midwest Publication, 3rd edition.
  • –––, 2005, Cornell Critical Thinking Tests Level X & Level Z: Manual , Seaside, CA: Critical Thinking Company, 5th edition.
  • Ennis, Robert H. and Eric Weir, 1985, The Ennis-Weir Critical Thinking Essay Test: Test, Manual, Criteria, Scoring Sheet: An Instrument for Teaching and Testing , Pacific Grove, CA: Midwest Publications.
  • Facione, Peter A., 1990a, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.
  • –––, 1990b, California Critical Thinking Skills Test, CCTST – Form A , Millbrae, CA: The California Academic Press.
  • –––, 1990c, The California Critical Thinking Skills Test--College Level. Technical Report #3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST , ERIC Document ED326584.
  • –––, 1992, California Critical Thinking Skills Test: CCTST – Form B, Millbrae, CA: The California Academic Press.
  • –––, 2000, “The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill”, Informal Logic , 20(1): 61–84. [ Facione 2000 available online ]
  • Facione, Peter A. and Noreen C. Facione, 1992, CCTDI: A Disposition Inventory , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Noreen C. Facione, and Carol Ann F. Giancarlo, 2001, California Critical Thinking Disposition Inventory: CCTDI: Inventory Manual , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Carol A. Sánchez, and Noreen C. Facione, 1994, Are College Students Disposed to Think? , Millbrae, CA: The California Academic Press. ERIC Document ED368311.
  • Fisher, Alec, and Michael Scriven, 1997, Critical Thinking: Its Definition and Assessment , Norwich: Centre for Research in Critical Thinking, University of East Anglia.
  • Freire, Paulo, 1968 [1970], Pedagogia do Oprimido . Translated as Pedagogy of the Oppressed , Myra Bergman Ramos (trans.), New York: Continuum, 1970.
  • Glaser, Edward Maynard, 1941, An Experiment in the Development of Critical Thinking , New York: Bureau of Publications, Teachers College, Columbia University.
  • Halpern, Diane F., 1998, “Teaching Critical Thinking for Transfer Across Domains: Disposition, Skills, Structure Training, and Metacognitive Monitoring”, American Psychologist , 53(4): 449–455. doi:10.1037/0003-066X.53.4.449
  • –––, 2016, Manual: Halpern Critical Thinking Assessment , Mödling, Austria: Schuhfried. Available at https://drive.google.com/file/d/0BzUoP_pmwy1gdEpCR05PeW9qUzA/view ; accessed 2017 12 01.
  • Hamby, Benjamin, 2014, The Virtues of Critical Thinkers , Doctoral dissertation, Philosophy, McMaster University. [ Hamby 2014 available online ]
  • –––, 2015, “Willingness to Inquire: The Cardinal Critical Thinking Virtue”, in Martin Davies and Ronald Barnett (eds.), The Palgrave Handbook of Critical Thinking in Higher Education , New York: Palgrave Macmillan, pp. 77–87.
  • Haynes, Ada, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, and Barry Stein, 2015, “Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning”, Inquiry: Critical Thinking Across the Disciplines , 30(3): 38–48. doi:10.5840/inquiryct201530316
  • Hitchcock, David, 2017, “Critical Thinking as an Educational Ideal”, in his On Reasoning and Argument: Essays in Informal Logic and on Critical Thinking , Dordrecht: Springer, pp. 477–497. doi:10.1007/978-3-319-53562-3_30
  • hooks, bell, 1994, Teaching to Transgress: Education as the Practice of Freedom , New York and London: Routledge.
  • –––, 2010, Teaching Critical Thinking: Practical Wisdom , New York and London: Routledge.
  • Johnson, Ralph H., 1992, “The Problem of Defining Critical Thinking”, in Stephen P, Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 38–53.
  • Kahane, Howard, 1971, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Belmont, CA: Wadsworth.
  • Kahneman, Daniel, 2011, Thinking, Fast and Slow , New York: Farrar, Straus and Giroux.
  • Kenyon, Tim, and Guillaume Beaulac, 2014, “Critical Thinking Education and Debasing”, Informal Logic , 34(4): 341–363. [ Kenyon & Beaulac 2014 available online ]
  • Krathwohl, David R., Benjamin S. Bloom, and Bertram B. Masia, 1964, Taxonomy of Educational Objectives, Handbook II: Affective Domain , New York: David McKay.
  • Kuhn, Deanna, 1991, The Skills of Argument , New York: Cambridge University Press. doi:10.1017/CBO9780511571350
  • Lipman, Matthew, 1987, “Critical Thinking–What Can It Be?”, Analytic Teaching , 8(1): 5–12. [ Lipman 1987 available online ]
  • Loftus, Elizabeth F., 2017, “Eavesdropping on Memory”, Annual Review of Psychology , 68: 1–18. doi:10.1146/annurev-psych-010416-044138
  • Martin, Jane Roland, 1992, “Critical Thinking for a Humane World”, in Stephen P. Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 163–180.
  • Mayhew, Katherine Camp, and Anna Camp Edwards, 1936, The Dewey School: The Laboratory School of the University of Chicago, 1896–1903 , New York: Appleton-Century. [ Mayhew & Edwards 1936 available online ]
  • McPeck, John E., 1981, Critical Thinking and Education , New York: St. Martin’s Press.
  • Nickerson, Raymond S., 1998, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology , 2(2): 175–220. doi:10.1037/1089-2680.2.2.175
  • Nieto, Ana Maria, and Jorge Valenzuela, 2012, “A Study of the Internal Structure of Critical Thinking Dispositions”, Inquiry: Critical Thinking across the Disciplines , 27(1): 31–38. doi:10.5840/inquiryct20122713
  • Norris, Stephen P., 1985, “Controlling for Background Beliefs When Developing Multiple-choice Critical Thinking Tests”, Educational Measurement: Issues and Practice , 7(3): 5–11. doi:10.1111/j.1745-3992.1988.tb00437.x
  • Norris, Stephen P. and Robert H. Ennis, 1989, Evaluating Critical Thinking (The Practitioners’ Guide to Teaching Thinking Series), Pacific Grove, CA: Midwest Publications.
  • Norris, Stephen P. and Ruth Elizabeth King, 1983, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1984, The Design of a Critical Thinking Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland. ERIC Document ED260083.
  • –––, 1985, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1990a, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • –––, 1990b, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • Obama, Barack, 2014, State of the Union Address , January 28, 2014. [ Obama 2014 available online ]
  • OCR [Oxford, Cambridge and RSA Examinations], 2011, AS/A Level GCE: Critical Thinking – H052, H452 , Cambridge: OCR. Information available at http://www.ocr.org.uk/qualifications/as-a-level-gce-critical-thinking-h052-h452/ ; accessed 2017 10 12.
  • OECD [Organization for Economic Cooperation and Development] Centre for Educational Research and Innovation, 2018, Fostering and Assessing Students’ Creative and Critical Thinking Skills in Higher Education , Paris: OECD. Available at http://www.oecd.org/education/ceri/Fostering-and-assessing-students-creative-and-critical-thinking-skills-in-higher-education.pdf ; accessed 2018 04 22.
  • Ontario Ministry of Education, 2013, The Ontario Curriculum Grades 9 to 12: Social Sciences and Humanities . Available at http://www.edu.gov.on.ca/eng/curriculum/secondary/ssciences9to122013.pdf ; accessed 2017 11 16.
  • Passmore, John Arthur, 1980, The Philosophy of Teaching , London: Duckworth.
  • Paul, Richard W., 1981, “Teaching Critical Thinking in the ‘Strong’ Sense: A Focus on Self-Deception, World Views, and a Dialectical Mode of Analysis”, Informal Logic , 4(2): 2–7. [ Paul 1981 available online ]
  • –––, 1984, “Critical Thinking: Fundamental to Education for a Free Society”, Educational Leadership , 42(1): 4–14.
  • –––, 1985, “McPeck’s Mistakes”, Informal Logic , 7(1): 35–43. [ Paul 1985 available online ]
  • Paul, Richard W. and Linda Elder, 2006, The Miniature Guide to Critical Thinking: Concepts and Tools , Dillon Beach, CA: Foundation for Critical Thinking, 4th edition.
  • Payette, Patricia, and Edna Ross, 2016, “Making a Campus-Wide Commitment to Critical Thinking: Insights and Promising Practices Utilizing the Paul-Elder Approach at the University of Louisville”, Inquiry: Critical Thinking Across the Disciplines , 31(1): 98–110. doi:10.5840/inquiryct20163118
  • Possin, Kevin, 2008, “A Field Guide to Critical-Thinking Assessment”, Teaching Philosophy , 31(3): 201–228. doi:10.5840/teachphil200831324
  • –––, 2013a, “Some Problems with the Halpern Critical Thinking Assessment (HCTA) Test”, Inquiry: Critical Thinking across the Disciplines , 28(3): 4–12. doi:10.5840/inquiryct201328313
  • –––, 2013b, “A Serious Flaw in the Collegiate Learning Assessment (CLA) Test”, Informal Logic , 33(3): 390–405. [ Possin 2013b available online ]
  • –––, 2014, “Critique of the Watson-Glaser Critical Thinking Appraisal Test: The More You Know, the Lower Your Score”, Informal Logic , 34(4): 393–416. [ Possin 2014 available online ]
  • Rawls, John, 1971, A Theory of Justice , Cambridge, MA: Harvard University Press.
  • Rousseau, Jean-Jacques, 1762, Émile , Amsterdam: Jean Néaulme.
  • Scheffler, Israel, 1960, The Language of Education , Springfield, IL: Charles C. Thomas.
  • Scriven, Michael, and Richard W. Paul, 1987, Defining Critical Thinking , Draft statement written for the National Council for Excellence in Critical Thinking Instruction. Available at http://www.criticalthinking.org/pages/defining-critical-thinking/766 ; accessed 2017 11 29.
  • Sheffield, Clarence Burton Jr., 2018, “Promoting Critical Thinking in Higher Education: My Experiences as the Inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology”, Topoi , 37(1): 155–163. doi:10.1007/s11245-016-9392-1
  • Siegel, Harvey, 1985, “McPeck, Informal Logic and the Nature of Critical Thinking”, in David Nyberg (ed.), Philosophy of Education 1985: Proceedings of the Forty-First Annual Meeting of the Philosophy of Education Society , Normal, IL: Philosophy of Education Society, pp. 61–72.
  • –––, 1988, Educating Reason: Rationality, Critical Thinking, and Education , New York: Routledge.
  • –––, 1999, “What (Good) Are Thinking Dispositions?”, Educational Theory , 49(2): 207–221. doi:10.1111/j.1741-5446.1999.00207.x
  • Simpson, Elizabeth, 1966–67, “The Classification of Educational Objectives: Psychomotor Domain”, Illinois Teacher of Home Economics , 10(4): 110–144, ERIC document ED0103613. [ Simpson 1966–67 available online ]
  • Skolverket, 2011, Curriculum for the Compulsory School, Preschool Class and the Recreation Centre , Stockholm: Ordförrådet AB. Available at http://malmo.se/download/18.29c3b78a132728ecb52800034181/pdf2687.pdf ; accessed 2017 11 16.
  • Smith, B. Othanel, 1953, “The Improvement of Critical Thinking”, Progressive Education , 30(5): 129–134.
  • Smith, Eugene Randolph, Ralph Winfred Tyler, and the Evaluation Staff, 1942, Appraising and Recording Student Progress , Volume III of Adventure in American Education , New York and London: Harper & Brothers.
  • Splitter, Laurance J., 1987, “Educational Reform through Philosophy for Children”, Thinking: The Journal of Philosophy for Children , 7(2): 32–39. doi:10.5840/thinking1987729
  • Stanovich Keith E., and Paula J. Stanovich, 2010, “A Framework for Critical Thinking, Rational Thinking, and Intelligence”, in David D. Preiss and Robert J. Sternberg (eds), Innovations in Educational Psychology: Perspectives on Learning, Teaching and Human Development , New York: Springer Publishing, pp 195–237.
  • Stanovich Keith E., Richard F. West, and Maggie E. Toplak, 2011, “Intelligence and Rationality”, in Robert J. Sternberg and Scott Barry Kaufman (eds.), Cambridge Handbook of Intelligence , Cambridge: Cambridge University Press, 3rd edition, pp. 784–826. doi:10.1017/CBO9780511977244.040
  • Tankersley, Karen, 2005, Literacy Strategies for Grades 4–12: Reinforcing the Threads of Reading , Alexandria, VA: Association for Supervision and Curriculum Development.
  • Thayer-Bacon, Barbara J., 1992, “Is Modern Critical Thinking Theory Sexist?”, Inquiry: Critical Thinking Across the Disciplines , 10(1): 3–7. doi:10.5840/inquiryctnews199210123
  • –––, 1993, “Caring and Its Relationship to Critical Thinking”, Educational Theory , 43(3): 323–340. doi:10.1111/j.1741-5446.1993.00323.x
  • –––, 1995a, “Constructive Thinking: Personal Voice”, Journal of Thought , 30(1): 55–70.
  • –––, 1995b, “Doubting and Believing: Both are Important for Critical Thinking”, Inquiry: Critical Thinking across the Disciplines , 15(2): 59–66. doi:10.5840/inquiryctnews199515226
  • –––, 2000, Transforming Critical Thinking: Thinking Constructively , New York: Teachers College Press.
  • Toulmin, Stephen Edelston, 1958, The Uses of Argument , Cambridge: Cambridge University Press.
  • Turri, John, Mark Alfano, and John Greco, 2017, “Virtue Epistemology”, in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). URL = < https://plato.stanford.edu/archives/win2017/entries/epistemology-virtue/ >
  • Warren, Karen J. 1988. “Critical Thinking and Feminism”, Informal Logic , 10(1): 31–44. [ Warren 1988 available online ]
  • Watson, Goodwin, and Edward M. Glaser, 1980a, Watson-Glaser Critical Thinking Appraisal, Form A , San Antonio, TX: Psychological Corporation.
  • –––, 1980b, Watson-Glaser Critical Thinking Appraisal: Forms A and B; Manual , San Antonio, TX: Psychological Corporation,
  • –––, 1994, Watson-Glaser Critical Thinking Appraisal, Form B , San Antonio, TX: Psychological Corporation.
  • Weinstein, Mark, 1990, “Towards a Research Agenda for Informal Logic and Critical Thinking”, Informal Logic , 12(3): 121–143. [ Weinstein 1990 available online ]
  • –––, 2013, Logic, Truth and Inquiry , London: College Publications.
  • Zagzebski, Linda Trinkaus, 1996, Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge , Cambridge: Cambridge University Press. doi:10.1017/CBO9781139174763
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up this entry topic at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • Association for Informal Logic and Critical Thinking (AILACT)
  • Center for Teaching Thinking (CTT)
  • Critical Thinking Across the European Higher Education Curricula (CRITHINKEDU)
  • Critical Thinking Definition, Instruction, and Assessment: A Rigorous Approach (criticalTHINKING.net)
  • Critical Thinking Research (RAIL)
  • Foundation for Critical Thinking
  • Insight Assessment
  • Partnership for 21st Century Learning (P21)
  • The Critical Thinking Consortium
  • The Nature of Critical Thinking: An Outline of Critical Thinking Dispositions and Abilities , by Robert H. Ennis

abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal

Copyright © 2018 by David Hitchcock < hitchckd @ mcmaster . ca >

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

Stanford Center for the Study of Language and Information

The Stanford Encyclopedia of Philosophy is copyright © 2016 by The Metaphysics Research Lab , Center for the Study of Language and Information (CSLI), Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

IMAGES

  1. Unleashing Potential: How STEM Education Enhances Critical Thinking Skills

    critical thinking involves action of bias

  2. How Creative and Critical Thinking Help in Promoting National Development

    critical thinking involves action of bias

  3. Critical Thinking Definition, Skills, and Examples

    critical thinking involves action of bias

  4. Cognitive Biases and the Blind Spots of Critical Thinking in 2020

    critical thinking involves action of bias

  5. This Graphic Reveals 10 Cognitive Biases That Shape Our Thinking, With Examples

    critical thinking involves action of bias

  6. Why “Bias For Action” is a critical trait

    critical thinking involves action of bias

VIDEO

  1. Pure Psychology tips for a better life

  2. How to Master the Art of Critical Thinking🎭

  3. DEVELOPING CRITICAL THINKING

  4. What is the Action Bias?

  5. Navigating Bias in Interviews: A Critical Discussion

  6. ICT Daily Bias Foundation -Trading Inside Previous Daily Ranges සිංහලෙන්

COMMENTS

  1. Critical thinking

    Teaching bias and critical thinking skills. By following this step-by-step process, I believe we can talk about bias with our students and increase the chances of them incorporating critical thinking skills into their lives. 1) Choose a bias. Search for a list of biases and read the basic definitions. 2) Learn about it.

  2. Bias

    Wittebols (2019) defines it as a "tendency to be psychologically invested in the familiar and what we believe and less receptive to information that contradicts what we believe" (p. 211). Quite simply, we may reject information that doesn't support our existing thinking. This can manifest in a number of ways with Hahn and Harris (2014 ...

  3. What Is Critical Thinking?

    Critical thinking is the ability to effectively analyze information and form a judgment. To think critically, you must be aware of your own biases and assumptions when encountering information, and apply consistent standards when evaluating sources. Critical thinking skills help you to: Identify credible sources. Evaluate and respond to arguments.

  4. Critical Thinking

    Critical Thinking. Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms ...

  5. Cognitive Bias Is the Loose Screw in Critical Thinking

    People cannot think critically unless they are aware of their cognitive biases, which can alter their perception of reality. Cognitive biases are mental shortcuts people take in order to process ...

  6. 2.2: Overcoming Cognitive Biases and Engaging in Critical Reflection

    The critical aspect of critical reflection involves a willingness to be skeptical of your own beliefs, your gut reactions, and your intuitions. Additionally, the critical aspect engages in a more analytic approach to the problem or situation you are considering. You should assess the facts, consider the evidence, try to employ logic, and resist ...

  7. Critical Thinking

    Critical thinking involves rigorously and skilfully using information, experience, observation, and reasoning to guide your decisions, actions and beliefs. It's a useful skill in the workplace and in life. You'll need to be curious and creative to explore alternative possibilities, but rational to apply logic, and self-aware to identify when ...

  8. What Is Cognitive Bias? Types & Examples

    Confirmation bias, hindsight bias, mere exposure effect, self-serving bias, base rate fallacy, anchoring bias, availability bias, the framing effect , inattentional blindness, and the ecological fallacy are some of the most common examples of cognitive bias. Another example is the false consensus effect. Cognitive biases directly affect our ...

  9. Cognitive Biases and Their Influence on Critical Thinking and

    r epresentativeness h euristic, confirmation bias, neglect of probability bias, overconfidence bias. 1 Many of the fundamental principles of economic theory have recently been challenged.

  10. Bridging critical thinking and transformative learning: The role of

    In recent decades, approaches to critical thinking have generally taken a practical turn, pivoting away from more abstract accounts - such as emphasizing the logical relations that hold between statements (Ennis, 1964) - and moving toward an emphasis on belief and action.According to the definition that Robert Ennis (2018) has been advocating for the last few decades, critical thinking is ...

  11. Contextual Debiasing and Critical Thinking: Reasons for Optimism

    5 Contextual Debiasing and Critical Thinking. The advantage of contextual debiasing techniques is that they allow individuals to "outsmart [their] own biases", to borrow Soll et al.'s ( 2015) expression, without having to rely on unrealistic assumptions regarding their cognitive and motivational capacities.

  12. Critical thinking

    Critical thinking is the analysis of available facts, evidence, observations, and arguments in order to form a judgement by the application of rational, skeptical, and unbiased analyses and evaluation. The application of critical thinking includes self-directed, self-disciplined, self-monitored, and self-corrective habits of the mind, thus a critical thinker is a person who practices the ...

  13. LibGuides: Critical Thinking & Evaluating Information: Bias

    Confirmation Bias - "Originating in the field of psychology; the tendency to seek or favour new information which supports one's existing theories or beliefs, while avoiding or rejecting that which disrupts them." Addition of definition to the Oxford Dictionary in 2019. "confirmation, n." OED Online, Oxford University Press, December 2020 ...

  14. Cognitive Bias List: 13 Common Types of Bias

    The Availability Heuristic. The Optimism Bias. Other Kinds. Although we like to believe that we're rational and logical, the fact is that we are continually under the influence of cognitive biases. These biases distort thinking, influence beliefs, and sway the decisions and judgments that people make each and every day.

  15. Bias and Critical Thinking

    Here, we show three different perspective in order to enable a more reflexive understanding of bias. The first is the understanding how different forms of biases relate to design criteria of scientific methods. The second is the question which stage in the application of methods - data gathering, data analysis, and interpretation of results ...

  16. Critical Thinking and Cognitive Bias

    Keywords: cognitive bias, critical thinking, metacognition, pedagogy. 1. The problem. Developing critical thinking skill is a central educational aim across the curriculum. Critical thinking, it is hoped, perhaps un-like the specifics of course content, is durable and portable.1. 1 The portability of critical thinking has been challenged ...

  17. Critical Thinking and Decision-Making

    Simply put, critical thinking is the act of deliberately analyzing information so that you can make better judgements and decisions. It involves using things like logic, reasoning, and creativity, to draw conclusions and generally understand things better. This may sound like a pretty broad definition, and that's because critical thinking is a ...

  18. Recognizing Bias: A Problem Solving and Critical Thinking Skills Guide

    Sources of Bias. Recognizing bias is an essential part of problem solving and critical thinking. It is important to be aware of potential sources of bias, such as personal opinions, values, or preconceived notions. Bias can have a profound effect on decisions, leading to outcomes that are not based on facts or evidence.

  19. Master cognitive biases and improve your critical thinking

    Being able to recognize and reduce your own cognitive biases will allow you to make better decisions. You will recognize your own strengths and limitations more accurately. You will begin to notice which situations spike your irrational judgments, and which situations allow your rational. When you become familiar with cognitive bias, you will ...

  20. How to Think Critically: Strategies for Effective Decision-Making

    Applying critical thinking in decision-making involves gathering and evaluating information, identifying assumptions and biases, considering multiple perspectives, and making informed judgments. Cognitive biases, such as confirmation bias and availability bias, can hinder critical thinking and decision-making. Improving critical thinking skills ...

  21. Implicit Bias Hurts Everyone. Here's How to Overcome It

    Implicit bias is a form of bias that influences our decision-making, our interactions and our behaviors. It can be based on any social group membership, like race, gender, age, sexual orientation ...

  22. Does Critical Thinking and Logic Education Have a Western Bias? The

    Critical Thinking about P is not the most reliable strategy from the available strategies in C b P.∴ It is not the case that Critical Thinking about P in C b P is epistemically responsible. In Vaidya (2013), I offered an extensive argument against Huemer's position. That argument depends on a development of a theory of what constitutes an ...

  23. Critical Thinking

    Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking ...