Radford University

Center for Innovation and Analytics

Departments

  • Academic Affairs
  • Audit and Advisory Services
  • Finance and Administration
  • Human Resources
  • Information Technology
  • Office of the President
  • Student Affairs
  • University Advancement
  • University Relations
  • Other Offices and Departments
  • About the Center for Innovation and Analytics
  • Areas of Growth in Analytics
  • Analytics Career Preparation
  • Microsoft Office Specialist Certifications
  • Executives in Residence in Analytics
  • Success Stories
  • Analytics Events
  • SAS Joint Graduate Certificate in Business Analytics
  • Analytics Resources
  • Online SAS Joint Graduate Certificate in Business Analytics Certificate
  • The Background to Support the Center
  • What the Center Provides
  • Skills Required by Employers
  • Director's Bio

P.O. Box 6953 Radford, VA 24142 Kyle Hall Suite 231 540.831.5513 cia@radford.edu cia-analytics@radford.edu cia-innovation@radford.edu

Dr. Wil Stanton, Director wstanton@radford.edu cia-analytics@radford.edu

Vicki Perkins, Administrative Assistant vperkins1@radford.edu

Problem Solving, Critical Thinking, and Analytical Reasoning Skills Sought by Employers

In this section:

Problem Solving

  • Critical Thinking

Analytical Reasoning

View the content on this page in a Word document.

Critical thinking, analytical reasoning, and problem-solving skills are required to perform well on tasks expected by employers. 1 Having good problem-solving and critical thinking skills can make a major difference in a person’s career. 2

Every day, from an entry-level employee to the Chairman of the Board, problems need to be resolved. Whether solving a problem for a client (internal or external), supporting those who are solving problems, or discovering new problems to solve, the challenges faced may be simple/complex or easy/difficult.

A fundamental component of every manager's role is solving problems. So, helping students become a confident problem solver is critical to their success; and confidence comes from possessing an efficient and practiced problem-solving process.

Employers want employees with well-founded skills in these areas, so they ask four questions when assessing a job candidate 3 :

  • Evaluation of information: How well does the applicant assess the quality and relevance of information?
  • Analysis and Synthesis of information: How well does the applicant analyze and synthesize data and information?
  • Drawing conclusions: How well does the applicant form a conclusion from their analysis?
  • Acknowledging alternative explanations/viewpoints: How well does the applicant consider other options and acknowledge that their answer is not the only perspective?

When an employer says they want employees who are good at solving complex problems, they are saying they want employees possessing the following skills:

  • Analytical Thinking — A person who can use logic and critical thinking to analyze a situation.
  • Critical Thinking – A person who makes reasoned judgments that are logical and well thought out.
  • Initiative — A person who will step up and take action without being asked. A person who looks for opportunities to make a difference.
  • Creativity — A person who is an original thinker and have the ability to go beyond traditional approaches.
  • Resourcefulness — A person who will adapt to new/difficult situations and devise ways to overcome obstacles.
  • Determination — A person who is persistent and does not give up easily.
  • Results-Oriented — A person whose focus is on getting the problem solved.

Two of the major components of problem-solving skills are critical thinking and analytical reasoning.  These two skills are at the top of skills required of applicants by employers.

- Return to top of page -

Critical Thinking 4

“Mentions of critical thinking in job postings have doubled since 2009, according to an analysis by career-search site Indeed.com.” 5 Making logical and reasoned judgments that are well thought out is at the core of critical thinking. Using critical thinking an individual will not automatically accept information or conclusions drawn from to be factual, valid, true, applicable or correct. “When students are taught how to use critical thinking to tap into their creativity to solve problems, they are more successful than other students when they enter management-training programs in large corporations.” 6

A strong applicant should question and want to make evidence-based decisions. Employers want employees who say things such as: “Is that a fact or just an opinion? Is this conclusion based on data or gut feel?” and “If you had additional data could there be alternative possibilities?” Employers seek employees who possess the skills and abilities to conceptualize, apply, analyze, synthesize, and evaluate information to reach an answer or conclusion.

Employers require critical thinking in employees because it increases the probability of a positive business outcome. Employers want employees whose thinking is intentional, purposeful, reasoned, and goal directed.

Recruiters say they want applicants with problem-solving and critical thinking skills. They “encourage applicants to prepare stories to illustrate their critical-thinking prowess, detailing, for example, the steps a club president took to improve attendance at weekly meetings.” 7

Employers want students to possess analytical reasoning/thinking skills — meaning they want to hire someone who is good at breaking down problems into smaller parts to find solutions. “The adjective, analytical, and the related verb analyze can both be traced back to the Greek verb, analyein — ‘to break up, to loosen.’ If a student is analytical, you are good at taking a problem or task and breaking it down into smaller elements in order to solve the problem or complete the task.” 9

Analytical reasoning connotes a person's general aptitude to arrive at a logical conclusion or solution to given problems. Just as with critical thinking, analytical thinking critically examines the different parts or details of something to fully understand or explain it. Analytical thinking often requires the person to use “cause and effect, similarities and differences, trends, associations between things, inter-relationships between the parts, the sequence of events, ways to solve complex problems, steps within a process, diagraming what is happening.” 10

Analytical reasoning is the ability to look at information and discern patterns within it. “The pattern could be the structure the author of the information uses to structure an argument, or trends in a large data set. By learning methods of recognizing these patterns, individuals can pull more information out of a text or data set than someone who is not using analytical reasoning to identify deeper patterns.” 11

Employers want employees to have the aptitude to apply analytical reasoning to problems faced by the business. For instance, “a quantitative analyst can break down data into patterns to discern information, such as if a decrease in sales is part of a seasonal pattern of ups and downs or part of a greater downward trend that a business should be worried about. By learning to recognize these patterns in both numbers and written arguments, an individual gains insights into the information that someone who simply takes the information at face value will miss.” 12

Managers with excellent analytical reasoning abilities are considered good at, “evaluating problems, analyzing them from more than one angle and finding a solution that works best in the given circumstances”. 13 Businesses want managers who can apply analytical reasoning skills to meet challenges and keep a business functioning smoothly

A person with good analytical reasoning and pattern recognition skills can see trends in a problem much easier than anyone else.

  • Find a Course
  • For Business
  • For Educators
  • Product News

Analytical thinking: what it is and why it matters more than ever

January 30, 2024

critical thinking and analytical thinking skills are less needed in position paper

Welcome back to our high-impact workplace skills series. We really enjoyed the conversations happening in the comments section of last week’s top skills of 2023 issue, so be sure to check those out for perspectives and insights from fellow members of our Career Chat community.

One comment that’s been on our mind came from Kendra Vivian Lewis , who asked some thoughtful questions about the comparative importance of workplace and technical skills and if there’s a way to forecast which skills will be important in the coming years. This week’s topic—analytical thinking, the number one skill on the list—is a great example as we explore both questions. Be sure to read to the end to discover a special offer that we’re running on Coursera Plus subscriptions through September 21.

What it means to think analytically

Analytical thinking involves using data to understand problems, identify potential solutions, and suggest the solution that’s most likely to have the desired impact. It’s similar to critical thinking skills , which are the skills you use to interpret information and make decisions.

In order to succeed as a strong analytical thinker, you also need to have strong technical skills in your field. Remember: technical skills describe the things you do, while workplace skills describe how you do them. So your workplace skills, used effectively, enhance your technical skills. That’s why we consider them to be high-impact—they stand to make your work more impactful than it would have been had you only used your technical skills.

To illustrate, suppose you just started a job as a data analyst for a think tank focused on climate change, and you’ve been tasked with raising community engagement in future climate action efforts.

You might start with your technical data analysis skills as you gather data from a few sources. Then, you’ll use your analytical thinking skills to determine the validity of each data source. Perhaps you’ll discard one source when you learn the research was funded by a firm with a financial stake in fossil fuel consumption. Your technical skills lead again as you clean data, and then you’ll return to your analytical thinking skills to analyze and interpret your findings, ultimately leading to your recommendation to start a transparency campaign to display water and energy use in the community.

Tell us in the comments: How do you use your analytical skills alongside your technical skills in your day-to-day work?

Why analytical skills top the list

To develop the skills list, the World Economic Forum surveyed 800+ global employers on their views of skills and jobs over the next five years, so this list is forward-looking. According to the Future of Jobs Report , employers believe analytical thinking skills will grow in importance by 72 percent in this timeframe.

The reason employers are keen to hire employees with strong analytical thinking skills is informed by trends in automation and technological advancements. While technical data analysis becomes easier with automation, reasoning and decision-making automation is advancing at a much slower pace—meaning employers anticipate that, within the next five years, we’ll have a wealth of data at our fingertips and too few people to interpret what that data means.

Where to begin

For a crash course in critical thinking, try the University of California, Davis’s Critical Thinking Skills for the Professional course. You can finish this beginner-level course in about 7 hours.

For a more comprehensive exploration into analytical thinking , try Duke University’s Introduction to Logic and Critical Thinking Specialization . Over four courses, you’ll learn how to effectively argue and reason using logic.

For a technical process to guide your analytical thinking, try Google’s Data Analytics Professional Certificate . Ground your analytical thinking skills in technical know-how in this eight-course series.

Interested in multiple programs? Don’t miss this special offer!

Through September 21, we’re offering $100 off annual Coursera Plus subscriptions for new subscribers. With this offer, you’ll pay less than $25 per month for one year of access to 6,100 courses, Specializations, and Professional Certificates with flexibility to start new courses and move between programs at your pace.

This offer is a great choice if you are frequently tempted to enroll in multiple courses at once or plan to complete a Specialization or Professional Certificate within the next year. If that sounds like you, take a closer look at the offer and the Coursera Plus course catalog.

That’s all for this week! Join us next week to talk about motivation and self-awareness skills.

Keep reading

  • Coursera Receives Industry-first Authorized Instructional Platform Designation from the American Council on Education
  • How to answer “what are your strengths and weaknesses?” in interviews
  • Job interviews: How to confidently prepare

SEP home page

  • Table of Contents
  • Random Entry
  • Chronological
  • Editorial Information
  • About the SEP
  • Editorial Board
  • How to Cite the SEP
  • Special Characters
  • Advanced Tools
  • Support the SEP
  • PDFs for SEP Friends
  • Make a Donation
  • SEPIA for Libraries
  • Entry Contents

Bibliography

Academic tools.

  • Friends PDF Preview
  • Author and Citation Info
  • Back to Top

Critical Thinking

Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms for thinking carefully, and the thinking components on which they focus. Its adoption as an educational goal has been recommended on the basis of respect for students’ autonomy and preparing students for success in life and for democratic citizenship. “Critical thinkers” have the dispositions and abilities that lead them to think critically when appropriate. The abilities can be identified directly; the dispositions indirectly, by considering what factors contribute to or impede exercise of the abilities. Standardized tests have been developed to assess the degree to which a person possesses such dispositions and abilities. Educational intervention has been shown experimentally to improve them, particularly when it includes dialogue, anchored instruction, and mentoring. Controversies have arisen over the generalizability of critical thinking across domains, over alleged bias in critical thinking theories and instruction, and over the relationship of critical thinking to other types of thinking.

2.1 Dewey’s Three Main Examples

2.2 dewey’s other examples, 2.3 further examples, 2.4 non-examples, 3. the definition of critical thinking, 4. its value, 5. the process of thinking critically, 6. components of the process, 7. contributory dispositions and abilities, 8.1 initiating dispositions, 8.2 internal dispositions, 9. critical thinking abilities, 10. required knowledge, 11. educational methods, 12.1 the generalizability of critical thinking, 12.2 bias in critical thinking theory and pedagogy, 12.3 relationship of critical thinking to other types of thinking, other internet resources, related entries.

Use of the term ‘critical thinking’ to describe an educational goal goes back to the American philosopher John Dewey (1910), who more commonly called it ‘reflective thinking’. He defined it as

active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends. (Dewey 1910: 6; 1933: 9)

and identified a habit of such consideration with a scientific attitude of mind. His lengthy quotations of Francis Bacon, John Locke, and John Stuart Mill indicate that he was not the first person to propose development of a scientific attitude of mind as an educational goal.

In the 1930s, many of the schools that participated in the Eight-Year Study of the Progressive Education Association (Aikin 1942) adopted critical thinking as an educational goal, for whose achievement the study’s Evaluation Staff developed tests (Smith, Tyler, & Evaluation Staff 1942). Glaser (1941) showed experimentally that it was possible to improve the critical thinking of high school students. Bloom’s influential taxonomy of cognitive educational objectives (Bloom et al. 1956) incorporated critical thinking abilities. Ennis (1962) proposed 12 aspects of critical thinking as a basis for research on the teaching and evaluation of critical thinking ability.

Since 1980, an annual international conference in California on critical thinking and educational reform has attracted tens of thousands of educators from all levels of education and from many parts of the world. Also since 1980, the state university system in California has required all undergraduate students to take a critical thinking course. Since 1983, the Association for Informal Logic and Critical Thinking has sponsored sessions in conjunction with the divisional meetings of the American Philosophical Association (APA). In 1987, the APA’s Committee on Pre-College Philosophy commissioned a consensus statement on critical thinking for purposes of educational assessment and instruction (Facione 1990a). Researchers have developed standardized tests of critical thinking abilities and dispositions; for details, see the Supplement on Assessment . Educational jurisdictions around the world now include critical thinking in guidelines for curriculum and assessment.

For details on this history, see the Supplement on History .

2. Examples and Non-Examples

Before considering the definition of critical thinking, it will be helpful to have in mind some examples of critical thinking, as well as some examples of kinds of thinking that would apparently not count as critical thinking.

Dewey (1910: 68–71; 1933: 91–94) takes as paradigms of reflective thinking three class papers of students in which they describe their thinking. The examples range from the everyday to the scientific.

Transit : “The other day, when I was down town on 16th Street, a clock caught my eye. I saw that the hands pointed to 12:20. This suggested that I had an engagement at 124th Street, at one o’clock. I reasoned that as it had taken me an hour to come down on a surface car, I should probably be twenty minutes late if I returned the same way. I might save twenty minutes by a subway express. But was there a station near? If not, I might lose more than twenty minutes in looking for one. Then I thought of the elevated, and I saw there was such a line within two blocks. But where was the station? If it were several blocks above or below the street I was on, I should lose time instead of gaining it. My mind went back to the subway express as quicker than the elevated; furthermore, I remembered that it went nearer than the elevated to the part of 124th Street I wished to reach, so that time would be saved at the end of the journey. I concluded in favor of the subway, and reached my destination by one o’clock.” (Dewey 1910: 68–69; 1933: 91–92)

Ferryboat : “Projecting nearly horizontally from the upper deck of the ferryboat on which I daily cross the river is a long white pole, having a gilded ball at its tip. It suggested a flagpole when I first saw it; its color, shape, and gilded ball agreed with this idea, and these reasons seemed to justify me in this belief. But soon difficulties presented themselves. The pole was nearly horizontal, an unusual position for a flagpole; in the next place, there was no pulley, ring, or cord by which to attach a flag; finally, there were elsewhere on the boat two vertical staffs from which flags were occasionally flown. It seemed probable that the pole was not there for flag-flying.

“I then tried to imagine all possible purposes of the pole, and to consider for which of these it was best suited: (a) Possibly it was an ornament. But as all the ferryboats and even the tugboats carried poles, this hypothesis was rejected. (b) Possibly it was the terminal of a wireless telegraph. But the same considerations made this improbable. Besides, the more natural place for such a terminal would be the highest part of the boat, on top of the pilot house. (c) Its purpose might be to point out the direction in which the boat is moving.

“In support of this conclusion, I discovered that the pole was lower than the pilot house, so that the steersman could easily see it. Moreover, the tip was enough higher than the base, so that, from the pilot’s position, it must appear to project far out in front of the boat. Moreover, the pilot being near the front of the boat, he would need some such guide as to its direction. Tugboats would also need poles for such a purpose. This hypothesis was so much more probable than the others that I accepted it. I formed the conclusion that the pole was set up for the purpose of showing the pilot the direction in which the boat pointed, to enable him to steer correctly.” (Dewey 1910: 69–70; 1933: 92–93)

Bubbles : “In washing tumblers in hot soapsuds and placing them mouth downward on a plate, bubbles appeared on the outside of the mouth of the tumblers and then went inside. Why? The presence of bubbles suggests air, which I note must come from inside the tumbler. I see that the soapy water on the plate prevents escape of the air save as it may be caught in bubbles. But why should air leave the tumbler? There was no substance entering to force it out. It must have expanded. It expands by increase of heat, or by decrease of pressure, or both. Could the air have become heated after the tumbler was taken from the hot suds? Clearly not the air that was already entangled in the water. If heated air was the cause, cold air must have entered in transferring the tumblers from the suds to the plate. I test to see if this supposition is true by taking several more tumblers out. Some I shake so as to make sure of entrapping cold air in them. Some I take out holding mouth downward in order to prevent cold air from entering. Bubbles appear on the outside of every one of the former and on none of the latter. I must be right in my inference. Air from the outside must have been expanded by the heat of the tumbler, which explains the appearance of the bubbles on the outside. But why do they then go inside? Cold contracts. The tumbler cooled and also the air inside it. Tension was removed, and hence bubbles appeared inside. To be sure of this, I test by placing a cup of ice on the tumbler while the bubbles are still forming outside. They soon reverse” (Dewey 1910: 70–71; 1933: 93–94).

Dewey (1910, 1933) sprinkles his book with other examples of critical thinking. We will refer to the following.

Weather : A man on a walk notices that it has suddenly become cool, thinks that it is probably going to rain, looks up and sees a dark cloud obscuring the sun, and quickens his steps (1910: 6–10; 1933: 9–13).

Disorder : A man finds his rooms on his return to them in disorder with his belongings thrown about, thinks at first of burglary as an explanation, then thinks of mischievous children as being an alternative explanation, then looks to see whether valuables are missing, and discovers that they are (1910: 82–83; 1933: 166–168).

Typhoid : A physician diagnosing a patient whose conspicuous symptoms suggest typhoid avoids drawing a conclusion until more data are gathered by questioning the patient and by making tests (1910: 85–86; 1933: 170).

Blur : A moving blur catches our eye in the distance, we ask ourselves whether it is a cloud of whirling dust or a tree moving its branches or a man signaling to us, we think of other traits that should be found on each of those possibilities, and we look and see if those traits are found (1910: 102, 108; 1933: 121, 133).

Suction pump : In thinking about the suction pump, the scientist first notes that it will draw water only to a maximum height of 33 feet at sea level and to a lesser maximum height at higher elevations, selects for attention the differing atmospheric pressure at these elevations, sets up experiments in which the air is removed from a vessel containing water (when suction no longer works) and in which the weight of air at various levels is calculated, compares the results of reasoning about the height to which a given weight of air will allow a suction pump to raise water with the observed maximum height at different elevations, and finally assimilates the suction pump to such apparently different phenomena as the siphon and the rising of a balloon (1910: 150–153; 1933: 195–198).

Diamond : A passenger in a car driving in a diamond lane reserved for vehicles with at least one passenger notices that the diamond marks on the pavement are far apart in some places and close together in others. Why? The driver suggests that the reason may be that the diamond marks are not needed where there is a solid double line separating the diamond lane from the adjoining lane, but are needed when there is a dotted single line permitting crossing into the diamond lane. Further observation confirms that the diamonds are close together when a dotted line separates the diamond lane from its neighbour, but otherwise far apart.

Rash : A woman suddenly develops a very itchy red rash on her throat and upper chest. She recently noticed a mark on the back of her right hand, but was not sure whether the mark was a rash or a scrape. She lies down in bed and thinks about what might be causing the rash and what to do about it. About two weeks before, she began taking blood pressure medication that contained a sulfa drug, and the pharmacist had warned her, in view of a previous allergic reaction to a medication containing a sulfa drug, to be on the alert for an allergic reaction; however, she had been taking the medication for two weeks with no such effect. The day before, she began using a new cream on her neck and upper chest; against the new cream as the cause was mark on the back of her hand, which had not been exposed to the cream. She began taking probiotics about a month before. She also recently started new eye drops, but she supposed that manufacturers of eye drops would be careful not to include allergy-causing components in the medication. The rash might be a heat rash, since she recently was sweating profusely from her upper body. Since she is about to go away on a short vacation, where she would not have access to her usual physician, she decides to keep taking the probiotics and using the new eye drops but to discontinue the blood pressure medication and to switch back to the old cream for her neck and upper chest. She forms a plan to consult her regular physician on her return about the blood pressure medication.

Candidate : Although Dewey included no examples of thinking directed at appraising the arguments of others, such thinking has come to be considered a kind of critical thinking. We find an example of such thinking in the performance task on the Collegiate Learning Assessment (CLA+), which its sponsoring organization describes as

a performance-based assessment that provides a measure of an institution’s contribution to the development of critical-thinking and written communication skills of its students. (Council for Aid to Education 2017)

A sample task posted on its website requires the test-taker to write a report for public distribution evaluating a fictional candidate’s policy proposals and their supporting arguments, using supplied background documents, with a recommendation on whether to endorse the candidate.

Immediate acceptance of an idea that suggests itself as a solution to a problem (e.g., a possible explanation of an event or phenomenon, an action that seems likely to produce a desired result) is “uncritical thinking, the minimum of reflection” (Dewey 1910: 13). On-going suspension of judgment in the light of doubt about a possible solution is not critical thinking (Dewey 1910: 108). Critique driven by a dogmatically held political or religious ideology is not critical thinking; thus Paulo Freire (1968 [1970]) is using the term (e.g., at 1970: 71, 81, 100, 146) in a more politically freighted sense that includes not only reflection but also revolutionary action against oppression. Derivation of a conclusion from given data using an algorithm is not critical thinking.

What is critical thinking? There are many definitions. Ennis (2016) lists 14 philosophically oriented scholarly definitions and three dictionary definitions. Following Rawls (1971), who distinguished his conception of justice from a utilitarian conception but regarded them as rival conceptions of the same concept, Ennis maintains that the 17 definitions are different conceptions of the same concept. Rawls articulated the shared concept of justice as

a characteristic set of principles for assigning basic rights and duties and for determining… the proper distribution of the benefits and burdens of social cooperation. (Rawls 1971: 5)

Bailin et al. (1999b) claim that, if one considers what sorts of thinking an educator would take not to be critical thinking and what sorts to be critical thinking, one can conclude that educators typically understand critical thinking to have at least three features.

  • It is done for the purpose of making up one’s mind about what to believe or do.
  • The person engaging in the thinking is trying to fulfill standards of adequacy and accuracy appropriate to the thinking.
  • The thinking fulfills the relevant standards to some threshold level.

One could sum up the core concept that involves these three features by saying that critical thinking is careful goal-directed thinking. This core concept seems to apply to all the examples of critical thinking described in the previous section. As for the non-examples, their exclusion depends on construing careful thinking as excluding jumping immediately to conclusions, suspending judgment no matter how strong the evidence, reasoning from an unquestioned ideological or religious perspective, and routinely using an algorithm to answer a question.

If the core of critical thinking is careful goal-directed thinking, conceptions of it can vary according to its presumed scope, its presumed goal, one’s criteria and threshold for being careful, and the thinking component on which one focuses. As to its scope, some conceptions (e.g., Dewey 1910, 1933) restrict it to constructive thinking on the basis of one’s own observations and experiments, others (e.g., Ennis 1962; Fisher & Scriven 1997; Johnson 1992) to appraisal of the products of such thinking. Ennis (1991) and Bailin et al. (1999b) take it to cover both construction and appraisal. As to its goal, some conceptions restrict it to forming a judgment (Dewey 1910, 1933; Lipman 1987; Facione 1990a). Others allow for actions as well as beliefs as the end point of a process of critical thinking (Ennis 1991; Bailin et al. 1999b). As to the criteria and threshold for being careful, definitions vary in the term used to indicate that critical thinking satisfies certain norms: “intellectually disciplined” (Scriven & Paul 1987), “reasonable” (Ennis 1991), “skillful” (Lipman 1987), “skilled” (Fisher & Scriven 1997), “careful” (Bailin & Battersby 2009). Some definitions specify these norms, referring variously to “consideration of any belief or supposed form of knowledge in the light of the grounds that support it and the further conclusions to which it tends” (Dewey 1910, 1933); “the methods of logical inquiry and reasoning” (Glaser 1941); “conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication” (Scriven & Paul 1987); the requirement that “it is sensitive to context, relies on criteria, and is self-correcting” (Lipman 1987); “evidential, conceptual, methodological, criteriological, or contextual considerations” (Facione 1990a); and “plus-minus considerations of the product in terms of appropriate standards (or criteria)” (Johnson 1992). Stanovich and Stanovich (2010) propose to ground the concept of critical thinking in the concept of rationality, which they understand as combining epistemic rationality (fitting one’s beliefs to the world) and instrumental rationality (optimizing goal fulfillment); a critical thinker, in their view, is someone with “a propensity to override suboptimal responses from the autonomous mind” (2010: 227). These variant specifications of norms for critical thinking are not necessarily incompatible with one another, and in any case presuppose the core notion of thinking carefully. As to the thinking component singled out, some definitions focus on suspension of judgment during the thinking (Dewey 1910; McPeck 1981), others on inquiry while judgment is suspended (Bailin & Battersby 2009, 2021), others on the resulting judgment (Facione 1990a), and still others on responsiveness to reasons (Siegel 1988). Kuhn (2019) takes critical thinking to be more a dialogic practice of advancing and responding to arguments than an individual ability.

In educational contexts, a definition of critical thinking is a “programmatic definition” (Scheffler 1960: 19). It expresses a practical program for achieving an educational goal. For this purpose, a one-sentence formulaic definition is much less useful than articulation of a critical thinking process, with criteria and standards for the kinds of thinking that the process may involve. The real educational goal is recognition, adoption and implementation by students of those criteria and standards. That adoption and implementation in turn consists in acquiring the knowledge, abilities and dispositions of a critical thinker.

Conceptions of critical thinking generally do not include moral integrity as part of the concept. Dewey, for example, took critical thinking to be the ultimate intellectual goal of education, but distinguished it from the development of social cooperation among school children, which he took to be the central moral goal. Ennis (1996, 2011) added to his previous list of critical thinking dispositions a group of dispositions to care about the dignity and worth of every person, which he described as a “correlative” (1996) disposition without which critical thinking would be less valuable and perhaps harmful. An educational program that aimed at developing critical thinking but not the correlative disposition to care about the dignity and worth of every person, he asserted, “would be deficient and perhaps dangerous” (Ennis 1996: 172).

Dewey thought that education for reflective thinking would be of value to both the individual and society; recognition in educational practice of the kinship to the scientific attitude of children’s native curiosity, fertile imagination and love of experimental inquiry “would make for individual happiness and the reduction of social waste” (Dewey 1910: iii). Schools participating in the Eight-Year Study took development of the habit of reflective thinking and skill in solving problems as a means to leading young people to understand, appreciate and live the democratic way of life characteristic of the United States (Aikin 1942: 17–18, 81). Harvey Siegel (1988: 55–61) has offered four considerations in support of adopting critical thinking as an educational ideal. (1) Respect for persons requires that schools and teachers honour students’ demands for reasons and explanations, deal with students honestly, and recognize the need to confront students’ independent judgment; these requirements concern the manner in which teachers treat students. (2) Education has the task of preparing children to be successful adults, a task that requires development of their self-sufficiency. (3) Education should initiate children into the rational traditions in such fields as history, science and mathematics. (4) Education should prepare children to become democratic citizens, which requires reasoned procedures and critical talents and attitudes. To supplement these considerations, Siegel (1988: 62–90) responds to two objections: the ideology objection that adoption of any educational ideal requires a prior ideological commitment and the indoctrination objection that cultivation of critical thinking cannot escape being a form of indoctrination.

Despite the diversity of our 11 examples, one can recognize a common pattern. Dewey analyzed it as consisting of five phases:

  • suggestions , in which the mind leaps forward to a possible solution;
  • an intellectualization of the difficulty or perplexity into a problem to be solved, a question for which the answer must be sought;
  • the use of one suggestion after another as a leading idea, or hypothesis , to initiate and guide observation and other operations in collection of factual material;
  • the mental elaboration of the idea or supposition as an idea or supposition ( reasoning , in the sense on which reasoning is a part, not the whole, of inference); and
  • testing the hypothesis by overt or imaginative action. (Dewey 1933: 106–107; italics in original)

The process of reflective thinking consisting of these phases would be preceded by a perplexed, troubled or confused situation and followed by a cleared-up, unified, resolved situation (Dewey 1933: 106). The term ‘phases’ replaced the term ‘steps’ (Dewey 1910: 72), thus removing the earlier suggestion of an invariant sequence. Variants of the above analysis appeared in (Dewey 1916: 177) and (Dewey 1938: 101–119).

The variant formulations indicate the difficulty of giving a single logical analysis of such a varied process. The process of critical thinking may have a spiral pattern, with the problem being redefined in the light of obstacles to solving it as originally formulated. For example, the person in Transit might have concluded that getting to the appointment at the scheduled time was impossible and have reformulated the problem as that of rescheduling the appointment for a mutually convenient time. Further, defining a problem does not always follow after or lead immediately to an idea of a suggested solution. Nor should it do so, as Dewey himself recognized in describing the physician in Typhoid as avoiding any strong preference for this or that conclusion before getting further information (Dewey 1910: 85; 1933: 170). People with a hypothesis in mind, even one to which they have a very weak commitment, have a so-called “confirmation bias” (Nickerson 1998): they are likely to pay attention to evidence that confirms the hypothesis and to ignore evidence that counts against it or for some competing hypothesis. Detectives, intelligence agencies, and investigators of airplane accidents are well advised to gather relevant evidence systematically and to postpone even tentative adoption of an explanatory hypothesis until the collected evidence rules out with the appropriate degree of certainty all but one explanation. Dewey’s analysis of the critical thinking process can be faulted as well for requiring acceptance or rejection of a possible solution to a defined problem, with no allowance for deciding in the light of the available evidence to suspend judgment. Further, given the great variety of kinds of problems for which reflection is appropriate, there is likely to be variation in its component events. Perhaps the best way to conceptualize the critical thinking process is as a checklist whose component events can occur in a variety of orders, selectively, and more than once. These component events might include (1) noticing a difficulty, (2) defining the problem, (3) dividing the problem into manageable sub-problems, (4) formulating a variety of possible solutions to the problem or sub-problem, (5) determining what evidence is relevant to deciding among possible solutions to the problem or sub-problem, (6) devising a plan of systematic observation or experiment that will uncover the relevant evidence, (7) carrying out the plan of systematic observation or experimentation, (8) noting the results of the systematic observation or experiment, (9) gathering relevant testimony and information from others, (10) judging the credibility of testimony and information gathered from others, (11) drawing conclusions from gathered evidence and accepted testimony, and (12) accepting a solution that the evidence adequately supports (cf. Hitchcock 2017: 485).

Checklist conceptions of the process of critical thinking are open to the objection that they are too mechanical and procedural to fit the multi-dimensional and emotionally charged issues for which critical thinking is urgently needed (Paul 1984). For such issues, a more dialectical process is advocated, in which competing relevant world views are identified, their implications explored, and some sort of creative synthesis attempted.

If one considers the critical thinking process illustrated by the 11 examples, one can identify distinct kinds of mental acts and mental states that form part of it. To distinguish, label and briefly characterize these components is a useful preliminary to identifying abilities, skills, dispositions, attitudes, habits and the like that contribute causally to thinking critically. Identifying such abilities and habits is in turn a useful preliminary to setting educational goals. Setting the goals is in its turn a useful preliminary to designing strategies for helping learners to achieve the goals and to designing ways of measuring the extent to which learners have done so. Such measures provide both feedback to learners on their achievement and a basis for experimental research on the effectiveness of various strategies for educating people to think critically. Let us begin, then, by distinguishing the kinds of mental acts and mental events that can occur in a critical thinking process.

  • Observing : One notices something in one’s immediate environment (sudden cooling of temperature in Weather , bubbles forming outside a glass and then going inside in Bubbles , a moving blur in the distance in Blur , a rash in Rash ). Or one notes the results of an experiment or systematic observation (valuables missing in Disorder , no suction without air pressure in Suction pump )
  • Feeling : One feels puzzled or uncertain about something (how to get to an appointment on time in Transit , why the diamonds vary in spacing in Diamond ). One wants to resolve this perplexity. One feels satisfaction once one has worked out an answer (to take the subway express in Transit , diamonds closer when needed as a warning in Diamond ).
  • Wondering : One formulates a question to be addressed (why bubbles form outside a tumbler taken from hot water in Bubbles , how suction pumps work in Suction pump , what caused the rash in Rash ).
  • Imagining : One thinks of possible answers (bus or subway or elevated in Transit , flagpole or ornament or wireless communication aid or direction indicator in Ferryboat , allergic reaction or heat rash in Rash ).
  • Inferring : One works out what would be the case if a possible answer were assumed (valuables missing if there has been a burglary in Disorder , earlier start to the rash if it is an allergic reaction to a sulfa drug in Rash ). Or one draws a conclusion once sufficient relevant evidence is gathered (take the subway in Transit , burglary in Disorder , discontinue blood pressure medication and new cream in Rash ).
  • Knowledge : One uses stored knowledge of the subject-matter to generate possible answers or to infer what would be expected on the assumption of a particular answer (knowledge of a city’s public transit system in Transit , of the requirements for a flagpole in Ferryboat , of Boyle’s law in Bubbles , of allergic reactions in Rash ).
  • Experimenting : One designs and carries out an experiment or a systematic observation to find out whether the results deduced from a possible answer will occur (looking at the location of the flagpole in relation to the pilot’s position in Ferryboat , putting an ice cube on top of a tumbler taken from hot water in Bubbles , measuring the height to which a suction pump will draw water at different elevations in Suction pump , noticing the spacing of diamonds when movement to or from a diamond lane is allowed in Diamond ).
  • Consulting : One finds a source of information, gets the information from the source, and makes a judgment on whether to accept it. None of our 11 examples include searching for sources of information. In this respect they are unrepresentative, since most people nowadays have almost instant access to information relevant to answering any question, including many of those illustrated by the examples. However, Candidate includes the activities of extracting information from sources and evaluating its credibility.
  • Identifying and analyzing arguments : One notices an argument and works out its structure and content as a preliminary to evaluating its strength. This activity is central to Candidate . It is an important part of a critical thinking process in which one surveys arguments for various positions on an issue.
  • Judging : One makes a judgment on the basis of accumulated evidence and reasoning, such as the judgment in Ferryboat that the purpose of the pole is to provide direction to the pilot.
  • Deciding : One makes a decision on what to do or on what policy to adopt, as in the decision in Transit to take the subway.

By definition, a person who does something voluntarily is both willing and able to do that thing at that time. Both the willingness and the ability contribute causally to the person’s action, in the sense that the voluntary action would not occur if either (or both) of these were lacking. For example, suppose that one is standing with one’s arms at one’s sides and one voluntarily lifts one’s right arm to an extended horizontal position. One would not do so if one were unable to lift one’s arm, if for example one’s right side was paralyzed as the result of a stroke. Nor would one do so if one were unwilling to lift one’s arm, if for example one were participating in a street demonstration at which a white supremacist was urging the crowd to lift their right arm in a Nazi salute and one were unwilling to express support in this way for the racist Nazi ideology. The same analysis applies to a voluntary mental process of thinking critically. It requires both willingness and ability to think critically, including willingness and ability to perform each of the mental acts that compose the process and to coordinate those acts in a sequence that is directed at resolving the initiating perplexity.

Consider willingness first. We can identify causal contributors to willingness to think critically by considering factors that would cause a person who was able to think critically about an issue nevertheless not to do so (Hamby 2014). For each factor, the opposite condition thus contributes causally to willingness to think critically on a particular occasion. For example, people who habitually jump to conclusions without considering alternatives will not think critically about issues that arise, even if they have the required abilities. The contrary condition of willingness to suspend judgment is thus a causal contributor to thinking critically.

Now consider ability. In contrast to the ability to move one’s arm, which can be completely absent because a stroke has left the arm paralyzed, the ability to think critically is a developed ability, whose absence is not a complete absence of ability to think but absence of ability to think well. We can identify the ability to think well directly, in terms of the norms and standards for good thinking. In general, to be able do well the thinking activities that can be components of a critical thinking process, one needs to know the concepts and principles that characterize their good performance, to recognize in particular cases that the concepts and principles apply, and to apply them. The knowledge, recognition and application may be procedural rather than declarative. It may be domain-specific rather than widely applicable, and in either case may need subject-matter knowledge, sometimes of a deep kind.

Reflections of the sort illustrated by the previous two paragraphs have led scholars to identify the knowledge, abilities and dispositions of a “critical thinker”, i.e., someone who thinks critically whenever it is appropriate to do so. We turn now to these three types of causal contributors to thinking critically. We start with dispositions, since arguably these are the most powerful contributors to being a critical thinker, can be fostered at an early stage of a child’s development, and are susceptible to general improvement (Glaser 1941: 175)

8. Critical Thinking Dispositions

Educational researchers use the term ‘dispositions’ broadly for the habits of mind and attitudes that contribute causally to being a critical thinker. Some writers (e.g., Paul & Elder 2006; Hamby 2014; Bailin & Battersby 2016a) propose to use the term ‘virtues’ for this dimension of a critical thinker. The virtues in question, although they are virtues of character, concern the person’s ways of thinking rather than the person’s ways of behaving towards others. They are not moral virtues but intellectual virtues, of the sort articulated by Zagzebski (1996) and discussed by Turri, Alfano, and Greco (2017).

On a realistic conception, thinking dispositions or intellectual virtues are real properties of thinkers. They are general tendencies, propensities, or inclinations to think in particular ways in particular circumstances, and can be genuinely explanatory (Siegel 1999). Sceptics argue that there is no evidence for a specific mental basis for the habits of mind that contribute to thinking critically, and that it is pedagogically misleading to posit such a basis (Bailin et al. 1999a). Whatever their status, critical thinking dispositions need motivation for their initial formation in a child—motivation that may be external or internal. As children develop, the force of habit will gradually become important in sustaining the disposition (Nieto & Valenzuela 2012). Mere force of habit, however, is unlikely to sustain critical thinking dispositions. Critical thinkers must value and enjoy using their knowledge and abilities to think things through for themselves. They must be committed to, and lovers of, inquiry.

A person may have a critical thinking disposition with respect to only some kinds of issues. For example, one could be open-minded about scientific issues but not about religious issues. Similarly, one could be confident in one’s ability to reason about the theological implications of the existence of evil in the world but not in one’s ability to reason about the best design for a guided ballistic missile.

Facione (1990a: 25) divides “affective dispositions” of critical thinking into approaches to life and living in general and approaches to specific issues, questions or problems. Adapting this distinction, one can usefully divide critical thinking dispositions into initiating dispositions (those that contribute causally to starting to think critically about an issue) and internal dispositions (those that contribute causally to doing a good job of thinking critically once one has started). The two categories are not mutually exclusive. For example, open-mindedness, in the sense of willingness to consider alternative points of view to one’s own, is both an initiating and an internal disposition.

Using the strategy of considering factors that would block people with the ability to think critically from doing so, we can identify as initiating dispositions for thinking critically attentiveness, a habit of inquiry, self-confidence, courage, open-mindedness, willingness to suspend judgment, trust in reason, wanting evidence for one’s beliefs, and seeking the truth. We consider briefly what each of these dispositions amounts to, in each case citing sources that acknowledge them.

  • Attentiveness : One will not think critically if one fails to recognize an issue that needs to be thought through. For example, the pedestrian in Weather would not have looked up if he had not noticed that the air was suddenly cooler. To be a critical thinker, then, one needs to be habitually attentive to one’s surroundings, noticing not only what one senses but also sources of perplexity in messages received and in one’s own beliefs and attitudes (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Habit of inquiry : Inquiry is effortful, and one needs an internal push to engage in it. For example, the student in Bubbles could easily have stopped at idle wondering about the cause of the bubbles rather than reasoning to a hypothesis, then designing and executing an experiment to test it. Thus willingness to think critically needs mental energy and initiative. What can supply that energy? Love of inquiry, or perhaps just a habit of inquiry. Hamby (2015) has argued that willingness to inquire is the central critical thinking virtue, one that encompasses all the others. It is recognized as a critical thinking disposition by Dewey (1910: 29; 1933: 35), Glaser (1941: 5), Ennis (1987: 12; 1991: 8), Facione (1990a: 25), Bailin et al. (1999b: 294), Halpern (1998: 452), and Facione, Facione, & Giancarlo (2001).
  • Self-confidence : Lack of confidence in one’s abilities can block critical thinking. For example, if the woman in Rash lacked confidence in her ability to figure things out for herself, she might just have assumed that the rash on her chest was the allergic reaction to her medication against which the pharmacist had warned her. Thus willingness to think critically requires confidence in one’s ability to inquire (Facione 1990a: 25; Facione, Facione, & Giancarlo 2001).
  • Courage : Fear of thinking for oneself can stop one from doing it. Thus willingness to think critically requires intellectual courage (Paul & Elder 2006: 16).
  • Open-mindedness : A dogmatic attitude will impede thinking critically. For example, a person who adheres rigidly to a “pro-choice” position on the issue of the legal status of induced abortion is likely to be unwilling to consider seriously the issue of when in its development an unborn child acquires a moral right to life. Thus willingness to think critically requires open-mindedness, in the sense of a willingness to examine questions to which one already accepts an answer but which further evidence or reasoning might cause one to answer differently (Dewey 1933; Facione 1990a; Ennis 1991; Bailin et al. 1999b; Halpern 1998, Facione, Facione, & Giancarlo 2001). Paul (1981) emphasizes open-mindedness about alternative world-views, and recommends a dialectical approach to integrating such views as central to what he calls “strong sense” critical thinking. In three studies, Haran, Ritov, & Mellers (2013) found that actively open-minded thinking, including “the tendency to weigh new evidence against a favored belief, to spend sufficient time on a problem before giving up, and to consider carefully the opinions of others in forming one’s own”, led study participants to acquire information and thus to make accurate estimations.
  • Willingness to suspend judgment : Premature closure on an initial solution will block critical thinking. Thus willingness to think critically requires a willingness to suspend judgment while alternatives are explored (Facione 1990a; Ennis 1991; Halpern 1998).
  • Trust in reason : Since distrust in the processes of reasoned inquiry will dissuade one from engaging in it, trust in them is an initiating critical thinking disposition (Facione 1990a, 25; Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001; Paul & Elder 2006). In reaction to an allegedly exclusive emphasis on reason in critical thinking theory and pedagogy, Thayer-Bacon (2000) argues that intuition, imagination, and emotion have important roles to play in an adequate conception of critical thinking that she calls “constructive thinking”. From her point of view, critical thinking requires trust not only in reason but also in intuition, imagination, and emotion.
  • Seeking the truth : If one does not care about the truth but is content to stick with one’s initial bias on an issue, then one will not think critically about it. Seeking the truth is thus an initiating critical thinking disposition (Bailin et al. 1999b: 294; Facione, Facione, & Giancarlo 2001). A disposition to seek the truth is implicit in more specific critical thinking dispositions, such as trying to be well-informed, considering seriously points of view other than one’s own, looking for alternatives, suspending judgment when the evidence is insufficient, and adopting a position when the evidence supporting it is sufficient.

Some of the initiating dispositions, such as open-mindedness and willingness to suspend judgment, are also internal critical thinking dispositions, in the sense of mental habits or attitudes that contribute causally to doing a good job of critical thinking once one starts the process. But there are many other internal critical thinking dispositions. Some of them are parasitic on one’s conception of good thinking. For example, it is constitutive of good thinking about an issue to formulate the issue clearly and to maintain focus on it. For this purpose, one needs not only the corresponding ability but also the corresponding disposition. Ennis (1991: 8) describes it as the disposition “to determine and maintain focus on the conclusion or question”, Facione (1990a: 25) as “clarity in stating the question or concern”. Other internal dispositions are motivators to continue or adjust the critical thinking process, such as willingness to persist in a complex task and willingness to abandon nonproductive strategies in an attempt to self-correct (Halpern 1998: 452). For a list of identified internal critical thinking dispositions, see the Supplement on Internal Critical Thinking Dispositions .

Some theorists postulate skills, i.e., acquired abilities, as operative in critical thinking. It is not obvious, however, that a good mental act is the exercise of a generic acquired skill. Inferring an expected time of arrival, as in Transit , has some generic components but also uses non-generic subject-matter knowledge. Bailin et al. (1999a) argue against viewing critical thinking skills as generic and discrete, on the ground that skilled performance at a critical thinking task cannot be separated from knowledge of concepts and from domain-specific principles of good thinking. Talk of skills, they concede, is unproblematic if it means merely that a person with critical thinking skills is capable of intelligent performance.

Despite such scepticism, theorists of critical thinking have listed as general contributors to critical thinking what they variously call abilities (Glaser 1941; Ennis 1962, 1991), skills (Facione 1990a; Halpern 1998) or competencies (Fisher & Scriven 1997). Amalgamating these lists would produce a confusing and chaotic cornucopia of more than 50 possible educational objectives, with only partial overlap among them. It makes sense instead to try to understand the reasons for the multiplicity and diversity, and to make a selection according to one’s own reasons for singling out abilities to be developed in a critical thinking curriculum. Two reasons for diversity among lists of critical thinking abilities are the underlying conception of critical thinking and the envisaged educational level. Appraisal-only conceptions, for example, involve a different suite of abilities than constructive-only conceptions. Some lists, such as those in (Glaser 1941), are put forward as educational objectives for secondary school students, whereas others are proposed as objectives for college students (e.g., Facione 1990a).

The abilities described in the remaining paragraphs of this section emerge from reflection on the general abilities needed to do well the thinking activities identified in section 6 as components of the critical thinking process described in section 5 . The derivation of each collection of abilities is accompanied by citation of sources that list such abilities and of standardized tests that claim to test them.

Observational abilities : Careful and accurate observation sometimes requires specialist expertise and practice, as in the case of observing birds and observing accident scenes. However, there are general abilities of noticing what one’s senses are picking up from one’s environment and of being able to articulate clearly and accurately to oneself and others what one has observed. It helps in exercising them to be able to recognize and take into account factors that make one’s observation less trustworthy, such as prior framing of the situation, inadequate time, deficient senses, poor observation conditions, and the like. It helps as well to be skilled at taking steps to make one’s observation more trustworthy, such as moving closer to get a better look, measuring something three times and taking the average, and checking what one thinks one is observing with someone else who is in a good position to observe it. It also helps to be skilled at recognizing respects in which one’s report of one’s observation involves inference rather than direct observation, so that one can then consider whether the inference is justified. These abilities come into play as well when one thinks about whether and with what degree of confidence to accept an observation report, for example in the study of history or in a criminal investigation or in assessing news reports. Observational abilities show up in some lists of critical thinking abilities (Ennis 1962: 90; Facione 1990a: 16; Ennis 1991: 9). There are items testing a person’s ability to judge the credibility of observation reports in the Cornell Critical Thinking Tests, Levels X and Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). Norris and King (1983, 1985, 1990a, 1990b) is a test of ability to appraise observation reports.

Emotional abilities : The emotions that drive a critical thinking process are perplexity or puzzlement, a wish to resolve it, and satisfaction at achieving the desired resolution. Children experience these emotions at an early age, without being trained to do so. Education that takes critical thinking as a goal needs only to channel these emotions and to make sure not to stifle them. Collaborative critical thinking benefits from ability to recognize one’s own and others’ emotional commitments and reactions.

Questioning abilities : A critical thinking process needs transformation of an inchoate sense of perplexity into a clear question. Formulating a question well requires not building in questionable assumptions, not prejudging the issue, and using language that in context is unambiguous and precise enough (Ennis 1962: 97; 1991: 9).

Imaginative abilities : Thinking directed at finding the correct causal explanation of a general phenomenon or particular event requires an ability to imagine possible explanations. Thinking about what policy or plan of action to adopt requires generation of options and consideration of possible consequences of each option. Domain knowledge is required for such creative activity, but a general ability to imagine alternatives is helpful and can be nurtured so as to become easier, quicker, more extensive, and deeper (Dewey 1910: 34–39; 1933: 40–47). Facione (1990a) and Halpern (1998) include the ability to imagine alternatives as a critical thinking ability.

Inferential abilities : The ability to draw conclusions from given information, and to recognize with what degree of certainty one’s own or others’ conclusions follow, is universally recognized as a general critical thinking ability. All 11 examples in section 2 of this article include inferences, some from hypotheses or options (as in Transit , Ferryboat and Disorder ), others from something observed (as in Weather and Rash ). None of these inferences is formally valid. Rather, they are licensed by general, sometimes qualified substantive rules of inference (Toulmin 1958) that rest on domain knowledge—that a bus trip takes about the same time in each direction, that the terminal of a wireless telegraph would be located on the highest possible place, that sudden cooling is often followed by rain, that an allergic reaction to a sulfa drug generally shows up soon after one starts taking it. It is a matter of controversy to what extent the specialized ability to deduce conclusions from premisses using formal rules of inference is needed for critical thinking. Dewey (1933) locates logical forms in setting out the products of reflection rather than in the process of reflection. Ennis (1981a), on the other hand, maintains that a liberally-educated person should have the following abilities: to translate natural-language statements into statements using the standard logical operators, to use appropriately the language of necessary and sufficient conditions, to deal with argument forms and arguments containing symbols, to determine whether in virtue of an argument’s form its conclusion follows necessarily from its premisses, to reason with logically complex propositions, and to apply the rules and procedures of deductive logic. Inferential abilities are recognized as critical thinking abilities by Glaser (1941: 6), Facione (1990a: 9), Ennis (1991: 9), Fisher & Scriven (1997: 99, 111), and Halpern (1998: 452). Items testing inferential abilities constitute two of the five subtests of the Watson Glaser Critical Thinking Appraisal (Watson & Glaser 1980a, 1980b, 1994), two of the four sections in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), three of the seven sections in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005), 11 of the 34 items on Forms A and B of the California Critical Thinking Skills Test (Facione 1990b, 1992), and a high but variable proportion of the 25 selected-response questions in the Collegiate Learning Assessment (Council for Aid to Education 2017).

Experimenting abilities : Knowing how to design and execute an experiment is important not just in scientific research but also in everyday life, as in Rash . Dewey devoted a whole chapter of his How We Think (1910: 145–156; 1933: 190–202) to the superiority of experimentation over observation in advancing knowledge. Experimenting abilities come into play at one remove in appraising reports of scientific studies. Skill in designing and executing experiments includes the acknowledged abilities to appraise evidence (Glaser 1941: 6), to carry out experiments and to apply appropriate statistical inference techniques (Facione 1990a: 9), to judge inductions to an explanatory hypothesis (Ennis 1991: 9), and to recognize the need for an adequately large sample size (Halpern 1998). The Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) includes four items (out of 52) on experimental design. The Collegiate Learning Assessment (Council for Aid to Education 2017) makes room for appraisal of study design in both its performance task and its selected-response questions.

Consulting abilities : Skill at consulting sources of information comes into play when one seeks information to help resolve a problem, as in Candidate . Ability to find and appraise information includes ability to gather and marshal pertinent information (Glaser 1941: 6), to judge whether a statement made by an alleged authority is acceptable (Ennis 1962: 84), to plan a search for desired information (Facione 1990a: 9), and to judge the credibility of a source (Ennis 1991: 9). Ability to judge the credibility of statements is tested by 24 items (out of 76) in the Cornell Critical Thinking Test Level X (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005) and by four items (out of 52) in the Cornell Critical Thinking Test Level Z (Ennis & Millman 1971; Ennis, Millman, & Tomko 1985, 2005). The College Learning Assessment’s performance task requires evaluation of whether information in documents is credible or unreliable (Council for Aid to Education 2017).

Argument analysis abilities : The ability to identify and analyze arguments contributes to the process of surveying arguments on an issue in order to form one’s own reasoned judgment, as in Candidate . The ability to detect and analyze arguments is recognized as a critical thinking skill by Facione (1990a: 7–8), Ennis (1991: 9) and Halpern (1998). Five items (out of 34) on the California Critical Thinking Skills Test (Facione 1990b, 1992) test skill at argument analysis. The College Learning Assessment (Council for Aid to Education 2017) incorporates argument analysis in its selected-response tests of critical reading and evaluation and of critiquing an argument.

Judging skills and deciding skills : Skill at judging and deciding is skill at recognizing what judgment or decision the available evidence and argument supports, and with what degree of confidence. It is thus a component of the inferential skills already discussed.

Lists and tests of critical thinking abilities often include two more abilities: identifying assumptions and constructing and evaluating definitions.

In addition to dispositions and abilities, critical thinking needs knowledge: of critical thinking concepts, of critical thinking principles, and of the subject-matter of the thinking.

We can derive a short list of concepts whose understanding contributes to critical thinking from the critical thinking abilities described in the preceding section. Observational abilities require an understanding of the difference between observation and inference. Questioning abilities require an understanding of the concepts of ambiguity and vagueness. Inferential abilities require an understanding of the difference between conclusive and defeasible inference (traditionally, between deduction and induction), as well as of the difference between necessary and sufficient conditions. Experimenting abilities require an understanding of the concepts of hypothesis, null hypothesis, assumption and prediction, as well as of the concept of statistical significance and of its difference from importance. They also require an understanding of the difference between an experiment and an observational study, and in particular of the difference between a randomized controlled trial, a prospective correlational study and a retrospective (case-control) study. Argument analysis abilities require an understanding of the concepts of argument, premiss, assumption, conclusion and counter-consideration. Additional critical thinking concepts are proposed by Bailin et al. (1999b: 293), Fisher & Scriven (1997: 105–106), Black (2012), and Blair (2021).

According to Glaser (1941: 25), ability to think critically requires knowledge of the methods of logical inquiry and reasoning. If we review the list of abilities in the preceding section, however, we can see that some of them can be acquired and exercised merely through practice, possibly guided in an educational setting, followed by feedback. Searching intelligently for a causal explanation of some phenomenon or event requires that one consider a full range of possible causal contributors, but it seems more important that one implements this principle in one’s practice than that one is able to articulate it. What is important is “operational knowledge” of the standards and principles of good thinking (Bailin et al. 1999b: 291–293). But the development of such critical thinking abilities as designing an experiment or constructing an operational definition can benefit from learning their underlying theory. Further, explicit knowledge of quirks of human thinking seems useful as a cautionary guide. Human memory is not just fallible about details, as people learn from their own experiences of misremembering, but is so malleable that a detailed, clear and vivid recollection of an event can be a total fabrication (Loftus 2017). People seek or interpret evidence in ways that are partial to their existing beliefs and expectations, often unconscious of their “confirmation bias” (Nickerson 1998). Not only are people subject to this and other cognitive biases (Kahneman 2011), of which they are typically unaware, but it may be counter-productive for one to make oneself aware of them and try consciously to counteract them or to counteract social biases such as racial or sexual stereotypes (Kenyon & Beaulac 2014). It is helpful to be aware of these facts and of the superior effectiveness of blocking the operation of biases—for example, by making an immediate record of one’s observations, refraining from forming a preliminary explanatory hypothesis, blind refereeing, double-blind randomized trials, and blind grading of students’ work. It is also helpful to be aware of the prevalence of “noise” (unwanted unsystematic variability of judgments), of how to detect noise (through a noise audit), and of how to reduce noise: make accuracy the goal, think statistically, break a process of arriving at a judgment into independent tasks, resist premature intuitions, in a group get independent judgments first, favour comparative judgments and scales (Kahneman, Sibony, & Sunstein 2021). It is helpful as well to be aware of the concept of “bounded rationality” in decision-making and of the related distinction between “satisficing” and optimizing (Simon 1956; Gigerenzer 2001).

Critical thinking about an issue requires substantive knowledge of the domain to which the issue belongs. Critical thinking abilities are not a magic elixir that can be applied to any issue whatever by somebody who has no knowledge of the facts relevant to exploring that issue. For example, the student in Bubbles needed to know that gases do not penetrate solid objects like a glass, that air expands when heated, that the volume of an enclosed gas varies directly with its temperature and inversely with its pressure, and that hot objects will spontaneously cool down to the ambient temperature of their surroundings unless kept hot by insulation or a source of heat. Critical thinkers thus need a rich fund of subject-matter knowledge relevant to the variety of situations they encounter. This fact is recognized in the inclusion among critical thinking dispositions of a concern to become and remain generally well informed.

Experimental educational interventions, with control groups, have shown that education can improve critical thinking skills and dispositions, as measured by standardized tests. For information about these tests, see the Supplement on Assessment .

What educational methods are most effective at developing the dispositions, abilities and knowledge of a critical thinker? In a comprehensive meta-analysis of experimental and quasi-experimental studies of strategies for teaching students to think critically, Abrami et al. (2015) found that dialogue, anchored instruction, and mentoring each increased the effectiveness of the educational intervention, and that they were most effective when combined. They also found that in these studies a combination of separate instruction in critical thinking with subject-matter instruction in which students are encouraged to think critically was more effective than either by itself. However, the difference was not statistically significant; that is, it might have arisen by chance.

Most of these studies lack the longitudinal follow-up required to determine whether the observed differential improvements in critical thinking abilities or dispositions continue over time, for example until high school or college graduation. For details on studies of methods of developing critical thinking skills and dispositions, see the Supplement on Educational Methods .

12. Controversies

Scholars have denied the generalizability of critical thinking abilities across subject domains, have alleged bias in critical thinking theory and pedagogy, and have investigated the relationship of critical thinking to other kinds of thinking.

McPeck (1981) attacked the thinking skills movement of the 1970s, including the critical thinking movement. He argued that there are no general thinking skills, since thinking is always thinking about some subject-matter. It is futile, he claimed, for schools and colleges to teach thinking as if it were a separate subject. Rather, teachers should lead their pupils to become autonomous thinkers by teaching school subjects in a way that brings out their cognitive structure and that encourages and rewards discussion and argument. As some of his critics (e.g., Paul 1985; Siegel 1985) pointed out, McPeck’s central argument needs elaboration, since it has obvious counter-examples in writing and speaking, for which (up to a certain level of complexity) there are teachable general abilities even though they are always about some subject-matter. To make his argument convincing, McPeck needs to explain how thinking differs from writing and speaking in a way that does not permit useful abstraction of its components from the subject-matters with which it deals. He has not done so. Nevertheless, his position that the dispositions and abilities of a critical thinker are best developed in the context of subject-matter instruction is shared by many theorists of critical thinking, including Dewey (1910, 1933), Glaser (1941), Passmore (1980), Weinstein (1990), Bailin et al. (1999b), and Willingham (2019).

McPeck’s challenge prompted reflection on the extent to which critical thinking is subject-specific. McPeck argued for a strong subject-specificity thesis, according to which it is a conceptual truth that all critical thinking abilities are specific to a subject. (He did not however extend his subject-specificity thesis to critical thinking dispositions. In particular, he took the disposition to suspend judgment in situations of cognitive dissonance to be a general disposition.) Conceptual subject-specificity is subject to obvious counter-examples, such as the general ability to recognize confusion of necessary and sufficient conditions. A more modest thesis, also endorsed by McPeck, is epistemological subject-specificity, according to which the norms of good thinking vary from one field to another. Epistemological subject-specificity clearly holds to a certain extent; for example, the principles in accordance with which one solves a differential equation are quite different from the principles in accordance with which one determines whether a painting is a genuine Picasso. But the thesis suffers, as Ennis (1989) points out, from vagueness of the concept of a field or subject and from the obvious existence of inter-field principles, however broadly the concept of a field is construed. For example, the principles of hypothetico-deductive reasoning hold for all the varied fields in which such reasoning occurs. A third kind of subject-specificity is empirical subject-specificity, according to which as a matter of empirically observable fact a person with the abilities and dispositions of a critical thinker in one area of investigation will not necessarily have them in another area of investigation.

The thesis of empirical subject-specificity raises the general problem of transfer. If critical thinking abilities and dispositions have to be developed independently in each school subject, how are they of any use in dealing with the problems of everyday life and the political and social issues of contemporary society, most of which do not fit into the framework of a traditional school subject? Proponents of empirical subject-specificity tend to argue that transfer is more likely to occur if there is critical thinking instruction in a variety of domains, with explicit attention to dispositions and abilities that cut across domains. But evidence for this claim is scanty. There is a need for well-designed empirical studies that investigate the conditions that make transfer more likely.

It is common ground in debates about the generality or subject-specificity of critical thinking dispositions and abilities that critical thinking about any topic requires background knowledge about the topic. For example, the most sophisticated understanding of the principles of hypothetico-deductive reasoning is of no help unless accompanied by some knowledge of what might be plausible explanations of some phenomenon under investigation.

Critics have objected to bias in the theory, pedagogy and practice of critical thinking. Commentators (e.g., Alston 1995; Ennis 1998) have noted that anyone who takes a position has a bias in the neutral sense of being inclined in one direction rather than others. The critics, however, are objecting to bias in the pejorative sense of an unjustified favoring of certain ways of knowing over others, frequently alleging that the unjustly favoured ways are those of a dominant sex or culture (Bailin 1995). These ways favour:

  • reinforcement of egocentric and sociocentric biases over dialectical engagement with opposing world-views (Paul 1981, 1984; Warren 1998)
  • distancing from the object of inquiry over closeness to it (Martin 1992; Thayer-Bacon 1992)
  • indifference to the situation of others over care for them (Martin 1992)
  • orientation to thought over orientation to action (Martin 1992)
  • being reasonable over caring to understand people’s ideas (Thayer-Bacon 1993)
  • being neutral and objective over being embodied and situated (Thayer-Bacon 1995a)
  • doubting over believing (Thayer-Bacon 1995b)
  • reason over emotion, imagination and intuition (Thayer-Bacon 2000)
  • solitary thinking over collaborative thinking (Thayer-Bacon 2000)
  • written and spoken assignments over other forms of expression (Alston 2001)
  • attention to written and spoken communications over attention to human problems (Alston 2001)
  • winning debates in the public sphere over making and understanding meaning (Alston 2001)

A common thread in this smorgasbord of accusations is dissatisfaction with focusing on the logical analysis and evaluation of reasoning and arguments. While these authors acknowledge that such analysis and evaluation is part of critical thinking and should be part of its conceptualization and pedagogy, they insist that it is only a part. Paul (1981), for example, bemoans the tendency of atomistic teaching of methods of analyzing and evaluating arguments to turn students into more able sophists, adept at finding fault with positions and arguments with which they disagree but even more entrenched in the egocentric and sociocentric biases with which they began. Martin (1992) and Thayer-Bacon (1992) cite with approval the self-reported intimacy with their subject-matter of leading researchers in biology and medicine, an intimacy that conflicts with the distancing allegedly recommended in standard conceptions and pedagogy of critical thinking. Thayer-Bacon (2000) contrasts the embodied and socially embedded learning of her elementary school students in a Montessori school, who used their imagination, intuition and emotions as well as their reason, with conceptions of critical thinking as

thinking that is used to critique arguments, offer justifications, and make judgments about what are the good reasons, or the right answers. (Thayer-Bacon 2000: 127–128)

Alston (2001) reports that her students in a women’s studies class were able to see the flaws in the Cinderella myth that pervades much romantic fiction but in their own romantic relationships still acted as if all failures were the woman’s fault and still accepted the notions of love at first sight and living happily ever after. Students, she writes, should

be able to connect their intellectual critique to a more affective, somatic, and ethical account of making risky choices that have sexist, racist, classist, familial, sexual, or other consequences for themselves and those both near and far… critical thinking that reads arguments, texts, or practices merely on the surface without connections to feeling/desiring/doing or action lacks an ethical depth that should infuse the difference between mere cognitive activity and something we want to call critical thinking. (Alston 2001: 34)

Some critics portray such biases as unfair to women. Thayer-Bacon (1992), for example, has charged modern critical thinking theory with being sexist, on the ground that it separates the self from the object and causes one to lose touch with one’s inner voice, and thus stigmatizes women, who (she asserts) link self to object and listen to their inner voice. Her charge does not imply that women as a group are on average less able than men to analyze and evaluate arguments. Facione (1990c) found no difference by sex in performance on his California Critical Thinking Skills Test. Kuhn (1991: 280–281) found no difference by sex in either the disposition or the competence to engage in argumentative thinking.

The critics propose a variety of remedies for the biases that they allege. In general, they do not propose to eliminate or downplay critical thinking as an educational goal. Rather, they propose to conceptualize critical thinking differently and to change its pedagogy accordingly. Their pedagogical proposals arise logically from their objections. They can be summarized as follows:

  • Focus on argument networks with dialectical exchanges reflecting contesting points of view rather than on atomic arguments, so as to develop “strong sense” critical thinking that transcends egocentric and sociocentric biases (Paul 1981, 1984).
  • Foster closeness to the subject-matter and feeling connected to others in order to inform a humane democracy (Martin 1992).
  • Develop “constructive thinking” as a social activity in a community of physically embodied and socially embedded inquirers with personal voices who value not only reason but also imagination, intuition and emotion (Thayer-Bacon 2000).
  • In developing critical thinking in school subjects, treat as important neither skills nor dispositions but opening worlds of meaning (Alston 2001).
  • Attend to the development of critical thinking dispositions as well as skills, and adopt the “critical pedagogy” practised and advocated by Freire (1968 [1970]) and hooks (1994) (Dalgleish, Girard, & Davies 2017).

A common thread in these proposals is treatment of critical thinking as a social, interactive, personally engaged activity like that of a quilting bee or a barn-raising (Thayer-Bacon 2000) rather than as an individual, solitary, distanced activity symbolized by Rodin’s The Thinker . One can get a vivid description of education with the former type of goal from the writings of bell hooks (1994, 2010). Critical thinking for her is open-minded dialectical exchange across opposing standpoints and from multiple perspectives, a conception similar to Paul’s “strong sense” critical thinking (Paul 1981). She abandons the structure of domination in the traditional classroom. In an introductory course on black women writers, for example, she assigns students to write an autobiographical paragraph about an early racial memory, then to read it aloud as the others listen, thus affirming the uniqueness and value of each voice and creating a communal awareness of the diversity of the group’s experiences (hooks 1994: 84). Her “engaged pedagogy” is thus similar to the “freedom under guidance” implemented in John Dewey’s Laboratory School of Chicago in the late 1890s and early 1900s. It incorporates the dialogue, anchored instruction, and mentoring that Abrami (2015) found to be most effective in improving critical thinking skills and dispositions.

What is the relationship of critical thinking to problem solving, decision-making, higher-order thinking, creative thinking, and other recognized types of thinking? One’s answer to this question obviously depends on how one defines the terms used in the question. If critical thinking is conceived broadly to cover any careful thinking about any topic for any purpose, then problem solving and decision making will be kinds of critical thinking, if they are done carefully. Historically, ‘critical thinking’ and ‘problem solving’ were two names for the same thing. If critical thinking is conceived more narrowly as consisting solely of appraisal of intellectual products, then it will be disjoint with problem solving and decision making, which are constructive.

Bloom’s taxonomy of educational objectives used the phrase “intellectual abilities and skills” for what had been labeled “critical thinking” by some, “reflective thinking” by Dewey and others, and “problem solving” by still others (Bloom et al. 1956: 38). Thus, the so-called “higher-order thinking skills” at the taxonomy’s top levels of analysis, synthesis and evaluation are just critical thinking skills, although they do not come with general criteria for their assessment (Ennis 1981b). The revised version of Bloom’s taxonomy (Anderson et al. 2001) likewise treats critical thinking as cutting across those types of cognitive process that involve more than remembering (Anderson et al. 2001: 269–270). For details, see the Supplement on History .

As to creative thinking, it overlaps with critical thinking (Bailin 1987, 1988). Thinking about the explanation of some phenomenon or event, as in Ferryboat , requires creative imagination in constructing plausible explanatory hypotheses. Likewise, thinking about a policy question, as in Candidate , requires creativity in coming up with options. Conversely, creativity in any field needs to be balanced by critical appraisal of the draft painting or novel or mathematical theory.

  • Abrami, Philip C., Robert M. Bernard, Eugene Borokhovski, David I. Waddington, C. Anne Wade, and Tonje Person, 2015, “Strategies for Teaching Students to Think Critically: A Meta-analysis”, Review of Educational Research , 85(2): 275–314. doi:10.3102/0034654314551063
  • Aikin, Wilford M., 1942, The Story of the Eight-year Study, with Conclusions and Recommendations , Volume I of Adventure in American Education , New York and London: Harper & Brothers. [ Aikin 1942 available online ]
  • Alston, Kal, 1995, “Begging the Question: Is Critical Thinking Biased?”, Educational Theory , 45(2): 225–233. doi:10.1111/j.1741-5446.1995.00225.x
  • –––, 2001, “Re/Thinking Critical Thinking: The Seductions of Everyday Life”, Studies in Philosophy and Education , 20(1): 27–40. doi:10.1023/A:1005247128053
  • American Educational Research Association, 2014, Standards for Educational and Psychological Testing / American Educational Research Association, American Psychological Association, National Council on Measurement in Education , Washington, DC: American Educational Research Association.
  • Anderson, Lorin W., David R. Krathwohl, Peter W. Airiasian, Kathleen A. Cruikshank, Richard E. Mayer, Paul R. Pintrich, James Raths, and Merlin C. Wittrock, 2001, A Taxonomy for Learning, Teaching and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives , New York: Longman, complete edition.
  • Bailin, Sharon, 1987, “Critical and Creative Thinking”, Informal Logic , 9(1): 23–30. [ Bailin 1987 available online ]
  • –––, 1988, Achieving Extraordinary Ends: An Essay on Creativity , Dordrecht: Kluwer. doi:10.1007/978-94-009-2780-3
  • –––, 1995, “Is Critical Thinking Biased? Clarifications and Implications”, Educational Theory , 45(2): 191–197. doi:10.1111/j.1741-5446.1995.00191.x
  • Bailin, Sharon and Mark Battersby, 2009, “Inquiry: A Dialectical Approach to Teaching Critical Thinking”, in Juho Ritola (ed.), Argument Cultures: Proceedings of OSSA 09 , CD-ROM (pp. 1–10), Windsor, ON: OSSA. [ Bailin & Battersby 2009 available online ]
  • –––, 2016a, “Fostering the Virtues of Inquiry”, Topoi , 35(2): 367–374. doi:10.1007/s11245-015-9307-6
  • –––, 2016b, Reason in the Balance: An Inquiry Approach to Critical Thinking , Indianapolis: Hackett, 2nd edition.
  • –––, 2021, “Inquiry: Teaching for Reasoned Judgment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 31–46. doi: 10.1163/9789004444591_003
  • Bailin, Sharon, Roland Case, Jerrold R. Coombs, and Leroi B. Daniels, 1999a, “Common Misconceptions of Critical Thinking”, Journal of Curriculum Studies , 31(3): 269–283. doi:10.1080/002202799183124
  • –––, 1999b, “Conceptualizing Critical Thinking”, Journal of Curriculum Studies , 31(3): 285–302. doi:10.1080/002202799183133
  • Blair, J. Anthony, 2021, Studies in Critical Thinking , Windsor, ON: Windsor Studies in Argumentation, 2nd edition. [Available online at https://windsor.scholarsportal.info/omp/index.php/wsia/catalog/book/106]
  • Berman, Alan M., Seth J. Schwartz, William M. Kurtines, and Steven L. Berman, 2001, “The Process of Exploration in Identity Formation: The Role of Style and Competence”, Journal of Adolescence , 24(4): 513–528. doi:10.1006/jado.2001.0386
  • Black, Beth (ed.), 2012, An A to Z of Critical Thinking , London: Continuum International Publishing Group.
  • Bloom, Benjamin Samuel, Max D. Engelhart, Edward J. Furst, Walter H. Hill, and David R. Krathwohl, 1956, Taxonomy of Educational Objectives. Handbook I: Cognitive Domain , New York: David McKay.
  • Boardman, Frank, Nancy M. Cavender, and Howard Kahane, 2018, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Boston: Cengage, 13th edition.
  • Browne, M. Neil and Stuart M. Keeley, 2018, Asking the Right Questions: A Guide to Critical Thinking , Hoboken, NJ: Pearson, 12th edition.
  • Center for Assessment & Improvement of Learning, 2017, Critical Thinking Assessment Test , Cookeville, TN: Tennessee Technological University.
  • Cleghorn, Paul. 2021. “Critical Thinking in the Elementary School: Practical Guidance for Building a Culture of Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessmen t, Leiden: Brill, pp. 150–167. doi: 10.1163/9789004444591_010
  • Cohen, Jacob, 1988, Statistical Power Analysis for the Behavioral Sciences , Hillsdale, NJ: Lawrence Erlbaum Associates, 2nd edition.
  • College Board, 1983, Academic Preparation for College. What Students Need to Know and Be Able to Do , New York: College Entrance Examination Board, ERIC document ED232517.
  • Commission on the Relation of School and College of the Progressive Education Association, 1943, Thirty Schools Tell Their Story , Volume V of Adventure in American Education , New York and London: Harper & Brothers.
  • Council for Aid to Education, 2017, CLA+ Student Guide . Available at http://cae.org/images/uploads/pdf/CLA_Student_Guide_Institution.pdf ; last accessed 2022 07 16.
  • Dalgleish, Adam, Patrick Girard, and Maree Davies, 2017, “Critical Thinking, Bias and Feminist Philosophy: Building a Better Framework through Collaboration”, Informal Logic , 37(4): 351–369. [ Dalgleish et al. available online ]
  • Dewey, John, 1910, How We Think , Boston: D.C. Heath. [ Dewey 1910 available online ]
  • –––, 1916, Democracy and Education: An Introduction to the Philosophy of Education , New York: Macmillan.
  • –––, 1933, How We Think: A Restatement of the Relation of Reflective Thinking to the Educative Process , Lexington, MA: D.C. Heath.
  • –––, 1936, “The Theory of the Chicago Experiment”, Appendix II of Mayhew & Edwards 1936: 463–477.
  • –––, 1938, Logic: The Theory of Inquiry , New York: Henry Holt and Company.
  • Dominguez, Caroline (coord.), 2018a, A European Collection of the Critical Thinking Skills and Dispositions Needed in Different Professional Fields for the 21st Century , Vila Real, Portugal: UTAD. Available at http://bit.ly/CRITHINKEDUO1 ; last accessed 2022 07 16.
  • ––– (coord.), 2018b, A European Review on Critical Thinking Educational Practices in Higher Education Institutions , Vila Real: UTAD. Available at http://bit.ly/CRITHINKEDUO2 ; last accessed 2022 07 16.
  • ––– (coord.), 2018c, The CRITHINKEDU European Course on Critical Thinking Education for University Teachers: From Conception to Delivery , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU03; last accessed 2022 07 16.
  • Dominguez Caroline and Rita Payan-Carreira (eds.), 2019, Promoting Critical Thinking in European Higher Education Institutions: Towards an Educational Protocol , Vila Real: UTAD. Available at http:/bit.ly/CRITHINKEDU04; last accessed 2022 07 16.
  • Ennis, Robert H., 1958, “An Appraisal of the Watson-Glaser Critical Thinking Appraisal”, The Journal of Educational Research , 52(4): 155–158. doi:10.1080/00220671.1958.10882558
  • –––, 1962, “A Concept of Critical Thinking: A Proposed Basis for Research on the Teaching and Evaluation of Critical Thinking Ability”, Harvard Educational Review , 32(1): 81–111.
  • –––, 1981a, “A Conception of Deductive Logical Competence”, Teaching Philosophy , 4(3/4): 337–385. doi:10.5840/teachphil198143/429
  • –––, 1981b, “Eight Fallacies in Bloom’s Taxonomy”, in C. J. B. Macmillan (ed.), Philosophy of Education 1980: Proceedings of the Thirty-seventh Annual Meeting of the Philosophy of Education Society , Bloomington, IL: Philosophy of Education Society, pp. 269–273.
  • –––, 1984, “Problems in Testing Informal Logic, Critical Thinking, Reasoning Ability”, Informal Logic , 6(1): 3–9. [ Ennis 1984 available online ]
  • –––, 1987, “A Taxonomy of Critical Thinking Dispositions and Abilities”, in Joan Boykoff Baron and Robert J. Sternberg (eds.), Teaching Thinking Skills: Theory and Practice , New York: W. H. Freeman, pp. 9–26.
  • –––, 1989, “Critical Thinking and Subject Specificity: Clarification and Needed Research”, Educational Researcher , 18(3): 4–10. doi:10.3102/0013189X018003004
  • –––, 1991, “Critical Thinking: A Streamlined Conception”, Teaching Philosophy , 14(1): 5–24. doi:10.5840/teachphil19911412
  • –––, 1996, “Critical Thinking Dispositions: Their Nature and Assessability”, Informal Logic , 18(2–3): 165–182. [ Ennis 1996 available online ]
  • –––, 1998, “Is Critical Thinking Culturally Biased?”, Teaching Philosophy , 21(1): 15–33. doi:10.5840/teachphil19982113
  • –––, 2011, “Critical Thinking: Reflection and Perspective Part I”, Inquiry: Critical Thinking across the Disciplines , 26(1): 4–18. doi:10.5840/inquiryctnews20112613
  • –––, 2013, “Critical Thinking across the Curriculum: The Wisdom CTAC Program”, Inquiry: Critical Thinking across the Disciplines , 28(2): 25–45. doi:10.5840/inquiryct20132828
  • –––, 2016, “Definition: A Three-Dimensional Analysis with Bearing on Key Concepts”, in Patrick Bondy and Laura Benacquista (eds.), Argumentation, Objectivity, and Bias: Proceedings of the 11th International Conference of the Ontario Society for the Study of Argumentation (OSSA), 18–21 May 2016 , Windsor, ON: OSSA, pp. 1–19. Available at http://scholar.uwindsor.ca/ossaarchive/OSSA11/papersandcommentaries/105 ; last accessed 2022 07 16.
  • –––, 2018, “Critical Thinking Across the Curriculum: A Vision”, Topoi , 37(1): 165–184. doi:10.1007/s11245-016-9401-4
  • Ennis, Robert H., and Jason Millman, 1971, Manual for Cornell Critical Thinking Test, Level X, and Cornell Critical Thinking Test, Level Z , Urbana, IL: Critical Thinking Project, University of Illinois.
  • Ennis, Robert H., Jason Millman, and Thomas Norbert Tomko, 1985, Cornell Critical Thinking Tests Level X & Level Z: Manual , Pacific Grove, CA: Midwest Publication, 3rd edition.
  • –––, 2005, Cornell Critical Thinking Tests Level X & Level Z: Manual , Seaside, CA: Critical Thinking Company, 5th edition.
  • Ennis, Robert H. and Eric Weir, 1985, The Ennis-Weir Critical Thinking Essay Test: Test, Manual, Criteria, Scoring Sheet: An Instrument for Teaching and Testing , Pacific Grove, CA: Midwest Publications.
  • Facione, Peter A., 1990a, Critical Thinking: A Statement of Expert Consensus for Purposes of Educational Assessment and Instruction , Research Findings and Recommendations Prepared for the Committee on Pre-College Philosophy of the American Philosophical Association, ERIC Document ED315423.
  • –––, 1990b, California Critical Thinking Skills Test, CCTST – Form A , Millbrae, CA: The California Academic Press.
  • –––, 1990c, The California Critical Thinking Skills Test--College Level. Technical Report #3. Gender, Ethnicity, Major, CT Self-Esteem, and the CCTST , ERIC Document ED326584.
  • –––, 1992, California Critical Thinking Skills Test: CCTST – Form B, Millbrae, CA: The California Academic Press.
  • –––, 2000, “The Disposition Toward Critical Thinking: Its Character, Measurement, and Relationship to Critical Thinking Skill”, Informal Logic , 20(1): 61–84. [ Facione 2000 available online ]
  • Facione, Peter A. and Noreen C. Facione, 1992, CCTDI: A Disposition Inventory , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Noreen C. Facione, and Carol Ann F. Giancarlo, 2001, California Critical Thinking Disposition Inventory: CCTDI: Inventory Manual , Millbrae, CA: The California Academic Press.
  • Facione, Peter A., Carol A. Sánchez, and Noreen C. Facione, 1994, Are College Students Disposed to Think? , Millbrae, CA: The California Academic Press. ERIC Document ED368311.
  • Fisher, Alec, and Michael Scriven, 1997, Critical Thinking: Its Definition and Assessment , Norwich: Centre for Research in Critical Thinking, University of East Anglia.
  • Freire, Paulo, 1968 [1970], Pedagogia do Oprimido . Translated as Pedagogy of the Oppressed , Myra Bergman Ramos (trans.), New York: Continuum, 1970.
  • Gigerenzer, Gerd, 2001, “The Adaptive Toolbox”, in Gerd Gigerenzer and Reinhard Selten (eds.), Bounded Rationality: The Adaptive Toolbox , Cambridge, MA: MIT Press, pp. 37–50.
  • Glaser, Edward Maynard, 1941, An Experiment in the Development of Critical Thinking , New York: Bureau of Publications, Teachers College, Columbia University.
  • Groarke, Leo A. and Christopher W. Tindale, 2012, Good Reasoning Matters! A Constructive Approach to Critical Thinking , Don Mills, ON: Oxford University Press, 5th edition.
  • Halpern, Diane F., 1998, “Teaching Critical Thinking for Transfer Across Domains: Disposition, Skills, Structure Training, and Metacognitive Monitoring”, American Psychologist , 53(4): 449–455. doi:10.1037/0003-066X.53.4.449
  • –––, 2016, Manual: Halpern Critical Thinking Assessment , Mödling, Austria: Schuhfried. Available at https://pdfcoffee.com/hcta-test-manual-pdf-free.html; last accessed 2022 07 16.
  • Hamby, Benjamin, 2014, The Virtues of Critical Thinkers , Doctoral dissertation, Philosophy, McMaster University. [ Hamby 2014 available online ]
  • –––, 2015, “Willingness to Inquire: The Cardinal Critical Thinking Virtue”, in Martin Davies and Ronald Barnett (eds.), The Palgrave Handbook of Critical Thinking in Higher Education , New York: Palgrave Macmillan, pp. 77–87.
  • Haran, Uriel, Ilana Ritov, and Barbara A. Mellers, 2013, “The Role of Actively Open-minded Thinking in Information Acquisition, Accuracy, and Calibration”, Judgment and Decision Making , 8(3): 188–201.
  • Hatcher, Donald and Kevin Possin, 2021, “Commentary: Thinking Critically about Critical Thinking Assessment”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 298–322. doi: 10.1163/9789004444591_017
  • Haynes, Ada, Elizabeth Lisic, Kevin Harris, Katie Leming, Kyle Shanks, and Barry Stein, 2015, “Using the Critical Thinking Assessment Test (CAT) as a Model for Designing Within-Course Assessments: Changing How Faculty Assess Student Learning”, Inquiry: Critical Thinking Across the Disciplines , 30(3): 38–48. doi:10.5840/inquiryct201530316
  • Haynes, Ada and Barry Stein, 2021, “Observations from a Long-Term Effort to Assess and Improve Critical Thinking”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 231–254. doi: 10.1163/9789004444591_014
  • Hiner, Amanda L. 2021. “Equipping Students for Success in College and Beyond: Placing Critical Thinking Instruction at the Heart of a General Education Program”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 188–208. doi: 10.1163/9789004444591_012
  • Hitchcock, David, 2017, “Critical Thinking as an Educational Ideal”, in his On Reasoning and Argument: Essays in Informal Logic and on Critical Thinking , Dordrecht: Springer, pp. 477–497. doi:10.1007/978-3-319-53562-3_30
  • –––, 2021, “Seven Philosophical Implications of Critical Thinking: Themes, Variations, Implications”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 9–30. doi: 10.1163/9789004444591_002
  • hooks, bell, 1994, Teaching to Transgress: Education as the Practice of Freedom , New York and London: Routledge.
  • –––, 2010, Teaching Critical Thinking: Practical Wisdom , New York and London: Routledge.
  • Johnson, Ralph H., 1992, “The Problem of Defining Critical Thinking”, in Stephen P, Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 38–53.
  • Kahane, Howard, 1971, Logic and Contemporary Rhetoric: The Use of Reason in Everyday Life , Belmont, CA: Wadsworth.
  • Kahneman, Daniel, 2011, Thinking, Fast and Slow , New York: Farrar, Straus and Giroux.
  • Kahneman, Daniel, Olivier Sibony, & Cass R. Sunstein, 2021, Noise: A Flaw in Human Judgment , New York: Little, Brown Spark.
  • Kenyon, Tim, and Guillaume Beaulac, 2014, “Critical Thinking Education and Debasing”, Informal Logic , 34(4): 341–363. [ Kenyon & Beaulac 2014 available online ]
  • Krathwohl, David R., Benjamin S. Bloom, and Bertram B. Masia, 1964, Taxonomy of Educational Objectives, Handbook II: Affective Domain , New York: David McKay.
  • Kuhn, Deanna, 1991, The Skills of Argument , New York: Cambridge University Press. doi:10.1017/CBO9780511571350
  • –––, 2019, “Critical Thinking as Discourse”, Human Development, 62 (3): 146–164. doi:10.1159/000500171
  • Lipman, Matthew, 1987, “Critical Thinking–What Can It Be?”, Analytic Teaching , 8(1): 5–12. [ Lipman 1987 available online ]
  • –––, 2003, Thinking in Education , Cambridge: Cambridge University Press, 2nd edition.
  • Loftus, Elizabeth F., 2017, “Eavesdropping on Memory”, Annual Review of Psychology , 68: 1–18. doi:10.1146/annurev-psych-010416-044138
  • Makaiau, Amber Strong, 2021, “The Good Thinker’s Tool Kit: How to Engage Critical Thinking and Reasoning in Secondary Education”, in Daniel Fasko, Jr. and Frank Fair (eds.), Critical Thinking and Reasoning: Theory, Development, Instruction, and Assessment , Leiden: Brill, pp. 168–187. doi: 10.1163/9789004444591_011
  • Martin, Jane Roland, 1992, “Critical Thinking for a Humane World”, in Stephen P. Norris (ed.), The Generalizability of Critical Thinking , New York: Teachers College Press, pp. 163–180.
  • Mayhew, Katherine Camp, and Anna Camp Edwards, 1936, The Dewey School: The Laboratory School of the University of Chicago, 1896–1903 , New York: Appleton-Century. [ Mayhew & Edwards 1936 available online ]
  • McPeck, John E., 1981, Critical Thinking and Education , New York: St. Martin’s Press.
  • Moore, Brooke Noel and Richard Parker, 2020, Critical Thinking , New York: McGraw-Hill, 13th edition.
  • Nickerson, Raymond S., 1998, “Confirmation Bias: A Ubiquitous Phenomenon in Many Guises”, Review of General Psychology , 2(2): 175–220. doi:10.1037/1089-2680.2.2.175
  • Nieto, Ana Maria, and Jorge Valenzuela, 2012, “A Study of the Internal Structure of Critical Thinking Dispositions”, Inquiry: Critical Thinking across the Disciplines , 27(1): 31–38. doi:10.5840/inquiryct20122713
  • Norris, Stephen P., 1985, “Controlling for Background Beliefs When Developing Multiple-choice Critical Thinking Tests”, Educational Measurement: Issues and Practice , 7(3): 5–11. doi:10.1111/j.1745-3992.1988.tb00437.x
  • Norris, Stephen P. and Robert H. Ennis, 1989, Evaluating Critical Thinking (The Practitioners’ Guide to Teaching Thinking Series), Pacific Grove, CA: Midwest Publications.
  • Norris, Stephen P. and Ruth Elizabeth King, 1983, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1984, The Design of a Critical Thinking Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland. ERIC Document ED260083.
  • –––, 1985, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland.
  • –––, 1990a, Test on Appraising Observations , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • –––, 1990b, Test on Appraising Observations: Manual , St. John’s, NL: Institute for Educational Research and Development, Memorial University of Newfoundland, 2nd edition.
  • OCR [Oxford, Cambridge and RSA Examinations], 2011, AS/A Level GCE: Critical Thinking – H052, H452 , Cambridge: OCR. Past papers available at https://pastpapers.co/ocr/?dir=A-Level/Critical-Thinking-H052-H452; last accessed 2022 07 16.
  • Ontario Ministry of Education, 2013, The Ontario Curriculum Grades 9 to 12: Social Sciences and Humanities . Available at http://www.edu.gov.on.ca/eng/curriculum/secondary/ssciences9to122013.pdf ; last accessed 2022 07 16.
  • Passmore, John Arthur, 1980, The Philosophy of Teaching , London: Duckworth.
  • Paul, Richard W., 1981, “Teaching Critical Thinking in the ‘Strong’ Sense: A Focus on Self-Deception, World Views, and a Dialectical Mode of Analysis”, Informal Logic , 4(2): 2–7. [ Paul 1981 available online ]
  • –––, 1984, “Critical Thinking: Fundamental to Education for a Free Society”, Educational Leadership , 42(1): 4–14.
  • –––, 1985, “McPeck’s Mistakes”, Informal Logic , 7(1): 35–43. [ Paul 1985 available online ]
  • Paul, Richard W. and Linda Elder, 2006, The Miniature Guide to Critical Thinking: Concepts and Tools , Dillon Beach, CA: Foundation for Critical Thinking, 4th edition.
  • Payette, Patricia, and Edna Ross, 2016, “Making a Campus-Wide Commitment to Critical Thinking: Insights and Promising Practices Utilizing the Paul-Elder Approach at the University of Louisville”, Inquiry: Critical Thinking Across the Disciplines , 31(1): 98–110. doi:10.5840/inquiryct20163118
  • Possin, Kevin, 2008, “A Field Guide to Critical-Thinking Assessment”, Teaching Philosophy , 31(3): 201–228. doi:10.5840/teachphil200831324
  • –––, 2013a, “Some Problems with the Halpern Critical Thinking Assessment (HCTA) Test”, Inquiry: Critical Thinking across the Disciplines , 28(3): 4–12. doi:10.5840/inquiryct201328313
  • –––, 2013b, “A Serious Flaw in the Collegiate Learning Assessment (CLA) Test”, Informal Logic , 33(3): 390–405. [ Possin 2013b available online ]
  • –––, 2013c, “A Fatal Flaw in the Collegiate Learning Assessment Test”, Assessment Update , 25 (1): 8–12.
  • –––, 2014, “Critique of the Watson-Glaser Critical Thinking Appraisal Test: The More You Know, the Lower Your Score”, Informal Logic , 34(4): 393–416. [ Possin 2014 available online ]
  • –––, 2020, “CAT Scan: A Critical Review of the Critical-Thinking Assessment Test”, Informal Logic , 40 (3): 489–508. [Available online at https://informallogic.ca/index.php/informal_logic/article/view/6243]
  • Rawls, John, 1971, A Theory of Justice , Cambridge, MA: Harvard University Press.
  • Rear, David, 2019, “One Size Fits All? The Limitations of Standardised Assessment in Critical Thinking”, Assessment & Evaluation in Higher Education , 44(5): 664–675. doi: 10.1080/02602938.2018.1526255
  • Rousseau, Jean-Jacques, 1762, Émile , Amsterdam: Jean Néaulme.
  • Scheffler, Israel, 1960, The Language of Education , Springfield, IL: Charles C. Thomas.
  • Scriven, Michael, and Richard W. Paul, 1987, Defining Critical Thinking , Draft statement written for the National Council for Excellence in Critical Thinking Instruction. Available at http://www.criticalthinking.org/pages/defining-critical-thinking/766 ; last accessed 2022 07 16.
  • Sheffield, Clarence Burton Jr., 2018, “Promoting Critical Thinking in Higher Education: My Experiences as the Inaugural Eugene H. Fram Chair in Applied Critical Thinking at Rochester Institute of Technology”, Topoi , 37(1): 155–163. doi:10.1007/s11245-016-9392-1
  • Siegel, Harvey, 1985, “McPeck, Informal Logic and the Nature of Critical Thinking”, in David Nyberg (ed.), Philosophy of Education 1985: Proceedings of the Forty-First Annual Meeting of the Philosophy of Education Society , Normal, IL: Philosophy of Education Society, pp. 61–72.
  • –––, 1988, Educating Reason: Rationality, Critical Thinking, and Education , New York: Routledge.
  • –––, 1999, “What (Good) Are Thinking Dispositions?”, Educational Theory , 49(2): 207–221. doi:10.1111/j.1741-5446.1999.00207.x
  • Simon, Herbert A., 1956, “Rational Choice and the Structure of the Environment”, Psychological Review , 63(2): 129–138. doi: 10.1037/h0042769
  • Simpson, Elizabeth, 1966–67, “The Classification of Educational Objectives: Psychomotor Domain”, Illinois Teacher of Home Economics , 10(4): 110–144, ERIC document ED0103613. [ Simpson 1966–67 available online ]
  • Skolverket, 2018, Curriculum for the Compulsory School, Preschool Class and School-age Educare , Stockholm: Skolverket, revised 2018. Available at https://www.skolverket.se/download/18.31c292d516e7445866a218f/1576654682907/pdf3984.pdf; last accessed 2022 07 15.
  • Smith, B. Othanel, 1953, “The Improvement of Critical Thinking”, Progressive Education , 30(5): 129–134.
  • Smith, Eugene Randolph, Ralph Winfred Tyler, and the Evaluation Staff, 1942, Appraising and Recording Student Progress , Volume III of Adventure in American Education , New York and London: Harper & Brothers.
  • Splitter, Laurance J., 1987, “Educational Reform through Philosophy for Children”, Thinking: The Journal of Philosophy for Children , 7(2): 32–39. doi:10.5840/thinking1987729
  • Stanovich Keith E., and Paula J. Stanovich, 2010, “A Framework for Critical Thinking, Rational Thinking, and Intelligence”, in David D. Preiss and Robert J. Sternberg (eds), Innovations in Educational Psychology: Perspectives on Learning, Teaching and Human Development , New York: Springer Publishing, pp 195–237.
  • Stanovich Keith E., Richard F. West, and Maggie E. Toplak, 2011, “Intelligence and Rationality”, in Robert J. Sternberg and Scott Barry Kaufman (eds.), Cambridge Handbook of Intelligence , Cambridge: Cambridge University Press, 3rd edition, pp. 784–826. doi:10.1017/CBO9780511977244.040
  • Tankersley, Karen, 2005, Literacy Strategies for Grades 4–12: Reinforcing the Threads of Reading , Alexandria, VA: Association for Supervision and Curriculum Development.
  • Thayer-Bacon, Barbara J., 1992, “Is Modern Critical Thinking Theory Sexist?”, Inquiry: Critical Thinking Across the Disciplines , 10(1): 3–7. doi:10.5840/inquiryctnews199210123
  • –––, 1993, “Caring and Its Relationship to Critical Thinking”, Educational Theory , 43(3): 323–340. doi:10.1111/j.1741-5446.1993.00323.x
  • –––, 1995a, “Constructive Thinking: Personal Voice”, Journal of Thought , 30(1): 55–70.
  • –––, 1995b, “Doubting and Believing: Both are Important for Critical Thinking”, Inquiry: Critical Thinking across the Disciplines , 15(2): 59–66. doi:10.5840/inquiryctnews199515226
  • –––, 2000, Transforming Critical Thinking: Thinking Constructively , New York: Teachers College Press.
  • Toulmin, Stephen Edelston, 1958, The Uses of Argument , Cambridge: Cambridge University Press.
  • Turri, John, Mark Alfano, and John Greco, 2017, “Virtue Epistemology”, in Edward N. Zalta (ed.), The Stanford Encyclopedia of Philosophy (Winter 2017 Edition). URL = < https://plato.stanford.edu/archives/win2017/entries/epistemology-virtue/ >
  • Vincent-Lancrin, Stéphan, Carlos González-Sancho, Mathias Bouckaert, Federico de Luca, Meritxell Fernández-Barrerra, Gwénaël Jacotin, Joaquin Urgel, and Quentin Vidal, 2019, Fostering Students’ Creativity and Critical Thinking: What It Means in School. Educational Research and Innovation , Paris: OECD Publishing.
  • Warren, Karen J. 1988. “Critical Thinking and Feminism”, Informal Logic , 10(1): 31–44. [ Warren 1988 available online ]
  • Watson, Goodwin, and Edward M. Glaser, 1980a, Watson-Glaser Critical Thinking Appraisal, Form A , San Antonio, TX: Psychological Corporation.
  • –––, 1980b, Watson-Glaser Critical Thinking Appraisal: Forms A and B; Manual , San Antonio, TX: Psychological Corporation,
  • –––, 1994, Watson-Glaser Critical Thinking Appraisal, Form B , San Antonio, TX: Psychological Corporation.
  • Weinstein, Mark, 1990, “Towards a Research Agenda for Informal Logic and Critical Thinking”, Informal Logic , 12(3): 121–143. [ Weinstein 1990 available online ]
  • –––, 2013, Logic, Truth and Inquiry , London: College Publications.
  • Willingham, Daniel T., 2019, “How to Teach Critical Thinking”, Education: Future Frontiers , 1: 1–17. [Available online at https://prod65.education.nsw.gov.au/content/dam/main-education/teaching-and-learning/education-for-a-changing-world/media/documents/How-to-teach-critical-thinking-Willingham.pdf.]
  • Zagzebski, Linda Trinkaus, 1996, Virtues of the Mind: An Inquiry into the Nature of Virtue and the Ethical Foundations of Knowledge , Cambridge: Cambridge University Press. doi:10.1017/CBO9781139174763
How to cite this entry . Preview the PDF version of this entry at the Friends of the SEP Society . Look up topics and thinkers related to this entry at the Internet Philosophy Ontology Project (InPhO). Enhanced bibliography for this entry at PhilPapers , with links to its database.
  • Association for Informal Logic and Critical Thinking (AILACT)
  • Critical Thinking Across the European Higher Education Curricula (CRITHINKEDU)
  • Critical Thinking Definition, Instruction, and Assessment: A Rigorous Approach
  • Critical Thinking Research (RAIL)
  • Foundation for Critical Thinking
  • Insight Assessment
  • Partnership for 21st Century Learning (P21)
  • The Critical Thinking Consortium
  • The Nature of Critical Thinking: An Outline of Critical Thinking Dispositions and Abilities , by Robert H. Ennis

abilities | bias, implicit | children, philosophy for | civic education | decision-making capacity | Dewey, John | dispositions | education, philosophy of | epistemology: virtue | logic: informal

Copyright © 2022 by David Hitchcock < hitchckd @ mcmaster . ca >

  • Accessibility

Support SEP

Mirror sites.

View this site from another server:

  • Info about mirror sites

The Stanford Encyclopedia of Philosophy is copyright © 2024 by The Metaphysics Research Lab , Department of Philosophy, Stanford University

Library of Congress Catalog Data: ISSN 1095-5054

X

UCL Careers

Analytical and Critical Thinking Skills

Menu

The power to apply logical thinking, break down complex problems into manageable components objectively, and make a reasoned judgement by evaluating information.

Explore your understanding

Applying analytical and critical thinking to a task is about being able to look at a situation and examining it carefully.  Paying attention to detail, remaining focused and having determination are all key elements to apply to this process.

During the recruitment process, industry case study scenarios are sometimes used to test your analytical skills. Employers are looking to see that you are able to critically look at data, evaluate the information you have and produce proposals for suggested actions.

Find and develop your skill

How can you improve your analytical and critical thinking skills at UCL? 

Travel to hone your analytical and critical thinking skills

Travel via UCL Go Abroad programmes which encompass an enriching selection of worldwide opportunities tailored to support UCL students to perform at their full potential and further develop their analytical and critical thinking skills through meeting a wide range of people. Attend an event to learn more about the global opportunities available both short term and longer term as part of your degree.

Learn how to apply analytical and critical thinking

Take part in a skills session delivered by employers to learn more about developing your analytical and critical thinking . You could also try coming to a Mock Assessment Centre where you can practice applying logic and evaluating information as part of a group task. 

Employer-led skills sessions

Use LinkedIn Learning to grow your skillset

LinkedIn Learning has a huge range of video courses supporting learning in software, creative and business skills – all free to UCL staff and currently enrolled students.  Access LinkedIn learning content on how you can build your analytical and thinking skills.

Access LinkedIn Learning

Join a club or society Join a club or society that challenges your critical thinking, such as the Consulting Society, Law for All, the International Relations Society, or even come up with your own proposal if you identify a gap. 

Clubs and societies directory

Develop your analytical and critical thinking skills as a researcher

Access courses related to analytical & critical skills such as Introductions to science, philosophy, and key concepts to develop your skills within this area. The UCL Doc Skills Programme is open to all postgraduate research students at UCL. You’ll find more information on all the courses available on our website . 

UCL Doctoral Skills Development Programme You will also be able to browse the  scheduled events for researchers  and  those for doctoral students .  Research students can also  access courses mapped to the Researcher Development Framework (RDF)  and  one-to-one advice, practice interviews and workshops tailored to researchers.

Prepare your examples

Ask yourself:

Do you have a group project that you are working on? Was there any type of evaluation involved? Think about a piece of research, report or dissertation you completed? What analysis and evaluation processes did you use and why?

Can you describe how you have developed your analytical and critical thinking skills whilst at UCL?

Next steps:

Course projects are also a great way to develop your skills. Need some support on how to apply analytical and critical thinking? Visit our 'Psychometric and aptitude tests' page to see the different examples of problem solving and situational judgement tests.

Practice psychometric and aptitude tests

Get support on how to structure answers on analytical and critical thinking as part of an interview. Visit our Interview Skills page. 

How to demonstrate your skills in an interview

Here you can find out more about how to structure your answer and demonstrate your skills along with many more resources that will help you prepare.

If you have written a draft application for any type of opportunity, our team can provide personalised practical tips and advice to help you better understand how recruiters will shortlist your application, and how you can best demonstrate your motivation and your most relevant skills / experience.

Get one-to-one advice

The employer perspective – Procter and Gamble:

“ Analytical skills is an essential skill for all employees at Procter and Gamble . We recruit a variety of roles including Sales, Marketing, Finance, Engineering and HR and across all of these analytical skills are used in day to day life at P&G. As a data driven company, we make a lot of decisions based on data from the market, our customers and our campaigns which means its crucial all employees have an ability to analyse and manipulate data to create insights used for decision making.
  • USC Libraries
  • Research Guides

Organizing Your Social Sciences Research Paper

  • Applying Critical Thinking
  • Purpose of Guide
  • Design Flaws to Avoid
  • Independent and Dependent Variables
  • Glossary of Research Terms
  • Reading Research Effectively
  • Narrowing a Topic Idea
  • Broadening a Topic Idea
  • Extending the Timeliness of a Topic Idea
  • Academic Writing Style
  • Choosing a Title
  • Making an Outline
  • Paragraph Development
  • Research Process Video Series
  • Executive Summary
  • The C.A.R.S. Model
  • Background Information
  • The Research Problem/Question
  • Theoretical Framework
  • Citation Tracking
  • Content Alert Services
  • Evaluating Sources
  • Primary Sources
  • Secondary Sources
  • Tiertiary Sources
  • Scholarly vs. Popular Publications
  • Qualitative Methods
  • Quantitative Methods
  • Insiderness
  • Using Non-Textual Elements
  • Limitations of the Study
  • Common Grammar Mistakes
  • Writing Concisely
  • Avoiding Plagiarism
  • Footnotes or Endnotes?
  • Further Readings
  • Generative AI and Writing
  • USC Libraries Tutorials and Other Guides
  • Bibliography

Critical thinking refers to deliberately scrutinizing and evaluating theories, concepts, or ideas using reasoned reflection and analysis. The act of thinking critically involves moving beyond simply understanding information, but rather, to question its source, its production, and its presentation in order to expose potential bias or researcher subjectivity [i.e., being influenced by personal opinions and feelings rather than by external determinants ] . Applying critical thinking to investigating a research problem involves actively challenging basic assumptions and questioning the choices and potential motives underpinning how the author designed the study, conducted the research, and arrived at particular conclusions or recommended courses of action.

Mintz, Steven. "How the Word "Critical" Came to Signify the Leading Edge of Cultural Analysis." Higher Ed Gamma Blog , Inside Higher Ed, February 13, 2024; Van Merriënboer, Jeroen JG and Paul A. Kirschner. Ten Steps to Complex Learning: A Systematic Approach to Four-component Instructional Design . New York: Routledge, 2017.

Thinking Critically

Applying Critical Thinking to Research and Writing

Professors like to use the term critical thinking; in fact, the idea of being a critical thinker permeates much of higher education writ large. In the classroom, the idea of thinking critically is often mentioned by professors when students ask how they should approach a research and writing assignment [other approaches your professor might mention include interdisciplinarity, comparative, gendered, global, etc.]. However, critical thinking is more than just an approach to research and writing. It is an acquired skill associated with becoming a complex learner capable of discerning important relationships among the elements of, as well as integrating multiple ways of understanding applied to, the research problem. Critical thinking is a lens through which you holistically interrogate a topic.

Given this, thinking critically encompasses a variety of inter-related connotations applied to writing a college-level research paper:

  • Integrated and Multi-Dimensional . Critical thinking is not focused on any one element of research, but instead, is applied holistically throughout the process of identifying the research problem, reviewing the literature, applying methods of analysis, describing the results, discussing their implications, and, if appropriate, offering recommendations for further research. It permeates the entire research endeavor from contemplating what to write to proofreading the final product.
  • Humanizes the Research . Thinking critically can help humanize what is being studied by extending the scope of your analysis beyond the traditional boundaries of prior research. This prior research could have involved, for example, sampling homogeneous populations, considering only certain factors related to the investigation of a phenomenon, or limiting the way authors framed or contextualized their study. Critical thinking supports opportunities to incorporate the experiences of others into the research process, leading to a more inclusive and representative examination of the topic.
  • Non-Linear . This refers to analyzing a research problem in ways that do not rely on sequential decision-making or rational forms of reasoning. Creative thinking relies on intuitive judgement, flexibility, and unconventional approaches to investigating complex phenomena in order to discover new insights, connections, and potential solutions . This involves going back and modifying your thinking as new evidence emerges , perhaps multiple times throughout the research process, and drawing conclusions from multiple perspectives.
  • Normative . This is the idea that critical thinking can be used to challenge prior assumptions in ways that advocate for social justice, equity, and inclusion and that can lead to research having a more transformative and expansive impact. In this respect, critical thinking can be viewed as a method for breaking away from dominant culture norms so as to produce research outcomes that illuminate previously hidden aspects of exploitation and injustice.
  • Power Dynamics . Research in the social sciences often includes examining aspects of power and influence that shape social relations, organizations, institutions, and the production and maintenance of knowledge. These studies focus on how power operates, how it can be acquired, and how power and influence can be maintained. Critical thinking can reveal how societal structures perpetuate power and influence in ways that marginalizes and oppresses certain groups or communities within the contexts of history , politics, economics, culture, and other factors.
  • Reflection . A key component of critical thinking is practicing reflexivity; the act of turning ideas and concepts back onto yourself in order to reveal and clarify your own beliefs, assumptions, and perspectives. Being critically reflexive is important because it can reveal hidden biases you may have that could unintentionally influence how you interpret and validate information. The more reflexive you are, the better able and more comfortable you are in opening yourself up to new modes of understanding.
  • Rigorous Questioning . Thinking critically is guided by asking questions that lead to addressing complex concepts, principles, theories, or problems more effectively and, in so doing, help distinguish what is known from from what is not known [or that may be hidden]. Critical thinking involves deliberately framing inquiries not just as research questions, but as a way to apply systematic, disciplined,  in-depth forms of questioning concerning the research problem and your positionality as a researcher.
  • Social Change . An overarching goal of critical thinking applied to research and writing is to seek to identify and challenge sources of inequality, exploitation, oppression, and marinalization that contributes to maintaining the status quo within institutions of society. This can include entities, such as, schools, courts, businesses, government agencies, or religious organizations, that have been created and maintained through certain ways of thinking within the dominant culture.

Although critical thinking permeates the entire research and writing process, it applies in particular to the literature review and discussion sections of your paper . In reviewing the literature, it is important to reflect upon specific aspects of a study, such as, determining if the research design effectively establishes cause and effect relationships or provides insight into explaining why certain phenomena do or do not occur, assessing whether the method of gathering data or information supports the objectives of the study, and evaluating if the assumptions used t o arrive at a specific conclusion are evidence-based and relevant to addressing the research problem. That said, an assessment of whether a source is helpful to supporting your arguments, but also, that this assessment involves critically analyzing how the research challenges conventional approaches to research that perpetuate inequalities or hides the voices of others.

Critical thinking applies to the discussion section of your paper because this is where you internalize the results of your study and explain its value and significance. This involves more than summarizing findings and describing outcomes. It includes reflecting on their importance and providing reasoned explanations why your paper helps fill a gap in the literature or expands knowledge and understanding in ways that inform practice. Critical reflection helps you think introspectively about your own beliefs concerning the significance of the findings, but in ways that avoid biased judgment and decision making.

Behar-Horenstein, Linda S., and Lian Niu. “Teaching Critical Thinking Skills in Higher Education: A Review of the Literature.” Journal of College Teaching and Learning 8 (February 2011): 25-41; Bayou, Yemeserach and Tamene Kitila. "Exploring Instructors’ Beliefs about and Practices in Promoting Students’ Critical Thinking Skills in Writing Classes." GIST–Education and Learning Research Journal 26 (2023): 123-154; Butcher, Charity. "Using In-class Writing to Promote Critical Thinking and Application of Course Concepts." Journal of Political Science Education 18 (2022): 3-21; Loseke, Donileen R. Methodological Thinking: Basic Principles of Social Research Design. Thousand Oaks, CA: Sage, 2012; Mintz, Steven. "How the Word "Critical" Came to Signify the Leading Edge of Cultural Analysis." Higher Ed Gamma Blog , Inside Higher Ed, February 13, 2024; Hart, Claire et al. “Exploring Higher Education Students’ Critical Thinking Skills through Content Analysis.” Thinking Skills and Creativity 41 (September 2021): 100877; Lewis, Arthur and David Smith. "Defining Higher Order Thinking." Theory into Practice 32 (Summer 1993): 131-137; Sabrina, R., Emilda Sulasmi, and Mandra Saragih. "Student Critical Thinking Skills and Student Writing Ability: The Role of Teachers' Intellectual Skills and Student Learning." Cypriot Journal of Educational Sciences 17 (2022): 2493-2510. Suter, W. Newton. Introduction to Educational Research: A Critical Thinking Approach. 2nd edition. Thousand Oaks, CA: SAGE Publications, 2012; Van Merriënboer, Jeroen JG and Paul A. Kirschner. Ten Steps to Complex Learning: A Systematic Approach to Four-component Instructional Design. New York: Routledge, 2017; Vance, Charles M., et al. "Understanding and Measuring Linear–Nonlinear Thinking Style for Enhanced Management Education and Professional Practice." Academy of Management Learning and Education 6 (2007): 167-185; Yeh, Hui-Chin, Shih-hsien Yang, Jo Shan Fu, and Yen-Chen Shih. "Developing College Students’ Critical Thinking through Reflective Writing." Higher Education Research & Development 42 (2023): 244-259.

  • << Previous: Academic Writing Style
  • Next: Choosing a Title >>
  • Last Updated: Apr 18, 2024 12:20 PM
  • URL: https://libguides.usc.edu/writingguide
  • Life Science Lesson Plans for K-5 Learners

Problem-solving Skills: A Comprehensive Overview

  • Unit Planning Strategies

Earth Science Lesson Plans for 6-8 Learners

  • Classroom Management
  • Behavior management techniques
  • Classroom rules
  • Classroom routines
  • Classroom organization
  • Assessment Strategies
  • Summative assessment techniques
  • Formative assessment techniques
  • Portfolio assessment
  • Performance-based assessment
  • Teaching Strategies
  • Active learning
  • Inquiry-based learning
  • Differentiated instruction
  • Project-based learning
  • o2c-library/governance/arc-organisation-reports/final%20report.pdf
  • Learning Theories
  • Behaviorism
  • Social Learning Theory
  • Cognitivism
  • Constructivism
  • Critical Thinking Skills
  • Analysis skills
  • Creative thinking skills
  • Problem-solving skills
  • Evaluation skills
  • Metacognition
  • Metacognitive strategies
  • Self-reflection and metacognition
  • Goal setting and metacognition
  • Teaching Methods and Techniques
  • Direct instruction methods
  • Indirect instruction methods
  • Lesson Planning Strategies
  • Lesson sequencing strategies
  • Unit planning strategies
  • Differentiated Instruction Strategies
  • Differentiated instruction for English language learners
  • Differentiated instruction for gifted students
  • Standards and Benchmarks
  • State science standards and benchmarks
  • National science standards and benchmarks
  • Curriculum Design
  • Course design and alignment
  • Backward design principles
  • Curriculum mapping
  • Instructional Materials
  • Textbooks and digital resources
  • Instructional software and apps
  • Engaging Activities and Games
  • Hands-on activities and experiments
  • Cooperative learning games
  • Learning Environment Design
  • Classroom technology integration
  • Classroom layout and design
  • Instructional Strategies
  • Collaborative learning strategies
  • Problem-based learning strategies
  • 9-12 Science Lesson Plans
  • Life science lesson plans for 9-12 learners
  • Earth science lesson plans for 9-12 learners
  • Physical science lesson plans for 9-12 learners
  • K-5 Science Lesson Plans
  • Earth science lesson plans for K-5 learners
  • Life science lesson plans for K-5 learners
  • Physical science lesson plans for K-5 learners
  • 6-8 Science Lesson Plans
  • Earth science lesson plans for 6-8 learners
  • Life science lesson plans for 6-8 learners
  • Physical science lesson plans for 6-8 learners
  • Science Learning
  • Analysis Skills: Understanding Critical Thinking and Science Learning

This article provides an overview of the analysis skills necessary for critical thinking and science learning. It offers tips and strategies to help readers develop their analysis skills.

Analysis Skills: Understanding Critical Thinking and Science Learning

Having strong analysis skills is essential for success in any field, whether it's science, business, or any other field. Critical thinking and science learning are two key components of having these skills. In order to become an expert in any field, one must be able to analyze information and make informed decisions. This article will explore the importance of analysis skills and how to develop them through critical thinking and science learning.

Additionally, it will discuss the benefits of seeking help from Profs online stata tutors to further hone these skills. Critical thinking is the process of making rational judgments about a situation or problem. It involves gathering facts, weighing evidence, and forming conclusions based on what you learn. Science learning refers to the process of acquiring knowledge about scientific theories, experiments, and discoveries. Both of these skills are necessary for gaining a thorough understanding of any topic. This article will explain why it is important to develop both analysis skills and critical thinking.

It will also provide tips on how to use critical thinking and science learning to enhance your analysis skills. Finally, it will discuss the importance of using both skills together in order to achieve better results. Analysis skills are essential for critical thinking and science learning. They involve the ability to think logically, break down problems into smaller parts, identify patterns and relationships, and use data to draw conclusions. Analytical thinking is the process of examining information, breaking it down into smaller components, and understanding how the components are related.

Problem-solving skills involve using a systematic approach to identify solutions to complex problems. Data analysis involves gathering, organizing, and analyzing data in order to draw meaningful conclusions. Having strong analysis skills is important for making informed decisions, solving complex problems, and understanding complex topics. It can help you identify potential consequences of actions, develop alternative solutions, and think critically about different perspectives.

It can also be beneficial when learning new topics or researching different areas of study. There are several strategies that you can use to improve your analysis skills. To improve analytical thinking, try breaking down complex problems into smaller parts and using data to identify patterns or relationships. For problem-solving skills, try brainstorming potential solutions and using a systematic approach to evaluate the options.

To improve data analysis skills, practice organizing data into meaningful categories and interpreting the data to draw meaningful conclusions. Analysis skills can be applied in various contexts. In business, data analysis can help identify trends and make informed decisions. Analytical thinking can be used to solve complex problems or develop new products or services.

In science, problem-solving skills can be used to explore new ideas or research topics. There are a variety of resources available to help you further develop your analysis skills. Books such as The Art of Analyzing Data by John DeKok provide an introduction to data analysis and how to use data for decision-making. Online courses such as Analytical Thinking & Problem Solving by edX provide an in-depth exploration of how to develop problem-solving skills.

Analytical thinking

Data analysis requires the ability to collect, organize, and interpret data in order to draw meaningful conclusions. Having strong analysis skills is important for a variety of reasons. They can help us make better decisions by allowing us to evaluate different options and make informed choices. They can also help us understand complex topics by breaking them down into more manageable pieces.

Finally, they can help us solve complex problems by allowing us to identify patterns, draw connections between different elements, and develop creative solutions. To develop analysis skills, it is important to practice analytical thinking, problem-solving, and data analysis. To improve analytical thinking, it is helpful to break down problems into smaller parts and look for patterns or connections between different elements. To improve problem-solving skills, it is helpful to brainstorm potential solutions and use a systematic approach when evaluating them.

To improve data analysis skills, it is important to collect relevant data, organize it in a meaningful way, and interpret it accurately. Analysis skills can be applied in a variety of contexts. In business, for example, data analysis can be used to identify trends or insights about customer behavior . Analytical thinking can be used to evaluate complex problems and develop creative solutions.

Problem-solving skills can be used to develop new products or services or find ways to improve existing ones. In science learning, analysis skills are essential for understanding complex topics and developing hypotheses that can be tested through experimentation. There are many resources available for those interested in further developing their analysis skills. Books such as The Art of Analytical Thinking , Data Analysis for Beginners , and The Power of Problem-Solving provide useful tips and strategies for improving analytical thinking, problem-solving, and data analysis.

Examples of Analysis Skills

Analytical thinking:, problem-solving skills:, why are analysis skills important, what are analysis skills.

It involves breaking down a problem into its component parts, examining each part in detail, and then using evidence-based logic to solve the problem. Problem-solving is another type of analysis skill that involves identifying the root cause of a problem and developing an effective strategy to address it. Data analysis is another type of analysis skill that involves collecting, analyzing, and interpreting data in order to draw meaningful conclusions. Analysis skills are important because they enable a person to think critically and make informed decisions. These skills are also essential for science learning, as they enable a person to analyze data, identify patterns, and draw conclusions.

How to Develop Analysis Skills

Using data:, systematic approach:, resources for developing analysis skills.

Some books that may be of particular interest include “Analytical Thinking: A Guide to Critical Thinking and Problem Solving” by Paul C. Nuttal, “The Analytic Thinker: Mastering the Skills of Reasoning, Analysis, and Critique” by Michael Starbird, and “The Art of Thinking Clearly: Better Thinking, Better Decisions” by Rolf Dobelli. Online courses can also be a great resource for developing analysis skills. Coursera, edX, and Udemy offer a wide range of courses on analytical thinking, problem-solving, and critical thinking.

These courses can provide an in-depth look at the fundamentals of analytical thinking and how it can be applied in different contexts. Websites can also be a great source of information on analytical thinking and problem-solving. There are many sites dedicated to the subject, such as the Cognitive Science Lab at Stanford University, which offers resources on cognitive psychology, artificial intelligence, and more. The website Thinking Critically by Peter Facione offers a wealth of information on critical thinking, including tips and strategies for developing analysis skills.

This allows for more informed decision-making and analysis. Additionally, analyzing data can help identify patterns and draw conclusions from the information. When looking for patterns, it is important to consider the context in which the data was gathered and how it relates to the overall problem. To improve problem-solving skills, brainstorming can be an effective technique. Brainstorming involves coming up with multiple ideas, without judging them, and then assessing their viability.

Books can be an excellent resource for developing analytical thinking skills. Popular titles include Thinking, Fast and Slow by Daniel Kahneman, The Art of Thinking Clearly by Rolf Dobelli, Thinking Critically by John Chaffee, and The Power of Thinking Differently by Edward de Bono. Online courses are also available to help readers better understand analytical thinking. Popular options include Coursera's Introduction to Logic and Critical Thinking , edX's Analytical Thinking: A Strategic Skill for Leaders , and Udemy's Data Analysis & Visualization in R: Analytical Thinking . Websites can also provide helpful resources for developing analytical thinking skills.

Sites like Khan Academy, which offers free tutorials on topics like logic and problem solving, as well as websites like Brilliant.org, which offers free online courses on a variety of topics, can be great resources for honing analytical skills. Finally, there are many other materials that readers can use to further develop their analytical thinking skills. These can include podcasts, videos, articles, and more. All of these resources can provide insight into different aspects of analytical thinking and help readers hone their skills. Analysis skills are essential for critical thinking and science learning. This article has provided an overview of what analysis skills are, why they are important, and tips and strategies for developing these skills.

Shahid Lakha

Shahid Lakha

Shahid Lakha is a seasoned educational consultant with a rich history in the independent education sector and EdTech. With a solid background in Physics, Shahid has cultivated a career that spans tutoring, consulting, and entrepreneurship. As an Educational Consultant at Spires Online Tutoring since October 2016, he has been instrumental in fostering educational excellence in the online tutoring space. Shahid is also the founder and director of Specialist Science Tutors, a tutoring agency based in West London, where he has successfully managed various facets of the business, including marketing, web design, and client relationships. His dedication to education is further evidenced by his role as a self-employed tutor, where he has been teaching Maths, Physics, and Engineering to students up to university level since September 2011. Shahid holds a Master of Science in Photon Science from the University of Manchester and a Bachelor of Science in Physics from the University of Bath.

New Articles

State Science Standards and Benchmarks

  • State Science Standards and Benchmarks

This article outlines state science standards and benchmarks, their purpose, and how they are developed.

Life Science Lesson Plans for K-5 Learners

Learn about life science lesson plans for K-5 learners and how to make them engaging and informative.

Classroom Rules - A Comprehensive Overview

  • Classroom Rules - A Comprehensive Overview

This article provides a comprehensive overview of classroom rules, covering why they are important, how to create and enforce them, and the consequences for when they are not followed.

Formative Assessment Techniques

  • Formative Assessment Techniques

Learn about the different formative assessment techniques, why they are important, and how to use them effectively in the classroom.

Leave Reply

Your email address will not be published. Required fields are marked *

I agree that spam comments wont´t be published

  • Behavior Management Techniques
  • Behaviorism: A Comprehensive Overview
  • Social Learning Theory Explained
  • Summative Assessment Techniques: An Overview
  • Active Learning: A Comprehensive Overview
  • Inquiry-Based Learning: An Introduction to Teaching Strategies
  • Understanding Cognitivism: A Learning Theory

Creative Thinking Skills

  • Constructivism: Exploring the Theory of Learning
  • Exploring Portfolio Assessment: An Introduction
  • Differentiated Instruction: A Comprehensive Overview
  • Evaluation Skills: A Comprehensive Overview
  • Classroom Routines: A Comprehensive Overview
  • Effective Classroom Organization Strategies for Science Teaching
  • Project-Based Learning: An In-Depth Look
  • Performance-Based Assessment: A Comprehensive Overview
  • Understanding Direct Instruction Methods
  • Course Design and Alignment
  • The Advantages of Textbooks and Digital Resources
  • Engaging Hands-on Activities and Experiments
  • An Overview of Metacognitive Strategies
  • Backward Design Principles: Understanding Curriculum Design
  • Engaging Cooperative Learning Games
  • Integrating Technology into the Classroom
  • Understanding Classroom Layout and Design
  • Lesson Sequencing Strategies: A Comprehensive Overview
  • Instructional Software and Apps: A Comprehensive Overview
  • Understanding Curriculum Mapping
  • Collaborative Learning Strategies
  • Indirect Instruction Methods: A Comprehensive Overview
  • Understanding National Science Standards and Benchmarks
  • Exploring Problem-Based Learning Strategies
  • Exploring Self-Reflection and Metacognition
  • Exploring Goal Setting and Metacognition
  • Life Science Lesson Plans for 9-12 Learners
  • Earth Science Lesson Plans for K-5 Learners
  • Differentiated Instruction for English Language Learners
  • Earth Science Lesson Plans for 9-12 Learners
  • Life Science Lesson Plans for 6-8 Learners
  • Physical Science Lesson Plans for 9-12 Learners
  • Physical Science Lesson Plans for K-5 Learners
  • Physical Science Lesson Plans for 6-8 Learners
  • Differentiated Instruction Strategies for Gifted Students

Recent Posts

Earth Science Lesson Plans for 6-8 Learners

Which cookies do you want to accept?

Advertisement

Advertisement

Situating Higher-Order, Critical, and Critical-Analytic Thinking in Problem- and Project-Based Learning Environments: A Systematic Review

  • REVIEW ARTICLE
  • Open access
  • Published: 21 March 2023
  • Volume 35 , article number  39 , ( 2023 )

Cite this article

You have full access to this open access article

  • Sofie M. M. Loyens   ORCID: orcid.org/0000-0002-2419-1492 1 ,
  • Julianne E. van Meerten 2 ,
  • Lydia Schaap 3 &
  • Lisette Wijnia 4  

7477 Accesses

14 Citations

1 Altmetric

Explore all metrics

Critical thinking (CT) is widely regarded as an important competence to obtain in education. Students’ exposure to problems and collaboration have been proven helpful in promoting CT processes. These elements are present in student-centered instructional environments such as problem-based and project-based learning (P(j)BL). Next to CT, also higher-order thinking (HOT) and critical-analytic thinking (CAT) contain elements that are present in and fostered by P(j)BL. However, HOT, CT, and CAT definitions are often ill-defined and overlap. The present systematic review, therefore, investigated how HOT, CT, and CAT were conceptualized in P(j)BL environments. Another aim of this study was to review the evidence on the effectiveness of P(j)BL environments in fostering HOT, CT, or CAT. Results demonstrated an absence of CAT in P(j)BL research and a stronger focus on CT processes than CT dispositions (i.e., trait-like tendency or willingness to engage in CT). Further, while we found positive effects of P(j)BL on HOT and CT, there was a lack of clarity and consistency in how researchers conceptualized and measured these forms of thinking. Also, essential components of P(j)BL were often overlooked. Finally, we identified various design issues in effect studies, such as the lack of control groups, that bring the reported outcomes of those investigations into question.

Similar content being viewed by others

critical thinking and analytical thinking skills are less needed in position paper

The effectiveness of collaborative problem solving in promoting students’ critical thinking: A meta-analysis based on empirical literature

Enwei Xu, Wei Wang & Qingxia Wang

critical thinking and analytical thinking skills are less needed in position paper

A Scoping Review of Empirical Research on Recent Computational Thinking Assessments

Maria Cutumisu, Cathy Adams & Chang Lu

critical thinking and analytical thinking skills are less needed in position paper

Toward a Systematic and Model-Based Approach to Design Learning Environments for Critical Thinking

Avoid common mistakes on your manuscript.

Critical thinking (CT) is widely regarded as an important competence to learn, and its importance has only increased over time (Pellegrino & Hilton, 2012 ). Mastery of this ability is not only necessary for students but also for working professionals and informed citizens (Bezanilla et al., 2021 ). Therefore, thinking critically is a central aim of education (e.g., Butler & Halpern, 2020 ). A meta-analysis demonstrated that the opportunity for dialogue, exposing students to authentic or situated problems and examples, and mentoring them in these habits of mind positively affected CT (Abrami et al., 2015 ). All of these pedagogical elements are, to some extent, present in student-centered instructional environments. Thus, it would seem that a good way to teach CT is by using active, student-centered instructional methods (Bezanilla et al., 2021 ; Lombardi et al., 2021 ). Problem-based learning (PBL) and project-based learning (PjBL) are prototypical examples of active, student-centered instructional methods. Next to CT, higher-order thinking (HOT) and critical-analytic thinking (CAT) also contain elements that are present in and fostered by student-centered learning environments, such as PBL and PjBL. However, CT, HOT, and CAT definitions are often ill-defined and overlap.

For the aforementioned reasons, the present study investigated CT, HOT, and CAT in student-centered learning environments. We will focus on PBL and PjBL because these two formats are most frequently studied in the research literature and hence constitute the vast majority of student-centered instructional methods (Authors, 2022 ; Nagarajan & Overton, 2019 ). In addition, both PBL and PjBL have been included as acknowledged instructional formats in the Cambridge Handbook of Learning Sciences (Sawyer, 2014 ) and have specific criteria that need to be fulfilled to be labeled as PBL and PjBL. Despite their unique characteristics and origin, PBL and PjBL share common ground because of their joint roots in constructivist learning theory (Authors, 2022 ; Loyens & Rikers, 2017 ). PBL and PjBL as pedagogies can be seen as manifestations of constructivist learning. Constructivism is a theory or view on how learning happens, which holds that learners construct knowledge out of experiences. It has roots in philosophy and stresses the student’s active role in his or her knowledge-acquisition process (Loyens et al., 2012 ).

What is PBL?

There are many definitions of PBL in the research literature, all presenting different perspectives and ideas with regard to this educational pedagogy (Dolmans et al., 2016 ; Zabit, 2010 ). Notwithstanding the variety of definitions, different PBL implementations demonstrate some shared characteristics. Based on the original method developed at McMaster University (Spaulding, 1969 ), Barrows ( 1996 ) described six core characteristics of PBL. The first characteristic is that learning is student centered. Second, learning occurs in small student groups under the guidance of a tutor. The third characteristic refers to the tutor as a facilitator or guide. Fourth, students encounter so-called “authentic” (i.e., relating to real life) problems in the learning sequence before any preparation or study has occurred. Fifth, these problems function as triggers for students’ prior knowledge activations, which leads to the discovery of knowledge gaps. Finally, students overcome these knowledge gaps through self-directed learning, which requires sufficient time for self-study (Schmidt et al., 2009 ).

Besides the core elements, different PBL environments share similarities in terms of the process. For example, Wijnia et al. ( 2019 ) highlighted that PBL as a process consists of three separate stages: an initial discussion phase, a self-study phase, and a reporting stage. First, students are given a meaningful problem that describes an observable phenomenon or event. The instructional goal of the problem presented to the students can differ. For example, the problem could originate from professional practice or be related directly to distinctive events in a particular domain or field of study. An example of a problem related to a specific domain of study from an introductory psychology course reads as follows (Schmidt et al., 2007 ):

Coming home from work, tired and in need of a hot bath, Anita, an account manager, discovers two spiders in her tub. She shrinks back, screams, and runs away. Her heart pounds, a cold sweat is coming over her. A neighbor saves her from her difficult situation by killing the little animals using a newspaper.” Explain what has happened here. (p. 92)

During the first stage, prior knowledge is important as students come up with theories to explain the problem based on their life experiences. Because their knowledge is often limited and insufficient, students formulate learning issues (formulated as questions) to guide their research and further self-study. All this takes place in a class discussion, usually in classes with fewer than 12 students. During the second stage, students consult learning resources to gain knowledge relevant to the problem and to address the learning issue questions. These resources can be selected by the students, the tutor, or a combination of both (Wijnia et al., 2019 ). In tandem with these steps, students have to plan and monitor study activities that need to be carried out before the next class meeting (Loyens et al., 2008 ). In the final stage, students reconvene under the guidance of their tutor to share and evaluate their findings critically and elaborate on their newly acquired knowledge. Students apply this knowledge to the problem to identify plausible solutions or explanations (Loyens & Rikers, 2017 ; Wijnia et al., 2019 ).

What is PjBL?

In project-based or project-centered learning (PjBL), the learning process is organized around activities that drive students’ actions (Blumenfeld et al., 1991 ). Students learn central concepts and principles of a discipline through the projects. This “learning-by-doing” approach of PjBL could help motivate students to learn, as they play an active role in the process (Saad & Zainudin, 2022 ). Students have a significant degree of control over the project they will work on and what they will do in the project. The projects are hence student-driven and, similar to PBL, are intended to generate learner agency. Specific end products need to result from the work, although the processes to get to the end product can vary. The end products (e.g., a website, presentation, or report) serve as the basis for discussion, feedback, and revision (Blumenfeld et al., 1991 ; Helle et al., 2006 ; Tal et al., 2006 ). Also, even though different forms of PjBL exist, most start with a driving question or problem and typically incorporate the following features (Krajcik, 2015 ; Thomas, 2000 ):

Projects for which the students seek solutions or clarifications are relevant to their lives.

PjBL involves planning and performing investigations to answer questions.

Students collaborate with other students, teachers, and members of society.

PjBL is centered around producing artifacts.

Technology is used when appropriate.

In sum, students perform a series of collaborative inquiry activities that should help them acquire new, domain-specific knowledge and thinking processes to solve real-world problems. The PjBL end products must reflect learners’ knowledge of the project topic and their metacognitive knowledge (Grant & Branch, 2005 ). A project can be a problem to solve (e.g., How can we reduce the pollution in the schoolyard pond?), a phenomenon to investigate (e.g., Why do you stay on your skateboard?), a model to design (e.g., Create a scale model of an ideal high school), or a decision to make (e.g., Should the school board vote to build a new school?; Yetkiner et al., 2008 ).

Students work together and projects last for considerable periods (Helle et al., 2006 ). The role of the instructor consists of facilitating the project. That is, the instructor helps with framing and structuring the projects, monitors the development of the end product, and assesses what students have learned (Chiu, 2020 ; David, 2008 ; Helle et al., 2006 ).

What PBL and PjBL Foster

One aim of schools and colleges implementing student-centered approaches such as PBL and PjBL is to increase students’ competence in tackling complex problems common in an ever-changing world (Gijbels et al., 2005 ). To that end, several goals and desired outcomes have been put forward for PBL and PjBL. The primary goal is to educate the students to a level where they can comfortably use and retrieve information when needed and identify situations where specific knowledge and strategic processes are applicable. With these strategies and knowledge, students can start developing plausible explanations of phenomena that represent important disciplinary understandings (Loyens et al., 2012 ; McNeill & Krajcik, 2011 ). Several studies have investigated the effectiveness of P(j)BL on knowledge acquisition (e.g., Chen & Yang, 2019 ; Strobel & Van Barneveld, 2009 ). In addition, both PBL and PjBL consist of collaborative learning sessions that could foster effective interpersonal communication. Such abilities can enable learners to contribute to discussions in clear and appropriate ways, help to reach conclusions and answers more easily, and identify inconsistencies and unresolved issues (Loyens et al., 2008 , 2012 ).

Further, students could develop problem-solving strategies while working on problems or projects (Krajcik et al., 2008 ). Even when problems are highly complex and ill-structured, they can be effectively analyzed, and plausible responses can be identified (Loyens et al., 2012 ). Also, because the problems and projects are specific to the students’ domain of study, the knowledge and strategies they acquire are applicable to their future professional practice. Therefore, problems and projects are believed to be more engaging, motivating, and interesting for the students (Hmelo-Silver, 2004 ; Larmer et al., 2015 ; Saad & Zainudin, 2022 ). Finally, as noted, student-centered instructional methods such as PBL and PjBL imply a different, less directive role for the teacher. Consequently, students receive more during the learning process. The success of PBL and PjBL also rests on the “preparedness of a student to engage in learning activities defined by him- or herself, rather than by a teacher” (Schmidt, 2000 , p. 243), a process referred to as self-directed learning.

Even though the research literature on P(j)BL does not explicitly state that these instructional formats should foster HOT, CT, and CAT, their design and implementations do appear to require students’ engagement in these forms of thinking. Thus, we will seek to define these constructs within the context of student-centered learning environments.

HOT, CT, and CAT

Like PBL and PjBL, many definitions of HOT, CT, and CAT can be found in the literature. We acknowledge our inability to be complete regarding the different domains and traditions (i.e., philosophical, psychological, educational) of HOT, CT, and CAT, in which definitions have been put forward. Rather, we will focus on definitions that help describe the role of HOT, CT, and CAT in student-centered learning environments.

What Is Higher Order Thinking (HOT)?

First, HOT can be seen as an overarching concept defined as “skills that enhance the construction of deeper, conceptually-driven understanding” (Schraw & Robinson, 2011 , p. 2). Framed in more traditional terms, HOT corresponds with Bloom’s taxonomy, with remembering or recalling facts reflecting lower-order cognitive thinking (i.e., concerned with the acquisition of knowledge or information) and comprehending, applying, analyzing, synthesizing, and evaluating as higher-order thinking, referring to more intellectual abilities and skills (Lombardi, 2022 ; Miri et al., 2007 ). The focus on thinking skills does not imply that the essential importance of knowledge is abandoned. In fact, knowledge is needed and related to thinking processes, which comes to the fore in, for example, later revisions of Bloom’s taxonomy (Lombardi, 2022 ).

HOT has been put forward as having four components (Schraw et al., 2011 ): (a) reasoning (i.e., induction and deduction), (b) argumentation (i.e., generating and evaluating evidence and arguments), (c) metacognition (i.e., thinking about and regulating one’s thinking), and (d) problem solving and critical thinking (CT). Problem-solving involves several steps carried out consecutively: 1) identifying and representing the problem at hand, 2) selecting and applying a suitable solution strategy, and 3) evaluating the process and solution (Chakravorty et al., 2008 ). CT refers to the reflective thinking that leads to certain outcomes (i.e., decision-making) and actions (Ennis, 1987 ). CT is, in this view, considered a subcomponent of HOT (Schraw et al., 2011 ). Indeed, in its measurement, CT has also addressed HOT processes such as analysis and synthesis (Lombardi, 2022 ).

Yen and Halili ( 2015 ) also characterize HOT as an umbrella term for all manner of reflective thinking, including creative thinking, problem-solving, decision-making, and metacognitive processing. Also, in this definition, several components refer to the taxonomy of Schraw and colleagues ( 2011 ): reflection (component d) and metacognition (component c), problem-solving (component d), and decision-making (CT, component d). The only exception is creative thinking, which is not included as a subcomponent by Schraw and colleagues ( 2011 ); however, they acknowledge that this could be part of a broader taxonomy of HOT, together with, for example, moral reasoning.

What Is CT?

The literature on CT traces back to the Greek philosophers who sought to explain the origin and meaning of such thinking. As Van Peppen ( 2020 ) points out, “the word critical derives from the Greek words ‘kritikos’ (i.e., to judge/discern) and ‘kriterion’ (i.e., standards),” and hence “CT implies making judgments based on standards” (p. 11). A second important ancestor of CT was John Dewey, who spoke of “reflective thinking” when referring to CT. From thereon, many traditions and definitions of CT have been formulated. Ennis ( 1987 ), for example, defined CT as the thinking process focused on the decision of what to believe or what to do. He further expanded the idea of Glaser ( 1941 ), who acknowledged the role of dispositions in CT. Ennis ( 1962 ) distinguished two distinct CT components, dispositions and abilities, with the first one being more trait-like tendencies (e.g., dispositions toward inquisitiveness, open-mindedness, sensitivity to other points of view, cognitive flexibility) and the second referring to actual cognitive activities (e.g., focusing, analyzing arguments, asking questions, evaluating evidence, comparing potential outcomes; Schraw et al., 2011 ).

Scholars of the American Philosophical Association tried to come up with a consensus on the definition of CT, referred to as the Delphi Panel (Facione, 1990b ). The processes associated with CT in that report were interpretation (i.e., understanding and articulating meaning), analysis (i.e., identifying relationships between information, including argument analysis), evaluation (i.e., making judgments and assessments about the credibility of information, including assessing arguments), inference (i.e., identifying the necessary information for decision making, including coming up with hypotheses), explanation (i.e., articulating and presenting one’s position, arguments, and analysis used to determine that position), and self-regulation (i.e., self-analyzing and examining one’s inferences and correcting when necessary; Facione, 1990b ). The latter component (i.e., self-regulation) has a strong metacognitive character (Zimmerman & Moylan, 2009 ).

Finally, another conceptualization of CT that resulted in the development of a widely used measurement instrument for CT comes from Halpern. Halpern ( 2014 ) mainly focuses on abilities/cognitive activities and less on dispositions in her definition of CT. Specifically, she states that CT entails “cognitive skills or strategies that increase the probability of a desirable outcome” (p. 8). Halpern’s taxonomy consists of five main elements; verbal reasoning, argument analysis, hypothesis testing, likelihood and uncertainty, and decision-making/problem-solving.

What Is CAT?

CAT refers to the processes we use “when we question or at least do not simply passively accept the accuracy of claims as givens” (Byrnes & Dunbar, 2014 , p. 479). Its distinguishing feature compared to CT is its focus on justification and determining whether appropriate and credible evidence supports a claim or proposed response (Murphy et al., 2014 ). Although some frameworks of HOT and CT include the component of analysis and evaluation, the CAT research literature puts the processes of “weighing the evidence” at the forefront. Alexander ( 2014 ) describes the process of CAT in four consecutive steps. The process starts with a claim or task for which one collects data or evidence. Individuals then evaluate or judge these data or evidence and, as the last step, integrate it with their knowledge and beliefs. Dispositions are not considered in the CAT literature, although individual differences, such as prior knowledge and goals, can act as moderators.

In sum, unequivocal definitions of HOT, CT, and CAT are hard to find, although definitions do share important attributes. As Byrnes and Dunbar ( 2014 ) point out, “operational definitions follow from theoretical definitions” (p. 482). Indeed, the definitions introduced in this overview have led to several operationalizations and measurements of HOT, CT, and CAT. Those operationalizations are also important in the discussion on how HOT, CT, and CAT are framed within student-centered learning and whether P(j)BL might foster such valued forms of thinking.

The Link Between HOT, CT, and CAT and P(j)BL

To establish the link between HOT, CT, and CAT and the learning environments P(j)BL, we carefully examined two lines of research literature: literature on how to effectively teach HOT, CT, and CAT, as well as literature on the learning processes involved in P(j)BL (i.e., a synthesis between cognitive and instructional science). Most of the research directed toward teaching reflective forms of thinking have addressed CT as a form of HOT (e.g., Abrami et al., 2015 ; Miri et al., 2007 ; Schraw et al., 2011 ). For example, Miri and colleagues (2007) defined three teaching strategies that should encourage students to engage collaboratively in CT-aligned processes (e.g., asking appropriate questions and seeking plausible solutions). Those teaching strategies are (a) dealing with real cases in class, (b) encouraging class discussions, and (c) fostering inquiry-oriented experiments.

The link with P(j)BL is evident, as they are centered around dealing with real problems and cases in collaborative class discussions. Certainly, PjBL sometimes requires the execution of experiments. The sharing of knowledge and collaboration has a place in both instructional formats, in PBL in the reporting phase, in PjBL because students collaborate with other students, teachers, and members of society. In their meta-analysis, Abrami and colleagues (2015) reviewed possible instructional strategies that could foster CT. They concluded that two types of interventions helped develop CT processes: discussion and “authentic or situated problems and examples … particularly when applied problem solving … is used” (Abrami et al., 2015 , p. 302). Again, there is a clear link because using authentic problems and class discussions are central components of PBL and PjBL. Also, Torff ( 2011 ) labeled core P(j)BL activities as “high-CT activities” (p. 363): Socratic discussion, debate, problem-solving, problem finding, brainstorming, decision-making, and analysis.

With respect to CAT, Byrnes and Dunbar ( 2014 ) put forward some instructional approaches that should prove facilitative. In their view, “students should pose unanswered questions that require the collection of data or evidence” (p. 488). Subsequently, they need to “engage in appropriate methodologies to gather this evidence.” Finally, students need to “have opportunities to be surprised by unanticipated findings and discuss or debate how the anticipated, unanticipated, and missing evidence should be interpreted” (p. 488). These authors also stress the importance of working in teams, engaging in discussions, identifying sources of uncertainty and problems of interpretation, and presenting findings and conclusions for peer review. Also, regarding CAT, parallels can be drawn with PBL and PjBL in which a problem or question is the starting point, and students engage in several learning activities to develop an understanding and potentially a solution.

The second line of research that is useful to establish the link between HOT, CT, and CAT and P(j)BL, deals withs learning processes. For example, Krajcik et al. ( 2008 ) argued that processes common to project-based approaches involve learners in “scientific practices such as argumentation, explanation, scientific modeling, and engineering design” (p. 3). Furthermore, Krajcik and colleagues mention that students learning in these environments use problem-solving, design, decision-making, argumentation, weighing of different pieces of evidence, explanation, investigation, and modeling. Some scholars mention the development of metacognitive knowledge as an outcome of PjBL (Grant & Branch, 2005 ).

A similar case can be made for PBL. Hung, W. and colleagues ( 2008 ) indicated that students who experience PBL possess better hypothesis-testing abilities due to their more coherent explanations of hypotheses and hypothesis-driven reasoning. Further, the PBL process relies heavily on group discussions of real-life problems, discovering knowledge gaps, gathering information/evidence to answer the learning issues/questions, analyzing the evidence, resolving unclarities, and deciding on the outcome.

The Present Study

The present study aimed to situate HOT, CT, and CAT in PBL and PjBL environments. As we have set out to establish, even though fostering HOT, CT, and CAT may not be an explicit goal of these student-centered approaches; there are theoretical and empirical reasons to expect an association to exist. The research literature on how to effectively teach HOT, CT, and CAT has mentioned instructional formats that use discussion and problem-solving. Secondly, the research literature on the learning processes that take place in student-centered learning environments like P(j)BL mention HOT, CT, and CAT processes such as decision making, argumentation, weighing of different pieces of evidence, explanation, investigation (Abrami et al., 2015 ; Krajcik et al., 2008 ). Therefore, the first research question we posed was: How are HOT, CT, and CAT conceptualized in student-centered learning environments? In addition, parallels exist between processes involved in HOT, CT, and CAT on the one hand and the learning activities/processes in P(j)BL on the other hand. Moreover, effective instructional activities to foster HOT, CT, and CAT (e.g., Abrami et al., 2015 ) are core activities in P(j)BL. Therefore, the second aim of this study was to review the evidence on the effectiveness of student-centered environments in fostering either HOT, CT , or CAT. To that end, we carried out a review study of studies investigating HOT, CT, and CAT in the context of PBL and PjBL.

Search Strategy

For this review, we systematically investigated six online databases: Web of Science (Core Collection) and five EBSCO databases (ERIC, Medline, PsycInfo, Psychology and Behavioral Sciences, and Teacher Reference Center). We included Medline in this list because problem-based learning originated in medical education (Spaulding, 1969 , see also Servant-Miklos, 2019 ) and is often researched in the context of medical education (W. Hung et al., 2019 ; Koh et al., 2008 ; Smits et al., 2002 ; Strobel & Van Barneveld, 2009 ). For this systematic review, we used the following Boolean string of search terms (Oliver, 2012 ): “project based learning” or “project based instruction” or “project based approach” or “PjBL” OR “problem based learning” or “problem based approach” or “problem based instruction” or “PBL” AND “higher order thinking” or “critical thinking” or “critical analytic* thinking.” Footnote 1 Because of its medical connotations (i.e., it is also an abbreviation for “peripheral blood lymphocytes”; e.g., Caldwell et al., 1998 ), “PBL” was not used for our Web of Science search. The search terms match our research questions in that they include both PBL and PjBL, as well as HOT, CT, and CAT, thus leading us to those studies that included those variables. In addition to using the aforementioned terms, we delimited our search to peer-reviewed records written in English.

Selection Process

Inclusion criteria.

In addition to the search parameters established by our search terms, we used the following inclusion criteria to determine our final sample—studies had to: (a) use a quantitative measure of HOT, CT, or CAT, (b) take place in a PBL or PjBL environment, (c) be an empirical study that took place in a classroom context, (d) investigate K-12 or higher education students, and (e) be published as a peer-reviewed journal article. For example, to be included in our review, studies must have a dedicated measure of HOT, CT, or CAT that gives insight into the authors’ conceptualization of these constructs. The measure used could be a standardized instrument or one that was researcher-designed.

Also, in line with our research aims, we only included studies that focused on PBL or PjBL and met the basic criteria for these student-centered approaches. To judge the quality of this learning environment, the definition or description in the theoretical framework and the implementation of those environments were assessed against the defining characteristics established in the literature (Barrows, 1996 ; Hmelo-Silver, 2004 ; Schmidt et al., 2009 ). For PBL, those criteria included (a) student-centered, active learning, (b) the guiding role of teachers, (c) collaborative learning in small groups, (d) the use of realistic problems as the start of the learning process, and (e) ample time for (self-directed) self-study. Furthermore, PBL had to contain the three process phases (i.e., initial discussion, self-study, and reporting phases). For PjBL, those criteria included (a) the project starts with a driving question or problem, (b) the project is relevant and authentic, (c) collaborative, inquiry learning activities take place, (d) room for student autonomy and the guiding role of teachers, (e) the project is central to the curriculum, and (f) the creation of a tangible product (Authors, 2022 ). We only included studies in a classroom context in K-12 and higher education because we wanted to focus on the effectiveness of these learning environments in formal educational settings. We focused on peer-review articles to better ensure the quality of the study.

Exclusion Criteria

We excluded self-report items (often from course evaluations) such as, “I have improved my ability to judge the value of new information or evidence presented to me” or “I have learned more about how to justify why certain procedures are undertaken in my subject area” (Castle, 2006 ). We also excluded the critical thinking scale from the Motivated Strategies for Learning Questionnaire (MSLQ; Duncan & McKeachie, 2005 ). In the MSLQ, critical thinking is viewed as a learning strategy. In light of strong criticism toward the reliance on MSLQ to effectively gauge these forms of reflective thinking (see Dinsmore & Fryer, 2022 , this issue), we have excluded these studies (e.g., Sungur & Tekkaya, 2006 ).

We also excluded studies that did not meet the defining criteria for PBL or PjBL. For example, if a study indicated investigating PBL where the learning process started with a lecture instead of a problem. Furthermore, we excluded studies that did not adequately describe the process the researchers labeled as PBL or PjBL (e.g., Razali et al., 2017 ). We also excluded studies that consisted of a combination of P(j)BL with additional activities (e.g., concept maps with PBL; Si et al., 2019 ) or interventions (e.g., a CT or motivation intervention combined with PBL; Olivares et al., 2013 ).

Moreover, we excluded studies that took place in a laboratory setting, intervention studies at places such as summer camps, tutoring, afterschool programs or studies with employee samples (e.g., health nurses; T.-M. Hung et al., 2015 ), because of our focus on formal education. We further excluded theoretical, conceptual, or “best practices” articles. Finally, for this review, we excluded peer-reviewed conference papers and abstracts (“wrong format”) as they were often hard to retrieve or provided too little information to code the outcome measures and learning environment.

Coding and Final Sample

Figure  1 outlines our entire search process, in line with PRISMA (Preferred Reporting Items for Systematic Review and Meta-Analysis) guidelines (Moher et al., 2015 ). As can be seen there, our initial searches in the Web of Science and EBSCO databases provided us with 2,968 results, which we uploaded to the Rayyan platform (Ouzzani et al., 2016 ). After removing duplicates identified by the Rayyan platform, 2,545 papers remained, of which we screened the titles, abstracts, and, if needed, the full texts. We identified an additional 27 duplicates and excluded 2,405 papers because the studies did not meet our inclusion criteria.

figure 1

Flowchart of the Search and Selection Process

We selected 113 studies for further inspection and coding. The specific codes were in line with our inclusion and exclusion criteria and were designed to streamline the final elimination round (i.e., excluding articles after in-depth reading). Table 1 provides a detailed overview of the codes used.

The search process overall resulted in a final sample of 28 studies. Of these 113 studies, 84 studies were excluded (see Fig.  1 for exclusion reasons). We excluded one paper because it contained duplicate data (Yuan et al., 2008b ) reported in another paper (Yuan et al., 2008a ). We selected the Yuan et al. ( 2008a ) paper instead of the Yuan et al. ( 2008b ) paper because the former also reported the results of the control group (i.e., lecture-based learning). In contrast, the latter only included the data of the PBL group.

Results and Discussion

Descriptives of the final sample.

Before answering our research questions, we describe the characteristics of the 28 included studies (see Table 2 ). Of the included studies, 22 investigated a higher education setting and 6 in K-12. Twelve studies took place within the Health Sciences domain (e.g., nursing education, medical education), 8 within the Science, Technology, Engineering, Art, and Math (STEAM) domain, and 8 in other domains (e.g., financial management or psychology). Studies covered 12 different countries. Most studies were conducted in the USA ( n  = 7), Turkey ( n  = 4), Indonesia ( n  = 4), and China ( n  = 3). Two studies investigated the effects of PjBL; all other studies examined a PBL setting.

Research Question 1: Conceptualization of HOT, CT, and CAT

The first research question of this review was “How are HOT, CT, and CAT conceptualized in student-centered learning environments (i.e., PBL and PjBL)?” To answer this research question, it is important to know that none of the identified studies investigated the effect of P(j)BL on CAT. HOT was only investigated in two studies. One of the included studies investigated the effect of PBL on HOT, and the other study investigated the effect of PjBL on HOT and CT. All other studies investigated the effect of P(j)BL on CT. Hence, to answer this research question, we will mainly focus on the conceptualization of CT and, to a smaller extent, on HOT. In answering this research question, we will discuss how CT and HOT were defined and measured.

Conceptualization of CT

CT conceptualizations in P(j)BL consisted of CT dispositions and CT processes. These processes were referred to as “skills” or “abilities” in the included studies (e.g., W.-C. W. Yu et al., 2015 ). Our review will use the term “processes” instead of “skills.” A CT disposition is “the constant internal motivation to engage problems and make decisions by using CT” (Facione et al., 2000 , p. 65). Facione et al. ( 2000 ) use the word disposition to refer to individuals' characterological attributes. An example of a disposition is being open-minded, analytical, or truth-seeking. In the included studies, dispositions were described as the “will” or “inclination” to evaluate situations critically (e.g., Temel, 2014 ; W.-C. W. Yu et al., 2015 ) and a necessary pre-condition for CT processes (Temel, 2014 ).

Although the correlation between CT dispositions and processes is not extremely high (e.g., Facione et al., 2000 ), both seem to be necessary for “reasonable reflective thinking focused on deciding what to believe or do” (e.g., Ennis, 2011 , p. 10). The included studies in this review defined the concept of CT seven times in terms of dispositions and processes, 13 times solely in terms of processes, and four times solely in terms of dispositions. Three studies did not define the concept or spoke in general terms as “a way to find meaning in the world in which we live” (Burris & Garton, 2007 , p. 106). Of the included studies, 7 studies measured CT disposition, 18 studies measured CT processes, and one study measured CT dispositions and processes. One study stated to have measured CT dispositions, but in the results section, only statistics for CT processes were reported (Hassanpour Dekhordi & Heydarnejad, 2008). Regarding measurements, 16 studies were congruent in defining CT and measuring CT. This means that, for example, when authors defined CT in terms of dispositions, they measured dispositions as well. This also means that 11 studies were incongruent in this respect (e.g., mentioning processes but measuring dispositions or mentioning both but measuring one component).

In sum, although CT consists of dispositions and processes, in the conceptualization of CT in P(j)BL research, we saw a majority of studies focusing only on the processes and, to a lesser extent, on (the combination with) dispositions. Also, 11 studies were incongruent in focus (dispositions and/or processes) of their description of CT and the focus of their measurement instrument.

CT Dispositions

In the studies included in this review, many different terms were used to describe CT disposition(s). The nine studies that reported measuring CT dispositions used the California Critical Thinking Disposition Inventory (CCTDI; Facione, 1990a ). The CCTDI assesses students’ willingness or inclination toward engaging in critical thinking. The CCTDI contains seven dispositions (Facione, 1990a ; Yeh, 2002 ). The scale Truth-Seeking refers to the mindset of being objective, honest, and seeking the truth even when findings do not support one’s opinions/interests. Open-Mindedness refers to tolerance, an open mind toward conflicting views, and sensitivity toward the possibility of one’s own bias. The subscale Analyticity concerns a disposition to anticipate possible consequences, results, and problematic situations. The fourth subscale, Systematicity, measures having an organized, orderly, and focused approach to problem-solving. The trust in one’s reasoning process is measured in the subscale CT Self-Confidence . Inquisitiveness concerns intellectual curiosity, whereas Cognitive Maturity refers to the expectation of making timely, well-considered judgments. A qualitative analysis of the descriptions showed that most of the terms used corresponded with CCTDI subscales (Facione, 1990a ; Yeh, 2002 ; see Appendix Table 5 ). Sometimes other terms were used that could be classified less easily according to the dispositions of the CCTDI but point toward a willingness to engage in critical thinking or the role of self-regulation and metacognition.

CT Processes

There were not only many dispositions mentioned by the studies included in this review, but the number of terms to describe CT processes was even higher. When we look at the three most used instruments, many of these terms appear as specific components of those tests (see Appendix Table 6 ). For example, the most commonly used instruments were the California Critical Thinking Skills Test (CCTST; 4 studies) and the Watson–Glaser Critical Thinking Appraisal (WGCTA; 2 studies). One study used the Cornell Critical Thinking Test (CCTT), but the theoretical framework related to this test was used in three other studies to measure CT processes. The CCTST, WGCTA, and CCTT are well-known commercial standardized measures of critical thinking.

The CCTST is a companion test of the CCTDI and measures five critical thinking processes: Analysis, Evaluation, Inference, Deductive Reasoning, and Inductive Reasoning (Facione, 1991 ). Analysis refers to accurately identifying problems and processes such as categorization, decoding significance, and clarifying meaning. Evaluation concerns the ability to assess statements’ credibility and arguments’ strength. The Inference subscale measures the ability to draw logical and justifiable conclusions from evidence and reasons. Deductive Reasoning relies on strict rules and logic, such as determining the consequences of a given set of rules, conditions, principles, or procedures (e.g., syllogisms, mathematical induction). Finally, Inductive Reasoning refers to reasoned judgment in uncertain, risky, or ambiguous contexts.

Another well-validated test used in the studies included in this review is the WGCTA (Watson & Glaser, 1980 ). The WGCTA provides problems and situations requiring CT abilities. It measures CT as a composite of attitudes of inquiry (i.e., recognizing the existence of problems and acceptance of a need for evidence), knowledge (i.e., about valid inferences, abstractions, and generalizations), and skills in applying these attitudes e and knowledge (Watson & Glaser, 1980 , 1994 , 2009 ).

The scale consists of five subscales: Inferences, Recognition of Assumptions, Deduction, Interpretation, and Evaluation of Arguments. The Inferences subscale measures to what extent participants can determine the truthfulness of inferences drawn from given data. Recognition of Assumptions concerns recognizing implicit presuppositions or assumptions in statements or assertions. The Deduction subscale measures the ability to determine if conclusions necessarily follow from the given information. Interpretation concerns weighing evidence and deciding if the generalizations based on the given data are justifiable. Finally, the subscale Evaluation of Arguments measures the ability to distinguish strong and relevant arguments from weak and irrelevant arguments. The long version of the scale consists of 80 items (parallel Forms A and B; Watson & Glaser, 1980 ), and the short version (From S) contains 40 items (Watson & Glaser, 1994 ). Newer test versions are available (Watson & Glaser, 2009 ), but the included studies relied on the older, abbreviated version (Burris & Garton, 2007 ; Şendağ & Odabaşı, 2009 ).

The CCTT level Z, designed by Ennis, measures deduction, semantics, credibility, induction, definition and assumption identification, and assumption identification (Bataineh & Zghoul, 2006 ; Ennis, 1993 ). The CCTT is a commercial measure of critical thinking and has two versions. The CCTT level X is for students in Grades 4–14, whereas CCTT level Z is for advanced and gifted high school students, college students, graduate students, and other adults. Possibly due to copyright restrictions, many articles gave very brief descriptions of the test subscales. We also found some inconsistencies in the descriptions of the subscales in the literature. We could not retrieve the original manuals of the CCTT from 1985 or the revised version from 2005. Leach and others ( 2020 ) gave a detailed description of the CCTT level X subscales and provided a possible explanation for the inconsistencies found in the literature.

The test measures five latent dimensions: Induction, Deduction, Observation, Credibility, and Assumption (Leach et al., 2020 ). However, these five dimensions are reduced into four parts in the test manual. Two dimensions are taken together (i.e., observation and credibility), and some items of one dimension are counted as an element of another part (Leach et al., 2020 ). Bataineh and Zghoul ( 2006 ) described the subscales of CCTT level Z in more detail. According to them, the CCTT level Z measures six dimensions: Deduction, Semantics, Credibility Induction, Definition and Assumption Identification, and Assumption Identification. The subscale Deduction measures to what extent a person can detect valid reasoning. The subscale, Semantics, measures the ability to assess verbal and linguistic aspects of arguments. Credibility concerns the extent to which a participant can estimate the truthfulness of a statement. The subscale Induction refers to the ability to judge conclusions and the best possible predictions. Definition and Assumption Identification measures the extent to which a person can identify the best definition of a given situation. Finally, assumption identification asks participants to choose the most probable unstated assumption in the text.

In summary, as can be seen from this descriptive analysis (see Appendix Tables 5 and 6 ), many terms are employed to characterize CT dispositions and processes. The conceptualization of CT differs per measurement, as evidenced by the three most commonly used instruments (i.e., CCTDT, CCTST, WGCTA) and the instruments based on Ennis’s conceptualization of CT (e.g., CCTT). We also observed a tendency to create new instruments, often based on or inspired by other measurement instruments that introduce new terms to describe CT.

HOT Processes

In some studies measuring CT, the authors mentioned that CT was a component of HOT (Cortázar et al., 2021 ; Dakabesi & Louise, 2019 ; Sasson et al., 2018 ). Only two studies measured HOT (Sasson et al., 2018 ; Sugeng & Suryani, 2020 ). In both instances, the authors solely defined HOT in reference to the more complex thinking levels in Bloom et al. and’s ( 1956 ; Krathwohl, 2002 ) taxonomy for the cognitive domain: application, analysis, synthesis, and evaluation. In the Sasson et al. ( 2018 ) study, comprehension was also included as one of the higher cognitive processes. In contrast, in the Sugeng and Suryani ( 2020 ) study, comprehension was treated as a form of lower-order thinking. Both studies use Bloom’s Taxonomy as a framework for coding student work. Therefore, we mainly saw the same terms to conceptualize HOT and also congruency between characterizing and measuring HOT processes, based on Bloom’s taxonomy.

Research Question 2: Can PBL and PjBL Foster HOT, CT, and CAT?

To answer Research Question 2, we summarized the main findings regarding the effectiveness of PBL and PjBL on CT and HOT (see Tables 3 and 4 ). When possible, we calculated the standardized mean difference (Cohen’s d ) in Comprehensive Meta-Analysis statistical software (version 3; Borenstein et al., 2009 ). First, we will discuss the effects on CT and then the two studies that examined HOT.

Effects of P(j)BL on Critical Thinking

Description of studies.

Table 3 reports the main findings of the 27 studies that investigated the effects of P(j)BL on CT. Most studies investigated the effects of PBL, and only two examined the effects of PjBL (Cortázar et al., 2021 ; Sasson et al., 2018 ). Most studies compared P(j)BL with a control group with pre-post measures of CT ( n  = 16). Five studies compared PBL with a control group and only reported posttest scores (see Table 3 ). We used the pre-post data of a single P(j)BL group (without a control group) for five other studies. One study had another design (comparing year groups; Pardamean, 2012 ).

The most commonly reported control group was (traditional) lecture-based learning (e.g., Carriger, 2016 ; Choi et al., 2014 ; Gholami et al., 2016 ; Lyons, 2008 ; Rehmat & Hartley, 2020 ; Tiwari et al., 2006 ; W.-C. W. Yu et al., 2015 ; Yuan et al., 2008a ), traditional or conventional learning (e.g., Dilek Eren & Akinoglu, 2013 ; Fitriani et al., 2020 ; Saputro et al., 2020 ; Sasson et al., 2018 ; Temel, 2014 ), or instructor-led instruction (Şendağ & Odabaşı, 2009 ). In Burris and Garton ( 2007 ), the control group was a supervised study group, of which the authors indicated this corresponded with Missouri’s recommended curriculum for (secondary) agriculture classes. The control group in Siew and Mapeala ( 2016 ) focused on conventional problem-solving.

Some studies included additional experimental groups (Carriger, 2016 ; Cortázar et al., 2021 ; da Costa Carbogim et al., 2018 ; Fitriani et al., 2020 ; Siew & Mapeala, 2016 ; W.-C. W. Yu et al., 2015 ). In these experimental groups, PBL was combined with another intervention (Cortázar et al., 2021 ; da Costa Carbogim et al., 2018 ; Fitriani et al., 2020 ; Siew & Mapeala, 2016 ; W.-C. W. Yu et al., 2015 ) or combined with lectures (Carriger, 2016 ). For example, these interventions might include a critical thinking intervention (da Costa Carbogim et al., 2018 ) or a socially shared regulation intervention (Cortázar et al., 2021 ). To describe the main findings and calculate effect sizes, we only report the data for the “regular” PBL and the control groups (if reported) of these studies. Overall, results showed positive effects of P(j)BL on critical thinking. When we only look at the statistical tests the authors performed, 19 studies reported positive effects on CT, indicating that students’ CT disposition or skills scores increased from pretest to posttest or obtained higher scores than the control group. Seven studies reported non-significant findings, and only one study reported a negative effect.

Meta-Analysis

In the meta-analysis section of this review, we only included studies with a pre-post and/or independent groups design. Independent groups designs and pre-post designs both give insight into the question of whether PBL and PjBL affect students’ CT. Independent group designs check if the instructional method is more effective than “traditional” education, whereas pre-post designs check for differences before and after the implementation.

We were able to calculate effect sizes for 23 studies. For the studies with independent groups pre-post designs, we used the pretest and posttest means and standard deviations ( SD s) and sample size per group to compute effect sizes. Posttest SD was used to standardize the effect size. Because most studies did not report the correlation between the pretest and posttest scores, we assumed a conservative correlation of 0.70 if the correlation was not reported. Studies suggest strong test–retest correlations for standardized critical thinking measures (Gholami et al., 2016 ; Macpherson & Owen, 2010 ). For example, Macpherson and Owen ( 2010 ) reported a strong positive correlation ( r  = 0.71) between two test moments among medical students for WGCTA. We used both groups’ means, SD s, and sample sizes to calculate the effect sizes for the studies with independent groups posttest-only designs. We used the mean difference, t , and sample size for the single group pre-post studies to calculate the effect size (da Costa Carbogim et al., 2018 ; Iwaoka et al., 2010 ) or the pretest and posttest means and SDs, sample size, and pre-post correlation. Again, we assumed a correlation of 0.70 if it was not reported.

When we included all 23 studies in one analysis (random-effects model), this resulted in a medium effect size of 0.644 ( SE  = 0.10, 95% CI [0.45, 0.83]). These results suggest that P(j)BL could positively affect students’ CT dispositions and processes. However, the effect was heterogenous, Q (22) = 237.46, p  < 0.001, I 2  = 90.74, T 2  = 0.17 ( SE  = 0.11), which implies that the variability in effect sizes has sources other than sampling error. Because only one of the studies investigated PjBL, we repeated the analysis for only the PBL studies, resulting in an effect size of 0.635 ( SE  = 0.10, 95% CI [0.44, 0.83]), indicating the results remained similar. In our analysis, we included three types of research designs. Overall, the studies that compared P(j)BL with a control group reported larger effect sizes (independent groups: n  = 4, d  = 0707, SE  = 0.22; independent groups pre-posttest: n  = 14, d  = 0.831, SE  = 0.13). than studies with a single-group pre-post design ( n  = 5, d  = 0.213, SE  = 0.18).

As Table 3 reveals, there were some studies with extreme effect sizes (e.g., Saputro et al., 2020 ). We, therefore, conducted leave-one-out analyses. The leave-one-out analyses revealed that effect sizes were between 0.533–0.686 with an SE of approximately 0.10. Confidence intervals were between 95% CI [0.37, 0.70] to 95% CI [0.48, 0.89]. Overall, the conclusion about the positive effect of P(j)BL on critical thinking would not change if we left out the study by Saputro et al. ( 2020 ).

We further examined publication bias by inspecting the funnel plot and Egger’s reception intercept (Egger et al., 1997 ). We applied Duval and Tweedie’s ( 2000 ) trim-and-fill technique and conducted a classic fail-safe N analysis. Figure  2 presents the funnel plot of all included studies and plots the individual study effect size against the standard error of the effect size estimates. The funnel plot indicated publication bias. Egger’s linear regression test for asymmetry further supported this observation, t (21) = 2.78, p  = 0.006. Duval and Tweedie’s trim-and-fill technique (7 studies trimmed at the left side) resulted in an adjusted effect size from a medium effect of 0.644 to a small effect of 0.298 (95% CI [0.09, 0.51]). The fail-safe N suggested that 1,376 missing studies are needed for the result of this meta-analysis to be nonsignificant ( p  > 0.05). Overall, the results suggested that P(j)BL can have a small-to-medium positive effect on students’ CT processes and dispositions.

figure 2

Funnel Plot. Note. Funnel plot with observed and imputed studies. The white dots represent the observed study samples included in the meta-analysis; the black dots represent the seven studies trimmed at the left side using Duval and Tweedie’s trim-and-fill technique

As mentioned, the effect was heterogeneous. Due to the limited number of studies, we could not investigate moderating factors that can explain the heterogeneity statistically. The variation in effect sizes can likely be partly caused by variation in how PBL was implemented, as this often differs per institute even when the defining characteristics have been met (Maudsley, 1999 ; Norman & Schmidt, 2000 ). However, to deal with this issue, we only included studies that met the defining criteria of PBL and PjBL. We further excluded studies that contained additional activities (e.g., concept mapping) that could affect the results.

Also, differences in the exact operationalization of CT could affect the results. To explore this, we calculated the effect size separately for studies reporting outcomes on CT processes ( n  = 17) and dispositions ( n  = 7). Analyses suggested a higher effect size for the studies reporting results for CT processes ( d  = 0.720, SE  = 0.14, 95% CI [0.46, 0.99]) than studies reporting on the effects of CT disposition ( d  = 0.411, SE = 0.13, 95% CI [0.16, 0.66]). However, these results must be interpreted with caution due to the limited number of studies and the extreme effect sizes in the CT processes group.

Other variables that could potentially explain heterogeneity are sample level (e.g., K-12 or higher education) and the duration of the intervention or exposure to PBL. For example, a meta-analysis of the effects of student-centered learning on students’ motivation showed that the effect of student-centered learning on motivation was lower for K-12 samples and curriculum implementation compared to studies conducted in a higher education setting and course implementations (Authors, 2022 ). Possibly, similar factors could affect the effect of P(j)BL on CT, but more research is needed to investigate this in more detail.

Additional Findings

In the meta-analysis section of this review, we only included studies with a pre-post and/or independent groups design. Two studies deviated from this design. Tiwari et al. ( 2006 ) did not only compare the effects of PBL vs. lecture-based learning immediately after the PBL course but also included two follow-ups one and two years later. As seen in Table 3 , the PBL group showed significant gain scores in CT immediately after the course, and the gain score remained positive at the first follow-up. However, the gain score became non-significant at the second follow-up (two years later). When we look at some subscale scores, results revealed significant gains in favor of PBL for truth-seeking, analyticity, and CT self-confidence. The gain score for analyticity remained positive at the first follow-up, whereas the truth-seeking gain score remained significant at the first and second follow-ups (two years later). The results of this study suggest that PBL can have long-term effects on CT dispositions.

Pardamean (2010) did not have a control group but examined CT processes in first through third-year students. Their study revealed no differences between the three-year groups on overall CT. There was one statistically significant difference on the subscale Inductive Reasoning, on which the second-year students obtained the highest score and the third-year students the lowest score. This study does not support that CT increases across year groups in a PBL curriculum. However, we have no information on the baseline CT of each group.

Not all studies reported the results of the test subscales, even when the original scale consisted of subscales. Seven of the nine studies using the CCTDI reported subscale results. Of these studies, three studies reported positive results for Open-Mindedness, Inquisitiveness, Truth-Seeking, or Systematicity, and two for the Analyticity subscale or the CT Self-Confidence subscale. Of the five studies that examined Cognitive Maturity, only one reported a positive effect of P(j)BL (see Table 3 ). Five studies reported the subscale scores on the CCTST. Of these studies, two found positive effects of P(j)BL on Analysis or Evaluation, and one study on Inference. No students reported positive effects on the Deduction subscale. For Induction, one study found a positive effect, and another reported a negative effect. Overall, mixed results were found on the subscale level (see Table 3 ).

Effects of P(j)BL on Higher-Order Thinking

Only two studies investigated the effects of P(j)BL on HOT (see Table 4 ). One in a PBL setting (Sugeng & Suryani, 2020 ) and the other in a PjBL setting (Sasson et al., 2018 ). We could not calculate effect sizes based on the data provided in the papers. Sugeng and Suryani ( 2020 ) compared a PBL group with a lecture-based group on HOT and lower-order thinking. The PBL group scored significantly higher on HOT, whereas the lecture-based group scored higher on lower-order thinking. Sasson et al. ( 2018 ) reported a positive effect for a 2-year PjBL program. HOT increased for the PjBL group but not for the control group from Measurement 1 (beginning of 9th grade) to Measurement 3 (end of 10th grade).

Conclusions and Implications

This systematic review focused on two questions: “How are HOT, CT, and CAT conceptualized in student-centered learning environments?” and “Can PBL and PjBL foster HOT, CT, and CAT?” We presented and discussed findings related to those questions in the preceding section. Here we offer a more global examination of the trends that emerged from our analysis of two popular forms of student-centered approaches to instruction, PBL and PjBL, and share lingering issues that should be explored in future research. However, we first address certain limitations of this systematic review that warrant consideration.

Limitations

As stated, several limitations emerged in this systematic review that have a bearing on the conclusions we proffer. For one, when we set out to conduct this research, our intention was to understand how higher-order, critical, and critical-analytic thinking were conceptualized in PBL and PjBL. However, we found it impossible to analyze the role of CAT because CAT was not part of any investigation in the context of P(j)BL. Further, the attention given to HOT was quite limited, with only two studies investigating it. It should also be mentioned that we found an unequal distribution in the studies included in this review regarding the thinking measures and the learning environments. For example, CAT appeared not embedded in the P(j)BL literature and HOT to a significantly smaller degree than CT.

Similarly, most studies in this review reported findings of a PBL environment, with a PjBL environment only investigated in two studies. Consequently, the conclusions we draw from our analysis rest primarily on empirical research on studies applying PBL and not PjBL approaches.

As mentioned, the CT measures used in this work often consisted of several subscales. For example, the WGCTA consists of the subscales inference, recognition of assumptions, deduction, interpretation, and evaluation of arguments. The presence of multiple indicators was a limitation because the effects can differ for these subcomponents, making global interpretation of effectiveness more difficult. However, we did not use those components as search terms in the literature search, and most studies did not report these subscales or define them. Future studies could use finer-grained search terms, including the thinking measures’ subscales or processes.

While the present study demonstrated positive effects of P(j)BL on HOT and CT, it remains unknown what exactly led to these positive effects, given that multiple links between P(j)BL and HOT and CT could be identified. Also, associations of HOT, CT, and CAT with performance were not investigated in the present study. More controlled experimental studies could shed light on these issues and help overcome the design issues associated with effect studies.

Finally, future research could relate HOT, CT, and CAT in P(j)BL environments to other learning processes, such as self-regulated learning (SRL) and self-directed learning (SDL). Components such as metacognition also play a prominent role in SRL and SDL processes. Future research could shed light on the relationships between thinking and regulating processes in the context of P(j)BL.

Research Question 1: CAT Is Not Embedded and HOT Not Frequently Studied in the P(j)BL Literature

Concerning conceptualizations (RQ1), we must first acknowledge the skewed distribution of studies over the three types of thinking (i.e., HOT, CT, and CAT). To start with, CAT was not part of any investigation in the context of P(j)BL, HOT only in two studies, with the vast majority focusing on CT. When looking at the definition of CAT and its distinguishing feature compared to CT, it has been put forward as the focus on determining how appropriate and credible evidence is (Byrnes & Dunbar, 2014 ). Remarkably, this component is undoubtedly present in P(j)BL. After all, in PBL, when students work on the problem (that they encounter based on prior knowledge) or, more specifically, the learning questions/issues for further self-directed study formulated during PBL group discussion, they will look for and study different literature resources (e.g., Loyens et al., 2012 ). During this knowledge acquisition process in finding answers to the learning questions, they need to check whether different literature resources are in accordance with each other or whether dissimilarities can be detected. In case of dissimilarities, it is up to the student to decide and, later on, during the reporting phase, to discuss how to deal with these dissimilarities with the group. How come different sources provide different answers to the learning questions/issues, and what does that say about the credibility of the sources themselves? Also, in PjBL, students undergo the same process when dealing with conflicting information while working on their projects. It is important to note that these conflicting pieces of information are resolved through group discussion in P(j)BL. However, initially, they might cause some uncertainty regarding the learning process. Indeed, several scholars acknowledge learning uncertainty as a potential consequence of the open set-up of student-centered learning environments (Dahlgren & Dahlgren, 2002 ; Kivela & Kivela, 2005 ; Llyod-Jones & Hak, 2004 ). Nevertheless, the four steps described by Alexander ( 2014 ) are present in P(j)BL, which probably implies that the concept of CAT is not yet well known and embedded in the P(j)BL literature and that in the context of the P(j)BL literature, CAT would be a more accurate term compared to CT.

Similarly, only two studies examined HOT, which could be explained by the fact that HOT is an umbrella term consisting of CT (Schraw et al., 2011 ). That means that when researchers investigate HOT, they are also investigating CT. In light of conceptual clarity, however, it would be recommended to examine concepts at the most detailed level.

More Focus on CT Processes Than CT Dispositions

Another finding was that in the P(j)BL literature, more focus lies on processes, referred to as skills or abilities, compared to dispositions of CT or combinations of CT processes and dispositions. This is not surprising as P(j)BL has been more focused on and related to several interpersonal and self-directed learning skills (Loyens et al., 2008 , 2012 ; Schmidt, 2000 ). What is more problematic is the incongruence between the definitions (CT dispositions and/or processes) and measurement instruments. From an educational point of view, processes seem to be the most natural to be fostered in education, although research has also demonstrated that learning environments can foster CT dispositions (Mathews & Lowe, 2011 ). Applied to P(j)BL environments, dispositions such as “inquisitiveness,” “open-mindedness,” “analyticity,” and “self-regulatory judgment” are certainly helpful. However, more empirical research is necessary to see whether and how P(j)BL environments can foster CT dispositions. The exact meaning of dispositions, which are frequently measured in this literature, is unclear. The measures, subscales, or items labeled as dispositions range from rather stable personality traits such as open-mindedness to more malleable individual differences factors such as prior knowledge. Some components included under dispositions also carry a strong cognitive character, such as analyticity.

Lack of Conceptual Clarity Troubles Measurements and Findings

Regarding the conceptualizations of HOT, CT, and CAT, we definitely ended up in muddy waters. The largest percentage of excluded articles was due to flawed conceptualizations (also for P(j)BL, which we will explain below). These flawed conceptualizations produce a domino effect because conceptualizations (i.e., theoretical definitions) are determinative for measurements (Byrnes & Dunar, 2014 ). In addition, conceptualizations were, so we observed, often at best operationalizations in which authors named specific (sub)processes without mentioning any theoretical grounding. The measurement tool used was often determinative for the inclusion of specific processes. However, we often observed a mismatch in that processes were mentioned that were not part of the measurement instrument, or we observed incongruence between definitions (e.g., processes) and measurement instruments (e.g., measuring dispositions).

Of course, attempts have been made to reach a consensus regarding conceptualizations. For example, the Delphi Report (Faccione, 1990b ) and the special issue on CAT (e.g., Alexander, 2014 ; Byrnes & Dunar, 2014 ) tried to create conceptual clarity on respectively CT and CAT. Consensus should, however, not be a goal in itself. HOT, CT, and CAT are such broad concepts that consensus is far from easy. It is, however, more important to reflect on what the learning objectives of the learning environment (P(j)BL or other) are and determine whether fostering HOT, CT, and CAT is one of them, and then create an educational practice that is in line with these objectives (i.e., constructive alignment; Biggs, 1996 ). Depending on the learning objectives of a learning environment (i.e., the construction of flexible knowledge bases, the development of inquiry skills, or a tool for “learning how to learn”; Schmidt et al., 2009 ), one could emphasize specific (sub)CT processes and use different measurement instruments.

A consequence of these flawed conceptualizations is flawed measurements. Measurements were often problematic in terms of their psychometric properties. Another issue in this respect is that many CT measurements are commercial and not readily available.

Next to issues with the conceptualizations of HOT, CT, and CAT, also serious problems were seen in the conceptualization of P(j)BL. Descriptions were absent, unclear, too broad and general (e.g., “active learning”), or indicative of a learning environment other than P(j)BL. This is not a new finding. In fact, PBL, for example, was identified as troublesome in terms of its definitions a long time ago (e.g., Lloyd-Jones et al., 1998 ). However, it is at least troublesome that this is still the case and, therefore, we excluded many studies. On the other hand, we also notice that more recent studies pay more attention to this issue (e.g., Lombardi et al., 2022 ).

Design Issues in Effect Studies of P(j)BL on HOT, CT, and CAT

While investigating the second research question of this study on the effects of P(j)BL on HOT, CT, and CAT, we made several observations. First, there is still a lack of controlled studies in this domain. The great majority of studies did not use a control group (note: those included in the meta-analysis did), making it impossible to determine the effects of interventions. Like the unclear conceptualizations of P(j)BL, this is not new and has been indicated as an issue before (Loyens et al., 2012 ).

Another observation was that in many studies, it was seen as an assumption/given that P(j)BL fosters CT, usually without any explanation. As explained in the introduction, fostering HOT, CT, or CAT skills is not mentioned as one of the goals of P(j)BL, despite the links that can be made. A priori stating that P(j)BL fosters CT is hence premature.

Finally, we noticed that in several studies, PBL was combined with other interventions (e.g., concept mapping). In those cases, the PBL group served as a control group and the PBL plus extra group as the experimental condition, making it impossible to establish the effects of PBL on HOT, CT, and CAT. Most studies used a pre-posttest design.

Research Question 2: Positive effects of P(j)BL on HOT and CT

Overall, results showed positive effects of P(j)BL on CT and HOT (note that no studies on CAT were found to be included in this review), with scores increasing from pre- to posttest or P(j)BL obtaining higher scores than the control group. These findings imply that P(j)BL does carry elements that can foster CT and HOT. As mentioned above, the positive effects could be interpreted by both cognitive and instructional science literature. Literature on how to effectively teach HOT, CT, and CAT has identified several techniques that help foster these skills. Not surprisingly, these techniques, such as dealing with real cases/problems in class, encouraging (Socratic) class discussions/debate, fostering inquiry-oriented experiments, problem-solving, problem finding, brainstorming, decision making, and analysis (Abrami et al., 2015 ; Miri et al., 2007 ; Torff, 2011 ) are all linked to P(j)BL. Similarly, literature on the learning processes involved in P(j)BL also mentions processes linked to thinking skills. For example, students working on their projects in PjBL use problem-solving, design, decision-making, argumentation, using and weighing different pieces of knowledge, explanation, investigation, and modeling (Krajcik et al., 2008 ). Similarly, students working on problems in PBL have group discussions about authentic problems, engage in evidence-seeking behavior, analyze the evidence, resolve unclarities, and decide on the outcome.

While the majority of findings revealed positive effects regarding the effectiveness of P(j)BL in fostering CT and HOT, several studies also reported negative or no effects. Given the wide variety of P(j)BL formats, it might be due to implementation issues, but it can also be ascribed to design issues, as mentioned above. Further research needs to shed light on the null and negative findings.

As a final note, it should be mentioned that effects were found for CT, but the studies usually do not make claims about the finer-grained subprocesses. For example, four components of HOT have been identified (Schraw et al., 2011 ), while outcome measures are usually calculated at the “general” and not the subcomponent level. Nevertheless, the HOT component of metacognition is quite different from the HOT component of reasoning or problem-solving, which demonstrates the importance of constructive alignment (Biggs, 1996 ). Constructive alignment implies clearly defining the learning objectives of the learning environment (P(j)BL or other). When fostering HOT, CT, and CAT is one of the learning objectives, an educational practice should be developed that aligns with these objectives. To make claims about whether learning environments are effective in fostering HOT, CT, or CAT processes, one must first discover whether these processes are or can be part of the learning objectives of these learning environments.

Implications

Several implications for theory and practice follow out of this review study. The first implication is that there is much room for improvement in terms of conceptual clarity. CAT is not investigated and HOT only sporadically in the context of P(j)BL. Nevertheless, when looking at the respective definitions of these thinking processes, the “analytical” part of CAT is certainly present in P(j)BL environments when weighing the evidence during the analysis of specific problems or projects (Alexander, 2014 ). In addition, since CT is considered a component of HOT, we propose investigations at the most detailed level. For practitioners looking for ways to foster HOT, CT, and CAT, it is important to know and hence take into account that definitions (and hence subsequently, measurements) are ambiguous. Given that the results seem positive in terms of the capability of P(j)BL to foster HOT and CT, it is important to guard that the child is not thrown away with the bathwater. Exposure to problems and collaboration with fellow students seem beneficial for fostering thinking skills.

Secondly, the lack of studies investigating CAT in P(j)BL means that this form of thinking is not yet embedded in the P(j)BL research literature. Given the characteristics of the P(j)BL process, CAT processes should be an object of investigation in these learning environments to further advance the theoretical understanding of CAT in P(j)BL.

In sum, the present review study led to several conclusions regarding conceptualizations of HOT, CT, and CAT in the P(j)BL literature. First, CAT is not embedded, and HOT is not frequently studied in the P(j)BL literature. Second, more focus lies on CT skills compared to CT dispositions in the research literature. Next, a lack of clear conceptualizations of HOT, CT, and CAT complicates the measurements and findings. This lack of conceptual clarity carries into instruments and tools of assessment that are limited in number (and sometimes availability) and of questionable validity. The lack of conceptual clarity also extends to P(j)BL environments, where essential components of PBL and PjBL were not always articulated or addressed in studies claiming to implement these approaches. Design issues in effect studies add to these complications. Further, the reference to HOT or CT skills conflicts with the literature on what differentiates skills from more intentional and purposefully implemented processes (i.e., cognitive and metacognitive strategies). Finally, mainly positive effects were found of P(j)BL on HOT and CT.

Data Availability

The data that support the findings of this study are available from the corresponding author, S.L., upon reasonable request.

Note that in the search terms for the student-centered learning environments, we did not use a hyphen, that is, “problem based learning” instead of “problem-based learning”.

References preceded by an * were included in the review

Abrami, P. C., Bernard, R. M., Borokhovski, E., Waddington, D. I., Wade, C. A., & Persson, T. (2015). Strategies for teaching students to think critically: A meta-analysis. Review of Educational Research, 85 (2), 275–314. https://doi.org/10.3102/0034654314551063

Article   Google Scholar  

Alexander, P. A. (2014). Thinking critically and analytically about critical-analytic thinking. Educational Psychology Review, 26 , 469–476. https://doi.org/10.1007/s10648-014-9283-1

Authors. (2022). [Blinded for review].

Barrows, H. S. (1996). Problem-based learning in medicine and beyond: A brief overview. In L. Wilkerson & W. H. Gijselaers (Eds.), New Directions in Teaching and Learning: Issue 68. Bringing problem-based learning to higher education: Theory and practice (pp. 3–12). Jossey-Bass. https://doi.org/10.1002/tl.37219966804

Bataineh, R. F., & Zghoul, L. H. (2006). Jordanian TEFL graduate students’ use of critical thinking skills (as measured by the Cornell Critical Thinking Test, Level Z). International Journal of Bilingual Education and Bilingualism, 9 (1), 33–50. https://doi.org/10.1080/13670050608668629

Bezanilla, M. J., Galindo-Domínguez, H., & Poblete, M. (2021). Importance of teaching critical thinking in higher education and existing difficulties according to teacher’s views. REMIE-Multidisciplinary Journal of Educational Research, 11(1), 20–48. https://doi.org/10.4471/remie.2021.6159

Biggs, J. B. (1996). Enhancing teaching through constructive alignment. Higher Education, 32 , 347–364. https://doi.org/10.1007/BF00138871

Bloom, B. S., Englehart, M. D., Furst, E. J., Hill, W. H., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals. Handbook I: Cognitive domain. David McKay Co.

Blumenfeld, P. C., Soloway, E., Marx, R. W., Krajcik, J. S., Guzdial, M., & Palincsar, A. (1991). Motivating project-based learning: Sustaining the doing, supporting the learning. Educational Psychologist, 26 (3–4), 369–398. https://doi.org/10.1080/00461520.1991.9653139

Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Introduction to meta-analysis . Wiley.

Book   Google Scholar  

*Burris, S., & Garton, B. L. (2007). Effect of instructional strategy on critical thinking and content knowledge: Using problem-based learning in the secondary classroom. Journal of Agricultural Education, 48(1), 106–116. https://files.eric.ed.gov/fulltext/EJ840072.pdf

Butler, H. A., & Halpern, D. F. (2020). Critical thinking impacts our everyday lives. In R. J. Sternberg & D. F. Halpern (Eds.), Critical thinking in psychology (2nd ed., pp. 152–172). Cambridge University Press. https://doi.org/10.1017/9781108684354.008

Byrnes, J. P., & Dunbar, K. N. (2014). The nature and development of critical analytic thinking. Educational Psychology Review, 26 , 477–493. https://doi.org/10.1007/s10648-014-9284-0

Caldwell, D. J., Caldwell, D. Y., Mcelroy, A. P., Manning, J. G., & Hargis, B. M. (1998). BASP-induced suppression of mitogenesis in chicken, rat and human PBL. Developmental & Comparative Immunology, 22 (5–6), 613–620. https://doi.org/10.1016/S0145-305X(98)00037-8

*Carriger, M. S. (2016). What is the best way to develop new managers? Problem-based learning vs. lecture-based instruction. The International Journal of Management Education, 14(2), 92–101. https://doi.org/10.1016/j.ijme.2016.02.003

Castle, A. (2006). Assessment of the critical thinking skills of student radiographers. Radiography, 12 (2), 88–95. https://doi.org/10.1016/j.radi.2005.03.004

Chakravorty, S. S., Hales, D. N., & Herbert, J. I. (2008). How problem-solving really works. International Journal of Data Analysis Techniques and Strategies, 1 (1), 44–59. https://doi.org/10.1504/IJDATS.2008.020022

Chen, C., & Yang, Y. (2019). Revisiting the effects of project-based learning on students’ academic achievement: A meta-analysis investigating moderators. Educational Research Review, 26 , 71–81. https://doi.org/10.1016/j.edurev.2018.11.001

Chiu, C.-F. (2020). Facilitating K-12 teachers in creating apps by visual programming and project-based learning. International Journal of Engineering Technology in Learning, 15(1), 103–118. https://www.learntechlib.org/p/217066/

*Choi, E., Lindquist, R., & Song, Y. (2014). Effects of problem-based learning vs. traditional lecture on Korean nursing students’ critical thinking, problem-solving, and self-directed learning. Nurse Education Today, 34(1), 52–56. https://doi.org/10.1016/j.nedt.2013.02.012

*Cortázar, C., Nussbaum, M., Harcah, J., Alvares, J., López, F., Goñi, J., & Cabezas, V. (2021). Promoting critical thinking in an online, project-based course. Computers in Human Behavior, 119, Article 106705. https://doi.org/10.1016/j.chb.2021.106705

*da Costa Carbogim, F., Barbosa, A. C. S., de Oliviera, L. B., De Sá Diaz, F. B. B., Toledo, L. V., Alves, K. R., de Castro Friedrich, D. B., Luiz, F. S., & de Araújo Püschel, V. A. (2018). Educational intervention to improve critical thinking for undergraduate nursing students: A randomized clinical trial. Nurse Education in Practice, 33 , 121–126. https://doi.org/10.1016/j.nepr.2018.10.001

Dahlgren, M. A., & Dahlgren, L. O. (2002). Portraits of PBL: Students’ experiences of the characteristics of problem-based learning in physiotherapy, computer engineering and psychology. Instructional Science, 30 , 111–127. https://doi.org/10.1023/A:1014819418051

*Dakabesi, D., & Luoise, I. S. Y. (2019). The effectiveness of problem-based learning model to increase the students’ critical thinking skills. Journal of Education and Learning, 13(4), 543–549. https://doi.org/10.11591/edulearn.v13i4.12940

David, J. L. (2008). Project-based learning. Educational Leadership: Journal of the Department of Supervision and Curriculum Development, 65 (5), 80–82.

Google Scholar  

*Dilek Eren, C., & Akinoglu, O. (2013). Effect of problem-based learning (PBL) on critical thinking disposition in science education. Journal of Environmental Protection and Ecology, 14 (3A), 1353–1361.

*Ding, X.-W. (2016). The effect of WeChat-assisted problem-based learning on the critical thinking disposition of EFL learners. International Journal of Emerging Technology in Learning, 11 (12), 23–29. https://doi.org/10.3991/ijet.v11i12.5927

Dinsmore, D. L., & Fryer, L. (2022). The dynamic interface between cognitive and metacognitive processing and higher-order, critical, or critical-analytic thinking [Manuscript Submitted for Publication] . University of North Florida.

Dolmans, D., Loyens, S. M. M., Marcq, H., & Gijbels, D. (2016). Deep and surface learning in problem-based learning: A review of the literature. Advances in Health Sciences Education, 21 (5), 1087–1112. https://doi.org/10.1007/s10459-015-9645-6

Duncan, T. G., & McKeachie, W. J. (2005). The making of the Motivated Strategies for Learning Questionnaire. Educational Psychologist, 40 (2), 117–128. https://doi.org/10.1207/s15326985ep4002_6

Duval, S. J., & Tweedie, R. L. (2000). A nonparametric “trim and fill” method of accounting for publication bias in meta-analysis. Journal of the American Statistical Association, 95 (449), 89–98. https://doi.org/10.2307/2669529

Egger, M., Smith, G. D., Schneider, M., & Minder, C. (1997). Bias in meta-analysis detected by a simple, graphical test. British Medical Journal, 315 (7109), 629–634. https://doi.org/10.1136/bmj.315.7109.629

Ennis, R. H. (1993). Critical thinking assessment. Theory Into Practice, 32(3), 179–186. https://www.jstor.org/stable/1476699

Ennis, R. H. (1962). A concept of critical thinking. Harvard Educational Review, 32 , 81–111.

Ennis, R. H. (1987). A taxonomy of critical thinking dispositions and abilities. In J. B. Baron & R. J. Sternberg (Eds.), Teaching thinking skills: Theory and practice (pp. 9–26). W H Freeman/Times Books/ Henry Holt & Co.

Ennis, R. (2011). Critical thinking: Reflection and perspective part I. Inquiry: Critical Thinking Across the Disciplines, 26 (1), 4–18. https://doi.org/10.5840/inquiryctnews20112613

Facione, P. A. (1990a). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction. California State University. https://eric.ed.gov/?id=ED315423

Facione, P. A. (1991). Using the California Critical Thinking Skills Test in research, evaluation, and assessment. California Academic Press. https://eric.ed.gov/?id=ED337498

Facione, P. A., Facione, N. C., & Giancarlo, C. A. (2000). The disposition toward critical thinking: Its character, measurement, and relationship to critical thinking skill. Informal Logic, 20(1), 61–84. https://doi.org/10.22329/il.v20i1.2254

Facione, P. A. (1990b). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction . The Delphi Report. California Academic Press.

*Fitriani, A., Zubaidah, S., Susilo, H., & Al Muhdhar, M. H. I. (2020). PBLPOE: A learning model to enhance students’ critical thinking skills and scientific attitudes. International Journal of Instruction, 13(2), 89–106. https://doi.org/10.29333/iji.2020.1327a

*Gholami, M., Moghadam, P. K., Mohammadipoor, F., Tarahi, M. J., Sak, M., Toulabi, T., & Pour, A. H. H. (2016). Comparing the effects of problem-based learning and the traditional lecture method on critical thinking skills and metacognitive awareness in nursing students in a critical care nursing course. Nurse Education Today, 45 , 16–21. https://doi.org/10.1016/j.nedt.2016.06.007

Gijbels, D., Dochy, F., Van den Bossche, P., & Segers, M. (2005). Effects of problem-based learning: A meta-analysis from the angle of assessment. Review of Educational Research, 75 (1), 27–61. https://doi.org/10.3102/00346543075001027

Glaser, E. M. (1941). An experiment in the development of critical thinking. Teachers College Record, 43 (5), 1–18. https://doi.org/10.1177/016146814204300507

Grant, M. M., & Branch, R. M. (2005). Project-based learning in a middle school: Tracing abilities through the artifacts of learning. Journal of Research on Technology in Education, 38 (1), 65–98. https://doi.org/10.1080/15391523.2005.10782450

Halpern, D. F. (2014). Thought and knowledge: An introduction to critical thinking (5th ed.). Psychology Press.

*Hassanpour Dehkordi, A., & Heydarnejad, M. S. (2008). The effects of problem-based learning and lecturing on the development of Iranian nursing students’ critical thinking. Pakistan Journal of Medical Sciences, 24(5), 740–743. http://pjms.com.pk/issues/octdec108/article/article19.html

Helle, L., Tynjälä, P., & Olkinuora, E. (2006). Project-based learning in secondary education – theory, practice and rubber sling slots. Higher Education, 51 , 287–314. https://doi.org/10.1007/s10734-004-6386-5

Hmelo-Silver, C. E. (2004). Problem-based learning: What and how do students learn? Educational Psychology Review, 16 , 235–266. https://doi.org/10.1023/B:EDPR.0000034022.16470.f3

Hung, W., Jonassen, D. H., & Liu, R. (2008). Problem-based learning. In J. M. Spector, M. D. Merrill, J. Van Merriënboer, & M. P. Driscoll (Eds.), Handbook of research on educational communications and technology (pp. 485–506). https://doi.org/10.4324/9780203880869.ch38

Hung, T.-M., Tang, L.-C., & Ko, C.-J. (2015). How mental health nurses improve their critical thinking through problem-based learning. Journal for Nurses in Professional Development, 31 (3), 170–175. https://doi.org/10.1097/NND.0000000000000167

Hung, W., Dolmans, D. H., & Van Merriënboer, J. J. (2019). A review to identify key perspectives in PBL meta-analyses and reviews: Trends, gaps and future research directions. Advances in Health Sciences Education, 24 , 943–957. https://doi.org/10.1007/s10459-019-09945-x

*Iwaoka, W. T., Li, Y., & Rhee, W. Y. (2010). Measuring gains in critical thinking in food science and human nutrition courses: The Cornell Critical Thinking Test, problem-based learning activities, and student journal entries. Journal of Food Science Education, 9 (3), 68–75. https://doi.org/10.1111/j.1541-4329.2010.00100.x

Kivela, J., & Kivela, R. J. (2005). Student perceptions of an embedded problem-based learning instructional approach in a hospitality undergraduate programme. International Journal of Hospitality Management, 24 , 437–464. https://doi.org/10.1016/j.ijhm.2004.09.007

Koh, G. C. H., Khoo, H. E., Wong, M. L., & Koh, D. (2008). The effects of problem-based learning during medical school on physician competency: A systematic review. Canadian Medical Association Journal, 178 (1), 34–41. https://doi.org/10.1503/cmaj.070565

Krajcik, J. (2015). Project-based science. Science Teacher, 82(1), 25–27. http://www.nsta.org/store/product_detail.aspx?id= https://doi.org/10.2505/4/tst15_082_01_25

Krajcik, J., McNeill, K. L., & Reiser, B. J. (2008). Learning-goals-driven design model: Developing curriculum materials that align with national standards and incorporate project-based pedagogy. Science Education, 92 (1), 1–32. https://doi.org/10.1002/sce.20240

Krathwohl, D. R. (2002). A revision of Bloom’s taxonomy: An overview. Theory Into Practice, 41 (4), 212–264. https://doi.org/10.1207/s15430421tip4104_2

Larmer, J., Mergendoller, J., & Boss, S. (2015). Gold standard PBL: Essential project design elements. https://www.pblworks.org/blog/gold-standard-pbl-essential-project-design-elements

Leach, S. M., Immekus, J. C., French, B. F., & Hand, B. (2020). The factorial validity of the Cornell Critical Thinking Tests: A multi-analytical approach. Thinking Skills and Creativity, 37 , 100676. https://doi.org/10.1016/j.tsc.2020.100676

Lloyd-Jones, G., & Hak, T. (2004). Self-directed learning and student pragmatism. Advances in Health Sciences Education, 9 , 61–73. https://doi.org/10.1023/B:AHSE.0000012228.72071.1e

Lloyd-Jones, G., Margetson, D., & Bligh, J. G. (1998). Problem-based learning: A coat of many colours. Medical Education, 32 , 492–494. https://doi.org/10.1046/j.1365-2923.1998.00248.x

Lombardi, D., Shipley, T. F., Astronomy Team, Biology Team, Chemistry, Engineering Team, Geography Team, Geoscience Team, & Physics Team. (2021). The curious construct of active learning. Psychological Science in the Public Interest, 22(1) 8–43. https://doi.org/10.1177/1529100620973974

Lombardi, D. (2022). On the horizon: The promis and power of higher-order, critical, and critical-analytical thinking. [Manuscript Submitted for Publication]. University of Maryland.

Lombardi, D., Matewos, M. M., Jaffe, J., Zohery, V., Mohan, S., Bock, K., & Jamani, S. (2022). Discourse and agency during scaffolded middle school science instruction . Advance online publication. https://doi.org/10.1080/0163853X.2022.2068317

Loyens, S. M. M., Kirschner, P. A., & Paas, F. (2012). Problem-based learning. In K. R. Harris, S. Graham, T. Urdan, A. G. Bus, S. Major, & H. L. Swanson (Eds.), APA educational psychology handbook, Vol. 3. Application to learning and teaching (pp. 403–425). American Psychological Association. https://doi.org/10.1037/13275-016

Loyens, S. M. M., & Rikers, R. M. J. P. (2017). Instruction based on inquiry. In R. E. Mayer & P. A. Alexander (Eds.), Handbook of research on learning and instruction (2nd ed., pp. 405–431). Routledge.

Loyens, S. M. M., Magda, J., & Rikers, R. M. J. P. (2008). Self-directed learning in problem-based learning and its relationships with self-regulated learning. Educational Psychology Review, 20 , 411–427. https://doi.org/10.1007/s10648-008-9082-7

*Lyons, E. M. (2008). Examining the effects of problem-based learning and NCLEX-RN scores on the critical thinking skills of associate degree nursing students in a Southeastern community college. International Journal of Nursing Education Scholarship, 5 (1), 21. https://doi.org/10.2202/1548-923X.1524

Macpherson, K., & Owen, C. (2010). Assessment of critical thinking ability in medical students. Assessment of Evaluation in Higher Education, 35 (1), 45–54. https://doi.org/10.1080/02602930802475471

Mathews, S. R., & Lowe, K. (2011). Classroom environments that foster a disposition for critical thinking. Learning Environments Research, 14 (1), 59–73. https://doi.org/10.1007/s10984-011-9082-2

Maudsley, G. (1999). Do we all mean the same thing by “problem-based learning”? A review of the concepts and a formulation of the ground rules. Academic Medicine, 74 (2), 178–185. https://doi.org/10.1097/00001888-199902000-00016

McNeill, K. L., & Krajcik, J. S. (2011). Supporting Grade 5–8 students in constructing explanations in science: The claim, evidence, and reasoning framework for talk and writing. Pearson.

Miri, B., David, B. C., & Uri, Z. (2007). Purposely teaching for the promotion of higher-order thinking skills: A case of critical thinking. Research in Science Education, 37 , 353–369. https://doi.org/10.1007/s11165-006-9029-2

Moher, D., Shamseer, L., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., Shekelle, P., Stewart, L. A., PRISMA-P Group. (2015). Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015 statement. Systematic Reviews, 4 , 1. https://doi.org/10.1186/2046-4053-4-1

*Muehlenkamp, J. J., Weiss, N., & Hansen, M. (2015). Problem-based learning for introductory psychology: Preliminary supporting evidence. Scholarship of Teaching and Learning in Psychology, 1 (2), 125–136. https://doi.org/10.1037/stl0000027

Murphy, P. K., Rowe, M. L., Ramani, G., & Silverman, R. (2014). Promoting critical-analytic thinking in children and adolescents at home and in school. Educational Psychology Review, 26 , 561–578. https://doi.org/10.1007/s10648-014-9281-3

Nagarajan, S., & Overton, T. (2019). Promoting systems thinking using project- and problem-based learning. Journal of Chemical Education, 96 (12), 2901–2909. https://doi.org/10.1021/acs.jchemed.9b00358

Norman, G. R., & Schmidt, H. G. (2000). Effectiveness of problem-based learning curricula: Theory, practice and paper darts. Medical Education, 34 (9), 721–728. https://doi.org/10.1046/j.1365-2923.2000.00749.x

Olivares, S., Saiz, C., & Rivas, S. F. (2013). Encouragement for thinking critically. Electronic Journal of Research in Education, 11 (2), 367–394. https://doi.org/10.14204/ejrep.30.12168

Oliver, P., (2012). Succeeding with your literature review: A handbook for students. McGraw Hill.

Ouzzani, M., Hammady, H., Fedorowicz, Z., & Elmagarmid, A. (2016). Rayyan: A web and mobile app for systematic reviews. Systematic Reviews, 5 , 210. https://doi.org/10.1186/s13643-016-0384-4

*Ozturk, C., Muslu, G. K., & Dicle, A. (2008). A comparison of problem-based and traditional education on nursing students’ critical thinking disposition. Nurse Education Today, 28 (5), 627–632. https://doi.org/10.1016/j.nedt.2007.10.001

*Pardamean, B. (2012). Measuring change in critical thinking skills of dental students educated in a PBL curriculum. Journal of Dental Education, 76 (4), 443–453. https://doi.org/10.1002/j.0022-0337.2012.76.4.tb05276.x

Pellegrino, J. W., & Hilton, M. L. (Eds.) (2012). Education for life and work: Developing transferable knowledge and skills in the 21st Century. National Academies Press. http://www.nap.edu/catalog.php?record_id=13398

Razali, S. N., Noor, H. A. M., Ahmad, M. H., & Shahbodin, F. (2017). Enhanced student soft skills through integrated online project based collaborative learning. International Journal of Advanced and Applied Sciences, 4 (3), 59–67. https://doi.org/10.21833/ijaas.2017.03.010

*Rehmat, A. P., & Hartley, K. (2020). Building engineering awareness: Problem-based learning approach for STEM integration. Interdisciplinary Journal of Problem-Based Learning, 14(1). https://doi.org/10.14434/ijpbl.v14i1.28636

Saad, A., & Zainudin, S. (2022). A review of project-based learning (PBL) and computational thinking (CT) in teaching and learning. Learning and Motivation, 78 , 101802. https://doi.org/10.1016/j.lmot.2022.101802

*Santiprastikul, S., Sithivong, K., & Polnueangma, O. (2013). The first year nursing students’ achievement and critical thinking in local wisdom course using problem based learning process. Wireless Personal Communications, 69 , 1077–1085. https://doi.org/10.1007/s11277-013-1067-2

*Saputro, A. D., Atun, S., Wilujeng, I., Ariyanto, A., & Arifin, S. (2020). Enhancing pre-service elementary teachers’ self-efficacy and critical thinking using problem-based learning. European Journal of Educational Research, 9(2), 765–773. https://doi.org/10.12973/eu-jer.9.2.765

*Sasson, I., Yehuda, I., & Malkinson, N. (2018). Fostering the skills of critical thinking and question-posing in a project-based learning environment. Thinking Skills and Creativity, 29 , 203–212. https://doi.org/10.1016/j.tsc.2018.08.001

Sawyer, R. K. (Ed.). (2014). The Cambridge Handbook of the Learning Sciences. Cambridge University Press.

Schmidt, H. G. (2000). Assumptions underlying self-directed learning may be false. Medical Education, 34 (4), 243–245. https://doi.org/10.1046/j.1365-2923.2000.0656a.x

Schmidt, H. G., Loyens, S. M., Van Gog, T., & Paas, F. (2007). Problem-based learning is compatible with human cognitive architecture: Commentary on Kirschner, Sweller, and. Educational Psychologist, 42 (2), 91–97. https://doi.org/10.1080/00461520701263350

Schmidt, H. G., Van der Molen, H. T., Te Winkel, W. W. R., & Wijnen, W. H. F. W. (2009). Constructivist, problem-based learning does work: A meta-analysis of curricular comparisons involving a single medical school. Educational Psychologist, 44 (4), 227–249. https://doi.org/10.1080/00461520903213592

Schraw, G., McCrudden, M. T., Lehman, S., & Hoffman, B. (2011). An overview of thinking skills. In G. Schraw & D. H. Robinson (Eds.), Assessment of higher order thinking skills (pp. 19–45). Information Age Publishing.

Schraw, G., & Robinson, D. H. (2011). Conceptualizing and assessing higher order thinking skills. In G. Schraw & D. H. Robinson (Eds.), Assessment of higher order thinking skills (pp. 1–15). Information Age Publishing.

*Şendağ, S., & Odabaşı, H. F. (2009). Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Computers & Education, 53 (1), 132–141. https://doi.org/10.1016/j.compedu.2009.01.008

Servant-Miklos, V. F. C. (2019). Fifty years on: A retrospective on the world’s first problem-based learning programme at McMaster University Medical School. Health Professions Education, 5 (1), 3–12. https://doi.org/10.1016/j.hpe.2018.04.002

Si, J., Kong, H., & Lee, S. (2019). Developing clinical reasoning skills through argumentation with the concept map method in medical problem-based learning. Interdisciplinary Journal of Problem-Based Learning, 13 (1), 5. https://doi.org/10.7771/1541-5015.1776

*Siew, N. M., & Mapeala, R. (2016). The effects of problem-based learning with thinking maps on fifth graders’ science critical thinking. Journal of Baltic Science Education, 15(5), 602–616. https://www.scientiasocialis.lt/jbse/?q=node/527

Smits, P. B. A., Verbeek, J. H. A. M., & de Buisonjé, C. D. (2002). Problem based learning in continuing medical education: A review of controlled evaluation studies. British Medical Journal, 324 , 153–156. https://doi.org/10.1136/bmj.324.7330.153

Spaulding, W. B. (1969). The undergraduate medical curriculum (1969 model): McMaster University. Canadian Medical Association Journal, 100 , 659–664.

Strobel, J., & Van Barneveld, A. (2009). When is PBL more effective? A meta-synthesis of meta-analyses comparing PBL to conventional classrooms. Interdisciplinary Journal of Problem-Based Learning, 3 , 44–58. https://doi.org/10.7771/1541-5015.1046

*Sugeng, B., & Suryani, A. W. (2020). Enhancing the learning performance of passive learners in a financial management class using problem-based learning. Journal of University Teaching & Learning Practice, 17(1), Article 5. https://doi.org/10.53761/1.17.1.5

Sungur, S., & Tekkaya, C. (2006). Effects of problem-based learning and traditional instruction on self-regulated learning. The Journal of Educational Research, 99 (5), 307–317. https://doi.org/10.3200/JOER.99.5.307-320

Tal, T., Krajcik, J. S., & Blumenfeld, P. C. (2006). Urban schools’ teachers enacting project-based science. Journal of Research in Science Teaching, 43 (7), 722–745. https://doi.org/10.1002/tea.20102

*Temel, S. (2014). The effects of problem-based learning on pre-service teachers’ critical thinking dispositions and perceptions of problem-solving ability. South African Journal of Education, 34(1), Article 769. https://hdl.handle.net/10520/EJC148686

Thomas, J. W. (2000). A review of research on project-based learning . http://www.bie.org/files/researchreviewPBL.pdf .

*Tiwari, A., Lai, P., So, M., & Yuen, K. (2006). A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Medical Education, 40 (6), 547–554. https://doi.org/10.1111/j.1365-2929.2006.02481.x

Torff, B. (2011). Critical thinking in the classroom: Teachers’ beliefs and practices in instruction and assessment. In G. Schraw & D. H. Robinson (Eds.), Assessment of higher order thinking skills (pp. 361–394). Information Age Publishing.

Van Peppen, L. M. (2020). Fostering critical thinking: Generative processing strategies to avoid bias in reasoning [Doctoral dissertation, Erasmus University Rotterdam]. RePub. hdl.handle.net/1765/130461

Watson, G., & Glaser, E. M. (1980). Watson-Glaser Critical Thinking Appraisal: Forms A and B manual. The Psychological Corporation.

Watson, G., & Glaser, E. M. (2009). Watson–Glaser TM II Critical Thinking Appraisal: Technical manual and user’s guide. Pearson.

Watson, G., & Glaser, E. M. (1994). Watson-Glaser Critical Thinking Appraisal . The Psychological Corporation.

Wijnia, L., Loyens, S. M. M., & Rikers, R. M. J. P. (2019). The problem-based learning process: An overview of different models. In M. Moallem, W. Hung, and N. Dabbagh (Eds.), The Wiley Handbook of problem-based learning (pp. 273–295). John Wiley & Sons. https://doi.org/10.1002/9781119173243.ch12

Yeh, M.-L. (2002). Assessing the reliability and validity of the Chinese version of the California Critical Thinking Disposition Inventory. International Journal of Nursing Studies, 39 (2), 123–132. https://doi.org/10.1016/S0020-7489(01)00019-0

Yen, T. S., & Halili, S. H. (2015). Effective teaching of higher-order thinking (HOT) in education. The Online Journal of Distance Education and e-Learning, 3(2), 41–47. https://tojdel.net/journals/tojdel/articles/v03i02/v03i02-04.pdf

Yetkiner, Z. E., Anderoglu, H., & Capraro, R. M. (2008). Research summary: Project-based learning in middle grades mathematics . http://www.nmsa.org/Research/ResearchSummaries/ProjectBasedLearninginMath/tabid/1570/Default.aspx .

*Yu, W.-C. W., Lin, C. C., Ho, M.-H., & Wang, J. (2015). Technology facilitated by PBL pedagogy and its impact on nursing students’ academic achievement and critical thinking dispositions. Turkish Online Journal of Educational Technology, 14(1), 97–107. https://eric.ed.gov/?id=EJ1057343

*Yu, D., Zhang, Y., Xu, Y., Wu, J., & Wang, C. (2013). Improvement in critical thinking dispositions of undergraduate nursing students through problem-based learning: A crossover-experimental study. Journal of Nursing Education, 52 (10), 574–581. https://doi.org/10.3928/01484834-20130924-02

*Yuan, H., Kunaviktikul, W., Klunklin, A., & Williams, B. A. (2008b). Promoting critical thinking skills through problem-based learning. Chiang Mai University Journal of Social Science and Humanities, 2(2), 85–99. https://www.thaiscience.info/journals/Article/CMUS/10613690.pdf

Yuan, H., Kunaviktikul, W., Klunklin, A., & Williams, B. A. (2008a). Improvement of nursing students’ critical thinking skills through problem-based learning in the People’s Republic of China: A quasi-experimental study. Nursing and Health Sciences, 10 , 70–76. https://doi.org/10.1111/j.1442-2018.2007.00373.x

Zabit, M. N. M. (2010). Problem-based learning on students critical thinking skills in teaching business education in Malaysia: A literature review. American Journal of Business Education, 3 (6), 19–32. https://doi.org/10.19030/ajbe.v3i6.436

Zimmerman, B. J., & Moylan, A. R. (2009). Self-regulation: Where metacognition and motivation intersect. In Handbook of Metacognition in Education (pp. 299–315). Routledge.

Download references

Author information

Authors and affiliations.

University College Roosevelt, Utrecht University, Utrecht, The Netherlands

Sofie M. M. Loyens

Department of Human Development & Quantitative Methodology, College of Education, University of Maryland, College Park, MD, USA

Julianne E. van Meerten

HU University of Applied Sciences, Utrecht, The Netherlands

Lydia Schaap

Faculty of Educational Sciences, Open Universiteit, Heerlen, The Netherlands

Lisette Wijnia

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Sofie M. M. Loyens .

Ethics declarations

Conflict of interest.

All authors declare that they have no conflicts of interest to disclose.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Quantitative Analyses of CT Terms.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Loyens, S.M.M., van Meerten, J.E., Schaap, L. et al. Situating Higher-Order, Critical, and Critical-Analytic Thinking in Problem- and Project-Based Learning Environments: A Systematic Review. Educ Psychol Rev 35 , 39 (2023). https://doi.org/10.1007/s10648-023-09757-x

Download citation

Accepted : 23 February 2023

Published : 21 March 2023

DOI : https://doi.org/10.1007/s10648-023-09757-x

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Problem-based learning
  • Project-based learning
  • Higher-order thinking
  • Critical thinking
  • Critical-analytic thinking
  • Find a journal
  • Publish with us
  • Track your research

The Peak Performance Center

The Peak Performance Center

The pursuit of performance excellence, thinking skills.

Thinking skills are the mental activities you use to process information, make connections, make decisions, and create new ideas. You use your thinking skills when you try to make sense of experiences, solve problems, make decisions, ask questions, make plans, or organize information.

Everybody has thinking skills, but not everyone uses them effectively. Effective thinking skills are developed over a period of time. Good thinkers see possibilities where others see only obstacles or roadblocks. Good thinkers are able to make connection between various factors and be able to tie them together. They are also able to develop new and unique solutions to problems.

Thinking refers to the process of creating a logical series of connective facets between items of information. Often times, thinking just happens automatically. However, there are times when you consciously think. It may be about how to solve a problem or making a decision. Thinking enables you to connect and integrate new experiences into your existing understanding and perception of how things are.

The simplest thinking skills are learning facts and recall, while higher order skills include analysis, synthesis, problem solving, and evaluation .

Levels of Thinking

Core Thinking Skills

Thinking skills are cognitive operations or processes that are the building blocks of thinking. There are several core thinking skills including focusing, organizing, analyzing, evaluating and generating.

Focusing  – attending to selected pieces of information while ignoring other stimuli.

Remembering  – storing and then retrieving information.  

Gathering  – bringing to the conscious mind the relative information needed for cognitive processing.  

Organizing  – arranging information so it can be used more effectively.

Analyzing  – breaking down information by examining parts and relationships so that its organizational structure may be understood.  

Connecting – making connections between related items or pieces of information.

Integrating  – connecting and combining information to better understand the relationship between the information.

Compiling – putting parts together to form a whole or building a structure or pattern from diverse elements.

Evaluating  – assessing the reasonableness and quality of ideas or materials on order to present and defend opinions.

Generating  – producing new information, ideas, products, or ways of viewing things.

Thinking Skills

Classifications and Types of Thinking

Convergent or Analytical Thinking: Bringing facts and data together from various sourc es and then applying logic and knowledge to solve problems or to make informed decisions.

Divergent thinking: Breaking a topic apart to explore its various components and then generating new ideas and solutions.

Critical Thinking: Analysis and evaluation of information, beliefs, or knowledge.

Creative Thinking: Generation of new ideas breaking from established thoughts, theories, rules, and procedures.

Metacognition

Thinking about thinking is called Metacognition. It is a higher order thinking that enables understanding, analysis, and control of your cognitive processes. It can involve planning, monitoring, assessing, and evaluating your use of your cognitive skills.

In the simplest form, convergent thinking or deductive reasoning looks inward to find a solution, while divergent or creative thinking looks outward for a solution.

Both thinking skills are essential for school and life.  Both require critical thinking skills to be effective.  Both are used for solving problems, doing projects and achieving objectives.  However, much of the thinking in formal education focuses on the convergent analytical thinking skills such as following or making a logical argument, eliminating the incorrect paths and then figuring out the single correct answer. 

Standardized tests such as IQ tests only measure convergent thinking.  Pattern recognition, logic thought flow, and the ability to solve problems with a single answer can all be tested and graded.  Although it is an extremely valuable skill, there are no accurate tests able to measure divergent or creative thinking skills.

Types of thinking

Types of thinking

Critical thinking

Blooms Taxonomy

Bloom’s Taxonomy Revised

Mind Mapping

Chunking Information

Brainstorming

Critical Thinking skills

Divergent and Convergent thinking skills are both “critical thinking” skills. 

Critical thinking refers to the process of actively analyzing, synthesizing, and/or evaluating and reflecting on information gathered from observation, experience, or communication and is focused on deciding what to believe or do. Critical thinking is considered a higher order thinking skills, such as analysis, synthesis, and problem solving, inference, and evaluation. 

The concept of higher order thinking skills became well known with the publication of Bloom’s taxonomy of educational objectives.  Bloom’s Taxonomy was primarily created for academic education; however, it is relevant to all types of learning. 

Often times when people are problem solving or decision making, he or she flips back and forth between convergent and divergent thinking.  When first looking at a problem, people often analyze the facts and circumstances to determine the root cause.  After which, they explore new and innovative options through divergent thinking, then switch back to convergent thinking to limit those down to one practical option.

Author:  James Kelly, September 2011

critical thinking and analytical thinking skills are less needed in position paper

Copyright © 2024 | WordPress Theme by MH Themes

web analytics

IMAGES

  1. 10 Essential Critical Thinking Skills (And How to Improve Them

    critical thinking and analytical thinking skills are less needed in position paper

  2. Critical_Thinking_Skills_Diagram_svg

    critical thinking and analytical thinking skills are less needed in position paper

  3. 💋 What is critical thinking examples. What Is Critical Thinking?. 2022

    critical thinking and analytical thinking skills are less needed in position paper

  4. Critical Thinking

    critical thinking and analytical thinking skills are less needed in position paper

  5. How You Can Improve Your Critical Thinking Skills

    critical thinking and analytical thinking skills are less needed in position paper

  6. 17 Analytical Thinking Examples (2024)

    critical thinking and analytical thinking skills are less needed in position paper

VIDEO

  1. How to develop Critical Thinking And Analytical Skills

  2. Reflective Writing & Critical Thinking||Unit-1||Part-1||TLP||Bsn 5th semester||In Urdu/English

  3. Is Woodshop The Most Important Class In School?

  4. Learn Critical Thinking From Chess

  5. Creation of A genius computer programmer

  6. सीखे विश्लेषण करना का सही तरीका

COMMENTS

  1. What Is Analytical Thinking and How Can You Improve Your Analytical

    Analytical thinking involves using a systemic approach to make decisions or solve problems. Analytical thinkers can better understand information and come to a sensible conclusion by breaking it into parts. For instance, once analytical thinkers identify a problem, they typically gather more information, develop possible solutions, test them ...

  2. Analytical Thinking vs. Critical Thinking (Plus Jobs That Use Them

    Another difference between analytical thinking and critical thinking is the direction individuals using them take to think about information. Analytical thinking is more linear and focused, whereas critical thinking is more circular. When individuals think analytically, they tend to move one from thought to the next straight formation.

  3. Problem Solving, Critical Thinking, and Analytical Reasoning Skills

    Critical Thinking 4 "Mentions of critical thinking in job postings have doubled since 2009, according to an analysis by career-search site Indeed.com." 5 Making logical and reasoned judgments that are well thought out is at the core of critical thinking. Using critical thinking an individual will not automatically accept information or conclusions drawn from to be factual, valid, true ...

  4. Bridging critical thinking and transformative learning: The role of

    By continually questioning and doubting the text, students develop some key aspects of critical thinking, such as analytical reasoning skills. Nevertheless, this doubting attitude fails to target skills that are necessary to understand challenging or foreign ideas and can thereby further entrench a reader's biases (Elbow, 1998). Elbow's ...

  5. Analytical thinking: what it is and why it matters more than ever

    Analytical thinking involves using data to understand problems, identify potential solutions, and suggest the solution that's most likely to have the desired impact. It's similar to critical thinking skills, which are the skills you use to interpret information and make decisions. In order to succeed as a strong analytical thinker, you also ...

  6. Analytical Thinking and Critical Thinking

    Basically, analytical thinking seeks to review and breakdown the information gathered while critical thinking looks to make a holistic judgment using various sources of information including a person's own existing knowledge. Analytical thinking is more linear and step-by-step breakdown of information. On the other hand, critical thinking is ...

  7. PDF The Thinker's Guide to Analytic Thinking

    Critical Thinking Reading and Writing Test—Assesses the ability of students to use reading and writing as tools for acquiring knowledge. Provides grading rubrics and outlines five levels of close reading and substantive writing (1-24 copies $6 00 each; 25-199 copies $4 00 each; 200-499 copies $2 50 each) #563m.

  8. What Are Critical Thinking Skills and Why Are They Important?

    It makes you a well-rounded individual, one who has looked at all of their options and possible solutions before making a choice. According to the University of the People in California, having critical thinking skills is important because they are [ 1 ]: Universal. Crucial for the economy. Essential for improving language and presentation skills.

  9. Critical Thinking

    Critical Thinking. Critical thinking is a widely accepted educational goal. Its definition is contested, but the competing definitions can be understood as differing conceptions of the same basic concept: careful thinking directed to a goal. Conceptions differ with respect to the scope of such thinking, the type of goal, the criteria and norms ...

  10. Analytical and Critical Thinking Skills

    Applying analytical and critical thinking to a task is about being able to look at a situation and examining it carefully. Paying attention to detail, remaining focused and having determination are all key elements to apply to this process. During the recruitment process, industry case study scenarios are sometimes used to test your analytical ...

  11. Applying Critical Thinking

    Critical thinking refers to deliberately scrutinizing and evaluating theories, concepts, or ideas using reasoned reflection and analysis. The act of thinking critically involves moving beyond simply understanding information, but rather, to question its source, its production, and its presentation in order to expose potential bias or researcher subjectivity [i.e., being influenced by personal ...

  12. Augmenting Analytical and Critical-Thinking Skills: Essential in the

    In this manner, analytical and critical-thinking skills needs to be honed. These are the skills that are facilitating the implementation of job duties in a well-organized and satisfactory manner.

  13. Analysis Skills: Understanding Critical Thinking and Science Learning

    1. Having strong analysis skills is essential for success in any field, whether it's science, business, or any other field. Critical thinking and science learning are two key components of having these skills. In order to become an expert in any field, one must be able to analyze information and make informed decisions.

  14. PDF 121503 Analytic Thinking

    Use a good dictionary.) The point(s) of view is/are as follows: (Know the point of view from which your thinking begins. Be especially careful to determine whether multiple points of view are relevant.) After reasoning through the parts of thinking above, the best solution (conclusion) to the problem is...

  15. PDF Thinking Critically and Analytically about Critical-Analytic Thinking

    †Critical-analytic thinking can be potentially costly or beneficial. †Young children can engage in critical-analytic thinking under the right conditions. †Critical-analytic thinking is fundamental to expertise development in cognitive domains. †Critical-analytic thinking is important to support the social good.

  16. Situating Higher-Order, Critical, and Critical-Analytic Thinking in

    Critical thinking (CT) is widely regarded as an important competence to obtain in education. Students' exposure to problems and collaboration have been proven helpful in promoting CT processes. These elements are present in student-centered instructional environments such as problem-based and project-based learning (P(j)BL). Next to CT, also higher-order thinking (HOT) and critical-analytic ...

  17. Thinking skills

    There are several core thinking skills including focusing, organizing, analyzing, evaluating and generating. Focusing - attending to selected pieces of information while ignoring other stimuli. Remembering - storing and then retrieving information. Gathering - bringing to the conscious mind the relative information needed for cognitive ...

  18. PDF An examination of high school students' critical thinking dispositions

    critical thinking dispositions. The results also showed that the critical thinking dispositions and analytical thinking skills of students in high performing schools were higher than those in low performing schools. Finally, mothers with higher level of education had a greater impact on students' critical thinking dispositions and analytical ...

  19. (PDF) ANALYTICAL THINKING AS A KEY COMPETENCE FOR ...

    The analytical thinking is a critical component of mental activity that enables people to solve problems. quickly and effectively. It includes a methodical grading approach that allows complex ...

  20. Critical Thinking: Components, Skills, and Strategies

    Critical Thinking: Components, Skills, a nd Strategies. Abdullah Bin Mohamed Al-Ghadouni. ABSTRACT. The research paper aimed at un covering the components of critica l thinking and. identifying ...

  21. Critical Thinking: Creating Job-Proof Skills for the Future of Work

    In this study, we explore the transformative impact of artificial intelligence (AI) on the job market and argue for the growing importance of critical thinking skills in the face of job automation and changing work dynamics. Advancements in AI have the potential to disrupt various professions, including, for example, programming, legal work, and radiology. However, solely relying on AI systems ...

  22. (PDF) An examination of high school students critical thinking

    Analytical thinking skills explained 57% of the variance in critical thinking dispositions. The results suggested that having high analytical thinking skills had an impact on students' critical ...

  23. PDF The Case for Performance-Based Assessment of Critical-Thinking Skills

    that critical-thinking, problem-solving, qualitative and quantitative analytic-based reasoning, and written-communication skills need to be improved. The multiple-choice test may do a satisfactory job of measuring student mastery of content; however, mastery of content is not enough to survive and prosper in the 21st century.