Critical Thinking for Engineers

Engineers are specialists in technical information. As the complexities of problems increase, there has been an increasing need for engineers to apply critical thinking in the context of problem solving. This article demonstrates the value and use of developing abstract thought in engineering, especially for students

Introduction

In school, the most widely used, or at least the most reputable method for solving problems is “Critical Thinking.” From understanding the works of a long dead philosopher to solving differential equations, “Critical Thinking” is like some sort of intellectual panacea. Although everyone can agree that “Critical Thinking” is usually a good thing, it is difficult to explain exactly what it is and even more difficult to teach it.

For most engineers, problem solving is essentially their profession. Critical thinking and abstract thought, then, are invaluable tools, which complement an engineer’s technical expertise. In this paper, our first goal is to define what exactly critical thinking is. From there, we will discuss examples, which highlight the importance of abstract thought as well efforts to teach this in the classroom. Finally, we will look at how this can be applied to our Senior Project and perhaps future work in general.

To begin, we will look at two definitions of critical thinking. In her 2002 article, Jessop argues that critical thinking is comprised of three major skills: analysis, synthesis, and evaluation. She goes on to quote a statement by Scriven (n.d.) to define the term more explicitly:

Critical Thinking is the intellectually disciplined process of actively and skillfully conceptualizing, applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action.(as quoted in Jessop, 2002, p. 141)

Analysis is breaking down the problem into parts and finding the relationships between them. Synthesis is thinking about other ways to solve the problem either by incorporating new information or combining the parts in a different way. Finally, evaluation is making a judgment about the results using the evidence at hand.

According to Scriven (n.d.), then, critical thinking is the combined process of analysis, synthesis, and evaluation. Since we are trying to use critical thinking as “a guide to belief and action,” synthesis, or the generation of new ideas or solutions, is a necessary component. However, creating these new solutions is difficult, if not impossible, without understanding the problem, which leads to analysis. The process of critical thinking, though, does not stop at synthesis. Out of the results from the synthesis stage, some may be better than others. Moreover, it is possible that none of the results actually solve the problem. Because of this, it is necessary to evaluate the results in order to find the best answer. To better understand this definition, we will apply this to an example.

Let’s assume we want an egg for breakfast. For analysis , the parts of this process might be putting butter in a pan, breaking the egg, and then cooking it. For synthesis , there are many different ways to prepare eggs. For example, we could whisk the egg to make scrambled eggs, or maybe we want hard boiled eggs instead. Finally, we need to evaluate our result. There are many different criteria for this, such as which one takes the least amount of time, which is the most delicious, which is the healthiest, etc. In order to apply critical thinking to this problem, the goals are to understand the problem, find possible solutions, and evaluate the result.

For comparison, we now look at another definition of critical thinking. Qiao (2009) writes, “When one used the methods and principles of scientific thinking in everyday life, then he was practicing critical thinking. So scientific and critical thinking are the same thing…” The first thing that comes to mind when thinking about “scientific thinking” is the scientific method, so at first, this comparison seems a little odd. For reference, the steps of the scientific method are presented as follows (Wikipedia, n.d.):

  • Define a question
  • Gather information and resources (observe)
  • Form an explanatory hypothesis
  • Test the hypothesis by performing an experiment and collecting data in a reproducible manner
  • Analyze the data
  • Interpret the data and draw conclusions that serve as a starting point for new hypothesis
  • Publish results
  • Retest (frequently done by other scientists)

In the steps above, we see some similarities with the earlier definition of critical thinking. Earlier, we stated that critical thinking was composed of analysis, synthesis, and evaluation. While engineers typically begin with problems instead of questions, the gathering of information and resources is definitely a part of analysis. In both cases, understanding the problem or question is a priority. In critical thinking, the next step would be synthesis. A scientist may be trying to answer a question by forming a hypothesis, but the need to imagine different possibilities and find an answer that fits is the same in engineering. Lastly, steps 4-6 could be considered one way to evaluate the results from synthesis. While a scientist may test his or her hypothesis with experiments, an engineer may run simulations or create prototypes. The point in either case, though, is to make sure is to ensure the ideas from earlier actually work.

Although we defined critical thinking from an engineer’s perspective, it should not be surprising that we can apply it loosely in other disciplines such as science. After all, the capacity for critical thinking is not limited to or only useful for engineers alone. Writers, philosophers, mathematicians, and many other disciplines make use of critical thinking as well. Even if the process is slightly different for each, at the very least, analysis, synthesis, and evaluation lie at the heart of critical thinking.

As a technical example of critical thinking, let us examine a problem a Tufts University student encountered while doing research over the summer. This student was writing the image processing code for a robot, which had a camera mounted on it.

The code to retrieve the video and display it was already written, so the student only had to focus on the image processing part. As a simple test, the student wrote a piece of code to find the number of black pixels in a video frame. The code was easy to test since all the pixels could be made black by covering up the camera. The problem occurred when the student’s code tried to count all the pixels when the camera was covered up. In this case, all the pixels should be black, but the student recorded only a fraction of that number.

So how did the student use critical thinking to solve the problem? First, he took into account all of the available information and tried to find possible sources of the problem. The input was a video frame with an apparent size of 480 x 640 pixels, which matched the output displayed. Repeating the test for black pixels consistently returned the same fraction. When the student modified his code to check for pixels of any colors, the result found the expected number of pixels, so at first the problem appeared to be related to detecting the black pixels. The student, however, had tested that part of the code thoroughly, and was fairly confident that it was not the source of the problem.

Continuing on with his analysis, the student decided to directly save the video frame and display it. Upon seeing the result, the student at once saw the problem and found a solution. While the given video frame had room for 480 x 640 pixels, the actual image was stored in the upper left hand corner as a 240 x 320 image. Thus, the student’s code was correct, as he originally surmised, and it was actually returning the correct number. The code to display the video, it turns out, expected this input, and resized the image to the 480 x 640 video feed that the student originally saw.

From there, the rest of the problem was straightforward. For synthesis, the student decided to use the upper left corner of the given images and ignore the rest of the pixels. The result was more efficient than the original code, since it only had to process a 240 x 320 image and it ignored the pixels that were skewing the results. This example demonstrates the importance of analysis in critical thinking. Without an understanding of the problem, it is unlikely that the student would have found a solution by starting with the synthesis step. In this case, the solution and the tests to make sure it worked were relatively simple, so the synthesis and evaluation steps were not as important. Nevertheless, applying all of these steps in tandem allowed the problem to be successfully solved.

Engineering Curriculum

For the most part, critical thinking has typically been something reserved for the liberal arts, especially English and Philosophy. Even on standardized tests like the SATs, there is a critical reading section. However, as we discussed earlier, critical thinking is not limited to the liberal arts; it is also an integral part of the sciences and engineering.

Recently, the Accreditation Board for Engineering and Technology (ABET) has been pushing for more emphasis on communication skills and understanding the global context of today’s problems in the engineering curriculum. Previously, and even now, the ABET accreditation process acknowledged schools that trained students not only to be able to apply their technical knowledge, but also lead and work well in teams. ABET believes that their new objectives can be achieved through the inclusion of more writing and critical thinking in the engineering classroom (Gunnink & Bernhardt, 2002).

Although most people agree that critical thinking should be a focus in school, there are a variety of proposed methods, but no single class or solution stands out. Even though we have been treating critical thinking as an individual effort, a few papers have suggested the use of group discussions and forums in order to encourage critical thinking (Radzi et al., 2009; Jacob et al, 2009). After defining critical thinking in her article, Jessop (2002) suggests a course based on Brainstorming and Critical Reading. For the brainstorming section, students are given a problem, and then, over the course of a few weeks, students must engineer a solution. For the critical reading section, students are given a number of journal articles to read and evaluate. Naturally, the brainstorming half is mainly concerned with the synthesis aspect of critical thinking while the critical reading half focuses on the analysis aspect (Jessop, 2002). The hope, of course, is that by practicing these steps, the students will become better at critical thinking in the future.

As mentioned earlier, Qiao (2009) was writing on critical thinking in schools in China. Qiao goes on to state, “The nature of authority has two forms: textbook authority and teacher authority. Laws and rules in textbook are golden and precious, beyond any manner of doubt. Science teacher is the prolocutor of truth.” (2009, p. 115). In order to promote critical thinking and a sense of skepticism, Qiao suggests a History, Philosophy, and Science (HPS) Education approach. In addition to the usual Science that students learn about, Qiao (2009) believes it is valuable to learn about both the History and Philosophy behind these advancements. While Jessop’s (2002) strategy is purely from an engineer’s perspective, Qiao’s approach relies on the idea that critical thinking is not restricted to engineers. Instead, the capacity for critical thought is developed through studies in history and philosophy.

Despite the differences in each method, the goal is the same. In order to tackle increasingly difficult problems, engineers will require more than just technical knowledge. To this end, there is a need for teachers and experts, whose job is to train these engineers, to bring critical thinking into the classroom.

Application to Senior Project

In this paper, we have attempted to answer questions like, “What is critical thinking?” and “Why is it important?” As we stated before, critical thinking can be thought of as similar to the scientific method, but its main points are the problem definition and understanding, the search for solutions, evaluation, and iteration. Since critical thinking is a powerful tool in problem solving, we have seen recent efforts to include it in the engineering curriculum. The final question we want to answer is, “How does this apply to our senior project?

The answer to this lost question is relatively simple. Each of our senior projects , if properly scoped and planned, should aim to solve a problem. In light of this, we should strive to solve these problems intelligently, which is to say, using critical thinking. This means fully researching and understanding the problem, creating new solutions and finding old ones, and evaluating the result. When our result is a failure, we go back, look for other solutions, and try again until we have solved the problem. So we can see that critical thinking is an important, if not essential, part of our senior project.

Cited References

  • Gunnink, B., & Bernhardt, K. L. S. (2002). Writing, critical thinking, and engineering curricula. In Frontiers in Education , 2002. FIE 2002. 32nd Annual (Vol. 2, pp. F3H–2–F3H–7 vol.2). Presented at the Frontiers in Education, 2002. FIE 2002. 32nd Annual. DOI: 10.1109/FIE.2002.1158211
  • Jacob, S. M., Lee, B., & Lueckenhausen, G. R. (2009). Measuring Critical Thinking Skills in Engineering Mathematics using online forums. In 2009 International Conference on Engineering Education (ICEED) (pp. 225–229). Presented at the 2009 International Conference on Engineering Education (ICEED). DOI: 10.1109/ICEED.2009.5490577
  • Jessop, J. L. P. (2002). Expanding our students’ brainpower: idea generation and critical thinking skills. IEEE Antennas and Propagation Magazine , 44(6), 140–144. DOI: 10.1109/MAP.2002.1167273
  • Qiao, C. (2009). Science Education and Fostering of Critical Thinking in China. In Second International Conference on Education Technology and Training , 2009. ETT ’09 (pp. 114–117). Presented at the Second International Conference on Education Technology and Training, 2009. ETT ’09. DOI: 10.1109/ETT.2009.25
  • Radzi, N. M., Abu, M. S., & Mohamad, S. (2009). Math-oriented critical thinking skills in engineering. In 2009 International Conference on Engineering Education (ICEED), (pp. 212–218). Presented at the 2009 International Conference on Engineering Education (ICEED). DOI: 10.1109/ICEED.2009.5490579
  • Scientific Method. (n.d.). In Wikipedia. Retrieved December 18, 2012, from http://en.wikipedia.org/wiki/Scientific_method
  • Scriven, M. & Paul, R. (n.d.) “Defining Critical Thinking.” National Council for Excellence in Critical Thinking Instruction. Retrieved from http:/lwww.criticalthinking.orgiuniversitylunivclasslDe~ning.html

Additional Resource

  • Accreditation Board for Engineering and Technology (ABET). (n.d.) Retrieved from http://www.abet.org/
  • Articles > 1. Design Process > Critical Thinking for Engineers

Search the Handbook:

Handbook overview.

  • Introduction and Acknowledgements
  • Senior Capstone Projects Summary for the 2022-23 Academic Year
  • Senior Capstone Projects Summary for the 2021-22 Academic Year
  • Senior Capstone Projects Summary for the 2020-21 Academic Year
  • Senior Capstone Projects Summary for the 2019-20 Academic Year
  • Senior Capstone Projects Summary for the 2018-19 Academic Year
  • Senior Capstone Projects Summary for the 2017-18 Academic Year
  • Senior Capstone Projects Summary for the 2016-17 Academic Year
  • Senior Capstone Projects Summary for the 2015-16 Academic Year
  • Senior Capstone Projects Summary for the 2014-15 Academic Year
  • Senior Capstone Projects Summary for the 2013-14 Academic Year
  • Senior Capstone Projects Summary for the 2012-13 Academic Year
  • 1. Design Process
  • 2. Management
  • 3. Technologies
  • 4. Communications And Life Skills
  • 5. Tech Notes
  • Electrical and Computer Engineering Design Handbook

PlatformPro by PageLines

Disclaimer | Non-Discrimination | Privacy | Terms for Creating and Maintaining Sites

Critical thinking for engineers and engineering critical thinking

Ieee account.

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

Issue Cover

  • Next Article

1 Introduction

2 methodology, 4 discussion, conflict of interest, data availability statement, critical thinking assessment in engineering education: a scopus-based literature review.

Contributed by the Design Education Committee of ASME for publication in the J ournal of M echanical D esign .

  • Split-Screen
  • Article contents
  • Figures & tables
  • Supplementary Data
  • Peer Review
  • Open the PDF for in another window
  • Cite Icon Cite
  • Permissions
  • Search Site

Deo, S., and Hölttä-Otto, K. (January 30, 2024). "Critical Thinking Assessment in Engineering Education: A Scopus-Based Literature Review." ASME. J. Mech. Des . July 2024; 146(7): 072301. https://doi.org/10.1115/1.4064275

Download citation file:

  • Ris (Zotero)
  • Reference Manager

Critical Thinking (CT) skills are highly valued by employers, leading to their integration into engineering education through various design- and problem-based approaches. Despite their recognized importance, the varying perceptions of CT present challenges in achieving a unified approach to its development and assessment. This paper reviews CT assessment in engineering education, particularly mapping Facione's CT skills with assessment approaches to discern how CT is evaluated. We conducted a systematic keyword search in the SCOPUS database and identified 462 articles from 2010 to March 2023. These were reviewed and distilled down to 80 articles included in this study. We find that CT has been recognized as an essential skill set, but there are no consistent definitions or means to assess it. Further, while CT is a multifaced skill, we find that very few assessment methods assess CT holistically. We identify three goals for CT assessment: (1) understand and recognize CT, (2) demonstrate CT, and (3) identify if CT has changed due to intervention. We discuss how different assessment approaches, including rubrics, surveys, standardized tests, and customized assessments, have been used and propose recommendations to support reaching a better understanding of CT assessment in engineering education. Further research is needed to understand better how these skills can be taught and assessed as part of engineering education to meet the needs of employers.

Critical thinking (CT) skills have recently gained increased attention in engineering education, partly due to the growing adoption of problem-based learning and design in engineering education worldwide [ 1 ]. Design and, more generally, the process of addressing open-ended problems require students to define and comprehend the issue, identify its components, explore multiple solutions, communicate effectively, and employ logical reasoning to resolve the problem. Schiavone [ 2 ] argues that students enter university education without a strong foundation in the essential skills to tackle open-ended problems. Although the roots of the problem can be traced back to the gaps in higher education, it is during their engineering education such gaps become more apparent. The skills like analysis, evaluation, and logical reasoning are needed, regardless of educational background or field. Therefore, it is crucial for higher education institutions to acknowledge this educational shortfall and engage to assess and facilitate the development of CT skills. In the process, engineering education not only remedies previous education gaps but could also enhance overall education quality.

John Dewey [ 3 ] coined “critical thinking” over a century ago. Dewey [ 3 ] refers to this concept as “reflective thinking,” but the debate persists over whether it is a discipline-neutral or disciplinary skill [ 4 – 6 ]. With no universally accepted definition [ 7 ], each field interprets CT differently: psychology emphasizes metacognitive (one’s awareness about their own thought process) and metalinguistic (awareness about how language is structured and consciously used) aspects [ 8 ]; medical education focuses on logic, data analysis, and argument evaluation for evidence-based medicine [ 9 ]; management highlights creativity, questioning assumptions, and alternative perspectives [ 10 ]; and engineering education underlines problem-solving, decision-making, and evidence evaluation as CT processes [ 11 ] and the ability to reason logically, make inferences, and draw conclusions based on evidence by Halpern [ 12 ]. Even with differences in perception of CT, essential components such as logical reasoning, evidence-based analysis, and informed decision-making are consistent.

Danczak et al. [ 13 ] investigated the perceptions around critical thinking of 470 chemistry students from an Australian University, 106 chemistry teaching staff, and 43 employers of chemistry graduates define CT. They found a wide range of responses. For example, students’ understanding of CT ranged from vague notions such as “Thinking Deeply” or “Thinking in a complex manner” to more precise thoughts like “critique” and “objectivity.” Overall, engineering students struggled to define CT. Teachers primarily viewed CT as “critiquing” or “evaluating,” with their responses being largely outcome-focused. On the other hand, employers associated CT with four skills: “creativity,” “problem-solving,” “systematic approach,” and “identifying opportunities.” In this context, it is worth highlighting that there is a clear focus on the fundamental attributes of critical thinking, including objectivity, analytical thinking, and systematic problem-solving. These attributes are highly relevant to individuals across all levels of education and professional development, from students to teachers and professionals. The emphasis on these key skills underscores their importance as foundational elements in developing effective critical thinking abilities.

In research, Facione [ 14 ] defined CT as a “purposeful, self-regulatory judgment which results in interpretation, analysis, evaluation, and inference, as well as an explanation of the evidential, conceptual, methodological, criteriological, or contextual considerations upon which judgment is based.” However, Dewey [ 3 ] defines it as “active, persistent and careful consideration of any belief or supposed form of knowledge in the light of the grounds that support it, and the further conclusions to which it tends.” Both definitions, irrespective of their different focus, have core CT aspects like purposeful analysis, interpretation, and evaluative judgment based on evidence, reflecting the core characteristics identified across disciplines.

In assessing CT within engineering education, it is crucial to consider critical thinking as a set of (CT) skills and dispositions [ 15 ]. CT skills involve an individual’s capacity to analyze, evaluate, and interpret information [ 16 ] and monitor and self-correct one’s thought process [ 17 ]. CT dispositions, on the other hand, pertain to a person’s personality traits, such as curiosity, open-mindedness, and careful decision-making. Paul and Elder [ 18 ] encapsulate CT as “a systematic method of shaping one’s thinking that is purposeful, precise, disciplined, and comprehensive, based on intellectual and behavioral standards, resulting in well-reasoned conclusions.” Even though one might consider dispositions and skills different, for CT, analytical prowess, reasoned judgment, and evidence-based decision-making are must-have components.

In summary, there is considerable diversity in definitions and perceptions of critical thinking. Nevertheless, core elements like the ability to analyze and evaluate information, develop well-founded judgments, and make decisions based on solid reasoning remain consistent across different disciplines. These definitions also highlight the multidimensional nature of CT, raising the question, “How can one determine if students are thinking critically enough?”. Addressing this question necessitates the assessment of CT in higher education, making it an essential task.

1.1 The Delphi Study as a Foundation for Literature Review.

For this study, we adopted a highly cited and comprehensive Delphi study by Facione [ 14 ], identifying the skills associated with CT. For this Delphi study, 46 experts from various fields reached a consensus on the skills associated with CT. Upon completing their study, a Delphi report was published, outlining the agreed-upon skills to be taught or assessed as part of CT. They identified six cognitive CT skills: (1) Interpretation, (2) Analysis, (3) Evaluation, (4) Inference, (5) Explanation, and (6) Self-regulation. Each cognitive skill encompasses subskills, totaling sixteen subskills, as shown in Fig. 1 . For example, under Evaluation skill: (1) Assess claims and (2) Assess Argument, two subskills are assigned. Facione’s [ 14 ] study has detailed descriptions of all the cognitive and subskills.

CT Skills Identified in Facione’s Delphi Study [14]

CT Skills Identified in Facione’s Delphi Study [ 14 ]

There are multiple reasons for adopting Facione’s work. First, the CT definition and various subskills cover the most fundamental skills associated with CT and the design process accepted by multiple authors. For example, varying definitions often result in different perspectives; however, several authors have reached a consensus on the inclusion of specific key skills, such as decision-making and problem-solving [ 19 – 21 ], making inferences, using logical reasoning [ 19 , 22 – 24 ], and examining claims and arguments [ 14 , 20 , 25 ]. Facione’s [ 14 ] study encompasses all these skills. Moreover, the Accreditation Board for Engineering and Technology (ABET) [ 26 ] accreditation, which offers an extensive list of suggested skills for graduates, also features several skills that are part of Facione’s CT skills.

In addition, the significance of CT is increasingly recognized within the sphere of engineering education, particularly as it intersects with the growing emphasis on design-based learning approaches, as explored in Sec. 1.2 . Second, the authors aimed to ensure that the study adhered to well-defined aspects of this construct. Facione’s work [ 14 ] anchors our literature analysis. We matched (mapped) Facione’s identified cognitive skills and sub-skills with CT assessment approaches in engineering, revealing which assessment approach measures what CT skills. The dissertation research discusses mapping in detail in the Methodology section. Before that, it is crucial to comprehend why CT garners significant attention and engineering education’s role in critical thinking.

1.2 The Link Between Critical Thinking and the Design Process.

The skills that Facione [ 14 ] described as CT span a broad range and are inextricably tied to the design process. To ensure original and successful solutions, the design process, frequently characterized by problem definition, ideation, prototyping, and evaluation, depends on successfully applying CT skills [ 27 , 28 ]. Although this study is not aimed at identifying the link between CT and the design process, the authors believe it could be valuable for educators to recognize this link. For instance, interpretation and analysis are crucial during the first phase of challenge formulation because designers must comprehend the underlying needs, constraints, and requirements [ 29 ]. Designers use assessment and inference to produce innovative concepts and ideas throughout the ideation process while evaluating their viability and possible impact [ 30 , 31 ].

CT skills such as explanation and self-regulation are essential during prototyping because they allow designers to effectively communicate their ideas and make necessary modifications based on feedback [ 32 , 33 ]. Finally, during the evaluation phase, designers employ analysis and evaluation skills to evaluate the performance of their prototypes against predefined criteria and constraints, allowing them to refine and optimize their designs [ 34 , 35 ]. Therefore, Facione’s CT skills [ 14 ] are applicable and integral to the various stages of the design process, emphasizing the usefulness of CT in a design process, which is often an integral part of engineering education. Engineering education plays a crucial role in imparting this skill, as discussed in the following section.

1.3 Importance of Critical Thinking and the Role of Engineering Education.

In 2018, Accenture [ 36 ] emphasized that higher-order cognitive abilities, such as complex reasoning (including CT), would be crucial for successful technological innovation and growth in the future. Similarly, McKinsey [ 37 ] found that, due to increased automation, activities requiring basic cognitive skills would decline substantially by 2030 in the United States and Europe. However, the demand for higher-order cognitive skills like CT would rise by 19% in the United States and 14% in Europe. Industries will focus on training employees in strategically important skills, including CT- and IT-related abilities. The World Economic Forum [ 38 ] predicted that by 2025, 85 million jobs would be replaced by machines, yet the need for individuals with strong CT and problem-solving skills would increase. Since 2016, these two skills have consistently ranked among the top and are expected to maintain their positions for the next five years.

For several decades, engineering education has actively evaluated whether students exhibit CT skills. For instance, during presentations, debates, or in-class activities, students may engage in communication by asking or answering questions for further clarification [ 19 ], attempt to comprehend and define various terms (interpretation) [ 19 ], recognize underlying assumptions (analysis) [ 25 ], and explain and interpret relevant and non-relevant information to draw conclusions (inference) [ 14 ]. Additionally, they may demonstrate reasoning abilities to justify their stance [ 20 ] or examine both positive and negative aspects of a problem or situation (self-awareness) [ 21 ]. These indicators of CT skills also represent essential attributes that university students need to develop.

Assessing CT in engineering fulfills various objectives, such as evaluating students’ multidimensional higher-order thinking skills like CT. In the American education system, CT assessment helps achieve ABET student learning outcomes, including identifying and formulating complex problems, effective communication, conducting experiments, and analyzing and interpreting data [ 39 ]. In 2020, India introduced a new education policy prioritizing developing higher-order thinking skills like creativity and CT [ 40 ]. Additionally, the Organization for Economic Cooperation and Development [ 41 ] launched a project in 2018 focusing on teaching and assessing CT, involving 14 countries and 26 higher education institutions. Therefore, previous literature implies that CT assessment can be beneficial for higher education institutes and industries in several areas.

CT assessment within engineering education is often not clearly defined or emphasized. For example, Revised Bloom’s taxonomy, a framework frequently referenced for teaching or assessing higher-order thinking skills [ 42 ], identifies the top three skills—analyzing, evaluating, and creating—as CT skills. However, these elements are not always directly assessed as part of students’ CT abilities. This gap signifies that the actual pedagogical implementation of teaching CT does not always align with dedicated evaluation methods, an area that requires further investigation [ 43 ]. As experts from various disciplines perceive CT differently, their assessment approaches may also vary. This diversity presents a significant challenge for educators tasked with measuring students’ proficiency in CT [ 44 ], and adding to this problem is the absence of a universally agreed method for CT assessment, making it harder to set consistent standards or practices [ 45 ].

Awareness of available CT assessment approaches is crucial to fostering this skill. Selecting an appropriate CT assessment can be challenging, especially for the individual new to the CT. A simple search for “Critical thinking assessment” on Google Scholar between 2010 and 2023 yields 37,800 records (in March 2023), an overwhelming amount of information to process, even for an expert. As a result, this study aims to systematically explore the Scopus database to understand CT assessment in engineering education better. The author limited the database investigation to the recent decade to maintain the study’s manageability. The systematic approach adopted for this research is discussed in the following section.

1.4 Purpose and Research Questions.

The purpose of this study is to conduct a comprehensive review of critical thinking assessment practices in engineering education to understand the various assessment approaches employed. By examining how CT is operationalized through assessment, we seek to identify common and uncommon practices and potential areas for improvement. Based on the research gaps uncovered during the review process, we will provide recommendations for advancing CT assessment in the future. This study targets a wide audience, including engineering and engineering design educators, CT practitioners, engineering education researchers, engineering design researchers, and organizations interested in CT-related research.

What are the most frequently used CT assessment approaches in engineering education?

What are the underexplored aspects of assessing CT in engineering education that warrant further research?

The systematic review process is detailed in the following section and illustrated in Fig 2 .

2.1 Research Database and Search Protocol.

A systematic literature search was conducted on CT “assessment” in “engineering education.” One of the authors initially searched for Scopus-indexed articles from 1980 to 2023. The first relevant article was published in 1994, with no records found before that. The search yielded 462 records.

To maintain feasibility, the database was refined, and a histogram of publication records per year was plotted using Scopus’ inbuilt result analysis function. The histogram showed an average of four records published between 1994 and 2009, but from 2010 onward, the average increased to over 23 records per year until March 2023. The review focused on records from 2010 to 2023, and additional checks ensured that no highly cited papers before 2010 were overlooked. It is important to note that the exclusion of earlier works was primarily due to their lower citation count, indicating less impact within the academic community. While our criteria centered on more recent, influential works, we acknowledged the foundation laid by earlier research. Detailed specifics regarding the inclusion and exclusion criteria, ensuring a balanced and thorough review, are discussed in the following section. The specific search protocol implemented for a comprehensive literature review was, [TITLE-ABS-KEY (critical AND thinking) AND TITLE-ABS-KEY (measurement) OR TITLE-ABS-KEY (assessment) AND TITLE-ABS-KEY (engineering AND education)] AND PUBYEAR > 2009 AND PUBYEAR < 2023 AND (LIMIT-TO (SUBJAREA, “ENGI”))

2.2 Inclusion and Exclusion Criteria.

This section describes the inclusion and exclusion criteria used to refine the research paper selection, focusing on engineering education and CT assessment.

The research paper is from any stream or subdiscipline within engineering education.

The paper contains an “Action → Effect” intervention, specifying the author(s) action to influence students’ critical thinking and the resulting change (increase, decrease, or no change) in critical thinking.

The paper includes clear information about the CT assessment approach used (e.g., sample questions for surveys, rubrics provided for rubric-based assessments) to understand the aspects of CT being assessed.

The paper presents conclusive claims about changes in CT resulting from an intervention in addition to previous criteria (e.g., increased, declined, or no change in CT due to a specific pedagogical approach or activity/intervention).

Papers providing a dedicated and comprehensive CT assessment approach are included as an exception.

The papers do not provide details about the CT assessment approach used or claims to enhance CT without assessing it in the study.

The paper uses non-evidence-based approaches, where the author(s) claim a change in students’ CT without explicitly assessing it or providing details about the assessment approach used.

The papers with participants from K-12 schools are excluded, as this study focuses on higher education in engineering colleges or universities.

The paper is not published in scientific conferences or journals.

The authors considered the paper evidence-based if it clearly describes interventions and their effects on CT, along with well-defined assessment approaches. Non-evidence-based papers, however, may claim to influence CT without explicit assessments or lack sufficient details about the assessment methods used. By recognizing these distinctions, the systematic literature review focuses on research papers offering reliable, evidence-based insights into CT assessment practices.

Types of interventions are beyond the scope of this paper and will not be explored in detail. Because of the confidentiality or propriety nature of the assessment approach, some authors could not reveal complete details of the assessment tool but provided samples; such articles are considered exceptions and included in this study. The initial 150 articles were removed after applying the engineering-only filter. The rest of the articles were distilled, as shown in Fig. 2 [ 46 , 47 ]. At the end of the process, we included 80 published articles fitting all inclusion criteria. They are provided as an   Appendix at the end of the manuscript.

Systematic review prism flowchart [46,47]

Systematic review prism flowchart [ 46 , 47 ]

2.3 Analysis and Mapping.

Initially, two contributing authors collaboratively established an Excel database specifically designed for the detailed scrutiny and systematic categorization of literature relevant to CT skills. The first author assumed primary responsibility for populating the database and undertaking a thorough and detailed review of each paper. Throughout the process, there was an ongoing dialogue with the second author, who provided critical evaluations of the relevance and suitability of each paper. This collaborative approach ensured a robust selection process, leading to the identification of various emerging trends, patterns, and themes, which were carefully documented. To further refine the analysis, the team developed comprehensive mapping guidelines, leading to the integration of additional data columns for enhanced detail and clarity. Following the preparatory stages, the first author proceeded with the actual mapping, adhering to the established guidelines to ensure a consistent analysis.

Next, the CT assessment approaches identified from the literature review were aligned with Facione’s six distinct cognitive and 16 sub-skills of critical thinking. The assessment approach can measure any six cognitive CT skills without specifying subskills. In this section, we only explain the relevance of the mapping, related challenges, and the solutions adopted. Tables 3 – 6 in the result section detail mapping with key findings.

First, the review revealed that there are direct and indirect CT assessment approaches. They can be differentiated based on how explicitly they evaluate students’ CT skills. For example, direct assessment methods specifically target the measurement of students’ CT by presenting them with tasks or questions that directly engage their CT abilities. These tasks or questions are often evaluated using rubrics, surveys, or standardized tests designed to measure CT. For example, Chang and Wang [ 48 ] employed a direct assessment approach by asking students, “Which aspect/s of critical thinking skills will you implement now that you have learned about the four levels of moral judgment?” This question explicitly requires students to demonstrate their understanding of CT skills.

Indirect assessment methods, conversely, evaluate CT skills implicitly through tasks that do not explicitly focus on CT but still reflect students’ CT abilities. These tasks may assess other skills or knowledge areas, and the assessment of CT is derived from the student’s performance on those tasks. For instance, Ahmad et al. [ 49 ] assessed students’ CT through their performance on a computer programming assignment. The grades on this assignment were subsequently analyzed using quantitative regression and correlation analysis to infer students’ CT skills. This approach does not explicitly request students to demonstrate their CT abilities but still allows for assessing these skills based on their performance in other domains. The mapping in Table 3 shows the mapping of three assessment approaches: rubrics, surveys and standardized assessment tests, as they directly mapped with Facione’s CT skills [ 14 ].

Customized assessment approaches do not appear in the mapping table, primarily due to two reasons: their inherent flexibility and these studies often adopt indirect assessment approaches or utilize content-specific terminology that does not align with Facione’s [ 14 ] CT skills. More content on customized assessment approaches and several examples are in the subsequent sections.

3.1 Attributes of the Included Articles.

Rubrics : These are scoring guides utilized to evaluate students’ work or performance, comprising a set of criteria and levels of achievement for each criterion.

Surveys: This is a broad term that includes various types of questionnaires, such as single-question surveys, validated scales or measures of CT, and scenario-based questions. We acknowledge that the term “survey” may encompass subcategories. They are further subcategorized in the following section.

Standardized CT tests: These are commercially available, pre-developed tests intended to measure CT skills consistently across different contexts.

Customized assessment tests: These are unique assessment approaches devised by individual instructors or researchers, tailored to their specific course or content. They can involve course-specific assignments, discussions, presentations, or combinations of various tests.

Our findings indicate that surveys were the most commonly used CT assessment measure (28 instances), followed by customized assessment tests (26 instances) and rubrics (24 instances). Commercially available standardized tests were the least utilized (nine instances). No specific trend in the frequency of using these approaches was observed throughout the study period. Table 1 presents the year-wise attributes of the articles included in this study. Additionally, we found that most papers focused on summative or formative assessment, with none targeting diagnostic assessment. Our analysis did not reveal any growth or decline patterns in studies related to CT assessment.

Attributes of the reviewed articles

3.2 Goals of Critical Thinking Assessment.

Goal 1: Assess students’ understanding and recognition of CT. This goal represents a more surface-level assessment, focusing on whether students can comprehend CT abilities and identify them within themselves or in the presented work. This foundational understanding is essential as a preliminary step in CT assessment.

Goal 2: Assess students’ demonstration of CT. This intermediate goal delves deeper into the assessment process by examining students’ abilities to exhibit their CT through various forms of communication, such as visual, verbal, written, or behavioral. It evaluates whether students can apply their understanding of CT in different contexts and situations, showcasing a higher level of engagement with the concept.

Goal 3: Assess the change in CT due to an intervention. This most comprehensive goal provides a thorough assessment by determining if students’ CT has changed (improved, decreased, or remained constant) as a result of a specific intervention. By comparing the results to a control group or students’ previous CT, this goal highlights the effectiveness of interventions in enhancing CT skills, allowing for more targeted and impactful educational strategies.

The number of articles assessing each identified goal

The hierarchical nature of these goals enables a progressive deepening of CT assessment, starting from foundational understanding to practical application and, finally, to measuring the effectiveness of interventions designed to improve CT skills. This structure provides valuable insights for educators to design, implement, and assess CT interventions in a more informed and engaging manner.

Various categories of CT assessment tools have been used to address each of the three hierarchical goals. Surveys were mostly used to assess Goal 1: understanding and recognizing CT. For example, Kisselburgh et al. [ 50 ] asked, “Which aspects of the course helped you with critical thinking about ethics?”. For Goal 2: demonstrating CT, standardized commercial tests were the least used, while other assessment approaches were more frequently employed. For example, through a survey, Thompson et al. [ 51 ] asked students how much they agreed with the statement, “The opportunity to apply the Paul-Elders critical thinking framework has improved my critical thinking.” Bayles [ 51 ] assessed students’ reflective writing assignments using critical thinking rubrics, which evaluated students’ CT at four distinct levels, as shown in Table 3 . Regarding Goal 3: measuring changes in CT due to an intervention, all four categories were used with similar frequency. At this level of Goal 3, CT assessment provides a comprehensive understanding of the effectiveness of various interventions and their impact on students’ CT. Some studies employ multiple assessment approaches like rubrics + surveys or customized assessments + surveys [ 52 – 56 ]. As a result, the combined assessment frequency does not equal the number of articles included in this study.

Critical thinking rubrics sample [ 51 ]

A brief examination of the mapping Tables 4 – 6 reveals that rubrics and standardized tests address numerous CT skills, yet surveys were the most prevalent assessment approach. Overall mapping indicates that three cognitive skills—Interpretation, Analysis, and Inference—are assessed more frequently, regardless of the assessment approach employed. The cognitive skill Evaluation was the least assessed. Examining ideas, Querying evidence, and Self-examination emerged from cognitive skills—Analysis, Interpretation, and Self-regulation—were the most assessed subskills. Rubrics or standardized tests primarily assessed the frequently assessed cognitive and subskills listed here. In contrast, surveys or customized assessments often treated CT as a single skill rather than a set of skills, as identified by Facione [ 14 ].

Rubrics mapping against Facione’s CT Skills

Survey’s mapping against Facione’s CT Skills

Standardized Test’s mapping against Facione’s CT Skills

Surveys subcategories

When authors refer to CT in their assessments without specifying particular aspects, it is not possible to conclusively determine which aspect of CT they are addressing. A separate column was added to the mapping table to display all assessment approaches that treat CT as a single skill. Among all assessment approaches, Holistic rubrics assessed all six cognitive skills. However, mapping indicates that these rubrics assessed the “querying evidence” subskill twice (not displayed in the table). This approach facilitates the assessment of all six cognitive skills, covering nine out of the sixteen subskills. We did not identify any other assessment approach that evaluates all six skills and directly aligns with all 16 subskills.

Numerous assessment approaches assessed at least (any) five distinct cognitive CT skills [ 57 – 60 ]. For instance, the Critical and Integrative Thinking Rubrics [ 58 ] and Washington State University’s Rubrics [ 57 ] did not correspond with any interpretation skill. Vila-Parrish’s Common Rubrics [ 60 ] and the CriTT survey [ 59 , 61 ] did not match self-regulation skills. A more comprehensive mapping of all identified approaches with each cognitive and subskill can be found in Tables 4 – 6 . In the end, an appendix lists all the papers reviewed for mapping.

3.3 Critical Thinking Skills Assessment Approaches

3.3.1 surveys..

We found diverse survey-based assessment approaches that capture different aspects of critical thinking. Tables 4 – 6 shared above-mentioned outline various categories of survey-based assessments found in the literature.

In a number of studies, critical thinking was approached as a single skill rather than a collection of skills, often making it one of several learning outcomes rather than the sole focus (refer to Tables 4 and 5 ). For instance, students were asked whether interactive video conferencing improved intellectual abilities, such as analytical reasoning and knowledge integration, in addition to CT or whether writing assignments enhanced CT [ 62 , 63 ]. Assessing CT as a single skill in this manner was limited in scope, typically posing simple questions to students about whether the interventions in the respective studies allowed them to engage in CT, think critically to address problems, or enhance their CT abilities [ 50 , 62 , 64 – 66 ]. Some surveys utilized a pre-post format to evaluate whether students’ CT had changed or if they had gained an understanding of CT and its applications [ 70 – 72 ] (see Table 7 ).

Several studies employed surveys based on the Paul–Elder CT framework [ 18 , 56 , 70 , 71 , 74 ]. This framework aims to educate students on the characteristics of critical thinkers, incorporating specific terms such as Depth, Precision, Breadth, Accuracy, and others. These terms can be applied to assess various CT skills across different contexts, such as crafting precise arguments, presenting results accurately, or precisely evaluating flaws in the evidence (Table 8 ). As a result, assessment approaches using these keywords offer flexibility in their application, making it difficult to map them to a specific CT skill. In the studies by Thompson and colleagues [ 56 , 74 ], students were questioned about their understanding and application of the Paul–Elder CT framework that shows the intellectual standards are applied to the elements of thought to develop intellectual traits. However, these studies also noted that students often faced difficulties in comprehending or implementing this framework, leading to misunderstandings or an inability to demonstrate their CT skills effectively.

Paul–Elders Critical Thinking Framework [ 18 ]

Holistic rubrics [ 79 ]

Among all the surveys, the Motivated Strategies for Learning Questionnaire (MSLQ) and the Critical Thinking Toolkit (CriTT) stand out as being one of the more comprehensive, validated assessment measures [ 61 , 77 ]. The MSLQ consists of 81 questions, with five specifically targeting CT assessment. It includes opinion-based questions like, “I frequently question what I hear or read in this course to determine its credibility” and “When a theory, interpretation, or conclusion is introduced in class or the readings, I try to establish if there is sufficient supporting evidence” [ 69 ]. On the other hand, the CriTT is a 27-item Likert scale survey that primarily focuses on three aspects: students’ confidence in CT, whether they value CT, and their misconceptions of CT [ 61 ]. While these surveys have been employed in previous research, their usage has not been widespread [ 59 , 67 , 68 ].

3.3.2 Rubrics.

The use of rubrics in CT assessment was quite widespread, and studies that utilized rubrics were assessing CT as a learning outcome. Among the various rubrics identified, Holistic rubrics were associated with all six cognitive skills and nine subskills of CT, as shown in Table 4 [ 79 ]. These Holistic rubrics were deployed in five studies [ 55 , 56 , 71 , 74 , 78 ]. In most of these studies, students were initially taught the Paul-Elders critical thinking framework, followed by assessments using Holistic rubrics for various activities, including case studies, reflective journals, and assignments to enhance students’ CT abilities [ 55 , 56 , 71 , 74 , 78 ].

Although Holistic rubrics (see Table 9 ) are not as thorough as the VALUE rubrics [ 80 ], as shown in Table 10 , they were utilized in multiple studies. For instance, in one case study, students were required to compose an opinion-based paragraph addressing the questions, “How did this failure occur?” and “Who was most responsible for this failure?” [ 71 ]. In another study, students were asked to write a reflective report on how they would tackle an engineering grand challenge if given a budget and research team [ 74 ]. Faculty members later assessed student work using these rubrics and found them beneficial for evaluating CT.

VALUE rubrics sample [ 80 ]

Our analysis revealed that VALUE rubrics [ 80 ] were used five times [ 52 , 53 , 59 , 60 , 81 ]. VALUE rubrics [ 80 ] assess four cognitive skills and six subskills, but unlike Holistic rubrics, the distinction between different CT levels is much more apparent, as illustrated in Table 10 . VALUE rubrics [ 80 ] were used in a serious storytelling exercise to evaluate its impact on students’ CT [ 81 ] and in a multi-year longitudinal study [ 52 ]. The Critical and Integrative Thinking (CIT) [ 58 ] rubrics constituted another assessment approach covering five cognitive and seven subskills, although these rubrics were only used in one study.

There is a remarkable variety in the contexts and activities, which are assessed using rubrics. For instance, Frank et al. [ 52 ] utilized VALUE rubrics [ 80 ] alongside the Collegiate Learning Assessment (CLA+) instrument. Their study determined that VALUE rubrics [ 80 ] might be appropriate for short-term use, while the CLA+ is a better option for longitudinal studies. In another study, the same author discovered that these rubrics were more cost-effective per student than the California Critical Thinking Skills Test (CCTST) or the Cornell Level Z test of CT [ 53 ]. Vila-Parrish et al. [ 60 ] managed to identify students’ CT levels from emerging to capstone using common rubrics after carefully designing assignments and activities to enhance CT. Rubrics were employed to assess CT in discussions [ 82 ], reflection journals [ 83 ], and even mind-mapping activities (MALAR rubrics) [ 84 ].

Moreover, the structures of the rubrics exhibited a wide range of variations. Some were detailed and comprehensive, such as CIT and Holistic rubrics [ 58 , 78 ]. In contrast, Alemayehu et al.’s [ 85 ] rubrics included CT as one of the skills among other learning outcomes, or in another study, straightforward CT scoring rubrics with scores ranging from “1” to “4” were employed for evaluating students’ reflection journals [ 83 ]. Rubrics demonstrated versatility in their structure and the contexts in which they were applied.

3.3.3 Standardized Commercial Tests.

We identified three commercially available CT assessment approaches in our review: the Collegiate Learning Assessment (CLA+), the Critical Thinking Assessment Test (CAT), and the Evaluate UR/CURE Surveys [ 52 – 54 , 86 – 89 ].

The CLA+ assesses multiple skills, similar to other standardized commercial tests; this test also shows excellent mapping with Facione’s [ 14 ] CT skills. The test takes two hours and consists of Performance Task (PT) and Selected Response Questions (SRQ) categories [ 90 ]. Although it covers all cognitive CT skills, its use in the literature is limited. The CAT is a 60-min, self-reported survey with essay-style answers to 15 real-life questions. Instructors calculate the CT scores, unlike the automated CLA+ scoring. CAT has seen broader use, both standalone and alongside VALUE rubrics [ 53 , 54 , 88 , 91 ].

Evaluate UR and its modified version, Evaluate UR CURE, are commercially available survey-based measures. These surveys assess student learning outcomes, including CT, using a five-point Likert scale. Students and faculties both complete the survey and discuss discrepancies in their results. These measures capture several CT skills, as shown in Table 6 , but their use has been limited to two recent studies [ 86 , 87 ].

3.3.4 Customized Assessment Approach.

The customized assessment approach covers a broad spectrum of assessment approaches, mostly content, course, or intervention-specific, as shown in Table 11 . These studies often assess CT through various means, such as open-ended questions, formative assessment approaches, quantitative and correlation analysis, electroencephalography (EEG), and expert evaluation with custom rubrics, as illustrated in Table 11 .

Customized assessment approaches subcategories

The mapping Tables 4 – 6 have several assessment approaches demonstrating inadequate alignment with CT skills. This is primarily due to their reliance on specific keywords for CT assessment, such as “Understand,” “Assumptions,” and “Concepts.” These keywords could be employed to evaluate a range of CT skills. For instance, “Understand” may be used in various situations, including understanding arguments, presenting results effectively, and recognizing personal biases. Consequently, assessment approaches utilizing keywords offer flexibility in their application and do not correspond to a specific CT skill. Depending on the instructor’s approach, these assessment approaches can assess either a single CT skill in-depth or all. While such flexibility can be advantageous, it may also lead to confusion. Without prior knowledge of CT, keywords alone might be insufficient for comprehending or evaluating CT effectively. Assessment approaches with adaptable keywords are listed in a separate column in Tables 4 – 6 .

Our analysis revealed that only a few assessment approaches from the past decade measured all six cognitive CT skills. Despite the extensive variety of CT assessments, no measure evaluated all 16 subskills. It is worth noting that several assessment approaches using keywords could potentially assess all 16 subskills, providing a comprehensive CT evaluation. However, no such readily available assessment approaches were identified in the literature during this study. In this section, we listed the key results of the literature review, and in the next section, we share more insight into these findings.

This study aimed to investigate various assessment approaches for tracking the development of students’ critical thinking skills. An exploratory systematic literature review of articles indexed in the Scopus database from 2010 onwards was conducted, identifying 462 articles. These articles were narrowed down to 80 using the inclusion and exclusion criteria outlined previously. Furthermore, this review scrutinized the literature to understand and uncover which aspects of the CT are most frequently assessed and how critical thinking skills are evaluated (either explicitly or implicitly) within these aspects, making it highly relevant to the development of design-related education. Many studies under review were related to design-based education, or design-oriented tasks were used to judge students’ CT skills.

4.1 Common Practices in Critical Thinking Assessment.

We noticed three goals of CT assessment mentioned explicitly or implicitly test if: (1) students can understand or recognize CT, (2) students demonstrate CT, or (3) students’ CT has improved due to any intervention (refer to Table 2 ). They were often assessed using project based learning (PBL), desing based learming (DBL), or similar open-ended design-based approaches. These goals could be tied to the Revised Bloom’s Taxonomy (RBT) [ 42 ]. For instance, Goal 1 is associated with the cognitive processes of “remembering” and “understanding,” as students are expected to identify and understand CT concepts. Goal 2 coincides with the “applying” and “analyzing” cognitive processes, as students are required to illustrate CT through various forms of communication. Goal 3 relates to the “evaluating” and “creating” cognitive processes, in which students’ CT changes are assessed following an intervention. However, such a connection was used only in limited studies assessing CT [ 91 , 98 ].

These goals can also be attributed to various stages of the design process [ 27 , 28 ]. Goal 1 is applicable during the problem definition stage, while Goal 2 is relevant throughout the ideation, prototyping, and evaluation stages, as students apply critical thinking to generate, develop, and assess design solutions. Goal 3, assessing CT changes due to interventions, highlights the role of continuous reflection and improvement in the design process, emphasizing the role of CT skills in driving effective design outcomes. Table 2 shows that engineering education focuses on assessing intermediate and above stages of the design process (Goals 2 and 3). The findings from the study support that CT assessment and design process assessment can be accomplished parallelly without necessarily distinguishing between the two as separate learning outcomes. It helps to support the growing adoption of design-based education and increasing interest in CT.

Interestingly, rubrics, surveys, and customized approaches combined show that three of six cognitive skills, Interpretation, Inference, and Analysis, were frequently assessed, and Evaluation skill was the least assessed. This pattern suggests several hypotheses. It may indicate an implicit expectation in the engineering field that professionals inherently possess strong capabilities in analyzing situations, understanding and interpreting information, or making decisions [ 11 ]. Alternatively, these findings could imply a valuation hierarchy within CT skills from educators’ standpoint or the engineering education framework, where some aspects are prioritized over others. There is also the possibility of a constrained comprehension of the full spectrum of CT skills, reflecting Ahern’s suspicions [ 6 ]. Each of these potential explanations warrants further exploration to understand the discrepancies in emphasizing specific CT skills over others.

We identified four primary CT assessment approaches: Surveys, rubrics, customized assessments, and standardized tests. Surveys and customized assessment approaches were most widely used, followed by rubrics, as shown in Table 1 . They had surprising variations in their structure. We could subcategorize them further based on their attributes, as shown in Tables 6 and 11 . Each structure brings unique opportunities for educators and researchers. For example, detecting the effect of deliberate changes in the problem-solving process on the outcome can be better assessed using the pre-post format, as done in some studies [ 70 – 72 ]. But to improve the problem-solving process itself, more specific information can be retrieved by using more structured or validated framework-based assessment [ 56 , 61 , 70 ] because they cover many of Facione’s CT skills [ 14 ], as shown in the mapping (Tables 4 – 6 ).

Similarly, customized assessment approaches offer an excellent opportunity to integrate CT assessment into design curricula or existing pedagogical practices. Several studies demonstrate that this assessment approach can be employed for various design and non-design tasks to assess students’ CT. For instance, one study assessed students’ CT skills by examining how well they demonstrated levels of Bloom’s taxonomy, which also represents CT skills [ 99 ]. In other research, a more mathematical approach was adopted to quantify learning [ 49 , 92 ]. Additionally, there were case studies that focused on understanding the reasons behind component design failures [ 100 ]. Each of these methods provides a unique lens through which CT can be evaluated in different contexts. However, there are caveats to using customized assessment approaches. For example, in a course assignment, CT can be assessed as a single skill, but some students might focus on analysis and interpretation, and others might show CT in self-regulation and remove their personal biases, but that may not get captured. This approach might lead to inconsistencies and mixed results, which were visible during the review. [ 74 , 101 ]. On the contrary, studies using dedicated CT measures such as CriTT [ 61 ], CLA+ [ 90 ], and CAT [ 53 ] are well-informed about what aspects of CT they assess. Being aware of which assessment approach measures what aspects of CT could support holistic CT assessment by designing interventions or teaching approaches that target specific CT skills, such as evaluation, which is currently assessed less, as shown in mapping tables.

Another key finding is that significant adoption of the customized assessment approach implies that educators are more inclined toward the discipline-specific nature of this construct as this approach needs to be tailored to course and topic-specific questions. But again, contradictory evidence of treating CT as a single skill in numerous studies, as shown in Table 5 , indicates that CT is also perceived as a generic skill. These observations support Barrier and Jones’s [ 4 , 5 ] previous argument that whether CT is a discipline-specific or generic skill is not universally accepted. However, we suspect it is subjective and most likely to remain so due to the different requirements of educators and researchers.

Throughout the literature review, various assessment approaches were explored, with rubrics offering a more in-depth evaluation of CT skills with limited variations compared to other methods, ranging from simple 1–4 scale Holistic Rubrics [ 79 ] to the more comprehensive VALUE rubrics [ 79 ]. However, the accuracy of rubric assessments can be influenced by the subjective perceptions of the rubric user, potentially affecting the reliability of the results [ 102 ]. To mitigate this, the author recommends employing double-masked scoring [ 103 , 104 ], crowdsourcing evaluation [ 105 ], and peer evaluation [ 106 ]. Additionally, a few studies utilized standardized commercial tests such as the CLA+, the Critical Thinking Assessment Test (CAT), and the EvaluteUR/CURE Survey [ 53 , 88 – 90 ]. These tests exhibit high mapping with Facione’s CT skills [ 14 ] but face challenges such as high costs, resource requirements, and proprietary nature. This raises the question of whether AI and ML could potentially offer alternatives to commercial standardized tests that are equally effective, scalable, and reliable as the influence of these technologies continues to grow in various fields.

Overall, these observations, assessment goals, their usefulness in assessing different phases of the design process, and several identified primary categories of assessment and subcategories support answer RQ1, which sought to identify and understand how CT is assessed in engineering education and design.

4.2 Underexplored Areas of Critical Thinking Assessment.

The review showed that commercial tests like CLA+ [ 90 ] and CAT [ 53 ] are validated, scalable, and reliable, but their adoption is surprisingly low. On the other hand, most surveys were non-validated despite being the most used assessment measure, which raises two questions: first, how repeatable and reliable are existing surveys? Second, is it the priority to assess other technical skills over CT the reason for the “Let’s see if it works approach” in CT assessment? This area requires further development and can benefit from reusing existing measures to enhance their validity and reliability, as done by Frank and colleagues [ 52 , 53 ]. Behar-Horenstein and Niu [ 17 ] recommended using multiple assessment approaches is better because each measure might capture only some part of CT, and the review showed limited studies using multiple approaches [ 52 – 56 , 74 ]. More frequent use of a similar approach can help enhance understanding of CT assessment since drawbacks of one assessment approach can be nullified or minimized with another supporting measure.

The identified assessment categories show the complex nature of CT and how individuals’ perception changes how they assess CT. For example, a study might be interested in finding students’ confidence in CT, their ability to value CT, or their misconceptions about CT [ 59 ], or qualitatively analyzing student lab reports to trace if students can address the problem, provide an in-depth analysis of the lab, interpret the data using outside sources [ 107 ]or students ability to apply existing Paul–Elder’s CT framework to demonstrate their understanding of CT [ 56 , 73 , 74 ]. Every study has a different take on CT, and the outcomes of their assessment change accordingly, making it hard to understand if one approach can work in another context. Additional longitudinal studies building on the past understanding of CT assessment as done by Refs. [ 53 , 55 , 108 ] are necessary. Additionally, comparisons between two different assessment approaches help to conclude the suitability of one approach, e.g., rubrics in the short term and commercial tests in longitudinal studies [ 52 , 53 ]. Such comparative studies would help to strengthen the credibility of identified assessment approaches further.

Although widely adopted and capable of providing excellent in-depth assessment opportunity, rubrics suffers from one major scalability limitation. Rubrics-based assessments are difficult to implement at the university level or involve a significant number of students. This challenge was also shared in Ref. [ 109 ], where the author agrees on the scalability issue of rubrics. Perhaps, how to create rubrics that are easily scalable and minimize user-induced biases is an area worth exploring.

In summary, as shared at the beginning, the adoption of design-based and problem-based education is growing globally. Several CT skills identified by Facione [ 14 ] represent the design process, as shared in the introduction section. Several observations presented an opportunity for educators and researchers to develop CT assessment aspects such as reliability, scalability, and refinement of existing measures, along with developing new assessment approaches. Addressing the concerns and limitations raised here are directions for future research on CT, which answers our RQ2.

4.3 Limitations.

In this research, one of the limitations was using the Scopus-indexed database alone and not including additional databases. Although gathering articles from 2010 onward provided a wide range of assessment approaches used in different contexts, our sample size may be limited due to using a single database. However, the categories and subcategories of the assessment approach we identified provide the necessary structure to integrate future assessment approaches. Therefore, even if additional databases are included, CT assessment approaches would still fit into the four categories, increasing the number of results but not significantly impacting overall results.

Another fact was that we found several articles using surveys or assessment approaches that are very context-specific. It is hard to establish their mapping with the CT skills or effectiveness in assessing CT, which authors saw as limitations. We also had a limited number of multi-year CT assessment studies. Such a lack of research on the longitudinal assessment of CT limits the understanding of the evolution of the CT assessment approach.

In summary, this study sought to investigate different assessment methods for monitoring students’ CT skill growth within the context of engineering education. The identification of 80 pertinent articles resulted from an exploratory systematic literature review. Many of these articles dealt with design education in some way, emphasizing the apparent relationship between CT abilities and the design process. Three main objectives of CT assessment came to light throughout the review, either explicitly or implicitly, and could be connected to both the Revised Bloom’s Taxonomy and different phases of the design process. The idea that CT assessment, design process assessment, and ABET accreditation-related learning outcome assessment can be carried out concurrently is supported by these connections, which also help to explain why design-based education is becoming more widely used and why CT is becoming more popular.

Several primary CT assessment approaches were identified, each with unique opportunities and challenges. Customized assessment approaches offer great potential for integrating CT assessment into design curricula, while rubrics provide a more in-depth evaluation of CT skills. However, concerns with scalability and potential biases in rubric assessments warrant further exploration. Additionally, the limited adoption of standardized commercial tests raises questions about the future of CT assessment and the potential role of AI and ML in developing alternative, scalable, and reliable assessment tools.

The review also revealed underexplored areas of CT assessment, such as the reliability of existing surveys and the limited use of multiple assessment approaches. These findings present opportunities for further development and refinement of CT assessment measures, addressing concerns related to scalability and user-induced biases. Longitudinal studies and the exploration of scalable rubrics are among the potential directions for future research, which would help to advance our understanding of CT assessment in engineering education and design. By addressing these concerns and limitations, this study provides valuable insights for educators and researchers seeking to enhance critical thinking assessment within the growing field of design education.

Conflicts of interest have been declared to the Editor and will be included in a Conflict of Interest Declaration section of the final paper.

No data, models, or code were generated or used for this paper.

The list of articles included in our study is tabulated below. All reviewed articles do not have in-text citations; hence they are included as an appendix here. The list in this table could support relevant stakeholders in choosing the assessment approach appropriate for their context and needs.

Get Email Alerts

Related articles, related proceedings papers, related chapters, affiliations.

  • Accepted Manuscripts
  • ASME Journal Media
  • About the Journal
  • Editorial Board
  • Information for Authors
  • Call for Papers
  • Rights and Permission
  • Online ISSN 1528-9001
  • Print ISSN 1050-0472

ASME Journals

  • About ASME Journals
  • Submit a Paper
  • Title History

ASME Conference Proceedings

  • About ASME Conference Publications and Proceedings
  • Conference Proceedings Author Guidelines

ASME eBooks

  • About ASME eBooks
  • ASME Press Advisory & Oversight Committee
  • Book Proposal Guidelines
  • Frequently Asked Questions
  • Publication Permissions & Reprints
  • ASME Membership

Opportunities

  • Faculty Positions
  • ASME Community

American Society of Mechanical Engineers Logo

  • Accessibility
  • Privacy Statement
  • Terms of Use
  • Get Adobe Acrobat Reader

This Feature Is Available To Subscribers Only

Sign In or Create an Account

A Study of Critical Thinking and Cross-Disciplinary Teamwork in Engineering Education

  • First Online: 14 September 2019

Cite this chapter

critical thinking in engineering education

  • Hulya Julie Yazici 3 ,
  • Lisa A. Zidek 4 &
  • Halcyon St. Hill 5  

Part of the book series: Women in Engineering and Science ((WES))

1067 Accesses

5 Citations

Preparing engineer students for a career in engineering reaches beyond typical coursework, memorizing equations and solving end of the chapter problems. Students must learn to work and solve problems in a cross-disciplinary environment. As engineering curriculum needs to be reassessed to focus more on skills, capabilities, and techniques as well as cultivating ethical values and attitudes, more research is needed to understand what contributes to critical thinking skills, and overall higher academic achievement. The study of critical thinking skill in engineering education and cross-disciplinary collaboration of engineers are to be further explored. An underlying theme is that critical thinking is not taught, rather it is developed through experiential learning and systematic approaches to problem solving. This chapter describes the critical thinking performance of engineering students in association with their thinking styles and in relation to cross-disciplinary team setting.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Accreditation Board for Engineering and Technology (ABET) (2015) Criteria for accrediting engineering programs [online]. http://www.abet.org/wp…/2015/05/E001-15-16-EAC-Criteria-03-10-15.pdf . Accessed 18 Dec 2015

Agdas S (2013) Effects of problem-based learning on development of critical thinking skills and dispositions in engineering. In: Abstract of Ph.D. dissertation presented to the Graduate School of the University of Florida, August 2013

Google Scholar  

Association of American Colleges and Universities (AACU) (2009) Critical thinking VALUE rubric. AACU Critical Thinking Value Matrix. https://www.aacu.org/value/rubrics/critical-thinking

Association to Advance Collegiate Schools in Business (AACSB) (2013) AACSB assurance of learning standards: an interpretation, AACSB White Paper No. 3. http://www.sdabocconi.it/sites/default/.../2_aolwhitepaper_final_11_20_07.pdf . Accessed 3 May 2013

Banning M (2006) Measures that can be used to instil critical thinking in nurse prescribers. Nurse Educ Pract 6:98–105

Article   Google Scholar  

Behar-Horenstein L, Niu L (2011) Teaching critical thinking skills in higher education: a review of the literature. J Coll Teach Learn 8(2):25–41

Bloom BS (1956) Taxonomy of educational objectives, handbook 1: cognitive domain. Longmans Green, New York

Boden D, Borrego M, Newswander LK (2011) Student socialization in interdisciplinary doctoral education. Higher Education 62:741–755. https://doi.org/10.1007/s10734-011-9415-1

Bonney CR, Sternberg RJ (2016) Learning to think critically. In: Mayer RE, Alexander PA (eds), Handbook of research on learning and instruction, Taylor and Francis

Borrego M, Newswander LK (2008) Characteristics of successful cross-discipline engineering education collaborations. J Eng Educ 97:123–134

Borrego M, Foster MJ, Froyd JE (2014) Systematic literature reviews in engineering education and other developing interdisciplinary fields. J Eng Educ 103(1):45–76

California Critical Thinking Skills Test (CCTST)/Critical Thinking Skills Tests/Products/Home-Insight Assessment (n.d.). http://www.insightassessment.com . Accessed 4 July 2012

Chen H, Chiang R, Storey V (2012) Business intelligence and analytics: from big data to big impact. MIS Quarterly 36(4):1165–1188

Chiang R, Goes P, Stohr EA (2012) Business intelligence and analytics education, and program development: a unique opportunity for the information systems discipline. ACM Trans Manage Inf Syst 3(3):1–13

Coso AE, Bailey RR, Minzenmayer E (2010) How to approach an interdisciplinary engineering problem: characterizing undergraduate engineering students’ perceptions. IEEE Frontiers in Education Conference, Washington, DC. https://doi.org/10.1109/FIE.2010.5673313

Facione PA (2000) The disposition toward critical thinking: Its character, measurement, and relation to critical thinking skill. Informal Logic 20(1):61–84

Ghanizadeh A (2017) The interplay between reflective thinking, critical thinking, self-monitoring, and academic achievement in higher education. Higher Education 74:101–114

Greeno JG, Collins AM, Resnick LB (1996) Cognition and learning. In: Berliner DC, Calfee RC (eds) Handbook of educational psychology. Macmillan, New York, pp 15–46

Heizer J, Render B (2014) Principles of operation management, 9th edn. Pearson Publishing, USA

Johri A, Olds BM, O’Connor K (2014) Situative frameworks for engineering learning research. In: Johri A, Olds BM (eds) Cambridge handbook of engineering education research. Cambridge University Press, New York, pp 47–66

Lattuca LR, Knight DB, Ro HK, Novoselich BJ (2017) Supporting the development of engineers’ interdisciplinary competence. J Eng Educ 106(1):71–97

Manyika J (2011) Big data: the next frontier for innovation, competition, and productivity, executive summary. McKinsey Global Institute. http://www.mckinsey.com/insights/business_technology/big_data_the_next_frontier_for_innovation . http://www.mckinsey.com/mgi/publications/big_data/pdfs/MGI_big_data_exec_summary.pdf

National Academy of Engineering (2004) The engineer of 2020: visions of engineering in the new century. National Academies Press, Washington, DC

Newswander LK, Borrego M (2009) Engagement in two interdisciplinary graduate programs. Higher Education 58(4):551–662. https://doi.org/10.1007/s10734-009-9215-z

Sheppard SD, Macatangay K, Colby A, Sullivan WM (2008) Educating engineers: designing for the future of the field. Jossey-Bass, San Francisco

Paul R, Elder L (2010) The miniature guide to critical thinking concepts and tools. Foundation for Critical Thinking Press, Dillon Beach

Siemens G, Long PD (2011) Penetrating the fog: analytics in learning and education. http://www.educause.edu/ero/article/penetrating-fog-analytics-learning-and-education

Simpson E, Courtney MD (2002) Critical thinking in nursing education: literature review. Int J Nurs Pract 8(2):89–98

Sternberg RJ (1997) The concept of intelligence and its role in lifelong learning and success. Am Psychol 52:1030–1037

St. Hill H, Yazici HJ, Zidek L (2015) Innovations in interprofessional education (IPE)—health professions, engineering and business: learning styles, critical thinking, and the impact on education and practice. In: IPE research paper presented at ASAHP 2015 Annual Meeting, Scottsdale, AZ, 28–30 October

St. Hill H, Yazici HJ, Papkov T, Zidek L (2018–2019) Unpublished manuscript in review

Terenzini PT, Reason RD (2010) Toward a more comprehensive understanding of college effects on student learning. In: Paper presented at the Annual Conference of the Consortium of Higher Education Researchers (CHER), Oslo, Norway

Watson G, Glaser EM (1980) Watson-Glaser critical thinking appraisal: forms A and B; manual. Psychological Corporation

Watson G, Glaser E (2002) Watson-Glaser critical thinking appraisal, UK edition: practice test. Psychological Corporation, London

Yazici HJ (2004) Student perceptions of collaborative learning in operations management classes. J Educ Business 80(2):110–118

Yazici HJ (2005) A study of collaborative learning style and team learning performance. Education + Training 47(3):216–229

Article   MathSciNet   Google Scholar  

Yazici HJ (2016) Role of learning style and interactive response systems on student learning outcomes in undergraduate business education. Int J Oper Manag Educ 6(2):109–134

Yazici HJ (2017) Innovative assignment approach in operations course for enhancing critical thinking, engineering education track, interactive presentation. In: IISE Annual Meeting, Pittsburg, PA, 20–23 May 2017

Yazici HJ, St. Hill H (2016a) Assessment of critical thinking learning outcomes in interdisciplinary education. In: IISE Annual Meeting, IS Evaluation and Teaching Tools Track, Anaheim, CA, 21–24 May

Yazici HJ, St. Hill H (2016b) Critical thinking learning outcomes in operations with an interdisciplinary approach. In: POMS 27th Annual Conference, Teaching/Pedagogy in P/OM Track, Orlando, FL, 6–9 May 2016

Download references

Acknowledgements

This research was funded by the Multidisciplinary Grant Research Initiative (MDRI, Number 2014–32), Florida Gulf Coast University, Fort Myers, FL.

Author information

Authors and affiliations.

Lutgert College of Business, Florida Gulf Coast University, Fort Myers, FL, USA

Hulya Julie Yazici

U.A. Whitaker College of Engineering, Florida Gulf Coast University, Fort Myers, FL, USA

Lisa A. Zidek

Marieb College of Health and Human Services, Florida Gulf Coast University, Fort Myers, FL, USA

Halcyon St. Hill

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Hulya Julie Yazici .

Editor information

Editors and affiliations.

Department of Industrial and Systems Engineering, Auburn University, Auburn, AL, USA

Alice E. Smith

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Yazici, H.J., Zidek, L.A., St. Hill, H. (2020). A Study of Critical Thinking and Cross-Disciplinary Teamwork in Engineering Education. In: Smith, A. (eds) Women in Industrial and Systems Engineering. Women in Engineering and Science. Springer, Cham. https://doi.org/10.1007/978-3-030-11866-2_8

Download citation

DOI : https://doi.org/10.1007/978-3-030-11866-2_8

Published : 14 September 2019

Publisher Name : Springer, Cham

Print ISBN : 978-3-030-11865-5

Online ISBN : 978-3-030-11866-2

eBook Packages : Engineering Engineering (R0)

Share this chapter

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

Bookmark this page

"resources/articles/applying-a-criticalthinking-model-for-engineering-education.shtml" not found

Sorry the page you are looking for is not found.

  • Critical Thinking Online Courses
  • The Center for Critical Thinking Community Online
  • Critical Thinking Blog
  • Online Courses for Your Students
  • Guided Study Groups
  • Critical Thinking Channel on YouTube
  • CT800: Spring 2024

Applying A Critical Thinking Model for Engineering Education

critical thinking in engineering education

ESSENTIAL INTELLECTUAL TRAITS

These virtues are not radically distinct from those sought by any maturing thinker. They determine the extent to which we think with insight and integrity, regardless of the subject. The engineering enterprise does however pose distinct questions for the engineer in pursuit of such virtue.

FUNDAMENTAL ELEMENTS OF THINKING

Applying intellectual standards, the model in engineering education.

Fostering Intellectual Traits

Engineering students are likely puzzled at first by the suggestion that personal virtues relate to their success as engineers. The criticality of these traits becomes prominent in their interactions as members of teams. Consequently, introducing the standards and using them to foster development, is most effectively done in the context of their efforts to make their teams succeed.

At the conclusion of team projects, or coincident with major milestones (long duration projects), team members can be assigned to write a paragraph in which they identify a vignette in which they saw one of the intellectual traits exhibited in a way that benefited the team, and a second example identifying a vignette in which an individual or team deficit in the intellectual traits hampered team performance. The faculty member or team manager should then collate the vignettes stripping contributors' names (recognizing the team manager may be the subject of either positive or negative vignettes). A group discussion of the results should be included as part of technical debrief.

Employing the Elements of Reasoning

The real power in this taxonomy of thinking is its scalability. A topic as large as an entire course or as small as an editorial in the newspaper or a single lecture can be decomposed using the elements. The student can be asked to decompose a journal article, course topic, textbook chapter or technical report using this framework. Opportunities abound for using the eight elements both in class and outside course-work.

The eight elements can be introduced to the students in several ways. The guide includes a number of templates and examples. The most effective way for the students to become comfortable working with the elements is to review an example and then immediately apply the template to some subject area.

On the opening day of a class, the entire class can be asked to identify the eight elements associated with the prerequisite course, e.g.- "Identify the eight elements associated with the class you finished last semester in Aerodynamics. What was the purpose of Aerodynamics? What question was it trying to answer? What was the point of view? What assumptions were commonly made? What information was brought to bear? What concepts were key? What conclusions were formed? What were the implications of the material you learned?" Once students were given 6-8 minutes to do this individually, they could then share their answers either in small groups, or as a class. They could then be assigned to skim their new text's Table of Contents and decompose the new course according to the same template.

The faculty member is indispensable in keeping the elements close to the surface of the students thinking. This is best done by Socratic interaction in which the questions posed by professor apply to one of the eight elements: "What were the assumptions constraining this approach?" "What implications follow from this development?" "When we started to derive this relationship, what question are we trying to answer?" "What's the source of this insight? Was it theoretical or experimental? What empirical support do we have for this theoretical result?"

At the end of any course segment or at the end of the entire course, the students can be tasked to decompose that chapter's content or the entire course using the 8 elements. At any point during the course, the content of a relevant article can be decomposed.

The value of this practice is helping the student to provide a context for that segment or course. It provides a framework for recalling the importance of assumptions, recalling the big picture question at hand, and moving beyond the direct content to wrestling with its implications.

Teaching the Intellectual Standards

An effective means of introducing the intellectual standards is by means of reciprocal teaching. Using the Engineering Reasoning guide, students should be assigned in pairs to read the descriptions and example questions associated with Clarity and Accuracy (one student assigned to each). They should be given 3-4 minutes to prepare to explain their assigned standard to their partner, including both examples of representative questions from the guide, as well as an example they've created themselves.

In class, the standards provide a template for developing good questions to be posed in Socratic fashion. In doing so, the professor is modelling the thinking of mature engineers through the questions they pose.

Ancillary material

Vignettes in the back of the guide are intended to illustrate both successes and failures in engineering in our critical thinking vocabulary. They are included to foster discussion portraying the results of both excellent and deficit engineering reasoning. Students can be encouraged to research other historical examples and specifically evaluate how the success or failure of a technical enterprise turned on the quality of thought. While we commonly dissect accidents for their technical and organizational flaws, it is also illuminating to evaluate the thinking present in these episodes.

CONCLUSIONS

Acknowledgements.

University of Missouri

Show Me Mizzou. News from the University of Missouri

The real deal

The Allen Angel Capital Education Program puts students in charge of selecting and investing in high-growth startup companies.

critical thinking in engineering education

May 8, 2024 Contact: Deidra Ashley, [email protected] Photo by Abbie Lankitus

The stakes are high for students involved in the Allen Angel Capital Education (AACE) Program at the Robert J. Trulaske, Sr. College of Business.

This group of mid-Missouri investors is composed of University of Missouri students dealing with real money and real entrepreneurs. Their decisions to invest in high-growth startup companies are based on long hours of cultivating deal flows, performing pre-screening duties, completing due diligence and structuring investment contracts.

AACE is a hands-on, learning-by-doing course with a twist — it’s real life.  

This semester, 17 students spanning seven Mizzou majors have been managing approximately $700,000 in assets, most of it deployed to 15 portfolio companies.

For some students like Kyle Klostermann, a junior majoring in accounting, AACE has been an opportunity to learn more about the other side of his entrepreneurial dream — the side of the investor tasked with determining whether an entrepreneurial vision is worth the investment. For others like Yuriy Snyder, a doctoral student in biological engineering, AACE has been an education in how engineers like himself can best pitch their ideas to investors and get them into the marketplace.

Both agree that working alongside students from other disciplines has broadened their insights and forced them to think more critically about angel and venture capital investment strategies.

“Everyone is bringing a different perspective to the table,” said Klostermann, who serves as a managing director for the group along with Snyder and two other students. “Being able to have all these different perspectives to consider before moving on to the next step of the process is an amazing opportunity.”

Investing in students

That’s exactly what W. D Allen was hoping when he and his sister, Pinney, launched the AACE program 12 years ago with donations of $25,000 and $500,000 respectively. Allen, an adjunct professor, coordinates the AACE program and oversees the course. He is quick to point out, however, that the students are in charge by serving as investment analysts to the fund by cultivating deal flow, performing pre-screening duties, completing due diligence, structuring and negotiating investment contracts, and monitoring portfolio holdings to ultimately drive financial returns for the fund.

“I sit back and marvel at what they are able to do,” Allen said.

Other advisors for the group are Kate Holland, an assistant professor of finance, and Gene Gerke and Steve Guthrie, co-presidents of Centennial Investors Angel Network.

Putting critical thinking to work

Before each semester, the managing directors of AACE run a three- to four-session bootcamp for members new to the investment world. Then, throughout the course, AACE advisors work with the students, who actively learn about angel and venture capital investment strategies through a balance of case studies, books and deal flow from actual companies seeking funding.

Grace Demetrician, another AACE managing director and junior who is pursuing a business degree with an emphasis in finance and banking, said while much of her education has stressed the importance of critical thinking, AACE has challenged her to balance her emotions with thinking critically – an interesting juxtaposition she hasn’t faced in many other classes.   

“You walk the line between being excited for a startup company and wanting them to do well and thinking critically about that company and considering whether they’re a good investment,” she said. “It’s a balancing act that is made more urgent because we are dealing with real money.”

Another AACE managing director, Becca May, who is a graduate student in geology, said she has come to appreciate how each student has an equal stake in the course, an aspect that has leveled the playing field in a way unique to her academic experience.  

“We each have one vote, and we have to come to an agreement, knowing that you might win some and you might lose some,” she said. “It’s been humbling and eye-opening for me.”

Because of the responsibility entrusted in them during AACE, the managing directors said they’ve grown more confident in their abilities to lead.

“When I got into AACE, I had no business background,” Snyder said. “Just going from science to business, I experienced a terminology gap that I had to overcome. It was difficult to learn a new style of communication, but it’s been so cool to step into a whole other world. I’ve developed a confidence that only comes from learning something completely new.”

Story written by Sara Diedrich

Learning by doing

MU Robert J. Trulaske, Sr. College of Business

Related Stories

Jordan Chiantelli-Mosebach and Tanner Smith

Jordan Chiantelli-Mosebach and Tanner Smith receive Fulbright Canada-Mitacs Globalink Research Internships

Grace Liles works on calculations in a science lab

Making a positive impact

University of Missouri senior Grace Liles is leaving her mark on the world by using radiochemistry to make potentially life-saving discoveries.

Students in front of Jesse Hall

Photo gallery: Senior Sendoff 2024

The class of 2024 charged through the Columns — symbolizing their next step as Mizzou alumni.

group photo of awardees

2024 Award for Academic Distinction recipients announced

Subscribe to

Show Me Mizzou

Stay up-to-date with the latest news by subscribing to the Show Me Mizzou newsletter.

IMAGES

  1. Critical Thinking Definition, Skills, and Examples

    critical thinking in engineering education

  2. The critical thinking engineer

    critical thinking in engineering education

  3. How to promote Critical Thinking Skills

    critical thinking in engineering education

  4. Critical Thinking Skills

    critical thinking in engineering education

  5. why is Importance of Critical Thinking Skills in Education

    critical thinking in engineering education

  6. What Education in Critical Thinking Implies Infographic

    critical thinking in engineering education

VIDEO

  1. How do I Think Critically?

  2. How to Catch a Leprechaun: Critical Thinking in Kindergarten and First Grade

  3. How to Develop Critical Thinking Skills? Urdu / Hindi

  4. 20240222133152

  5. DAY2 Session6(9.11.20)SMVEC ATAL FDP on Design Thinking -Engineering Applications by Dr K Ganesan

  6. Thinking Engineering? Think #DkIT

COMMENTS

  1. A literature review of critical thinking in engineering education

    Well-developed critical thinking (CT) skills are essential for dealing with the multi-dimensional nature of these problems. CT in an engineering context is well reported in teaching and learning academic literature. However, much of this is framed within theoretical and conceptual frameworks.

  2. Critical Thinking for Engineers

    Abstract. Engineers are specialists in technical information. As the complexities of problems increase, there has been an increasing need for engineers to apply critical thinking in the context of problem solving. This article demonstrates the value and use of developing abstract thought in engineering, especially for students.

  3. PDF Incorporating Critical Thinking into an Engineering Undergraduate ...

    produce or make. Critical thinking, it is claimed, in engineering education occurs today mostly in a focused context, directed toward fulfilling one of several ABET learning outcomes (Claris & Riley, 2013). Of course the teaching of critical thinking should be more than this, as critical thinking entails much more than the conventional practices in

  4. Critical thinking for engineers and engineering critical thinking

    Abstract: Design decisions for a critical thinking curriculum for engineering students serve as a point of departure to briefly describe an under-appreciated reason to emphasize critical thinking in engineering programs. An increasing focus on the role of context, environment and systems in shaping human judgement means that engineers should be especially aware of the propensity for designs ...

  5. PDF A literature review of critical thinking in Engineering education

    A review of critical thinking in engineering education Developing optimum solutions to engineering problems typically relies on structured and complex thought processes that require evaluation, interpretation and opinion. Well-developed critical thinking (CT) skills are essential for dealing with the multi-dimensional nature of these problems.

  6. (PDF) Facilitating Critical Thinking in Engineering Students: An

    The critical importance of fostering critica l thinking (CT) skills i n engineering students is well-. acknowledged. This study aimed to shed light on the existing methods used to teach a nd ...

  7. A literature review of critical thinking in engineering education

    Critical thinking is a mental process that is well organized and plays a role in the decision-making process of solving problems by analyzing and interpreting data in scientific investigation ...

  8. PDF Critical Thinking in Engineering and Technology Education

    As a first step, we have undertaken a review of recent educational literature (particularly that concerned specifically with science and engineering education) to develop a picture of current best practices in developing critical thinking skills in engineering and technology students. This search included the Internet, Ei Compendex (IOP), ERIC ...

  9. PDF Critical Thinking for Engineers and Engineering Critical Thinking

    978-1-5090-3912-8/16/$31.00 ©2016 IEEE concern, the design of the course made particular efforts to link so-called "soft skills" of critical thinking as directly as possible the professional demands of a successful Engineering career, whether in practice or in research. The course design emphasized "epistemic humility: recognizing ...

  10. Is Critical Thinking a Skill or a Way to Develop Skills? An ...

    A Systematic Literature Review (SLR) was led based in published studies, in the Scopus database related to critical thinking and how it is used in the area of engineering education [].The SLR comprises a set of three main searches done sequentially, in order to refine the search based on exclusion and inclusion criteria, Fig. 1. The selected documents were analyzed through a descriptive ...

  11. Strengthening critical thinking in engineering students

    The purpose of this study is to design strategies that develop critical thinking in engineering students, using three specific cognitive tools: (1) the enrichment of the vocabulary corpus; (2) the formation of coherent speech habits; and (3) the practice of argumentation and opinion. ... 2018 World Engineering Education Forum-Global Engineering ...

  12. (PDF) Outcomes of Critical Thinking in Engineering Education: A

    Cultivating, forming, and developing critical thinking competencies is one of the most important success factors of engineers in problem-finding, problematization identifying innovative solutions ...

  13. Empowering students'engineering thinking: An empirical study of

    Engineering thinking is widely recognized as a critical component of quality engineering education (Lin et al., 2021; Moore et al., 2014). However, educational approaches and practices for developing engineering thinking in school-based settings vary across nations ( Avsec & Sajdera, 2019 ; ; NRC, 2009 ; Simarro & Couso, 2021 ), highlighting ...

  14. Exploring Students' Critical Thinking Skills Using the Engineering

    Critical thinking skills (CTS) have been applied in the learning environment to address students' challenges in the twenty-first century. Therefore, specific approaches need to be implemented in the learning environment to support students' CTS. This research explores students' CTS during the learning process through the engineering design process (EDP) in a physics classroom. The methodology ...

  15. [PDF] Critical Thinking In Engineering And Technology Education: A

    TLDR. To help students become better critical thinkers and appreciate the importance of its practice throughout their education and careers, critical thinking and engineering reasoning are now explicit parts of the engineering school's recently introduced, ―Introduction to Engineering‖ (ENGR 100) course. Expand.

  16. Critical Thinking In Engineering And Technology Education: A Review

    Several literature reviews on critical thinking interventions in undergraduate engineering education have been conducted in recent years and reveal that, while various pedagogical approaches to ...

  17. Critical Thinking Assessment in Engineering Education: A Scopus-Based

    Abstract. Critical Thinking (CT) skills are highly valued by employers, leading to their integration into engineering education through various design- and problem-based approaches. Despite their recognized importance, the varying perceptions of CT present challenges in achieving a unified approach to its development and assessment. This paper reviews CT assessment in engineering education ...

  18. Critical Thinking In Engineering And Technology Education: A Review

    Critical Thinking in Engineering and Technology Education: a Review INTRODUCTION. The ability to think critically is a vitally important skill in the engineering workplace. The need for critical thinking is implicit in most of the program outcomes proscribed by ABET, including designing experiments and interpreting data; designing a product to ...

  19. The critical thinking engineer

    Critical thinking then, is the kind of skill an engineer should be equipped with before moving forward into an engineering career. The question is: Can critical thinking be taught? Educational institutions are attempting to impart this illusive skill before students even reach the tertiary education level. How institutions are doing it today

  20. A scoping literature review of sociotechnical thinking in engineering

    This paper provides a scoping literature review of STT in engineering education, focusing on research purposes, methodologies, findings, and potential gaps. ... He is interested in topics related to Science and Technology Studies and Critical Pedagogy. ... his research involves conceptualising ways of thinking in engineering that encourage and ...

  21. Why Engineers Need Critical Thinking Skills

    Engineering.com. Engineering information and connections for the global community of engineers. Find engineering games, videos, jobs, disciplines, calculators and articles….

  22. A Study of Critical Thinking and Cross-Disciplinary Teamwork in

    Critical thinking is the basis for reasoned decision-making and is therefore central to engineering education and practice. Although engineers are expected to have the ability to use math and science in their thinking, this thought process is not oriented toward theory, but design and discovery (Sheppard et al. 2008 ).

  23. Critical consciousness in engineering education: going beyond critical

    That said, we conducted qualitative research with engineering students who produced mathematical models involving differential equations. We found that critical mathematical consciousness and critical thinking show variations, as there is a greater occurrence of critical thinking to the detriment of critical consciousness.

  24. Critical thinking in the university curriculum

    Critical thinking is a graduate attribute that many courses, including engineering courses, claim to produce in students. As a graduate attribute it is seen by academics as a particularly desirable outcome of student learning and is said by researchers to be a defining characteristic of university education.

  25. Applying A Critical Thinking Model for Engineering Education

    Richard Paul's critical thinking model was adapted to the challenge of engineering education, and published in July 2006 as a guide to Engineering Reasoning. Paul's model is briefly described and exemplified by questions engineers ask in practice. This paper describes classroom exercises employing the model which are suitable for undergraduate ...

  26. The real deal // Show Me Mizzou // University of Missouri

    For others like Yuriy Snyder, a doctoral student in biological engineering, AACE has been an education in how engineers like himself can best pitch their ideas to investors and get them into the marketplace. ... said while much of her education has stressed the importance of critical thinking, AACE has challenged her to balance her emotions ...