Learning

Definitions of Computational Thinking, Algorithmic Thinking & Design Thinking

by Lcom Team | Jun 7, 2022 | Blogs

Young student at school desk practicing computational thinking algorithmic thinking and design thinking skills

Share this article!

While there are differences between each, these methods all blend critical thinking and creativity, follow iterative processes to formulate effective solutions, and help students embrace ambiguous and open-ended questions . So, without further ado…

Definition of Computational Thinking

Computational thinking is a set of skills and processes that enable students to navigate complex problems. It relies on a four-step process that can be applied to nearly any problem: decomposition, pattern recognition, abstraction and algorithmic thinking.

The computational thinking process starts with data as the input and quests to derive meaning and answers from it. The output is not only an answer but a process for arriving at it. As well as an output, computational thinking also plots the journey to the solution to ensure that the process can be replicated and others can learn from it and use it.

The computational thinking process includes four key concepts: 

  • Decomposition : Break the problem down into smaller, more manageable parts.
  • Pattern Recognition : Analyze data and identify similarities and connections among its different parts.
  • Abstraction : Identify the most relevant information needed to solve the problem and eliminate the extraneous details.
  • Algorithmic Thinking : Develop a step-by-step process to solve the problem so that the work is replicable by humans or computers.

Examples of Computational Thinking

Computational thinking is a multi-disciplinary tool that can be broadly applied in both plugged and unplugged ways. These are some examples of computational thinking in a variety of contexts.

1. Computational Thinking for Collaborative Classroom Projects

To navigate the different concepts of computational thinking – decomposition, pattern recognition, abstraction and algorithmic thinking – guided practice is essential for students.

In these classroom-ready lesson plans, students cultivate understanding of computational thinking with hands-on, collaborative activities that guide them through the problem and deliver a clearly articulated and replicable process – an algorithm 😉 – that groups present to the class.

  • Computational Thinking Lesson Plan, Grades K-2
  • Computational Thinking Lesson Plan, Grades 3-5
  • Computational Thinking Lesson Plan, Grades 6-8

2. Computational Thinking for Data-Driven Instruction

In this example , the New Mexico School for the Arts sought a more defined process for using data to better inform decision-making across the school. To do so, they developed interim assessments that generate actionable data, but the process of mining the data for relevant information was incredibly cumbersome.

Expediting and improving the data analysis process, they designed a coherent process for analyzing the data quickly to find the most important information. This process can now be applied time and time again and has enabled them to tailor instructional planning to the needs of students.

3. Computational Thinking for Journalism

To measure gender stereotypes in films, Julia Silge, data scientist and author of Text Mining with R , coalesced data from 2000 movie scripts. Decomposing the problem, she specified that she would specifically look at the verb association with male and female pronouns in screen direction.

By identifying patterns in sentence structure, Silge was able to measure and abstract data from these on a mass scale, which made the research possible. Her analysis then resulted in this article, She Giggles, He Gallops .

Definition of Algorithmic Thinking

Algorithmic thinking is not solving for a specific answer; instead, it solves how to build a replicable process –an algorithm, which is a formula for calculating answers, processing data, or automating tasks.

Algorithmic thinking  is a derivative of computer science and coding . It seeks to automate the problem-solving process by creating a series of systematic logical steps that process a defined set of inputs and produce a defined set of outputs.

Examples of Algorithmic Thinking

Here are three examples that cover algorithms in basic arithmetic, standardized testing and our good ol’ friend, Google.

1. Algorithmic Thinking in Long Division

Without having to dive into technology, there are algorithms we teach students, whether or not we realize it. For example, long division follows the standard division algorithm for dividing multi-digit integers to calculate the quotient.

The division algorithm enables both people and computers to solve division problems with a systematic set of logical steps, which this video shows . Rather than having to analyze and parse through these problems, we are able to automate solving for quotients because of the algorithm.

2. Algorithmic Thinking in Standardized Testing

A somewhat recent development in standardized testing is the advent of computer adaptive assessments that pick questions based on student ability as determined by correct and incorrect answers given.

If students select the correct answer to a question, then the next question would be moderately more difficult. But if they answer wrong, then the assessment offers a moderately easier question. This occurs through an iterative algorithm that starts with a pool of questions. After an answer, the pool is adjusted accordingly. This repeats continuously.

The purpose of this algorithmic approach to assessment is to measure student performance in a more targeted way. This iterative algorithm isn’t just limited to standardized tests; personalized and adaptive learning programs use this same algorithm, too.

3. Algorithmic Thinking in Google

Have you ever wondered why the chosen results appear for a query as opposed to those on the second, third, fourth, or tenth pages of a google search?

You guessed it! Google’s search results are determined (in part) by the PageRank algorithm, which assigns a webpage’s importance based on the number of sites linking to it. In other words, the algorithm looks at hyperlinks to a webpage as an upvote.

So, if we google ‘what is an algorithm,’ we can bet that the chosen pages have the most links to them for the topic ‘what is an algorithm.’ It’s still more complicated than this, of course. PageRank also looks at the score for the site that is linking to the webpage to rank the authority of the link. And there is still much more; if you are interested, this article goes into the intricacies of the PageRank algorithm.

What can we take away from this? There are over 1.5 billion websites with billions more pages to count, but thanks to algorithmic thinking we can type just about anything into Google and expect to be delivered a curated list of resources in under a second. This right here is the power of algorithmic thinking .

Definition of Design Thinking

Design thinking is a problem-solving method that helps solve problems that are vague, open-ended and don’t have a defined output. Design thinking starts with asking, “Why is this a problem?” It uses empathy, definition, ideation, prototypes, testing and improvements to design a unique output.

Design thinking is a user-centered approach to problem solving . The process ends with a deliverable of sorts, whether technological or constructed with tape and paper. Rather than being a replicable approach like computational thinking or algorithmic thinking, design thinking is conceptual and its outputs are unique .

The design thinking process contains the following steps: empathize , define , ideate , prototype , ideate and test (plus improve ).

  • Empathize: Research the needs of the user to understand why they have the problem and identify their pain points.
  • (re)Define: Specify and articulate the problem based on feedback from the empathize phase.
  • Ideate: Strategize different ways to solve the problem that fit the user’s needs.
  • Prototype: Build models of sample solutions.
  • Test: Try the prototypes, experiment with them and seek feedback.
  • Improve: Consider what worked and what did not from the testing prototypes, return to the ideate phase to develop enhanced prototypes and test again.

Design thinking is a non-linear process, meaning that we return to steps and restart in certain areas. Design thinking is deliverable focused, making sure what we create best serves and represents the end user’s needs .

Examples of Design Thinking

Design thinking is widely applied. Here are a few examples of innovative and disruptive ways teachers, schools and organizations are using design thinking.

1. Design Thinking Student Projects

In this article , Kristen Magyar, fifth-grade teacher and STREAM enthusiast, shares how she was inspired to create a toy invention unit based on the popular show, Toy Box. What makes this project so excellent is that Magyar tailored it to the students’ interests, knowing that learning is far more likely to resonate when instruction is relevant to their personal experiences and interests.

The Toy Box unit was project-based and centered on the design thinking process. Students invented entirely new toys and pitched them to a panel of judges. Learn more about this collaborative project here !

2. Design Thinking for School Improvement

This interview features Sam Seidel , Director of K12 Strategy + Research at the Stanford D.School . He is passionate about using design thinking to reimagine education. He focuses in part on school initiatives like project-based learning and state programs like standardized testing.

Seidel’s message is that as schools seek to innovate their processes and programs, they need to bring teachers into the conversations. Initiatives will not be as effective without the buy-in from teachers. He encourages school and district leaders to empathize with problems teachers may have, develop solutions that match their needs and their student needs, and embrace an iterative process for honing the efficacy of these.

3. Design Thinking for Business Growth

Now we get to talk about my second favorite topics (education being the first), which is food. As one of many food delivery applications, UberEats uses design thinking to improve on a city-by-city basis. UberEats affirms that their work must be relevant to that of the users, and as a multinational company, that means they must tailor their program to each city in which they operate.

To do so, UberEats immerses their employees in different cities by exploring and eating their way through the various cuisines ( Um… can I sign up for this? ), talking with restaurants, and meeting with platform users.

UberEats then translates the findings into prototyped solutions. They iterate quickly and are not afraid of making improvements on the fly to uphold their belief that a user-centered product will grow its market and outperform its competition.

Teaching Students Computational Thinking, Algorithmic Thinking, & Design Thinking

Learning.com Staff Writers

Learning.com Team

Staff Writers

Founded in 1999, Learning.com provides educators with solutions to prepare their students with critical digital skills. Our web-based curriculum for grades K-12 engages students as they learn keyboarding, online safety, applied productivity tools, computational thinking, coding and more.

Further Reading

Teaching Students to Think Like Programmers | Learning.com

  • Teaching Students to Think Like Programmers | Learning.com

by Lcom Team | Aug 27, 2024

Recent discussions in education emphasize the importance of teaching students to think like computer programmers. Computational thinking involves...

Defining Computational Thinking

  • Defining Computational Thinking

by Lcom Team | Aug 24, 2024

Computational thinking is a problem-solving process that involves various techniques and thought processes borrowed from computer science. It...

Supporting Texas Students in Becoming Tech-Strong: Expert Strategies

  • Supporting Texas Students in Becoming Tech-Strong: Expert Strategies

by Lcom Team | Aug 22, 2024

The rapidly evolving technological landscape means students must develop robust digital skills to thrive in future careers. Texas recognizes this...

Quick Links

  • Request More Info

Recent news & Articles

  • How Computer Fundamentals Equip Students for Success
  • Algorithmic Thinking: A Critical Skill for Today’s Students
  • Our Mission

Computational Thinking is Critical Thinking—and Belongs in Every Subject

Identifying patterns and groupings is a useful way of thinking not just for computer scientists but for students in all fields.

Two high school students working on a laptop together

Computational thinking, a problem-solving process often used by computer scientists, is not that different from critical thinking and can be used in any discipline, writes Stephen Noonoo in “ Computational Thinking Is Critical Thinking. And It Works in Any Subject, ” for EdSurge. 

Elements of computational thinking, like pattern recognition, are easily transferred to unexpected areas of study like social studies or English, says Tom Hammond, a former teacher who is now an education professor at Lehigh University. Hammond says that students like the computational thinking approach because it’s engaging: “Ask yourself, would you rather get to play with a data set or would you rather listen to the teacher tell you about the data set?” 

For example, in history classes students make use of data-rich, often open-source geographic information systems, or GIS, to plot election results from the colonial era to reimagine the way politics unfolded in the 1700s. These kinds of data visualization exercises offer a way for students to actively manipulate real-world information for deeper engagement and understanding.

There are three steps to bring computational thinking into your classroom, regardless of your subject area. First, consider the dataset. Hammond offers an example of incorporating computational thinking into a social studies class: A student is asked to give five state names which Hammond writes on the board. Then a different student lists five more states.

Once all the information is on the table, students execute the second step: identifying patterns. “Typically, this involves shifting to greater levels of abstraction—or conversely, getting more granular,” Noonoo writes. For students looking for commonalities or trends, this kind of critical thinking “cues them into the subtleties.” In the states example, students try to identify why Hammond grouped the states in the way he did. Is it by geography? Is it by what date they became part of the United States? Slowly, students begin to identify patterns—something the brain is already hardwired to do, according to Hammond. 

In the final stage—decomposition—students break down information into digestible parts and then decide “What’s a trend versus what’s an outlier to the trend? Where do things correlate, and where can you find causal inference?” Establish a rule from the data—a process that requires that students make fine distinctions about how complex datasets can be reliably interpreted, Hammond says.

“It definitely took some practice to help them understand the difference between just finding a relationship and then a cause-and-effect relationship,” says Shannon Salter, a social studies teacher in Allentown, Pennsylvania, who collaborates with Hammond. 

An entire curriculum can be dedicated to incorporating computational thinking, but that kind of “major overhaul” isn’t required, Hammond says. “It can be inserted on a lesson-by-lesson basis and only where it makes sense.” 

Computational thinking is not that far afield from critical thinking. The processes mirror each other: “look at the provided information, narrow it down to the most valuable data, find patterns and identify themes,” Noonoo writes. Students become more agile thinkers when they exercise these transferrable skills in subjects not often associated with computer science, like history or literature. 

  • Lesson 1: What is computational thinking?
  • Edit on GitHub

Lesson 1: What is computational thinking? 

Being able to think is one of the hallmarks of being a human being - especially meta thinking which involves thinking about your own thoughts. We often overlook the complexity of thinking because it comes so naturally to us. It seems so easy because we do it everyday with little to no thought about how the process works. It only becomes apparent that it is complex when something or someone interrupts our thought process e.g. try explaining a specific idea you have to someone in one or two sentences; or try follow a lecture in a topic you don’t know anything about. These tasks become difficult because in our daily life we strip away what we can safely assume to make communication easier and more effective. This stripping away of “unnecessary” information becomes a problem when we need it. This is most prominent when working with computer programming because a computer cannot assume any information, so you have to provide commands without exception. This means that you have to adjust your thinking slightly to work through problems in the same way a computer program would. This isn’t as daunting as it sounds because it is quite straightforward and it can also help with the way in which you solve problems outside of programming too.

The main purpose of computational thinking is to identify problems and solve them. This seems simple, but problems themselves can be quite complex and made up of different parts, each of which forms its own separate problem. The complex nature of problems can make them seem to be overwhelming, but when you follow a few simple steps, you are able to break them down into manageable pieces. This is very effective because not only does it work for computational problems, but it also works for most problems you will face in life. There are four key steps in this process:

Decomposition

Pattern recognition

Abstraction

Algorithmic thinking

Using these four components, you will be able to successfully solve problems by “dividing and conquering” them. There are also some additional factors that need to be considered that are not directly related to the core way of solving the problem - constraints. There could be several constraints that you are facing which were unforeseeable, imposed upon your project or factors that weren’t considered when starting. In this course, you will learn to use these four components as well as how to deal with constraints and what you can do to mitigate the effects of them.

Another point that is important to understand involves the following three words: what , why and how . It is always important to understand what you are doing because if you don’t understand that then you cannot reorientate yourself when you face problems in your project. In the same way, you need to understand why you are doing something. Keeping a goal in mind when you break down a problem into smaller pieces will keep you focused on the big picture even when you are working on the smaller issues. This is vital for not losing sight of what your final goal is. Lastly, having a clear understanding of how you are going to do something will put an element of realism to a project and reveal any potential unrealistic expectations right in the beginning. These are three important components to keep in mind when you are working on any step of the computational thinking process. They are supplementary because even though they help keep you on track, you still need the plan and the action to achieve the goal.

This course will teach you how to effectively use these concepts to become a self-sufficient learner and problem-solver. Not only in the computational world, but also in the real world where you are faced with problems that are much less logical and solutions that are suboptimal at best. The principles behind both real world and computational problems are essentially the same, so these lessons are applicable in most - if not all - domains of study and life. The main point you need to keep in mind is that the purpose of the exercises isn’t necessarily to achieve that specific goal, but it’s rather to go through the process so that you learn the formula to achieve any goal.

The four components of computational thinking 

1. decomposition .

In simple terms, decomposition is the process of breaking down a large problem into smaller problems. There are a few reasons why this is helpful in the bigger picture. It gives you insight into the practicalities associated with solving the problem. You can view the smaller tasks with more understanding of what needs to be done because the goal is clearer. This can help you develop actionable steps and get started on solving them. Image for a moment you moved to a new house and there is nothing in it yet, but you want to make a cup of coffee after a day of moving. This seems like a simple enough problem to solve - except that it isn’t. You are actually dealing with a complex problem because it’s made up of several problems disguised as one, so the problem actually seems to be impossible. You cannot make coffee because you don’t have any of the ingredients in your house. So there are two main tasks to solve: buy ingredients and make the coffee.

There two tasks can be further broken down into even simpler tasks. When you buy ingredients, you need to do three things:

Make of list of ingredients you need. Decide where you are going to buy them. Cecide how you are going to get there.

Instructor note

This can further be divided into separate tasks for each item that you need to buy.

coffee powder

You can buy all of those items at the supermarket, so you decide to go there.

The fastest why to the supermarket is by bus, so you decide to take the bus.

You also need to make the same trip home after you are done at the supermarket.

2. Pattern recognition 

When you have broken the problem down into smaller tasks, you can look for patterns. The first step (the list of ingredients) is made up of several smaller steps even though it looks like one step.

Take the bus to the supermarket

Purchase the milk

Take the bus back home

Repeat the process for all the items

This isn’t an efficient way of doing this, so we can look for patterns in the tasks. The glaringly obvious pattern is that if we buy all the items at once then we only have to make one trip to the supermarket and one trip back home. This is the process of pattern recognition which is very useful in using previous knowledge to apply to new problems. For example, perhaps you are going to a new supermarket because your regular supermarket is closed for the day. You don’t need to go through the entire process of planning everything out because you can use the same pattern as usual, but adjusting a few key points. You would have to take a different bus and walk an extra few meters to get to the new supermarket, but buying the ticket, purchasing the items inside the supermarket and returning home is still the same process. You have recognized a pattern that you can use for other problems which have similar characteristics.

3. Abstraction 

The process of abstraction is to discard unnecessary details that are not relevant to solving the problem. You cannot take everything into account when making a decision, so you filter out any unnecessary details and focus on what is relevant to the problem you are solving. In the above example, you take the bus to get to the supermarket. Is it important that you know every stop on the way to the bus stop? No. Is it important that you know the model of the bus you are taking? No. Is it important that you know the bus drivers name? No. These are all factors that could be relevant to someone else if they have a task that involves those details. For example, if you are a bus driver and you need to change shifts with a bus driver named John then it’s important to know the name of the bus driver. So, it’s not necessarily the case that the details are not important, but rather that there are details that are not important to your own task.

4. Algorithmic thinking 

When you have decomposed the problem, identified any patterns and filtered out the unnecessary details, you are ready to create a step-by-step guide on how to solve the actual problems. At this point you need to make detailed plans for each step. You have to specify actions in the right order and with sufficient detail, so you can’t just say “take the bus to the supermarket and come back when you’re done”. You need to specify the smaller details such as the time you need to catch the bus, where you need to catch the bus and which number bus you need to catch. Then you need to specify where to get off, which direction to take towards the supermarket and how to long walk from the bus stop. Once you’re in the supermarket, you need to find all the items, collect them in a basket and pay for them. Then you repeat the bus process in reverse order making sure to take the bus from the opposite side of the street.

The relevance of the four components 

The importance of the four components is to focus your thinking on the details of the problem, remove any inferences you might have and realistically show what kind of problem you are dealing with. This may seem a bit strange with the first example about coffee. What is important, however, isn’t the example itself, but rather the way in which it was broken down and solved. This forms a blueprint for solving problems and you can use this blueprint to solve other problems. After doing a simple example, you can scale up the complexity of the problems until you are able to this for any problem you face. However, there are other factors to take into account because after all, the world we live in isn’t a static place, so things often change.

A note on the difference between decomposition and algorithmic thinking 

While creating a step-by-step plan can be seen as a form of decomposition, it’s important to note that decomposition is a broader concept that encompasses the identification of major components or subproblems. Algorithmic thinking is a more detailed and specific step that involves designing the precise instructions or actions to solve those subproblems. Both steps are crucial in computational thinking as they contribute to breaking down complex problems and devising effective solutions.

You can think of this using the following analogy: if you were to organize a trip overseas then you would break the problem down into a few smaller parts i.e. travel to destination, book into accomodation, organize a few external trips, finish trip and book out, travel back home. This is a broad overview of the solution which is how the decomposition step works. When you are on the algorithmic thinking step, you take each of those smaller parts and create a plan to solve them. Take for example the first part “travel to destination”: decide which mode of transport (boat, car, train or air travel), then decide on the dates and time of departure, etc. At this stage, you should be breaking each part down into very specific and actionable solutions.

Constraints 

There are often things that change along the way, so it’s important to understand that most of the time you will have to work within some constraints because you hardly ever have the ideal conditions for carrying out your plan. For example if the supermarket doesn’t have any coffee in stock then what is the solution for that? You could buy tea instead or buy some takeout coffee from the restaurant next door. These aren’t optimal solutions, but they are alternatives due to the constraints that you may face in the real world. What if you find out that the busses have changed their payment systems and now you need to pay with a transit card. The only problem is that you’ve never used a transit card before, so you need to figure out how that works. In this case, the decomposition of your plan is still valid, but you need to adjust the algorithmic thinking portion of the four components. You would need to prioritize getting a bus transit card and loading it with money before going to the bus stop. This would form a new tasks which takes a higher priority to the other tasks since you cannot complete any of the other tasks without first getting the bus transit card.

Something to think about

In your own life think back to a time when you had a problem that seemed overwhelming. How did you manage to solve it at the end of the day?

Using the four components of computational thinking described above, could you have created a better plan to solve that problem that seemed so overwhelming before?

Digital teaching and learning equipped with PD

The one about algorithmic thinking in computational thinking.

Picture of Anna McVeigh-Murphy

  • Share on Twitter
  • Share on LinkedIn
  • Share on Facebook
  • Send to Pinterest
  • Share via Facebook

An algorithm is a process or formula for calculating answers, sorting data, and automating tasks; and algorithmic thinking is the process for developing an algorithm.

“Effective algorithms make assumptions, show a bias toward simple solutions, trade off the costs of error against the cost of delay, and take chances.” Brian Christian, Tom Griffiths

With algorithmic thinking, students endeavor to construct a step-by-step process for solving a problem and like problems so that the work is replicable by humans or computers.

Algorithmic thinking is a derivative of computer science and the process to develop code and program applications. This approach a utomates the problem-solving process by creating a series of systematic, logical steps that intake a defined set of inputs and produce a defined set of outputs based on these.

In other words, algorithmic thinking is not solving for a specific answer; instead, it solves how to build a sequential, complete, and replicable process that has an end point – an algorithm. Designing an algorithm helps students to both communicate and interpret clear instructions for a predictable, reliable output. This is the crux of computational thinking .

Examples of Algorithms in Everyday Life

And like computational thinking and its other elements we’ve discussed, algorithms are something we experience regularly in our lives.

If you’re an amateur chef or a frozen meal aficionado, you follow recipes and directions for preparing food , and that’s an algorithm.

When you’re feeling groovy and bust out in a dance routine – maybe the Cha Cha Slide, the Macarena, or Flossing – you are also following a routine that emulates an algorithm and simultaneously being really cool .

Outlining a process for checking out books in a school library or instructions for cleaning up at the end of the day is developing an algorithm and letting your inner computer scientist shine.

Examples of Algorithms in Curriculum

Beginning to develop students’ algorithmic prowess, however, does not require formal practice with coding or even access to technology. Have students map directions for a peer to navigate a maze, create visual flowcharts for tasks, or develop a coded language.

To get started, here are ideas for incorporating algorithmic thinking in different subjects .

English Language Arts: Students map a flow chart that details directions for determining whether to use a colon or dash in a sentence.

Mathematics: In a word problem, students develop a step-by-step process for how they answered a question that can then be applied to similar problems.

Science: Students articulate how to classify elements in the periodic table.

Social Studies: Students describe a sequence of smaller events in history that precipitated a much larger event.

Languages: Students apply new vocabulary and practice speaking skills to direct another student to perform a task, whether it’s ordering coffee at a café or navigating from one point in a classroom to another.

Arts: Students create instructions for drawing a picture that another student then has to use to recreate the image.

Get the Guide

Examples of Algorithms in Computer Science

These are obviously more elementary examples; algorithms – especially those used in coding – are often far more intricate and complex. To contextualize algorithms in computer science and programming , below are two examples.

Standardized Testing and Algorithms: Coding enables the adaptive technology often leveraged in classrooms today.

For example, the shift to computer-based standardized tests has led to the advent of adaptive assessments that pick questions based on student ability as determined by correct and incorrect answers given.

If students select the correct answer to a question, then the next question is moderately more difficult. But if they answer wrong, then the assessment offers a moderately easier question. This occurs through an iterative algorithm that starts with a pool of questions. After an answer, the pool is adjusted accordingly. This repeats continuously.

The Omnipotent Google and Algorithms: Google’s search results are determined (in part) by the PageRank algorithm, which assigns a webpage’s importance based on the number of sites linking to it.

So, if we google ‘what is an algorithm,’ we can bet that the chosen pages have some of the most links to them for the topic ‘what is an algorithm.’ It’s still more complicated than this, of course; if you are interested, this article goes into the intricacies of the PageRank algorithm.

There are over 1.5 billion websites with billions more pages to count, but thanks to algorithmic thinking we can type just about anything into Google and expect to be delivered a curated list of resources in under a second. This right here is the power of algorithmic thinking.

“The Google algorithm was a significant development. I've had thank-you emails from people whose lives have been saved by information on a medical website or who have found the love of their life on a dating website.” Tim Berners-Lee

In whatever way it’s approached in the classroom, algorithmic thinking encourages students to communicate clearly and logically . Students learn to persevere throughout its multiple iterations, challenges, and solutions.

To arrive at an algorithm (especially as algorithms advance in complexity), they must apply computational thinking and practice metacognition as they do so. In this process, students become more adept critical thinkers, eloquent communicators, and curious problem solvers that ask bold questions and flourish in ambiguity and uncertainty.

What's Next?   Check out our articles on  decomposition , pattern recognition , and abstraction .

Related Articles

difference between computational thinking algorithmic thinking and critical thinking

Computational Thinking: Algorithm Basics for Grades 3-5

In this lesson plan, students develop a written algorithm to guide a partner in drawing a mystery animal.

difference between computational thinking algorithmic thinking and critical thinking

Computational Thinking, Algorithmic Thinking, & Design Thinking Defined

Learn how using these approaches to problem solving encourages students to blend critical thinking and creativity to design effective solutions.

difference between computational thinking algorithmic thinking and critical thinking

Computational Thinking: Decomposition and Design for Grades 6-8

In this lesson plan, students design solutions to for an egg-drop experiment. They will use an iterative process to test and modify their solutions.

Presented by

© Copyright 2023 equip by Learning.com

Cookie Settings

End of page. Back to Top

ClickCease

Popular Searches

Next generation science.

  • Designing Challenge Based Science Learning
  • Unit Library

What is Computational Thinking?

  • Inclusive Integration of Computational Thinking
  • Data Practices
  • Creating Algorithms
  • Understanding Systems with Computational Models

Computational thinking is an interrelated set of skills and practices for solving complex problems, a way to learn topics in many disciplines, and a necessity for fully participating in a computational world.

Many different terms are used when talking about computing, computer science, computational thinking, and programming. Computing encompasses the skills and practices in both computer science and computational thinking. While computer science is an individual academic discipline, computational thinking is a problem-solving approach that integrates across activities, and programming is the practice of developing a set of instructions that a computer can understand and execute, as well as debugging, organizing, and applying that code to appropriate problem-solving contexts. The skills and practices requiring computational thinking are broader, leveraging concepts and skills from computer science and applying them to other contexts, such as core academic disciplines (e.g. arts, English language arts, math, science, social studies) and everyday problem solving. For educators integrating computational thinking into their classrooms, we believe computational thinking is best understood as a series of interrelated skills and competencies.

A Venn diagram showing the relationship between computer science (CS), computational thinking (CT), programming and computing.

Figure 1. The relationship between computer science (CS), computational thinking (CT), programming and computing.

In order to integrate computational thinking into K-12 teaching and learning, educators must define what students need to know and be able to do to be successful computational thinkers. Our recommended framework has three concentric circles.

  • Computational thinking skills , in the outermost circle, are the cognitive processes necessary to engage with computational tools to solve problems. These skills are the foundation to engage in any computational problem solving and should be integrated into early learning opportunities in K-3.
  • Computational thinking practices , in the middle circle, combine multiple computational skills to solve an applied problem. Students in the older grades (4-12) may use these practices to develop artifacts such as a computer program, data visualization, or computational model.
  • Inclusive pedagogies , in the innermost circle, are strategies for engaging all learners in computing, connecting applications to students’ interests and experiences, and providing opportunities to acknowledge, and combat biases and stereotypes within the computing field.

A pie chart extruding from a Venn diagram to illustrate a framework for computational thinking integration.

Figure 2. A framework for computational thinking integration.

What does inclusive computational thinking look like in a classroom? In the image below, we provide examples of inclusive computing pedagogies in the classroom. The pedagogies are divided into three categories to emphasize different pedagogical approaches to inclusivity. Designing Accessible Instruction refers to strategies teachers should use to engage all learners in computing. Connecting to Students’ Interests, Homes, and Communities refers to drawing on the experiences of students to design learning experiences that are connected with their homes, communities, interests and experiences to highlight the relevance of computing in their lives. Acknowledging and Combating Inequity refers to a teacher supporting students to recognize and take a stand against the oppression of marginalized groups in society broadly and specifically in computing. Together these pedagogical approaches promote a more inclusive computational thinking classroom environment, life-relevant learning, and opportunities to critique and counter inequalities. Educators should attend to each of the three approaches as they plan and teach lessons, especially related to computing.

Examples of inclusive pedagogies for teaching computing

Figure 3. Examples of inclusive pedagogies for teaching computing in the classroom adapted from Israel et al., 2017; Kapor Center, 2021; Madkins et al., 2020; National Center for Women & Information Technology, 2021b; Paris & Alim, 2017; Ryoo, 2019; CSTeachingTips, 2021

Micro-credentials for computational thinking

A micro-credential is a digital certificate that verifies an individual’s competence in a specific skill or set of skills. To earn a micro-credential, teachers submit evidence of student work from classroom activities, as well as documentation of lesson planning and reflection.

Because the integration of computational thinking is new to most teachers, micro-credentials can be a useful tool for professional learning and/or credentialing pathways. Digital Promise has created micro-credentials for Computational Thinking Practices . These micro-credentials are framed around practices because the degree to which students have built foundational skills cannot be assessed until they are manifested through the applied practices.

Visit Digital Promise’s micro-credential platform to find out more and start earning micro-credentials today!

Sign up for updates!

Teach Your Kids Code Header Image

What is Algorithmic Thinking? A Beginner’s Guide

This post may contain affiliate links. As an Amazon Associate, I earn from qualifying purchases.

Sharing is caring!

Algorithmic thinking is a way of approaching problems that involves breaking them down into smaller, more manageable parts. It is a process that involves identifying the steps needed to solve a problem and then implementing those steps in a logical and efficient manner. Algorithmic thinking is a key component of computational thinking, which is the ability to think like a computer and approach problems in a way that is both systematic and creative.

At its core, algorithmic thinking is about problem-solving. It is a way of thinking that involves breaking down complex problems into smaller, more manageable parts and then solving those parts one at a time. This approach can be applied to a wide range of problems, from simple math equations to complex programming challenges. By breaking problems down into smaller parts, algorithmic thinking allows us to approach problems in a way that is both logical and efficient.

Algorithmic thinking is closely related to critical thinking and logic. It involves the ability to analyze problems, identify patterns, and develop solutions that are both effective and efficient. By developing these skills, individuals can become better problem solvers and more effective communicators. Whether you are a student, a professional, or simply someone who wants to improve your problem-solving skills, algorithmic thinking is a valuable tool that can help you achieve your goals.

What is Algorithmic Thinking?

Algorithmic thinking is a problem-solving approach that involves breaking down complex problems into smaller, more manageable steps. It is a process of logically analyzing and organizing procedures to create a set of instructions that can be executed by a computer or human.

Algorithmic thinking involves using logic and critical thinking skills to develop algorithms, which are sets of instructions that can be used to solve problems. These algorithms can be used by computers or humans to efficiently and effectively solve problems. Algorithmic thinking, like computational thinking, involves exploring, decomposing, pattern recognition, and testing to develop efficient solutions to complex problems.

Algorithmic thinking is an essential skill in the fields of computer science , programming, and STEM . It is a valuable tool for analyzing and solving complex problems, making it an essential skill for success in today’s technology-driven world. Algorithmic thinking also helps to develop logic and critical thinking skills which are essential for success in any field.

Applications

Algorithmic thinking has numerous applications in various fields, including education, data analysis, machine learning, robotics, and operating systems. In education, it is used to develop data-driven instruction and instructional planning. In data analysis, it is used to develop algorithms for sorting and analyzing data. In machine learning, it is used to develop algorithms for recognizing patterns and making predictions. In robotics, it is used to develop algorithms for controlling robots. In operating systems, it is used to develop algorithms for managing resources and scheduling tasks.

Algorithmic Thinking Unplugged

Here at Teach Your Kids Code, we have designed a variety of algorithmic thinking activities for kids that don’t need a computer.

Check out a few of our activities here:

Coding Unplugged FI

Learn Algorithms With A Deck of Cards

‘Program’ your robot to navigate a maze of cards and obstacles. Students will learn the basics of designing an algorithm with this activity.

difference between computational thinking algorithmic thinking and critical thinking

Learn Algorithms with Snakes and Ladders

This twist on the classic board game Snakes and Ladders will teach your students all about Algorithms.

Algorithmic Thinking Process

Algorithmic thinking is a systematic approach to problem-solving that involves breaking down complex problems into smaller, more manageable parts. The algorithmic thinking process involves several steps that help in solving complex problems.

Exploring the Problem

The first step in the algorithmic thinking process is to explore the problem. This involves understanding the problem, identifying the constraints, and defining the goals. It is important to ask questions and gather information about the problem to gain a deeper understanding of it.

Decomposition

The next step is decomposition, which involves breaking down the problem into smaller, more manageable parts. This involves identifying the sub-problems and organizing them in a logical way. Decomposition helps in simplifying the problem and making it easier to solve.

Pattern Recognition

After decomposition, the next step is pattern recognition. This involves identifying patterns in the data and finding similarities between the sub-problems. Pattern recognition helps in identifying the relationships between the sub-problems and finding common solutions.

Abstraction

The next step is abstraction, which involves identifying the essential elements of the problem and ignoring the non-essential details. Abstraction helps in simplifying the problem and making it easier to understand.

Algorithm Design

The next step is algorithm design, which involves designing a solution to the problem. This involves creating a step-by-step plan for solving the problem. The plan should be clear, concise, and easy to follow.

Testing and Iteration

The final step is testing and iteration. This involves testing the solution and making any necessary changes. Iteration helps in refining the solution and making it more efficient.

Free Computational Thinking Worksheets

We’ve designed a set of worksheets to teach algorithmic and computational thinking concepts in the classroom. Worksheets will introduce the basic concepts of computational thinking. Answer guides are included.

Featured Image

Final Thoughts

Algorithmic thinking is a fundamental skill that is becoming increasingly important in today’s digital age. It is the process of breaking down complex problems into smaller, more manageable parts and developing a step-by-step approach to solving them.

Algorithmic thinking is not just limited to computer science and programming but can be applied to a wide range of fields, including mathematics, engineering, and business. By developing this skill, individuals can become more efficient problem-solvers, making them more valuable in the workforce.

In addition, algorithmic thinking can be a valuable tool for students of all ages. By introducing this concept early on in education, students can develop effective habits in processing tasks and problem-solving.

Overall, algorithmic thinking is a valuable skill that can benefit individuals in both their personal and professional lives. By breaking down complex problems into smaller, more manageable parts, individuals can become more efficient problem-solvers and better equipped to tackle the challenges of the digital age.

difference between computational thinking algorithmic thinking and critical thinking

Kate is mom of two rambunctious boys and a self-proclaimed super nerd. With a background in neuroscience, she is passionate about sharing her love of all things STEM with her kids. She loves to find creative ways to teach kids computer science and geek out about coding and math.  She has authored several books on coding for kids which can be found at Hachette UK .

Similar Posts

12 Ways Kids Can Learn with Minecraft

12 Ways Kids Can Learn with Minecraft

If your kids are anything like mine, they absolutely love Minecraft. And guess what? This popular game is more than just fun—it can be an educational tool! Here are 12 ways kids can learn with…

Best Rubik’s Cube

Best Rubik’s Cube

The Rubik’s Cube is an incredibly popular mechanical puzzle that has captured attention for decades and has managed to retain enough cultural capital to still be popular today. The 3D puzzle was invented all the…

14 Ways to Boost Critical Thinking Skills for Kids

14 Ways to Boost Critical Thinking Skills for Kids

Hey fellow STEM friends! I’ve been on a mission to find fun and engaging ways to boost critical thinking skills in our little ones. We all know how important it is for our kids to…

50 Interesting STEM Facts

50 Interesting STEM Facts

Do you know what STEM is? STEM stands for Science, Technology, Engineering and Mathematics. STEM represents a unique set of interlinked subjects used in a variety of fields. STEM education is a common focus on…

This Clip is Bananas! Viral Banana Piano in Action

This Clip is Bananas! Viral Banana Piano in Action

Ever imagined making music with bananas? You’re not alone—this viral video is turning heads by showcasing an incredibly cool and innovative way to create your own musical keyboard using the Synth-a-Sette! The Synth-a-Sette is a…

Spooky DIY Fun: 3D Pen Pumpkin Carving

Spooky DIY Fun: 3D Pen Pumpkin Carving

With Halloween just around the corner, it’s time to get your young learners excited about the season with a fun and unique DIY activity: 3D pen pumpkin carving. This creative endeavor combines art, technology, and…

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Last Updated on June 6, 2023 by Kaitlyn Siu

Computational/Algorithmic Thinking

  • Reference work entry
  • First Online: 01 January 2020
  • Cite this reference work entry

difference between computational thinking algorithmic thinking and critical thinking

  • Max Stephens 2 &
  • Djordje M. Kadijevich 3  

953 Accesses

13 Citations

Introduction

In many countries, the curricular relationship with digital technologies is moving very rapidly (Stephens 2018 ). These technologies are not only seen as learning and teaching tools for existing disciplines such as mathematics but are also associated with new forms of literacy to be developed for scientific, societal, and economic reasons (Bocconi et al. 2016 ). Computational thinking, a term coined by Papert ( 1980 ), a key element of the new digital literacy, has been described by Wing ( 2011 ) as a fundamental personal ability like reading, writing, and arithmetic which enables a person to recognize aspects of computations in various problem situations and to deal appropriately with those aspects by applying tools and techniques from computer science (The Royal Society 2011 ).

To support an appropriate integration of digital technology in mathematics education, research must focus on the way in which the use of this technology can mediate the learning of mathematics (Drijvers 2018...

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Abramovich S (2015) Mathematical problem posing as a link between algorithmic thinking and conceptual knowledge. Teach Math 18(2):45–60. http://elib.mi.sanu.ac.rs/files/journals/tm/35/tmn35p45-60.pdf

Google Scholar  

Artigue M (2010) The future of teaching and learning mathematics with digital technologies. In: Hoyles C, Lagrange JB (eds) Mathematics education and technology – rethinking the terrain. The 17th ICMI study. Springer, New York, pp 463–476. https://doi.org/10.1007/978-1-4419-0146-0_23

Chapter   Google Scholar  

Australian Curriculum, Assessment and reporting Authority (ACARA) (2016) Digital technologies. Retrieved from http://docs.acara.edu.au/resources/Digital_Technologies_-_Sequence_of_content.pdf

Bocconi S, Chioccariello A, Dettori G, Ferrari A, Engelhardt K (2016) Developing computational thinking in compulsory education. European Union, European Commission, Joint Research Centre, Luxemburg

Brennan K, Resnick M (2012) New frameworks for studying and assessing the development of computational thinking. In: Proceedings of the 2012 annual meeting of the American Educational Research Association, Vancouver. https://web.media.mit.edu/~kbrennan/files/Brennan_Resnick_AERA2012_CT.pdf

Department of Education (UK) (2013) National Curriculum in England: computing programmes of study . https://www.gov.uk/government/publications/national-curriculum-in-england-computing-programmes-of-study/national-curriculum-in-england-computing-programmes-of-study

Drijvers P (2018) Tools and taxonomies: a response to Hoyles. Res Math Edu 20(3):229–235. https://doi.org/10.1080/14794802.2018.1522269

Article   Google Scholar  

Drijvers P, Kodde-Buitenhuis H, Doorman M (2019) Assessing mathematical thinking as part of curriculum reform in the Netherlands. Educ Stud Math. https://doi.org/10.1007/s10649-019-09905-7

Hickmott D, Prieto-Rodriguez E, Holmes K (2018) A scoping review of studies on computational thinking in K–12 mathematics classrooms. Digit Exp Math Edu 4(1):48–69. https://doi.org/10.1007/s40751-017-0038-8

Hoyles C. Noss R (2015) Revisiting programming to enhance mathematics learning. In: Paper presented at Math + coding symposium. Western University, London

Kadijevich DM (2018) A cycle of computational thinking. In: Trebinjac B, Jovanović S (eds) Proceedings of the 9th international conference on e-learning. Metropolitan University, Belgrade, pp 75–77. https://econference.metropolitan.ac.rs/wp-content/uploads/2019/05/e-learning-2018-final.pdf

Kanemune S, Shirai S, Tani S (2017) Informatics and programming education at primary and secondary schools in Japan. Olympiads Inf 11:143–150. https://ioinformatics.org/journal/v11_2017_143_150.pdf

Kenderov PS (2018) Powering knowledge versus pouring facts. In: Kaiser G, Forgasz H, Graven M, Kuzniak A, Simmt E, Xu B (eds) Invited lectures from the 13th international congress on mathematical education. ICME-13 monographs. Springer, Cham. https://doi.org/10.1007/978-3-319-72170-5_17

Kotsopoulos D, Floyd L, Khan S, Namukasa IK, Somanath S, Weber J, Yiu C (2017) A pedagogical framework for computational thinking. Digit Exp Math Edu 3(2):154–171

Lee I, Martin F, Denner J, Coulter B, Allan W, Erickson J, Malyn-Smith J, Werner L (2011) Computational thinking for youth in practice. ACM Inroads 2(1):33–37. https://users.soe.ucsc.edu/~linda/pubs/ACMInroads.pdf

Lockwood EE, DeJarnette A, Asay A, Thomas M (2016) Algorithmic thinking: an initial characterization of computational thinking in mathematics. In: Wood MB, Turner EE, Civil M, Eli JA (eds) Proceedings of the 38th annual meeting of the north American chapter of the International Group for the Psychology of mathematics education. The University of Arizona, Tucson, pp 1588–1595. https://files.eric.ed.gov/fulltext/ED583797.pdf

Ministere de l’Education Nationale (2016) Algorithmique et programmation. Author: Paris. http://cache.media.eduscol.education.fr/file/Algorithmique_et_programmation/67/9/RA16_C4_MATH_algorithmique_et_programmation_N.D_551679.pdf

Modeste S (2016) Impact of informatics on mathematics and its teaching. In: Gadducci F, Tavosanis M (eds) History and philosophy of computing. HaPoC 2015. IFIP advances in information and communication technology, vol 487. Springer, Cham, pp 243–255

Mouza C, Yang H, Pan Y-C, Ozden SY, Pollock L (2017) Resetting educational technology coursework for pre-service teachers: a computational thinking approach to the development of technological pedagogical content knowledge (TPACK). Australas J Educ Technol 33(3):61–76. https://doi.org/10.14742/ajet.3521

Papert S (1980) Mindstorms: children, computers, and powerful ideas. Basic Books, New York

Prime Minister’s Office (2016) Comprehensive schools in the digital age. Author: Helsinki Finland. https://valtioneuvosto.fi/en/article/-/asset_publisher/10616/selvitys-perusopetuksen-digitalisaatiosta-valmistunut

Scantamburlo T (2013) Philosophical aspects in pattern recognition research. A PhD dissertation, Department of informatics, Ca’ Foscari University of Venice, Venice. https://pdfs.semanticscholar.org/c36d/b973c9ed1fd666b3d14cdf464e4a74bdceb7.pdf

Shute VJ, Sun C, Asbell-Clarke J (2017) Demystifying computational thinking. Educ Res Rev 22:142–158. Internet. https://doi.org/10.1016/j.edurev.2017.09.003

Stephens M (2018) Embedding algorithmic thinking more clearly in the mathematics curriculum. In: Shimizu Y, Withal R (eds) Proceedings of ICMI study 24 School mathematics curriculum reforms: challenges, changes and opportunities. University of Tsukuba, pp 483–490. https://protect-au.mimecast.com/s/oa4TCJypvAf26XL9fVkPOr?domain=human.tsukuba.ac.jp

The Royal Society (2011) Shut down or restart? The way forward for computing in UK schools. The Author, London. https://royalsociety.org/~/media/education/computing-in-schools/2012-01-12-computing-in-schools.pdf

Victorian Curriculum and Assessment Authority (2017) Victorian certificate of education – algorithmics (a higher education scored subject) – study design (2017–2021). https://www.vcaa.vic.edu.au/Documents/vce/algorithmics/AlgorithmicsSD-2017.pdf

Weintrop D, Beheshti E, Horn M, Orno K, Jona K, Trouille L, Wilensky U (2016) Defining computational thinking for mathematics and science classroom. J Sci Educ Technol 25(1):127–141. https://doi.org/10.1007/s10956-015-9581-5

Webb M, Davis N, Bell T, Katz YJ, Reynolds N, Chambers DP, Sysło MM (2017) Computer science in K-12 school curricula of the 2lst century: why, what and when? Educ Inf Technol 22(2):445–468. https://doi.org/10.1007/s10639-016-9493-x

Wing JM (2011) Research notebook: computational thinking—what and why? Link Newslett 6:1–32. https://www.cs.cmu.edu/~CompThink/resources/TheLinkWing.pdf

Download references

Acknowledgments

The authors are grateful to Michèle Artigue for her generous suggestions about the structure of this entry and the content of its sections, as well as to John G Moala for specific comments regarding curricular issues.

Author information

Authors and affiliations.

MGSE, The University of Melbourne, Melbourne, VIC, Australia

Max Stephens

Institute for Educational Research, Belgrade, Serbia

Djordje M. Kadijevich

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Max Stephens .

Editor information

Editors and affiliations.

Department of Education, Centre for Mathematics Education, London South Bank University, London, UK

Stephen Lerman

Section Editor information

Laboratoire de Didactique André Revuz (EA4434), Université Paris-Diderot, Paris, France

Michèle Artigue

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this entry

Cite this entry.

Stephens, M., Kadijevich, D.M. (2020). Computational/Algorithmic Thinking. In: Lerman, S. (eds) Encyclopedia of Mathematics Education. Springer, Cham. https://doi.org/10.1007/978-3-030-15789-0_100044

Download citation

DOI : https://doi.org/10.1007/978-3-030-15789-0_100044

Published : 23 February 2020

Publisher Name : Springer, Cham

Print ISBN : 978-3-030-15788-3

Online ISBN : 978-3-030-15789-0

eBook Packages : Education Reference Module Humanities and Social Sciences Reference Module Education

Share this entry

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research
  • DOI: 10.1007/s40692-017-0090-9
  • Corpus ID: 13788344

Algorithmic thinking, cooperativity, creativity, critical thinking, and problem solving: exploring the relationship between computational thinking skills and academic performance

  • Tenzin Doleck , Paul Bazelais , +2 authors Ram B. Basnet
  • Published in Journal of Computers in… 11 August 2017
  • Computer Science, Education

114 Citations

Creativity in the acquisition of computational thinking, the significance of investigating the relationship between mathematical thinking and computational thinking using linguistic aspects, constructivism learning theory: a paradigm for students’ critical thinking, creativity, and problem solving to affect academic performance in higher education.

  • Highly Influenced

The Computational Thinking Scale for Computer Literacy Education

Effect of scratch on computational thinking skills of chinese primary school students, computational thinking and metacognition, what influences computational thinking a theoretical and empirical study based on the influence of learning engagement on computational thinking in higher education, problem-solving and computational thinking practices: lesson learned from the implementation of expression model, the associations between computational thinking and creativity: the role of personal characteristics, computational thinking and coding across content areas to develop digital skills, 59 references, computational thinking is critical thinking: connecting to university discourse, goals, and learning outcomes, problem solving and computational thinking in a learning environment, computational thinking for all: pedagogical approaches to embedding 21st century problem solving in k-12 classrooms, thinking about computational thinking.

  • Highly Influential

Which cognitive abilities underlie computational thinking? Criterion validity of the Computational Thinking Test

Defining computational thinking for mathematics and science classrooms, computational thinking in compulsory education: towards an agenda for research and practice, review on teaching and learning of computational thinking through programming: what is next for k-12, enhancing teaching through constructive alignment, targeting critical thinking within teacher education: the potential impact on society, related papers.

Showing 1 through 3 of 0 Related Papers

  • Open access
  • Published: 09 September 2024

The transfer effect of computational thinking (CT)-STEM: a systematic literature review and meta-analysis

  • Zuokun Li 1 &
  • Pey Tee Oon   ORCID: orcid.org/0000-0002-1732-7953 1  

International Journal of STEM Education volume  11 , Article number:  44 ( 2024 ) Cite this article

257 Accesses

1 Altmetric

Metrics details

Integrating computational thinking (CT) into STEM education has recently drawn significant attention, strengthened by the premise that CT and STEM are mutually reinforcing. Previous CT-STEM studies have examined theoretical interpretations, instructional strategies, and assessment targets. However, few have endeavored to delineate the transfer effects of CT-STEM on the development of cognitive and noncognitive benefits. Given this research gap, we conducted a systematic literature review and meta-analysis to provide deeper insights.

We analyzed results from 37 studies involving 7,832 students with 96 effect sizes. Our key findings include: (i) identification of 36 benefits; (ii) a moderate overall transfer effect, with moderate effects also observed for both near and far transfers; (iii) a stronger effect on cognitive benefits compared to noncognitive benefits, regardless of the transfer type; (iv) significant moderation by educational level, sample size, instructional strategies, and intervention duration on overall and near-transfer effects, with only educational level and sample size being significant moderators for far-transfer effects.

Conclusions

This study analyzes the cognitive and noncognitive benefits arising from CT-STEM’s transfer effects, providing new insights to foster more effective STEM classroom teaching.

Introduction

In recent years, computational thinking (CT) has emerged as one of the driving forces behind the resurgence of computer science in school curriculums, spanning from pre-school to higher education (Bers et al., 2014 ; Polat et al., 2021 ; Tikva & Tambouris, 2021a ). CT is complex, with many different definitions (Shute et al., 2017 ). Wing ( 2006 , p. 33) defines CT as a process that involves solving problems, designing systems, and understanding human behavior by drawing on the concepts fundamental to computer science (CS). Contrary to a common perception that CT belongs solely to CS, gradually, it has come to represent a universally applicable attitude and skill set (Tekdal, 2021 ) involving cross-disciplinary literacy (Ye et al., 2022 ), which can be applied to solving a wide range of problems within CS and other disciplines (Lai & Wong, 2022 ). Simply put, CT involves thinking like a computer scientist when solving problems, and it is a universal competence that everyone, not just computer scientists, should acquire (Hsu et al., 2018 ). Developing CT competency not only helps one acquire domain-specific knowledge but enhances one’s general ability to solve problems across various academic fields (Lu et al., 2022 ; Wing, 2008 ; Woo & Falloon, 2022 ; Xu et al., 2022 ), including STEM (science, technology, engineering, and mathematics) (Chen et al., 2023a ; Lee & Malyn-Smith, 2020 ; Wang et al., 2022a ; Waterman et al., 2020 ; Weintrop et al., 2016 ), the social sciences, and liberal arts (Knochel & Patton, 2015 ).

Given the importance of CT competency, integrating it into STEM education (CT-STEM) has emerged as a trend in recent years (Lee et al., 2020 ; Li & Anderson, 2020 ; Merino-Armero et al., 2022 ). CT-STEM represents the integration of CT practices with STEM learning content or context, grounded in the premise that a reciprocal relationship between STEM content learning and CT can enrich student learning (Cheng et al., 2023 ). Existing research supports that CT-STEM enhances student learning in two ways (Li et al., 2020b ). First, CT, viewed as a set of practices for bridging disciplinary teaching, shifts traditional subject forms towards computational-based STEM content learning (Wiebe et al., 2020 ). Engaging students in discipline-specific CT practices like modeling and simulation has been shown to improve their content understanding (Grover & Pea, 2013 ; Hurt et al., 2023 ) and enhance learning (Aksit & Wiebe, 2020 ; Rodríguez-Martínez et al., 2019 ; Yin et al., 2020 ). Another way is to take CT as a transdisciplinary thinking process and practice, providing a structured problem-solving framework that can reduce subject fixation (Ng et al., 2023 ). Aligning with integrated STEM (iSTEM) teaching, this approach equips students with critical skills such as analytical thinking, data manipulation, algorithmic thinking, collaboration, and creative solution development in authentic contexts (Tikva & Tambouris, 2021b ). Such skills are increasingly vital for addressing complex problems in a rapidly evolving digital and artificial intelligence-driven world.

Despite the growing interest in CT-STEM (Li et al., 2020b ; Tekdal, 2021 ), recent reviews indicate a focus on theoretical interpretations (Lee & Malyn-Smith, 2020 ; Weintrop et al., 2016 ), instructional strategies (Hutchins et al., 2020a ; Ma et al., 2021 ; Rachmatullah & Wiebe, 2022 ), and assessment targets (Bortz et al., 2020 ; Román- González et al., 2017). Although previous meta-analyses have shown CT-STEM’s positive impact on students meeting learning outcomes (Cheng et al., 2023 ), there is a gap in systematically analyzing its benefits, particularly in differentiating student learning via transfer effects (Popat & Starkey, 2019 ; Ye et al., 2022 ). Transfer, a key educational concept categorized as near and far transfer based on the theory of “common elements” (Perkins & Salomon, 1992 ), is crucial for understanding and evaluating CT-STEM’s utility and developing effective pedagogies. Previous studies have concentrated on cognitive learning outcomes (Cheng et al., 2023 ; Zhang & Wong, 2023 ) but offer limited insight into CT-STEM’s transfer effects on noncognitive outcomes like affective and social skills (Lai et al., 2023 ; Tang et al., 2020 ; Zhang et al., 2023 ). Given that CT-STEM effects extend beyond the cognitive domain (Ezeamuzie & Leung, 2021; Lu et al., 2022 ), it is equally important to recognize and nurture noncognitive benefits like self-efficacy, cooperativity, and communication in CT-STEM practices (Yun & Cho, 2022 ).

To better understand and evaluate CT-STEM transfer effects on students’ cognitive and noncognitive benefits acquisition, we systematically review published CT-STEM effects using PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines (Moher et al., 2010 ). We employ meta-analysis to quantify these effects and identify moderating variables. The following research questions guide our study:

RQ1: What cognitive and noncognitive benefits are acquired from CT-STEM’s near and far transfer effects?

RQ2: (a) What are the overall transfer effects of CT-STEM on cognitive and noncognitive benefits mentioned in Q1? and (b) What are the moderators of this effect?

RQ3: (a) What are the near and far transfer effects of CT-STEM on cognitive and noncognitive benefits mentioned in Q1? and (b) What are the moderators of these effects?

Literature review

  • Computational thinking (CT)

The concept of procedural thinking was first introduced by Papert ( 1980 ), who connected programming to procedural thinking and laid a foundation for CT (Merino-Armero et al., 2022 ). Although Papert was the first to describe CT, Wing ( 2006 , 2008 , 2011 ) brought considerable attention back to the term, a focus that continues to date (Brennan & Resnick, 2012 ; Chen et al., 2023a ). Various other definitions have emerged in the literature, and there is no consensus definition of CT (Barr & Stephenson, 2011 ; Grover & Pea, 2013 ; Shute et al., 2017 ). The definitions of CT often incorporate programming and computing concepts (e.g., Israel-Fishelson & Hershkovitz, 2022 ) or consider CT to be a set of elements associated with both computing concepts and problem-solving skills (e.g., Kalelioglu et al., 2016 ; Piatti et al., 2022 ). From the former perspective, many researchers defined CT based on programming and computing concepts. For example, Denner et al. ( 2012 ) defined CT as a united competence composed of three key dimensions of CT: programming, documenting and understanding software, and designing for usability. An alternative defining framework (Brennan & Resnick, 2012 ), originating from a programming context (i.e., Scratch), focuses on CT concepts and practices, including computational terms of sequences, loops, conditionals, debugging, and reusing.

Viewed from the latter perspective, CT deviates from the competencies typically associated with simple computing or programming activities. Instead, it is characterized as a set of competencies encompassing domain-specific knowledge/skills in programming and problem-solving skills for non-programming scenarios (Lai & Ellefson, 2023 ; Li et al., 2020a ; Tsai et al., 2021 , 2022 ). Using this broad viewpoint, CT can be defined as a universally applicable skill set involved in problem-solving processes. For instance, ISTE and CSTA (2011) developed an operational definition of CT, which refers to a problem-solving process covering core skills, such as abstraction , problem reformulation , data practices , algorithmic thinking , automation & modeling & simulation, and generalization . Selby and Woollard ( 2013 ) proposed a process-oriented definition of CT based on its five essential practices: abstraction , decomposition , algorithmic thinking , evaluation , and generalization . Shute et al. ( 2017 ) provided a cross-disciplinary definition centered on solving problems effectively and efficiently, categorizing CT into six practices: decomposition , abstraction , algorithm design , debugging , iteration , and generalization . In all these cases, the essence of CT lies in a computer scientist’s approach to problems, which is a skill applicable to everyone’s daily life and across all learning domains.

The above classification of definitions mainly focuses on the cognitive aspect of CT. Other researchers have suggested that CT contains not only a cognitive component (Román-González et al., 2017 ) but also a noncognitive component, highlighting important dispositions and attitudes, including confidence in dealing with complexity, persistence in working with difficult problems, tolerance for ambiguity, the ability to deal with open-ended problems, and the ability to communicate and work with others to achieve a common goal or solution (Barr & Stephenson, 2011 ; CSTA & ISTE, 2011 ).

In short, while computational thinking (CT) is frequently associated with programming, its scope has significantly expanded over the years (Hurt et al., 2023 ; Kafai & Proctor, 2022 ). Building on these prior efforts, we define CT as a problem-solving/thought process that involves selecting and applying the appropriate tools and practices for solving problems effectively and efficiently. As a multifaceted set of skills and attitudes, CT includes both cognitive aspects, highlighting students’ interdisciplinary practices/skills, and noncognitive aspects like communication and collaboration.

Integrating CT in STEM education (CT-STEM)

There is an urgent need to bring CT into disciplinary classrooms to prepare students for new integrated fields (e.g., computational biology, computational physics, etc.) as practiced in the realistic professional world. To address this, a growing body of research and practice has focused on integrating CT into specific and iSTEM lessons (Jocius et al., 2021 ). This integration, i.e., CT-STEM, refers to the infusion of CT practices with STEM content/context, with the aim of enhancing students’ CT skills and STEM knowledge (Cheng et al., 2023 ). Accordingly, CT-STEM serves a dual purpose: one, it has the potential to foster the development of student CT practices and skills; and another, it simultaneously deepens students’ disciplinary understanding and improves learning performance within and across disciplines (Waterman et al., 2020 ). Current research reveals two potential ways this integration facilitates students’ STEM learning. First, integrating CT into STEM provides students with an essential, structured framework by characterizing CT as a thought process and general competency, with disciplinary classrooms offering “a meaningful context (and set of problems) within which CT can be applied” (Weintrop et al., 2016 , p. 128). Key processes of this problem-solving approach include: formulating problems computationally, data processing for solving problems, automating/simulating/modeling solutions, evaluating solutions, and generalizing solutions (Lyon & Magana, 2021 ; Wang et al., 2022a ). Engaging in these practices aids students in applying STEM content to complex problem-solving and develops their potential as future scientists and innovators, aligning with iSTEM teaching.

In addition, introducing CT within disciplinary classroom instruction transforms traditional STEM subject formats into an integrated computational-based approach. This way takes a specific set of CT practices naturally integrated into different STEM disciplines to facilitate students’ content learning (Li et al., 2020b ; Weller et al., 2022 ). Weintrop et al. ( 2016 ) identified four categories of CT practices in math and science education: data practices , modeling and simulation practices , computational problem-solving practices , and systems thinking practices . Engaging students in systems thinking practices can simplify the understanding of systems and phenomena within the STEM disciplines (Grover & Pea, 2013 ). Integrating CT involves students in data practices , modeling , simulation and/or using computational tools such as programming to generate representations, rules, and reasoning structures (Phillips et al., 2023 ). This aids in formulating predictions and explanations, visualizing systems, testing hypotheses, and enhancing students’ understanding of scientific phenomena and mechanisms (Eidin et al., 2024 ). When comparing the previously mentioned two integrated ways, the first places specific attention on developing discipline-general CT, while the second emphasizes improving students’ learning of disciplinary content and developing discipline-specific CT (Li et al., 2020b ).

Practical aspects of CT-STEM have also been explored in the literature, including instructional strategies and assessment targets. Scholars have attempted different instructional strategies for CT-STEM implementation to achieve the designated educational purpose. These strategies can be categorized as instructional models (e.g., problem-driven strategies and project-based strategies), topic contexts (e.g., game-based strategies, and modeling- and simulation-based strategies), scaffolding strategies, and collaborative strategies (Wang et al., 2022a ) (see Table  1 ) . Typically, in instructional models , CT is viewed as an essential competency, guiding students to create interdisciplinary artifacts and solve specific real-world problems. Li et al. ( 2023 ) integrated CT as a core thought model into a project-based learning process, focusing on student-designed products for practical problems. Compatible with instructional models, a variety of instruction strategies based on topic contexts have been used, such as game design, computational modeling and simulation, and robotics. These also called plugged-in activities, typically involve computer programming for performing STEM tasks (Adanır et al., 2024 ). In contrast, unplugged activities operate independently of computers, involving physical movements or using certain objects to illustrate abstract STEM concepts or principles (Barth-Cohen et al., 2019 ; Chen et al. 2023b ). In combination with the above strategies, scaffolding strategies have been designed and utilized in CT-STEM to reduce students’ cognitive load and provide support for their self-regulated learning, such as guidance and adaptive, peer-, and resource-scaffolding. In addition, educators have employed various collaborative strategies (e.g., Think-Pair-Share practice) to enhance students’ cooperative and communicative skills in CT-STEM learning (Tikva & Tambouris, 2021a ). In short, the use of different types of instructional strategies serves as a significant factor in influencing the effectiveness of CT-STEM.

Prior research has focused on assessment targets within the cognitive and noncognitive domains (Tang et al., 2020 ; Wang et al., 2022a ). The former includes direct cognitive manifestations such as knowledge and skills related to CT constructs and STEM constructs, as well as domain-general mental abilities such as creativity and critical thinking (Tang et al., 2020 ). Wang et al. ( 2022a ) reported CT-STEM studies targeted cognitive domain assessments, which included assessments of students’ CT concepts and skills, programming knowledge and skills, and STEM achievements. These constructs were mainly measured through tests, including validated and self-developed tests. Other researchers characterize CT as a general thinking skill and employ performance scales for measurement (e.g., Korkmaz et al., 2017 ; Tsai et al., 2019 , 2021 ). The assessment of the noncognitive domain focused on students’ dispositions and attitudes towards CT-STEM (Lai & Wong, 2022 ), including self-efficacy, interest, and cooperativity, mainly measured by surveys/scales.

In summary, CT-STEM has garnered considerable attention from researchers, primarily exploring theoretical interpretations of how a reciprocal relationship between STEM and CT can enrich student learning. CT-STEM is implemented through the development and application of varied instructional strategies, with assessments aimed at understanding its effects on students’ cognitive and noncognitive domains. While these are important contributions, there is a notable lack of systematic and empirical evidence concerning the differentiated benefits of CT-STEM integration. We aim to address this deficit by differentiating benefits via transfer effects and systematically synthesizing pertinent research in this field.

Transfer effect of learning

Transference or transfer effect refers to the ability to apply what one has known or learned in one situation to another (Singley & Anderson, 1989 ), standing at the heart of education as it highlights the flexible application of acquired knowledge (OECD, 2018 ). Perkins and Salomon ( 1992 ) defined transfer as the process of transferring learning and performance from one context to another, possibly even in a dissimilar context. From a cognitivist perspective, knowledge, seen as a stable mental entity, can traditionally be summoned and adapted to new situations under the right circumstances (Day & Goldstone, 2012 ). Nevertheless, this traditional approach has been subject to extensive criticism, particularly from those who hold a constructivist perspective. From their view, the transfer of learning is not a static application of knowledge to a new context but rather the “byproduct of participation in particular situations” (Day & Goldstone, 2012 )—a standpoint widely acknowledged and endorsed by most researchers. Despite the broad consensus on this view (Scherer et al., 2019 ), some questions remain: How can a successful transfer occur? What factors define “other” or “new” contexts?

One prominent explanation for the successful transfer of knowledge is the theory of “common elements” (Singley & Anderson, 1989 ), which hypothesizes that successful transfer depends upon the elements that two different contexts or problem situations share (Scherer et al., 2019 ). Thus, based on this theory, the transfer effect can be divided into near transfer and far transfer (Perkins & Salomon, 1992 ). Near transfer occurs when successful skills and strategies are transferred between contexts that are similar, i.e., contexts that are closely related and require similar skills and strategies to be performed; conversely, far transfer occurs when successful skills or strategies are transferred between contexts that are inherently different (Perkins & Salomon, 1992 ). Essentially, the transfer effect is determined by the similarity or overlap between the contexts and problems in which the skills were acquired and new different problems that are encountered in the future (Baldwin & Ford, 1988 ). Simply put, there is a greater chance of transference between related contexts or problem situations (near-transfer) than between divergent situations (far-transfer). Since transfer effects are inherently situation-specific, they depend highly on the circumstances under which the skills/knowledge were acquired and the overlap with the new situation (Lobato, 2006 ).

While far-transfer effects are less likely to occur, numerous studies have reported far-transfer effects, albeit to varying extents (Bransford & Schwartz, 1999 ). Scherer et al. ( 2019 ) reported a moderate effect ( g  = 0.47) indicative of far transfer effects in learning computer programming, while Sala and Gobet ( 2016 ) found relatively limited evidence of far transfer effects within the domains of chess instruction and music education: successful transfer was only observed in situations that required skills similar to those acquired in the interventions. The extent of far-transfer can fluctuate across different contexts, indicating a need for further exploration within different disciplines and learning contexts.

The transfer effects of CT-STEM

The transfer effects of learning computer programming have been explored (Bernardo & Morris, 1994 ; Pirolli & Recker, 1994 ; Scherer et al., 2019 , 2020 ). For instance, students learning BASIC programming demonstrated that acquiring programming knowledge significantly enhanced the students’ abilities to solve verbal and mathematical problems; however, no significant differences were found in mathematical modeling and procedural comprehension (Bernardo & Morris, 1994 ). Scherer et al. ( 2019 ) conducted a meta-analysis exploring the effects of transferring computer programming knowledge on students’ cognitive benefits. They identified positive skill transfers from learning programming to areas such as creative thinking, mathematical abilities, and spatial skills. Beyond cognitive benefits, Popat and Starkey ( 2019 ) and Melro et al. ( 2023 ) indicate that learning programming also contributes to noncognitive benefits like collaboration and communication.

Programming can be a conduit for teaching, learning, and assessing CT and a mechanism to expose students to CT by creating computational artifacts. Although programming skills and CT share a close relationship and overlap in several aspects (e.g., application of algorithms, abstraction, and automation), they are not identical (Ezeamuzie & Leung, 2022 )—the latter (i.e., CT) also involves incorporating computational perspectives and computational participation (i.e., the student’s understanding of himself or herself, and their interactions with others and technology; Shue et al., 2017). CT can also be taught without programming through so-called unplugged activities. Hence, research on the transfer of programming only addresses a limited aspect of the CT transference.

Research on CT transfer effects has recently surged (Liu & Jeong, 2022 ; Ye et al., 2022 ). In a meta-analysis, Ye et al. ( 2022 ) reported a positive transfer effect beyond computer programming in understanding science, engineering, mathematics, and the humanities. Using in-game CT supports, Liu and Jeong ( 2022 ) reported a significant improvement in student CT skills at the near transfer level but not at the far transfer level. Correlation analyses by Román-González et al. ( 2017 ) demonstrated a significant relationship between CT and other cognitive abilities, which is collaborated by Xu et al.’s ( 2022 ) study, showing CT relates to numerous cognitive and learning abilities in other domains, such as reasoning, creative thinking, and arithmetic fluency. Other studies attribute cognitive benefits to CT, such as executive functions (Arfé et al., 2019 ). Although the results from correlation analyses cannot provide definitive causal evidence, they offer valuable insights and directions for future investigations, including potential meta-analysis studies.

While several systematic reviews and meta-analyses have been conducted on programming and CT transfer effects, there is a scarcity of meta-analysis that investigate the transfer effects of CT-STEM and the variables that moderate these effects. Cheng et al. ( 2023 ) explored the overall effect of CT-STEM on students’ STEM learning performance within a K-12 education context and reported a large effect size ( g  = 0.85) between pretest and posttest scores on STEM learning outcomes. They investigated moderating variables in the models, including student grade levels, STEM disciplines, intervention durations, and types of interventions. Of these, only the intervention durations had a significant moderating effect. While their work offers evidence supporting the effectiveness of CT-STEM on students’ learning outcomes, evidenced by a large effect size, we identified three notable shortcomings: First, their meta-analysis lacked a focus on potential benefits that can be derived from CT-STEM integration, particularly in terms of differentiating learning outcomes from the perspective of transfer effects. Existing meta-analyses have found that effect sizes vary considerably across various types of learning outcomes (Sala & Gobet, 2017 ; Scherer et al., 2019 ). This variation indicates that CT-STEM may not benefit different categories of learning outcomes equally. Second, the study focused only on cognitive learning outcomes, omitting noncognitive effects that may be fostered by CT-STEM. As noted earlier, although CT is primarily a cognitive psychological construct associated with cognitive benefits, it also has a complementary noncognitive aspect (Román-González et al., 2018 ). The synergy between CT and STEM holds promise for delivering cognitive and noncognitive benefits to students. Third, their inclusion of only studies that employed one-group pretest–posttest designs may contribute to biased outcomes, limiting the potential representativeness and robustness of the research findings (Cuijpers et al., 2017 ). Morris and DeShon ( 2002 ) posited that combining effect sizes from different study designs, both rationally and empirically, would lead to more reliable and comprehensive conclusions.

While various studies have validated the transfer effect of programming and CT, a systematic examination of CT-STEM’s transfer effects remains an area for further exploration. Our review identified key gaps, including a lack of differentiation in learning outcomes, insufficient focus on noncognitive benefits, and limitations in research robustness. Additionally, practical challenges, such as identifying effective activities and methods for CT integration into STEM, as well as determining optimal intervention durations, need to be addressed. We address these issues by investigating the transfer effects of CT-STEM, combining effect sizes from diverse studies, and considering both cognitive and noncognitive domains. We also identify practical factors that could influence these effects through moderator analysis. Our goal is to enhance instructional design in CT-STEM and provide new insights and guidance for both practitioners and researchers in the field.

Conceptual framework for the present study

Drawing from Mayer’s ( 2011 , 2015 ) framework, we synthesized evidence on the CT-STEM transfer effects and the contextual conditions that enhance instructional effectiveness. This framework, widely used to evaluate technology-based interventions like computer programming and educational robotics (Chen et al., 2018 ; Sun & Zhou, 2022 ; Tsai & Tsai, 2018 ), offers a multifaceted perspective on instructional methods. It allows for the exploration of three types of research questions: (a) Learning consequences, by examining the benefits of specific instructional methods; (b) Media comparison, by assessing the effectiveness of instructional methods; and (c) Value-added teaching, by investigating how changes in teaching conditions affect student performance. Chen et al. ( 2018 ) highlights this framework’s aptitude for systematically organizing and mapping domains and study contexts, accommodating diverse research foci.

Transferring this framework to the context of CT-STEM instruction (see Fig.  1 ), we systematically summarize the learning sequences through CT-STEM’s transfer effect. Based on our literature review section, we have categorized these sequences into four types: (a) Cognitive benefits through near transfer effect (CNT); (b) Noncognitive benefits through near transfer effect (NCNT); (c) Cognitive benefits through far transfer effect (CFT); and (d) Noncognitive benefits through far transfer effect (NCFT). This study synthesizes evidence on CT-STEM’s effectiveness per transfer type and examines various moderators affecting these effects. We considered sample features (e.g., educational level and sample size) and study features (e.g., study design, subject, instructional strategy, and intervention duration) as potential moderators affecting the transferability of CT-STEM. Previous CT-related studies indicated that these moderators contribute to variance in the effect sizes (Lai & Wong, 2022 ; Scherer et al., 2020 ; Sun & Zhou, 2022 ; Ye et al., 2022 ).

figure 1

Conceptual framework of the present meta-analysis

Methodology

We collected and analyzed literature on the transfer effects of CT-STEM using a rigorous systematic review process (Jesson et al., 2011 ), adhering to the PRISMA guidelines (Moher et al., 2010 ).

Database and keywords

We initially searched for key works on CT and STEM in seven databases: Web of Science, Science Direct, Springer, Wily, IEEE Xplore Digital Library, Sage, and Taylor & Francis. In the search, CT was explicitly confined to “computational thinking.” The major intervention approaches were included, such as programming, plugged activities, and unplugged activities. For STEM, we used the following terms: STEM, science, technology, engineering, and mathematics, and further supplemented “science” with discipline-specific terms like “physics,” “chemistry,” and “biology.” Additionally, we added “game design” and “robotics” to complement “technology,” as these are significant technical contexts for CT. As a final step, we searched for full peer-reviewed articles in the databases using keyword groupings, focusing exclusively on educational and educational research fields: (“Computational thinking” OR “programming” OR “plugged activity” OR “unplugged activity”) AND (“STEM” OR “technology” OR “engineering” OR “mathematics” OR “physics” OR “chemistry” OR “biology” OR “game design” OR “robotics”). The initial search included articles published between January 1, 2011, and March 1, 2023, as professional CT-STEM fields were formed and gained popularity after 2011 (Lee & Malyn-Smith, 2020 ; Malyn-Smith & Ippolito, 2011 ). This initial search yielded 12,358 publications, which were then subjected to further screening.

Inclusion and exclusion criteria

The inclusion and exclusion criteria for articles were detailed in Table  2 . This study examined the transfer effects of CT-STEM, exploring both near and far transfer effects on cognitive and noncognitive benefits acquisition. Eligible studies included those with experimental or quasi-experimental designs, such as Independent-groups pretest–posttest (IGPP), Independent-groups posttest (IGP), and Single-group pretest–posttest (SGPP), reporting pretest and posttest or solely posttest performance. Articles where CT was not integrated with STEM content or context, or if the authors did not conceptualize or assert their studies as integrating CT with STEM learning, were excluded. Studies focusing on programming tools like Scratch or robotics, without involving other STEM content or contexts were excluded. Since STEM education often emphasizes situated learning, with contexts from social studies, culture, language, and arts (Kelley & Knowles, 2016 ), articles in other disciplines (e.g., social sciences, literacy, and culture) that involve CT activities, such as designing digital stories and games (Zha et al., 2021 ), were included. We did not limit the educational context (e.g., K-12 or higher education) since the effects of CT-STEM differ at various educational levels, and including both enables a more comprehensive understanding. The methods of assessment after the CT-STEM interventions were unrestricted. Inclusion criteria for studies necessitated reporting at least one cognitive (e.g., critical thinking or school achievement) or noncognitive (e.g., communication or collaboration) benefit using performance-based outcome measures. Studies reporting only behavioral (e.g., response times and number and sequence of actions) were excluded. Eligibility also depended on providing adequate statistical data for effect size calculation, requiring details like sample sizes, standard deviations, means, t -values, F -values, or z -scores.

Study selection

Figure  2 shows the three selection stages: identification , screening , and eligibility evaluation . After the initial search, automatic and manual searching were used to eliminate duplicates. Two independent researchers used the inclusion and exclusion criteria to screen the article titles and abstracts, eliminating those that did not fit the criteria. Following this, the texts of the remaining articles were scrutinized and assessed using the criteria requirements for inclusion in the final sample. The interrater agreement was high (Cohen’s Kappa coefficient = 0.92). All disagreements were resolved by discussing and reviewing. This selection process yielded 32 studies that met the eligibility criteria. Lastly, a “snowball” search method (Petersen & Valdez, 2005 ) was used to find additional articles that met the criteria. Both backward and forward snowballing using the identified papers resulted in an additional five papers. Overall, the search and evaluation process yielded 37 articles for analysis (a complete list of references for these included studies can be found in Supplementary Material A1).

figure 2

The selection flowchart used based on PRISMA approach

Data extraction and analysis

Coding of studies.

We modified the systematic review coding scheme spreadsheet (Scherer et al., 2019 ; Ye et al., 2022 ), which was used to document and extract information. It includes basic study details (reference, publication year, and journal), four types of outcome variables, sample features (educational level and sample size), study characteristics (study design, subject, instructional strategy, and intervention duration), and statistical data for effect size calculation. To ensure the reliability of the coding, each study was coded by two researchers using the coding scheme. The interrater reliability was 0.93 using the Kappa coefficient, and discrepancies were settled in discussion sessions until mutual agreement was reached.

Outcome variables

To ascertain which cognitive and noncognitive benefits can be derived through CT-STEM transference, we constructed a hierarchical structure and classified these benefits into four categories: CNT, NCNT, CFT, and NCFT (see Table  3 ). CNT (i.e., domain-specific cognitive skills/knowledge) occurs when skills or knowledge acquired in CT-STEM are applied to a domain that is closely related, such as CT knowledge/concepts and CT practices/skills (Scherer et al., 2019 ; Sun & Zhou, 2022 ). In the included studies, CNT was measured using (a) validated tests, such as the Computation Thinking test (CTt), and (b) self-developed tests/tasks for evaluating students’ comprehension of subject-specific concepts and knowledge. NCNT pertains to shifts in students’ attitudes, motivations, self-efficacy, or perceptions concerning the related domain (e.g., CT-STEM, iSTEM, STEM, or programming) following their engagement with CT-STEM (Bloom & Krathwohl, 1956 ). Measures for NCNT in the selected studies primarily utilized standardized scales, with some employing self-developed scales.

CFT (i.e., domain-general cognitive skills) manifests when the skills attained from the CT-STEM are applied to different domains (Doleck et al., 2017 ; Xu et al., 2022 ). These skills, such as reasoning skills, creativity, and critical thinking, were mostly assessed by standardized scales and various tests like the Bebras test, TOPS test, Computational Thinking Scale (CTS) (e.g., Korkmaz et al., 2017 ; Tsai et al., 2019 , 2021 ), and Cornell Critical Thinking test (CCTT). NCFT involves the transfer of skills from CT-STEM to higher-order noncognitive learning outcomes such as cooperativity and communication (OECD, 2018 ). Measurement techniques for this category included validated scales along with specific self-developed tasks. Then, we calculated the measured frequency of each benefit in the selected papers (N = 37) and used bar charts for visualization to answer RQ1.

Moderator variables

Based on the framework presented in Fig.  1 and previous meta-analyses in CT-STEM and related fields (e.g., educational robotics, programming, and CT), we examined two types of moderators for their potential role in enhancing the transferability within CT-STEM (see Table  4 ). The variables included: (1) Sample features. Sample features comprised the educational levels targeted by the intervention—kindergarten, primary school, secondary school, and university/college—and the sample size , with the latter equating to class size in educational contexts and exhibiting variability across studies; (2) Study features. The design of the primary studies was coded as either an IGPP, an IGP, or a SGPP. Considering the possibility of multiple designs occurring within one study, we elected to code them independently (Scherer et al., 2020 ). Next to the subject , the coding of categories is primarily predicated on the intervention transfer area (Ye et al., 2022 ). When CT is integrated into several subjects, we coded such studies as “Multiple STEM subjects” accordingly. Based on Wang et al.’s ( 2022a ) review, we assigned instructional strategy as additional possible moderating variables and coded them as “instructional models,” “topic contexts,” “scaffolding strategies,” and “collaborative strategies.” Table  1 provides an account of these instructional strategies and contains sample references; Supplementary Material A2 contains more detailed descriptions of these strategies for each included study. Finally, the length of the intervention was extracted and later coded as < 1 week, one week-1 month, one month-1 semester, > 1 semester, and not mentioned.

Calculating effect sizes

We computed effect sizes using the Comprehensive Meta-Analysis (CMA) Software 3.0 (Borenstein et al., 2013 ). To increase the number of articles in our meta-analysis, we included three types of study designs (Morris & DeShon, 2002 ). Despite potential time bias and selection bias, our study used the same metric (i.e., raw-score metric) for calculating effect sizes. This metric is insensitive to variations in ρ and is recommended when homogeneity of ρ cannot be assumed or tested empirically (Morris & DeShon, 2002 ). These calculations were based on the means and standard deviations of the student learning outcome data. If these values were not reported in the studies, we used other statistics to calculate the standardized mean difference, such as t -values, z -scores, F -values, Cohen’s d , SE , and Confidence intervals (95% CI) (Borenstein et al., 2009 ). All reported p -values are two-tailed unless otherwise reported.

We calculated the effect sizes by the metric of Hedges’ g , which allows the integration of results from varied research designs with minimal bias and provides a global measure of CT-STEM effectiveness (Sun et al., 2021 ). Hedges’ g was interpreted by Hedges and Olkin’s ( 2014 ) assertion, in which 0.20–0.49 indicates low effect, 0.50–0.79 indicates medium effect, and 0.8 and above indicates high effect. CMA 3.0 empirically supports the amalgamation of multiple study designs in a single analysis (Borenstein et al., 2013 ). Leveraging this feature, we used experimental designs as a moderator to mitigate potential bias (Morris & DeShon, 2002 ). The statistically nonsignificant p -value of the Q test ( p  = 0.343) failed to reject the null hypothesis of no difference between mean effect sizes calculated from alternate designs. Therefore, effect sizes from different designs can be meaningfully combined (Delen & Sen, 2023 ; Morris & DeShon, 2002 ). Due to substantial variations in outcome measures and environments across studies, we employed the random-effects model to address RQ2 (a) and RQ3 (a) in this study by calculating overall and subgroup effect sizes (Borenstein et al., 2021 ; Xu et al., 2019 ).

Non-independence

We calculated one effect size per study to ensure the independence of the effect sizes; however, if a study reported multiple benefits that did not overlap, the effect size for each benefit was included in the analysis. Additionally, when a study reported effect sizes for separate groups of students (e.g., students in grades 1, 2, and 3) where the participants did not overlap, the effect sizes for each group were considered independent samples (Lipsey & Wilson, 2001 ). When a study reported multiple assessments (e.g., midterm and final exams) in one subject area, we selected the most comprehensive assessment (Bai et al., 2020 ).

Analyses of heterogeneity

Heterogeneity was detected using the I 2 test (i.e., there is a degree of inconsistency in the studies’ results), which was calculated to show the ratio of between-groups variance to the total variation across effect sizes, revealing the effect sizes variation stemming from the differences among studies (Shamseer et al., 2015 ). Then, we conducted a moderator analysis to pinpoint potential sources of variance in transfer effect sizes, including examining the overall, near, and far transfer effects, to address the RQ2 (b) and RQ3 (b).

Publication bias

We conducted three additional analyses to determine if publication bias affected the review results. They included a funnel plot, Egger’s test, and the classic fail-safe N. The funnel plot is a graphical tool that compares effect sizes to standard errors to check if publication bias distorted treatment effects (Egger et al., 1997 ). We used the Egger test to examine symmetry and quantify the amount of bias captured by the funnel plot (Bai et al., 2020 ; Borenstein, 2005 ). The classic fail-safe N was calculated to address the issue of publication bias affecting the effect size. Specifically, when the meta-analysis results are significant, it is essential to calculate the number of lost and unpublished studies that should be included to make the compound effect insignificant (Rosenthal, 1979 ). According to Rosenberg ( 2005 ), the fail-safe N (X) should reach 5 k + 10 to ensure that X is large relative to k (the number of independent effect sizes). The greater the fail-safe N value, the smaller the publication bias.

Cognitive and noncognitive benefits through CT-STEM’s transfer effect (RQ1)

Our investigation of CT-STEM transference revealed 36 benefits, detailed in Fig.  3 . This includes benefits from both near and far transfer: seventeen cognitive and eight noncognitive benefits were attributed to near transfer (CNT and NCNT, respectively), while nine cognitive and two noncognitive benefits resulted from far transfer (CFT and NCFT, respectively).

figure 3

The measured frequency of documented benefits in selected publications

The top five benefits most frequently documented in empirical CT-STEM research were mathematics achievement ( f  = 9), CT knowledge/ concepts ( f  = 7), CT ( f  = 5), physics achievement ( f  = 5), and self-efficacy ( f  = 5). The notable medium frequency of certain NCNT, such as self-efficacy and motivation, highlights a dual focus in research: enhancing both cognitive skills and noncognitive gains in students involved in CT-STEM. There has been greater integration of CT into mathematics and science; however, other disciplines (e.g., biology, chemistry, social science, and culture) have received less attention. The limited observation of NCFT (only two identified) underscores the potential for broader research explorations.

CT-STEM’s overall transfer effects and moderator analysis (RQ2)

Overall transfer effects of ct-stem (rq2a).

In total, 37 primary studies involving 7832 students were included in the sample, yielding 96 effect sizes. Among these studies, 62% (23 studies) utilized an IGPP design, 35% (13 studies) adopted an SGPP design, and 3% (1 study) employed an IGP design. In this meta-analysis, we first analyzed 37 empirical studies using a random model. Our finding shows a significant overall effect size favoring the transfer effect of CT-STEM on both cognitive and noncognitive benefits for students ( g  = 0.601, 95% CI [0.510–0.691], Z  = 12.976, p  < 0.001) (see Fig.  4 ). The heterogeneity test results showed a significant Q value ( Q  = 853.052, I 2  = 88.864, p  < 0.001), suggesting substantial heterogeneity in the study effect sizes. Thus, a moderator analysis of different contextual variables would be required in subsequent analyses.

figure 4

Forest plot of effect size (Hedges’ g ) in the random-effect model

To assess potential publication bias in our meta-analysis, we generated a funnel plot and performed the Classic Fail-safe N and Egger tests. As depicted in Fig.  5 , the studies were primarily evenly distributed on both sides of the funnel plot and located in the middle to upper effective areas (Egger et al., 1997 ). The Classic Fail-safe N value was 4702, significantly exceeding the conservative threshold of 5 k + 10 (490). Moreover, Egger’s Intercept was 1.01, [− 0.03–2.05] with a p -value of 0.06, which indicates no publication bias in our data set.

figure 5

Funnel plot (Hedges’ g ) of overall transfer effects

Moderator analysis of overall transfer effects (RQ2b)

We examined six variables as potential moderators, including educational level , sample size , study design , subject , instructional strategy , and intervention duration , using the random model to identify the origins of heterogeneity (see Table  5 ). The moderator analysis indicated no significant differences in effect size among various study designs ( QB  = 2.142, df  = 2, p  = 0.343). This suggests that different designs estimate a similar treatment effect, allowing for a combined analysis of effect sizes across designs (Morris & DeShon, 2002 ). Further, the analysis showed that the subject did not significantly moderate the CT-STEM benefits ( QB  = 13.374, df  = 9, p  = 0.146), indicating effective CT integration across various STEM disciplines ( g  = 0.567, p  < 0.001). However, we observed a notable exception in social science ( g  = 0.727, p  = 0.185), where the integration effect was not significant, in contrast to significant effects in subjects like engineering ( g  = 0.883, p  < 0.001) and science ( g  = 0.875, p  < 0.001).

Significant moderator effects were found in educational level ( QB  = 13.679, df  = 3, p  = 0.003), sample size ( QB  = 48.032, df  = 3, p  < 0.001), instructional strategy ( QB  = 7.387, df  = 2, p  = 0.025), and intervention duration ( QB  = 22.950, df  = 3, p  < 0.001). Specifically, educational levels showed different effects: medium for kindergarten ( g  = 0.777, p  < 0.001), elementary ( g  = 0.613, p  < 0.001), and secondary students ( g  = 0.690, p  < 0.001), but lower for university students ( g  = 0.366, p  < 0.001). This indicates a stronger CT-STEM impact in the lower grades. Smaller sample size groups (less than 50 students) exhibited the highest effect size ( g  = 0.826, p  < 0.001), while larger groups (over 150 students) showed the lowest ( g  = 0.233, p  < 0.001), suggesting a decrease in effect with increasing class size. Instructional strategy was a significant moderator, indicating that the intervention strategy type significantly impacts CT-STEM’s transfer effects. Strategies involving topic contexts (e.g., modeling, simulation, robotics, programming) had the largest effect ( g  = 0.647, p  < 0.001), followed by scaffolding methods (e.g., (meta)cognitive scaffolding) ( g  = 0.492, p  < 0.001), with the instructional model strategy showing the smallest effect ( g  = 0.394, p  < 0.001). In addition, intervention duration was a critical moderator. The most significant effect was observed in interventions lasting between one week and one month ( g  = 0.736, p  < 0.001), with longer durations showing diminishing effects.

CT-STEM’s near and far transfer effects and moderator analysis (RQ3)

Near transfer effect by cognitive and noncognitive benefits (rq3a).

To further analyze the effect size of CT-STEM near-transfer, we focused on a subgroup encompassing both cognitive and noncognitive benefits, as detailed in Table  6 . We observed that the effect size for CT-STEM near-transfer is 0.645 (95% CI [0.536–0.753], Z  = 11.609, p  < 0.001), indicating a moderate impact on near-transfer benefits, with cognitive benefits demonstrating a larger effect size ( g  = 0.672, 95% CI [0.540–0.804], Z  = 9.978, p  < 0.001) compared to noncognitive benefits ( g  = 0.547, 95% CI [0.388–0.706], Z  = 6.735, p  < 0.001). This suggests that CT-STEM interventions are more impactful on cognitive aspects, e.g., CT skills, programming abilities, and algorithmic thinking, than noncognitive aspects, such as self-efficacy, learning motivation, and attitudes.

We utilized a funnel plot to assess and illustrate the publication bias of the study (see Fig.  6 ). The majority of the studies cluster in the effective area of the plot. The symmetric distribution of studies on the funnel plot’s left and right sides suggests a minimal publication bias. Furthermore, Egger’s test yielded a result of t (70) = 0.85 with a p -value of 0.40, reinforcing this indication. The Classic Fail-safe N was calculated to be 6539, substantially exceeding the estimated number of unpublished studies (5 k + 10 = 370). Therefore, these results collectively suggest that publication bias has a negligible impact on the CT-STEM’s near-transfer effects.

figure 6

Funnel plot (Hedges’ g ) of near-transfer effect

Far transfer effect by cognitive and noncognitive benefits (RQ3a)

In examining CT-STEM far-transfer as a specific subgroup (see Table  6 ), we found a moderate effect size ( g  = 0.444, 95% CI [0.312–0.576], Z  = 6.596, p  < 0.001), indicating a significant positive impact of CT-STEM on students’ generic skills, including creativity, critical thinking, and problem-solving. A comparison of effect sizes between cognitive and noncognitive benefits revealed that cognitive benefits ( g  = 0.466, 95% CI [0.321–0.611], Z  = 6.289, p  < 0.001) were more pronounced than noncognitive benefits ( g  = 0.393, 95% CI [0.011–0.775], Z  = 1.833, p  = 0.044). The results show that CT-STEM effectively enhances cognitive and noncognitive skills in the far-transfer type. The far-transfer effect is more significant for cognitive abilities such as general thinking and problem-solving skills than noncognitive skills.

The funnel plot for far-transfer effects (see Fig.  7 ) shows some degree of asymmetry, which was further substantiated by Egger’s Test, yielding t (24) = 3.90 with a p -value of less than 0.001. Although the calculated Fail-safe N (N = 794) is considerably larger than the threshold of 5 k + 10 (130), this discrepancy does suggest the possibility of some publication bias in the far-transfer effects of our study.

figure 7

Funnel plot (Hedges’ g ) of far-transfer effect

Heterogeneity and moderator analysis of near and far transfer effects (RQ3b)

We conducted heterogeneity assessments for each subgroup, focusing on near-transfer and far-transfer effects. The significant Q statistic values indicated high heterogeneity in both groups ( Q near  = 671.379, I 2  = 89.425%, p  < 0.001; Q fa r  = 93.552, I 2  = 75.415%, p  < 0.001). We then explored moderating effects based on educational level , sample size , subject , instructional strategy , and intervention duration . The results showed that the near-transfer effect of CT-STEM is moderated by educational level , sample size , instructional strategy , and intervention duration (see Table  7 ). In contrast, the far-transfer effect is moderated only by educational level and sample size (see Table  8 ). These findings suggest that the near-transfer effect is more susceptible to contextual factors variations than the far-transfer effect.

Discussion and implications

This study examined the transfer effects of CT-STEM on students’ cognitive and noncognitive skills. We conducted a systematic literature review and a meta-analysis approach. The main findings and implications of this study are discussed in the following sections.

Cognitive and noncognitive benefits through CT-STEM transfer effects

RQ1 asks what are the cognitive and noncognitive benefits derived from the transfer effects of CT-STEM. From 37 empirical studies, we identified 36 benefits, categorized into four types: CNT, CFT, NCNT, and NCFT. These benefits are consistent with findings in prior studies (e.g., Melro et al., 2023 ; Román-González et al., 2018 ; Scherer et al., 2019 ; Tsarava et al., 2022 ; Ye et al., 2022 ), indicating CT-STEM provides cognitive and noncognitive benefits but also fosters development of domain-specific and domain-general skills. Most prior research has focused on CT-STEM’s impact on students’ mathematics achievement, CT skills/concepts, self-efficacy, and cooperativity. Our results further suggest that CT-STEM enhances cognitive skills while significantly contributing to affective and social learning outcomes. This finding supports the view that while CT is primarily cognitive, akin to problem-solving abilities, it has a significant noncognitive aspect (Román-González et al., 2018 ). An illustrative example is the study by Wang et al. ( 2022b ), which developed a non-programming, unplugged-in CT program in mathematics, that effectively improved students’ CT skills, cooperation tendencies, and perceptions of CT.

Most transfer studies to date have primarily focused on students’ mathematics and science achievement, with less emphasis on other subjects like physics, biology, and chemistry. One reason is the overlap in thinking practices among these disciplines and CT (Rich et al., 2019 ; Ye et al. 2023 ). For example, modeling and simulating complex phenomena in these subjects foster problem decomposition skills, crucial in mathematics, science, and CS. Additionally, CT offers an analytical and systematic framework for problem-solving, a key aspect in tackling complex mathematical and scientific problems (Berland & Wilensky, 2015 ). Despite this, CT’s potential in a wider range of subjects remains underexplored (Ye et al., 2022 ). Previous studies have identified potential challenges in integrating CT into diverse STEM disciplines (Kite & Park, 2023 ; Li et al., 2020a ), and finding suitable curriculum topics that effectively utilize CT’s benefits can be difficult. Beyond mathematics, CT-STEM transfer studies have looked at topics like ecology (Christensen & Lombardi, 2023 ; Rachmatullah & Wiebe, 2022 ), force and motion (Aksit & Wiebe, 2020 ; Hutchins et al., 2020a , 2020b ), and chemical reactions (Chongo et al., 2021 ). This situation indicates a need for exploring a broader range of STEM topics to fully leverage the synergy between CT and STEM.

Our review identified only two far-noncognitive benefits of CT-STEM, suggesting these benefits may be harder to measure. Gutman and Schoon ( 2013 ) noted that far-noncognitive skills like perseverance and persistence have variable measurement robustness and are context-dependent. Mirroring the research methods of Israel-Fishelson and Hershkovitz ( 2021 ) and Falloon ( 2016 ), we recommend further capturing and analyzing students’ behaviors through recordings or log files from learning platforms. Additionally, few studies have focused on these competencies in CT-STEM, highlighting a promising direction for future CT-STEM integration efforts.

CT-STEM’s transfer effects

For RQ2 (a) and RQ3 (a), our meta-analysis indicates positive impacts on both cognitive ( g  = 0.628) and noncognitive benefits ( g  = 0.510), each showing moderate effect sizes. This finding supports the use of CT-STEM in enhancing students’ cognitive and noncognitive skills, as suggested by Lee et al. ( 2020 ), who argue that integrating CT in STEM encourages deeper engagement in authentic STEM practices, thereby developing a broad spectrum of skills, including cognitive and noncognitive aspects.

Our findings that cognitive benefits exhibit greater effect sizes than noncognitive benefits across both near-transfer and far-transfer, contrast with previous research by Kautz et al. ( 2014 ), which suggested noncognitive skills are more malleable. Two factors that might explain this disparity are gender and age. Gender may be a significant factor since CT-STEM requires students to utilize computational concepts, practices, and perspectives to solve complex, real-world problems, which can have inherent gender biases. For example, Czocher et al. ( 2019 ) found that female students often experience more frustration and lower engagement in CT-STEM, and similar studies report that they have lower interest, confidence, and self-efficacy than males (Wang et al., 2022b ). Jiang and Wong ( 2022 ) found no significant gender differences in cognitive skills like CT, indicating that the differences might lie in the affective skill domains, suggesting that students’ noncognitive skills might be less malleable than their cognitive skills in CT-STEM programs. As such, increasing students’ motivation, especially among girls, is a crucial issue for future studies (Tikva & Tambouris, 2021b ). Student age may be a contributing factor. Lechner et al. ( 2021 ) demonstrated that age influences skill adaptability, with younger individuals showing greater exploratory behavior and neural plasticity. Both characteristics are pivotal for cognitive development (e.g., reasoning skills and literacy) (Gualtieri & Finn, 2022 ), making cognitive skills more plastic than noncognitive skills. This aligns with our findings, where a significant proportion of studies (49%) focused on primary school settings, reinforcing the importance of early CT integration.

In comparing the near- and far-transfer effects, our analysis shows that the effect size for near-transfer is higher than that for far-transfer for both cognitive and noncognitive domains, aligning with previous findings that identified a strong effect of programming through near transfer ( g  = 0.75, 95% CI [0.39, 1.11]) and a moderate effect through far transfer ( g  = 0.47, 95% CI [0.35, 0.59]) (Scherer et al., 2019 ). One explanation is by the theory of “common elements” (Singley & Anderson, 1989 ), which suggests that skills developed through CT-STEM are more readily transferable to similar contexts due to shared conceptual commonalities and elements (Nouri et al., 2020 ; Scherer et al., 2019 ). Essentially, students proficient in a skill often find it easier to apply this proficiency to a related skill that shares foundational principles and strategies (Baldwin & Ford, 1988 ). Despite this, the far-transfer effects in CT-STEM do occur and are significant. We stress the importance of developing effective strategies that foster these far-transfer effects within the CT-STEM curriculum. One approach is identifying “common elements” and conceptual similarities between different discipline context and skills, thus promoting transference.

Contextual variables explaining variation in the CT-STEM’s transfer effects

In our meta-analysis (Q2 (b) and Q3 (b)), we examined the heterogeneity of CT-STEM’s overall, near-transfer, and far-transfer effects using moderators: educational level , sample size , study design , subject , instructional strategy , and intervention duration . For the overall transfer effects, we found significant variations in the effect size, with notably higher efficacy observed in grade school students than university students. This finding further advocates for the early integration of CT in STEM education (Nouri et al., 2020 ). This difference in CT-STEM’s impact can be attributed to two factors: (1) It correlates with students’ cognitive and noncognitive development, with early grades being crucial for acquiring these benefits (Jiang & Wong, 2022 ); (2) The hands-on, experiential nature of CT-STEM, utilizing tangible materials and interactive simulations, is particularly suited to the development and learning needs of young children (Thomas & Larwin, 2023 ). Also, class size emerged as a strong moderator (Li et al., 2022 ; Sun & Zhou, 2022 ; Sun et al., 2021 ), with smaller classes (under 50 students) showing more pronounced transfer effects. As class size increases, the impact of CT-STEM on skills development decreases, possibly due to logistical constraints e.g., space, equipment, and resources (Cheng et al., 2023 ). We also found significant differences due to instructional strategies . Learning activities involving computational modeling, simulation, and embodied learning yielded larger effect sizes. This supports constructivist educational methods like computational modeling for simulating complex phenomena and facilitating content learning (Basu et al., 2015 ; Sengupta et al., 2013 ). For intervention duration , we found that CT-STEM interventions of one week to one month are most effective in enhancing student’s learning outcomes, after which the effect size diminishes, in agreement with Sun et al. ( 2021 ). This time frame window may be due to the need to balance learning time and ongoing students’ interest and motivation, with extended durations leading to a decrease in motivation and interest as students adjust to the new learning method (Appleton et al., 2008 ; Cheng et al., 2023 ). Importantly, our analysis revealed that subject matter had little impact on CT-STEM benefits, suggesting broad applicability across various STEM subjects.

Our analysis of near- and far-transfer effects in CT-STEM shows that educational level , sample size , instructional strategy , and intervention duration significantly moderate near-transfer effects, while far-transfer effects are mainly moderated by educational level and sample size . One explanation is that near-transfer effects are linked to domain-specific skills, responding to particular instructional elements like strategies and duration (van Graaf et al., 2019 ). While far-transfer effects for domain-general skills like critical thinking show significant moderation primarily by educational level and sample size , rather than instructional design. This may be due to a predominant focus on domain-specific skills in current instructional designs (Geary et al., 2017 ). One attractive alternative is to consider CT as a transdisciplinary thinking practice and integrate it across various STEM subjects to enhance students’ domain-general skills development (Li et al., 2020b ).

The far-transfer effects are linked to cognitive development and social contexts, and thus influenced by educational level , which aligns with cognitive maturation and skill readiness (Jiang & Wong, 2021; Zhan et al., 2022 ). In addition, sample size also affects social skills and classroom dynamics (Sung et al., 2017 ; Yılmaz & Yılmaz, 2023 ). Therefore, in designing CT-STEM activities, it is crucial to consider age-appropriate objectives and learning content, as well as class size, for optimal development of cognitive and social skills. Future research should continue to explore these factors, particularly in developing social skills.

Theoretical and practical implications

This study provides new knowledge for CT-STEM research and informs CT-STEM instructional design and practice. This work extends the current understanding of CT-STEM’s transfer effects on students’ cognitive and noncognitive domains. Our findings support the premise that CT-STEM can significantly enhance the development of students’ cognitive and noncognitive skills through near and far transfer. In addition, we provide a simple hierarchical structure that integrates cognitive and noncognitive domains through a transfer perspective (see Table  3 ). This structure can guide researchers in systematically classifying and identifying measurable constructs, leading to a more comprehensive understanding of student learning in CT-STEM.

Analysis of moderators provides actionable guidance for CT-STEM instructional design to capitalize on positive transfer effects. For overall and near-transfer effects, we encourage early integration of CT into individual and iSTEM disciplines through informed designed activities. We show that smaller class sizes (under 50 students), interventions lasting one week to one month, and strategic selection of instructional methods like computational modeling promote more effective transference (see Tables 5 and 7 ). Consequently, we recommend that educators and instructional designers prioritize creating collaborative learning environments using both in-person, hybrid, and online collaborative platforms, reducing logistical issues and allowing for closer monitoring of group interactions and timely feedback. Flexible curriculum design, with durations ranging from intensive one-week modules to longer month-long projects, is key to maximizing transference learning effects. Given computational modeling’s central role in STEM (NGSS Lead States, 2013 ), we encourage educators looking to integrate CT into classroom teaching to consider it as a primary entry point. To support far-transfer, educators need to develop age-appropriate content and activities that align with students’ cognitive development progression (Zhang and Nouri, 2019 ), alongside fostering a collaborative culture that nurtures social skills. For instructional models that have shown the greatest effect sizes (see Table  8 ), we strongly encourage teachers, especially those with prior experience in CT integration, to develop instructional models based on engineering design processes (Wiebe et al., 2020 ) that engage students in problem-solving and the creation of creative artifacts to foster their higher-order thinking skills.

This systematic literature review and meta-analysis examined the cognitive and noncognitive benefits of CT-STEM’s transfer effects. Analyzing 96 effect sizes from 37 qualifying studies, we found: (a) 36 distinct CT-STEM benefits across four categories, namely, CNT, CFT, NCNT, and NCFT; (b) CT-STEM had overall medium and significant impacts on four categories of benefits ( g  = 0.601); (c) the effect size of near-transfer ( g  = 0.645) was greater than that of far-transfer ( g  = 0.444), and cognitive benefits ( g  = 0.628) consistently showed a larger effect size than noncognitive benefits ( g  = 0.510); (d) educational level, sample size, instructional strategy, and intervention duration significantly moderated both overall and near-transfer effects, while far-transfer effects were significantly moderated only by educational level and sample size. Our findings provide a roadmap for curriculum designers and teachers to more effectively and efficiently integrate CT into STEM education at all grade levels, enhancing student development of both cognitive and noncognitive skills.

This study has several limitations. Although it uses a comprehensive review of the literature across seven databases, some specialized sources might have been overlooked. This highlights the need for future research to include more specialized/professional databases for an additional understanding of CT-STEM’s transfer effects. While the standardization of effect sizes and moderator analysis helped to mitigate potential biases from diverse study designs, further methodological enhancements are warranted in future studies. The findings on noncognitive benefits through far transfer (NCFT), such as social competencies, are limited by the nature of the research dataset and the limited research available (Lai & Wong, 2022 ; Lai et al., 2023 ). This indicates a need for the rigorous development of measurement tools and instructional designs in this area. Finally, we investigated six moderators within CT-STEM but did not examine aspects like curriculum characteristics and teachers’ experience. These areas, due to their qualitative nature and infrequent reporting in our sample studies, were not included but are significant avenues for future research. Despite these limitations, the study’s contributions are significant, as it systematically elucidates the cognitive and noncognitive benefits from CT-STEM transfer effects and provides robust evidence. The identified moderators aid educators in facilitating the occurrence of transfer within classroom teaching.

Availability of data and materials

The datasets used and analyzed during the current study are available from the corresponding author on reasonable request.

Adanır, G. A., Delen, I., & Gulbahar, Y. (2024). Research trends in K-5 computational thinking education: A bibliometric analysis and ideas to move forward. Education and Information Technologies, 29 , 3589–3614. https://doi.org/10.1007/s10639-023-11974-4

Article   Google Scholar  

Aksit, O., & Wiebe, E. N. (2020). Exploring force and motion concepts in middle grades using computational modeling: A classroom intervention study. Journal of Science Education and Technology, 29 , 65–82. https://doi.org/10.1007/s10956-019-09800-z

Angeli, C. (2022). The effects of scaffolded programming scripts on pre-service teachers’ computational thinking: Developing algorithmic thinking through programming robots. International Journal of Child-Computer Interaction, 31 , 100329. https://doi.org/10.1016/j.ijcci.2021.100329

Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school: Critical conceptual and methodological issues of the construct. Psychology in the Schools, 45 (5), 369–386. https://doi.org/10.1002/pits.20303

Arfé, B., Vardanega, T., Montuori, C., & Lavanga, M. (2019). Coding in primary grades boosts children’s executive functions. Frontiers in Psychology, 10 , 2713. https://doi.org/10.3389/fpsyg.2019.02713

Bai, S., Hew, K. F., & Huang, B. (2020). Does gamification improve student learning outcome? Evidence from a meta-analysis and synthesis of qualitative data in educational contexts. Educational Research Review, 30 , 100322. https://doi.org/10.1016/j.edurev.2020.100322

Baldwin, T. T., & Ford, J. K. (1988). Transfer of training: A review and directions for future research. Personnel Psychology, 41 (1), 63–105. https://doi.org/10.1111/j.1744-65701988.tb00632.x

Barr, V., & Stephenson, C. (2011). Bringing computational thinking to K-12: What is involved and what is the role of the computer science education community? Acm Inroads, 2 (1), 48–54. https://doi.org/10.1145/1929887.1929905

Barth-Cohen, L., Montoya, B., & Shen, J. (2019). Walk like a robot: A no-tech coding activity to teach computational thinking. Science Scope , 42 (9), 12–17. https://www.jstor.org/stable/26899024

Basu, S., Sengupta, P., & Biswas, G. (2015). A scaffolding framework to support learning of emergent phenomena using multi-agent-based simulation environments. Research in Science Education, 45 , 293–324. https://doi.org/10.1007/s11165-014-9424-z

Berland, M., & Wilensky, U. (2015). Comparing virtual and physical robotics environments for supporting complex systems and computational thinking. Journal of Science Education and Technology, 24 , 628–647. https://doi.org/10.1007/s10956-015-9552-x

Bernardo, M. A., & Morris, J. D. (1994). Transfer effects of a high school computer programming course on mathematical modeling, procedural comprehension, and verbal problem solution. Journal of Research on Computing in Education, 26 (4), 523–536. https://doi.org/10.1080/08886504.1994.10782108

Bers, M. U., Flannery, L., Kazakoff, E. R., & Sullivan, A. (2014). Computational thinking and tinkering: Exploration of an early childhood robotics curriculum. Computers & Education, 72 , 145–157. https://doi.org/10.1016/j.compedu.2013.10.020

Bloom, B. S., & Krathwohl, D. R. (1956). Taxonomy of educational objectives: The classification of educational goals by a committee of college and university examiners . Handbook I: Cognitive domain . Longmans, Green.

Borenstein, M. (2005). Software for publication bias. In H. R. Rothstein, A. J. Sutton, & M. Borenstein (Eds.), Publication bias in meta-analysis: Prevention, assessment and adjustments (pp. 193–220). John Wiley & Sons. https://doi.org/10.1002/0470870168

Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2009). Random-effects model. In Introduction to meta-analysis (pp. 69–75). John Wiley & Sons. https://doi.org/10.1002/9780470743386

Borenstein, M., Hedges, L., Higgins, J., & Rothstein, H. (2013). Comprehensive Meta Analysis (Version 3) [Computer software]. Biostat.

Borenstein, M., Hedges, L. V., Higgins, J. P. T., & Rothstein, H. R. (2021). Subgroup analyses. In Introduction to meta-analysis (2nd ed., pp. 161–195). John Wiley & Sons.

Bortz, W. W., Gautam, A., Tatar, D., & Lipscomb, K. (2020). Missing in measurement: Why identifying learning in integrated domains is so hard. Journal of Science Education and Technology, 29 , 121–136. https://doi.org/10.1007/s10956-019-09805-8

Bransford, J. D., & Schwartz, D. L. (1999). Chapter 3: Rethinking transfer: A simple proposal with multiple implications. Review of Research in Education, 24 (1), 61–100. https://doi.org/10.3102/0091732X024001061

Brennan, K., & Resnick, M. (2012). New frameworks for studying and assessing the development of computational thinking. In Proceedings of the 2012 Annual Meeting of the American Educational Research Association (pp. 1–25). Vancouver, BC. http://scratched.gse.harvard.edu/ct/files/AERA2012.pdf

Chen, H. E., Sun, D., Hsu, T. C., Yang, Y., & Sun, J. (2023a). Visualising trends in computational thinking research from 2012 to 2021: A bibliometric analysis. Thinking Skills and Creativity, 47 , 101224. https://doi.org/10.1016/j.tsc.2022.101224

Chen, J., Wang, M., Kirschner, P. A., & Tsai, C.-C. (2018). The role of collaboration, computer use, learning environments, and supporting strategies in CSCL: A meta-analysis. Review of Educational Research, 88 (6), 799–843. https://doi.org/10.3102/0034654318791584

Chen, P., Yang, D., Metwally, A. H. S., Lavonen, J., & Wang, X. (2023b). Fostering computational thinking through unplugged activities: A systematic literature review and meta-analysis. International Journal of STEM Education, 10 , 47. https://doi.org/10.1186/s40594-023-00434-7

Cheng, L., Wang, X., & Ritzhaupt, A. D. (2023). The effects of computational thinking integration in STEM on students’ learning performance in K-12 Education: A Meta-analysis. Journal of Educational Computing Research, 61 (2), 416–443. https://doi.org/10.1177/07356331221114183

Chongo, S., Osman, K., & Nayan, N. A. (2021). Impact of the plugged-in and unplugged chemistry computational thinking modules on achievement in chemistry. EURASIA Journal of Mathematics, Science and Technology Education, 17 (4), em1953. https://doi.org/10.29333/ejmste/10789

Christensen, D., & Lombardi, D. (2023). Biological evolution learning and computational thinking: Enhancing understanding through integration of disciplinary core knowledge and scientific practice. International Journal of Science Education, 45 (4), 293–313. https://doi.org/10.1080/09500693.2022.2160221

CSTA & ISTE. (2011). Operational definition of computational thinking for K–12 education . Retrieved from http://csta.acm.org/Curriculum/sub/CurrFiles/CompThinkingFlyer.pdf

Cuijpers, P., Weitz, E., Cristea, I. A., & Twisk, J. (2017). Pre-post effect sizes should be avoided in meta-analyses. Epidemiology and Psychiatric Sciences, 26 (4), 364–368. https://doi.org/10.1017/S2045796016000809

Czocher, J. A., Melhuish, K., & Kandasamy, S. S. (2019). Building mathematics self-efficacy of STEM undergraduates through mathematical modelling. International Journal of Mathematical Education in Science and Technology, 51 (6), 807–834. https://doi.org/10.1080/0020739X.2019.1634223

Day, S. B., & Goldstone, R. L. (2012). The import of knowledge export: Connecting findings and theories of transfer of learning. Educational Psychologist, 47 (3), 153–176. https://doi.org/10.1080/00461520.2012.696438

Delen, I., & Sen, S. (2023). Effect of design-based learning on achievement in K-12 education: A meta-analysis. Journal of Research in Science Teaching, 60 (2), 330–356. https://doi.org/10.1002/tea.21800

Denner, J., Werner, L., & Ortiz, E. (2012). Computer games created by middle school girls: Can they be used to measure understanding of computer science concepts? Computers & Education, 58 (1), 240–249. https://doi.org/10.1016/j.compedu.2011.08.006

Doleck, T., Bazelais, P., Lemay, D. J., Saxena, A., & Basnet, R. B. (2017). Algorithmic thinking, cooperativity, creativity, critical thinking, and problem solving: Exploring the relationship between computational thinking skills and academic performance. Journal of Computers in Education, 4 , 355–369. https://doi.org/10.1007/s40692-017-0090-9

Egger, M., Smith, G. D., Schneider, M., & Minder, C. (1997). Bias in meta-analysis detected by a simple, graphical test. British Medical Journal, 315 (7109), 629–634. https://doi.org/10.1136/bmj.315.7109.629

Eidin, E., Bielik, T., Touitou, I., Bowers, J., McIntyre, C., Damelin, D., & Krajcik, J. (2024). Thinking in terms of change over time: Opportunities and challenges of using system dynamics models. Journal of Science Education and Technology, 33 , 1–28. https://doi.org/10.1007/s10956-023-10047-y

Ezeamuzie, N. O., & Leung, J. S. C. (2022). Computational thinking through an empirical lens: A systematic review of literature. Journal of Educational Computing Research, 60 (2), 481–511. https://doi.org/10.1177/07356331211033158

Falloon, G. (2016). An analysis of young students’ thinking when completing basic coding tasks using Scratch Jnr. On the iPad. Journal of Computer Assisted Learning, 32 (6), 576–593. https://doi.org/10.1111/jcal.12155

Fanchamps, N. L. J. A., Slangen, L., Hennissen, P., & Specht, M. (2021). The influence of SRA programming on algorithmic thinking and self-efficacy using Lego robotics in two types of instruction. International Journal of Technology and Design Education, 31 , 203–222. https://doi.org/10.1007/s10798-019-09559-9

Geary, D. C., Nicholas, A., Li, Y., & Sun, J. (2017). Developmental change in the influence of domain-general abilities and domain-specific knowledge on mathematics achievement: An eight-year longitudinal study. Journal of Educational Psychology, 109 (5), 680–693. https://doi.org/10.1037/edu0000159

Grover, S., & Pea, R. (2013). Computational thinking in K–12: A review of the state of the field. Educational Researcher, 42 (1), 38–43. https://doi.org/10.3102/0013189X12463051

Gualtieri, S., & Finn, A. S. (2022). The sweet spot: When children’s developing abilities, brains, and knowledge make them better learners than adults. Perspectives on Psychological Science, 17 (5), 1322–1338. https://doi.org/10.1177/17456916211045971

Gutman, L. M., & Schoon, I. (2013). The impact of non-cognitive skills on outcomes for young people . University of London, Institute of Education.

Google Scholar  

Guven, G., Kozcu Cakir, N., Sulun, Y., Cetin, G., & Guven, E. (2022). Arduino-assisted robotics coding applications integrated into the 5E learning model in science teaching. Journal of Research on Technology in Education, 54 (1), 108–126. https://doi.org/10.1080/15391523.2020.1812136

Hedges, L. V., & Olkin, I. (2014). Statistical methods for meta-analysis . Academic Press.

Hsu, T.-C., Abelson, H., Lao, N., & Chen, S.-C. (2021). Is it possible for young students to learn the AI-STEAM application with experiential learning? Sustainability, 13 (19), 11114. https://doi.org/10.3390/su131911114

Hsu, T.-C., Chang, S.-C., & Hung, Y.-T. (2018). How to learn and how to teach computational thinking: Suggestions based on a review of the literature. Computers & Education, 126 , 296–310. https://doi.org/10.1016/j.compedu.2018.07.004

Hurt, T., Greenwald, E., Allan, S., Cannady, M. A., Krakowski, A., Brodsky, L., Collins, M. A., Montgomery, R., & Dorph, R. (2023). The computational thinking for science (CT-S) framework: Operationalizing CT-S for K–12 science education researchers and educators. International Journal of STEM Education, 10 , 1. https://doi.org/10.1186/s40594-022-00391-7

Hutchins, N. M., Biswas, G., Maróti, M., Lédeczi, Á., Grover, S., Wolf, R., Blair, K. P., Chin, D., Conlin, L., Basu, S., & McElhaney, K. (2020a). C2STEM: A system for synergistic learning of physics and computational thinking. Journal of Science Education and Technology, 29 , 83–100. https://doi.org/10.1007/s10956-019-09804-9

Hutchins, N. M., Biswas, G., Zhang, N., Snyder, C., Lédeczi, Á., & Maróti, M. (2020b). Domain-specific modeling languages in computer-based learning environments: A systematic approach to support science learning through computational modeling. International Journal of Artificial Intelligence in Education, 30 , 537–580. https://doi.org/10.1007/s40593-020-00209-z

Israel-Fishelson, R., & Hershkovitz, A. (2021). Micro-persistence and difficulty in a game-based learning environment for computational thinking acquisition. Journal of Computer Assisted Learning, 37 (3), 839–850. https://doi.org/10.1111/jcal.12527

Israel-Fishelson, R., & Hershkovitz, A. (2022). Studying interrelations of computational thinking and creativity: A scoping review (2011–2020). Computers & Education, 176 , 104353. https://doi.org/10.1016/j.compedu.2021.104353

Jesson, J., Matheson, L., & Lacey, F. M. (2011). Doing your literature review: Traditional and systematic techniques (1st ed.). SAGE Publications.

Jiang, S., & Wong, G. K. W. (2022). Exploring age and gender differences of computational thinkers in primary school: A developmental perspective. Journal of Computer Assisted Learning, 38 (1), 60–75. https://doi.org/10.1111/jcal.12591

Jocius, R., O’Byrne, W. I., Albert, J., Joshi, D., Robinson, R., & Andrews, A. (2021). Infusing computational thinking into STEM teaching: From professional development to classroom practice. Educational Technology & Society, 24 (4), 166–179.

Kafai, Y. B., & Proctor, C. (2022). A revaluation of computational thinking in K–12 education: Moving toward computational literacies. Educational Researcher, 51 (2), 146–151. https://doi.org/10.3102/0013189X211057904

Kalelioglu, F., Gulbahar, Y., & Kukul, V. (2016). A framework for computational thinking based on a systematic research review. Baltic Journal of Modern Computing, 4 (3), 583–596.

Kautz, T., Heckman, J. J., Diris, R., ter Weel, B., & Borghans, L. (2014). Fostering and measuring skills: Improving cognitive and non-cognitive skills to promote lifetime success (OECD Education Working Papers No. 110). OECD Publishing. https://doi.org/10.1787/5jxsr7vr78f7-en

Kelley, T. R., & Knowles, J. G. (2016). A conceptual framework for integrated STEM education. International Journal of STEM Education, 3 , 11. https://doi.org/10.1186/s40594-016-0046-z

Kite, V., & Park, S. (2023). What’s computational thinking? Secondary science teachers’ conceptualizations of computational thinking (CT) and perceived barriers to CT integration. Journal of Science Teacher Education, 34 (4), 391–414. https://doi.org/10.1080/1046560X.2022.2110068

Knochel, A. D., & Patton, R. M. (2015). If art education then critical digital making: Computational thinking and creative code. Studies in Art Education, 57 (1), 21–38.

Korkmaz, Ö., Çakir, R., & Özden, M. Y. (2017). A validity and reliability study of the computational thinking scales (CTS). Computers in Human Behavior , 72 , 558–569. https://doi.org/10.1016/j.chb.2017.01.005

Lai, R. P., & Ellefson, M. R. (2023). How multidimensional is computational thinking competency? A bi-factor model of the computational thinking challenge. Journal of Educational Computing Research, 61 (2), 259–282. https://doi.org/10.1177/07356331221121052

Lai, X., & Wong, G. K. W. (2022). Collaborative versus individual problem solving in computational thinking through programming: A meta-analysis. British Journal of Educational Technology, 53 (1), 150–170. https://doi.org/10.1111/bjet.13157

Lai, X., Ye, J., & Wong, G. K. W. (2023). Effectiveness of collaboration in developing computational thinking skills: A systematic review of social cognitive factors. Journal of Computer Assisted Learning, 39 (5), 1418–1435. https://doi.org/10.1111/jcal.12845

Lechner, C. M., Gauly, B., Miyamoto, A., & Wicht, A. (2021). Stability and change in adults’ literacy and numeracy skills: Evidence from two large-scale panel studies. Personality and Individual Differences, 180 , 110990. https://doi.org/10.1016/j.paid.2021.110990

Lee, I., Grover, S., Martin, F., Pillai, S., & Malyn-Smith, J. (2020). Computational thinking from a disciplinary perspective: Integrating computational thinking in K-12 science, technology, engineering, and mathematics education. Journal of Science Education and Technology, 29 , 1–8. https://doi.org/10.1007/s10956-019-09803-w

Lee, I., & Malyn-Smith, J. (2020). Computational thinking integration patterns along the framework defining computational thinking from a disciplinary perspective. Journal of Science Education and Technology, 29 , 9–18. https://doi.org/10.1007/s10956-019-09802-x

Leonard, J., Buss, A., Gamboa, R., Mitchell, M., Fashola, O. S., Hubert, T., & Almughyirah, S. (2016). Using robotics and game design to enhance children’s self-efficacy, STEM attitudes, and computational thinking skills. Journal of Science Education and Technology, 25 , 860–876. https://doi.org/10.1007/s10956-016-9628-2

Li, F., Wang, X., He, X., Cheng, L., & Wang, Y. (2022). The effectiveness of unplugged activities and programming exercises in computational thinking education: A meta-analysis. Education and Information Technologies, 27 , 7993–8013. https://doi.org/10.1007/s10639-022-10915-x

Li, X., Xie, K., Vongkulluksn, V., Stein, D., & Zhang, Y. (2023). Developing and testing a design-based learning approach to enhance elementary students’ self-perceived computational thinking. Journal of Research on Technology in Education, 55 (2), 344–368. https://doi.org/10.1080/15391523.2021.1962453

Li, Y., & Anderson, J. (2020). STEM integration: Diverse approaches to meet diverse needs. In J. Anderson & Y. Li (Eds.), Integrated approaches to STEM education: An international perspective (pp. 15–20). Springer. https://doi.org/10.1007/978-3-030-52229-2_2

Chapter   Google Scholar  

Li, Y., Schoenfeld, A. H., diSessa, A. A., Graesser, A. C., Benson, L. C., English, L. D., & Duschl, R. A. (2020a). Computational thinking is more about thinking than computing. Journal for STEM Education Research, 3 , 1–18. https://doi.org/10.1007/s41979-020-00030-2

Li, Y., Schoenfeld, A. H., diSessa, A. A., Graesser, A. C., Benson, L. C., English, L. D., & Duschl, R. A. (2020b). On computational thinking and STEM education. Journal for STEM Education Research, 3 , 147–166. https://doi.org/10.1007/s41979-020-00044-w

Lipsey, M. W., & Wilson, D. B. (2001). Practical meta-analysis . SAGE Publications Inc.

Liu, Z., & Jeong, A. C. (2022). Connecting learning and playing: The effects of in-game cognitive supports on the development and transfer of computational thinking skills. Educational Technology Research and Development, 70 , 1867–1891. https://doi.org/10.1007/s11423-022-10145-5

Lobato, J. (2006). Alternative perspectives on the transfer of learning: History, issues, and challenges for future research. The Journal of the Learning Sciences, 15 (4), 431–449. https://doi.org/10.1207/s15327809jls1504_1

Lu, C., Macdonald, R., Odell, B., Kokhan, V., Demmans Epp, C., & Cutumisu, M. (2022). A scoping review of computational thinking assessments in higher education. Journal of Computing in Higher Education, 34 , 416–461. https://doi.org/10.1007/s12528-021-09305-y

Lyon, J. A., & Magana, A. J. (2021). The use of engineering model-building activities to elicit computational thinking: A design-based research study. Journal of Engineering Education, 110 (1), 184–206. https://doi.org/10.1002/jee.20372

Ma, H., Zhao, M., Wang, H., Wan, X., Cavanaugh, T. W., & Liu, J. (2021). Promoting pupils’ computational thinking skills and self-efficacy: A problem-solving instructional approach. Educational Technology Research and Development, 69 , 1599–1616. https://doi.org/10.1007/s11423-021-10016-5

Malyn-Smith, J., & Ippolito, J. (2011). Profile of a computational thinking enabled STEM professional in America’s workplaces: Research Scientist (Unpublished manuscript) . Education Development Center, Inc.

Mayer, R. E. (2011). Multimedia learning and games. In S. Tobias & J. D. Fletcher (Eds.), Computer Games and Instruction (pp. 281–305). Information Age Publishing.

Mayer, R. E. (2015). On the need for research evidence to guide the design of computer games for learning. Educational Psychologist, 50 (4), 349–353. https://doi.org/10.1080/00461520.2015.1133307

Melro, A., Tarling, G., Fujita, T., & Kleine Staarman, J. (2023). What else can be learned when coding? A configurative literature review of learning opportunities through computational thinking. Journal of Educational Computing Research, 61 (4), 901–924. https://doi.org/10.1177/07356331221133822

Merino-Armero, J. M., González-Calero, J. A., & Cozar-Gutierrez, R. (2022). Computational thinking in K-12 education. An insight through meta-analysis. Journal of Research on Technology in Education, 54 (3), 410–437. https://doi.org/10.1080/15391523.2020.1870250

Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G. (2010). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. International Journal of Surgery, 8 (5), 336–341. https://doi.org/10.1016/j.ijsu.2010.02.007

Morris, S. B., & DeShon, R. P. (2002). Combining effect size estimates in meta-analysis with repeated measures and independent-groups designs. Psychological Methods, 7 (1), 105. https://doi.org/10.1037/1082-989X.7.1.105

Ng, O. L., Leung, A., & Ye, H. (2023). Exploring computational thinking as a boundary object between mathematics and computer programming for STEM teaching and learning. ZDM Mathematics Education, 55 , 1315–1329. https://doi.org/10.1007/s11858-023-01509-z

NGSS Lead States. (2013). Next generation science standards: For states, by states . The National Academy Press.

Nouri, J., Zhang, L., Mannila, L., & Norén, E. (2020). Development of computational thinking, digital competence and 21st century skills when learning programming in K-9. Education Inquiry, 11 (1), 1–17. https://doi.org/10.1080/20004508.2019.1627844

OECD. (2018). Future of education and skills 2030: Conceptual learning framework . A literature summary for research on the transfer of learning (8th Informal Working Group Meeting, pp. 1–29). OECD Conference Centre, Paris, France.

Papert, S. A. (1980). Mindstorms: Children, computers, and powerful ideas . Basic Books.

Perkins, D. N., & Salomon, G. (1992). Transfer of learning. In T. N. Postlethwaite & T. Husen (Eds.), International Encyclopedia of Education (2nd ed., pp. 6452–6457). Pergamon Press.

Petersen, R. D., & Valdez, A. (2005). Using snowball-based methods in hidden populations to generate a randomized community sample of gang-affiliated adolescents. Youth Violence and Juvenile Justice, 3 (2), 151–167. https://doi.org/10.1177/1541204004273316

Phillips, A. M., Gouvea, E. J., Gravel, B. E., Beachemin, P. H., & Atherton, T. J. (2023). Physicality, modeling, and agency in a computational physics class. Physical Review Physics Education Research, 19 (1), 010121. https://doi.org/10.1103/PhysRevPhysEducRes.19.010121

Piatti, A., Adorni, G., El-Hamamsy, L., Negrini, L., Assaf, D., Gambardella, L., & Mondada, F. (2022). The CT-cube: A framework for the design and the assessment of computational thinking activities. Computers in Human Behavior Reports, 5 , 100166. https://doi.org/10.1016/j.chbr.2021.100166

Pirolli, P., & Recker, M. (1994). Learning strategies and transfer in the domain of programming. Cognition and Instruction, 12 (3), 235–275. https://doi.org/10.1207/s1532690xci1203_2

Polat, E., Hopcan, S., Kucuk, S., & Sisman, B. (2021). A comprehensive assessment of secondary school students’ computational thinking skills. British Journal of Educational Technology, 52 (5), 1965–1980. https://doi.org/10.1111/bjet.13092

Popat, S., & Starkey, L. (2019). Learning to code or coding to learn? A systematic review. Computers & Education, 128 , 365–376. https://doi.org/10.1016/j.compedu.2018.10.005

Rachmatullah, A., & Wiebe, E. N. (2022). Building a computational model of food webs: Impacts on middle school students’ computational and systems thinking skills. Journal of Research in Science Teaching, 59 (4), 585–618. https://doi.org/10.1002/tea.21738

Rich, K. M., Spaepen, E., Strickland, C., & Moran, C. (2019). Synergies and differences in mathematical and computational thinking: Implications for integrated instruction. Interactive Learning Environments, 28 (3), 272–283. https://doi.org/10.1080/10494820.2019.1612445

Rodríguez-Martínez, J. A., González-Calero, J. A., & Sáez-López, J. M. (2019). Computational thinking and mathematics using Scratch: An experiment with sixth-grade students. Interactive Learning Environments, 28 (3), 316–327. https://doi.org/10.1080/10494820.2019.1612448

Román-González, M., Pérez-González, J. C., & Jiménez-Fernández, C. (2017). Which cognitive abilities underlie computational thinking? Criterion validity of the computational thinking test. Computers in Human Behavior, 72 , 678–691. https://doi.org/10.1016/j.chb.2016.08.047

Román-González, M., Pérez-González, J. C., Moreno-León, J., & Robles, G. (2018). Extending the nomological network of computational thinking with noncognitive factors. Computers in Human Behavior, 80 , 441–459. https://doi.org/10.1016/j.chb.2017.09.030

Rosenberg, M. S. (2005). The file-drawer problem revisited: A general weighted method for calculating fail-safe numbers in meta-analysis. Evolution, 59 (2), 464–468. https://doi.org/10.1111/j.0014-3820.2005.tb01004.x

Rosenthal, R. (1979). The file drawer problem and tolerance for null results. Psychological Bulletin, 86 , 638–641.

Sala, G., & Gobet, F. (2016). Do the benefits of chess instruction transfer to academic and cognitive skills? A meta-analysis. Educational Research Review, 18 , 46–57. https://doi.org/10.1016/j.edurev.2016.02.002

Sala, G., & Gobet, F. (2017). Does far transfer exist? Negative evidence from chess, music, and working memory training. Current Directions in Psychological Science, 26 (6), 515–520. https://doi.org/10.1177/0963721417712760

Scherer, R., Siddiq, F., & Sánchez Viveros, B. (2019). The cognitive benefits of learning computer programming: A meta-analysis of transfer effects. Journal of Educational Psychology, 111 (5), 764–792. https://doi.org/10.1037/edu0000314

Scherer, R., Siddiq, F., & Sánchez Viveros, B. (2020). A meta-analysis of teaching and learning computer programming: Effective instructional approaches and conditions. Computers in Human Behavior, 109 , 106349. https://doi.org/10.1016/j.chb.2020.106349

Selby, C. C., & Woollard, J. (2013). Computational thinking: The developing definition. In Paper presented at the 18th annual conference on innovation and technology in computer science education , Canterbury.

Sengupta, P., Kinnebrew, J. S., Basu, S., Biswas, G., & Clark, D. (2013). Integrating computational thinking with K-12 science education using agent-based computation: A theoretical framework. Education and Information Technologies, 18 , 351–380. https://doi.org/10.1007/s10639-012-9240-x

Shamseer, L., Moher, D., Clarke, M., Ghersi, D., Liberati, A., Petticrew, M., Shekelle, P., & Stewart, L. A. (2015). Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: Elaboration and explanation. BMJ, 349 , g7647. https://doi.org/10.1136/bmj.g7647

Shute, V. J., Sun, C., & Asbell-Clarke, J. (2017). Demystifying computational thinking. Educational Research Review, 22 , 142–158. https://doi.org/10.1016/j.edurev.2017.09.003

Singley, M. K., & Anderson, J. R. (1989). The transfer of cognitive skill . Harvard University Press.

Sun, L., Hu, L., & Zhou, D. (2021). Which way of design programming activities is more effective to promote K-12 students’ computational thinking skills? A meta-analysis. Journal of Computer Assisted Learning, 37 (4), 1048–1062. https://doi.org/10.1111/jcal.12545

Sun, L., & Zhou, D. (2022). Effective instruction conditions for educational robotics to develop programming ability of K-12 students: A meta-analysis. Journal of Computer Assisted Learning, 39 (2), 380–398. https://doi.org/10.1111/jcal.12750

Sung, W., Ahn, J., & Black, J. B. (2017). Introducing computational thinking to young learners: Practicing computational perspectives through embodiment in mathematics education. Technology, Knowledge and Learning, 22 , 443–463. https://doi.org/10.1007/s10758-017-9328-x

Sung, W., & Black, J. B. (2021). Factors to consider when designing effective learning: Infusing computational thinking in mathematics to support thinking-doing. Journal of Research on Technology in Education, 53 (4), 404–426. https://doi.org/10.1080/15391523.2020.1784066

Tang, X., Yin, Y., Lin, Q., Hadad, R., & Zhai, X. (2020). Assessing computational thinking: A systematic review of empirical studies. Computers & Education, 148 , 103798. https://doi.org/10.1016/j.compedu.2019.103798

Tekdal, M. (2021). Trends and development in research on computational thinking. Education and Information Technologies, 26 , 6499–6529. https://doi.org/10.1007/s10639-021-10617-w

Thomas, D. R., & Larwin, K. H. (2023). A meta-analytic investigation of the impact of middle school STEM education: Where are all the students of color? International Journal of STEM Education, 10 , 43. https://doi.org/10.1186/s40594-023-00425-8

Tikva, C., & Tambouris, E. (2021a). A systematic mapping study on teaching and learning computational thinking through programming in higher education. Thinking Skills and Creativity, 41 , 100849. https://doi.org/10.1016/j.tsc.2021.100849

Tikva, C., & Tambouris, E. (2021b). Mapping computational thinking through programming in K-12 education: A conceptual model based on a systematic literature review. Computers & Education, 162 , 104083. https://doi.org/10.1016/j.compedu.2020.104083

Tsai, M.-J., Liang, J.-C., & Hsu, C.-Y. (2021). The computational thinking scale for computer literacy education. Journal of Educational Computing Research, 59 (4), 579–602. https://doi.org/10.1177/0735633120972356

Tsai, M.-J., Liang, J.-C., Lee, S.W.-Y., & Hsu, C.-Y. (2022). Structural validation for the developmental model of computational thinking. Journal of Educational Computing Research, 60 (1), 56–73. https://doi.org/10.1177/07356331211017794

Tsai, M.-J., Wang, C.-Y., & Hsu, P.-F. (2019). Developing the computer programming self-efficacy scale for computer literacy education. Journal of Educational Computing Research, 56 (8), 1345–1360. https://doi.org/10.1177/0735633117746747

Tsai, Y.-L., & Tsai, C.-C. (2018). Digital game-based second-language vocabulary learning and conditions of research designs: A meta-analysis study. Computers & Education, 125 , 345–357. https://doi.org/10.1016/j.compedu.2018.06.020

Tsarava, K., Moeller, K., Román-González, M., Golle, J., Leifheit, L., Butz, M. V., & Ninaus, M. (2022). A cognitive definition of computational thinking in primary education. Computers & Education, 179 , 104425. https://doi.org/10.1016/j.compedu.2021.104425

van der Graaf, J., van de Sande, E., Gijsel, M., & Segers, E. (2019). A combined approach to strengthen children’s scientific thinking: Direct instruction on scientific reasoning and training of teacher’s verbal support. International Journal of Science Education, 41 (9), 1119–1138. https://doi.org/10.1080/09500693.2019.1594442

Wang, C., Shen, J., & Chao, J. (2022a). Integrating computational thinking in STEM education: A literature review. International Journal of Science and Mathematics Education, 20 , 1949–1972. https://doi.org/10.1007/s10763-021-10227-5

Wang, J., Zhang, Y., Hung, C. Y., Wang, Q., & Zheng, Y. (2022b). Exploring the characteristics of an optimal design of non-programming plugged learning for developing primary school students’ computational thinking in mathematics. Educational Technology Research and Development, 70 , 849–880. https://doi.org/10.1007/s11423-022-10093-0

Waterman, K. P., Goldsmith, L., & Pasquale, M. (2020). Integrating computational thinking into elementary science curriculum: An examination of activities that support students’ computational thinking in the service of disciplinary learning. Journal of Science Education and Technology, 29 , 53–64. https://doi.org/10.1007/s10956-019-09801-y

Weintrop, D., Beheshti, E., Horn, M., Orton, K., Jona, K., Trouille, L., & Wilensky, U. (2016). Defining computational thinking for mathematics and science classrooms. Journal of Science Education and Technology, 25 , 127–147. https://doi.org/10.1007/s10956-015-9581-5

Weller, D. P., Bott, T. E., Caballero, M. D., & Irving, P. W. (2022). Development and illustration of a framework for computational thinking practices in introductory physics. Physical Review Physics Education Research, 18 (2), 020106. https://doi.org/10.1103/PhysRevPhysEducRes.18.020106

Wiebe, E., Kite, V., & Park, S. (2020). Integrating computational thinking in STEM. In C. C. Johnson, M. J. Mohr-Schroeder, T. J. Moore, & L. D. English (Eds.), Handbook of Research on STEM Education (pp. 196–209). Taylor & Francis Group.

Wing, J. M. (2006). Computational thinking. Communications of the ACM, 49 (3), 33–35. https://doi.org/10.1145/1118178.1118215

Wing, J. M. (2008). Computational thinking and thinking about computing. Philosophical Transactions of the Royal Society a: Mathematical, Physical and Engineering Sciences, 366 (1881), 3717–3725. https://doi.org/10.1098/rsta.2008.0118

Wing, J. M. (2011). Research notebook: Computational thinking—What and why. The Link Magazine, 6 , 20–23.

Woo, K., & Falloon, G. (2022). Problem solved, but how? An exploratory study into students’ problem solving processes in creative coding tasks. Thinking Skills and Creativity, 46 , 101193. https://doi.org/10.1016/j.tsc.2022.101193

Xia, L., & Zhong, B. (2018). A systematic review on teaching and learning robotics content knowledge in K-12. Computers & Education, 127 , 267–282. https://doi.org/10.1016/j.compedu.2018.09.007

Xu, W., Geng, F., & Wang, L. (2022). Relations of computational thinking to reasoning ability and creative thinking in young children: Mediating role of arithmetic fluency. Thinking Skills and Creativity, 44 , 101041. https://doi.org/10.1016/j.tsc.2022.101041

Xu, Z., Ritzhaupt, A. D., Tian, F., & Umapathy, K. (2019). Block-based versus text-based programming environments on novice student learning outcomes: A meta-analysis study. Computer Science Education, 29 (2–3), 177–204. https://doi.org/10.1080/08993408.2019.1565233

Ye, H., Liang, B., Ng, O.-L., & Chai, C. S. (2023). Integration of computational thinking in K-12 mathematics education: A systematic review on CT-based mathematics instruction and student learning. International Journal of STEM Education, 10 , 3. https://doi.org/10.1186/s40594-023-00396-w

Ye, J., Lai, X., & Wong, G. K. W. (2022). The transfer effects of computational thinking: A systematic review with meta-analysis and qualitative synthesis. Journal of Computer Assisted Learning, 38 (6), 1620–1638. https://doi.org/10.1111/jcal.12723

Yılmaz, F. G. K., & Yılmaz, R. (2023). Exploring the role of sociability, sense of community and course satisfaction on students’ engagement in flipped classroom supported by facebook groups. Journal of Computers in Education, 10 , 135–162. https://doi.org/10.1007/s40692-022-00226-y

Yin, Y., Hadad, R., Tang, X., & Lin, Q. (2020). Improving and assessing computational thinking in maker activities: The integration with physics and engineering learning. Journal of Science Education and Technology, 29 , 189–214. https://doi.org/10.1007/s10956-019-09794-8

Yun, H. J., & Cho, J. (2022). Affective domain studies of K-12 computing education: A systematic review from a perspective on affective objectives. Journal of Computers in Education, 9 , 477–514. https://doi.org/10.1007/s40692-021-00211-x

Zha, S., Morrow, D. A., Curtis, J., & Mitchell, S. (2021). Learning culture and computational thinking in a Spanish course: A development model. Journal of Educational Computing Research, 59 (5), 844–869. https://doi.org/10.1177/0735633120978530

Zhan, Z., He, W., Yi, X., & Ma, S. (2022). Effect of unplugged programming teaching aids on children’s computational thinking and classroom interaction: With respect to Piaget’s four stages theory. Journal of Educational Computing Research, 60 (5), 1277–1300. https://doi.org/10.1177/07356331211057143

Zhang, L., & Nouri, J. (2019). A systematic review of learning computational thinking through Scratch in K-9. Computers & Education, 141 , 103607. https://doi.org/10.1016/j.compedu.2019.103607

Zhang, S., & Wong, G. K. W. (2023). Exploring the underlying cognitive process of computational thinking in primary education. Thinking Skills and Creativity, 48 , 101314. https://doi.org/10.1016/j.tsc.2023.101314

Zhang, Y., Ng, O.-L., & Leung, S. (2023). Researching computational thinking in early childhood STE (A) M education context: A descriptive review on the state of research and future directions. Journal for STEM Education Research, 6 , 427–455. https://doi.org/10.1007/s41979-023-00097-7

Zhao, L., Liu, X., Wang, C., & Su, Y.-S. (2022). Effect of different mind mapping approaches on primary school students’ computational thinking skills during visual programming learning. Computers & Education, 181 , 104445. https://doi.org/10.1016/j.compedu.2022.104445

Zhong, H.-X., Lai, C.-F., Chang, J.-H., & Chiu, P.-S. (2023). Developing creative material in STEM courses using integrated engineering design based on APOS theory. International Journal of Technology and Design Education, 33 , 1627–1651. https://doi.org/10.1007/s10798-022-09788-5

Download references

Acknowledgements

The authors are indebted to the editor and reviewers who greatly helped strengthen this paper.

This study is not supported by any funding sources.

Author information

Authors and affiliations.

Faculty of Education, University of Macau, Taipa, Macau, China

Zuokun Li & Pey Tee Oon

You can also search for this author in PubMed   Google Scholar

Contributions

All authors contributed to the writing of this manuscript. The work of Zuokun Li included designing the study, collecting and analyzing the data, interpreting the results, and writing the initial draft of the manuscript. Pey Tee Oon made contributions in the areas of conceptualization, writing, reviewing, and editing, as well as providing project supervision.

Corresponding author

Correspondence to Pey Tee Oon .

Ethics declarations

Ethics approval and consent to participate.

Not applicable.

Consent for publication

All individuals identifiable in this manuscript have given their consent for publication.

Competing interests

The authors declare no potential conflict of interest in the work.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1., additional file 2., rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Cite this article.

Li, Z., Oon, P.T. The transfer effect of computational thinking (CT)-STEM: a systematic literature review and meta-analysis. IJ STEM Ed 11 , 44 (2024). https://doi.org/10.1186/s40594-024-00498-z

Download citation

Received : 12 December 2023

Accepted : 02 August 2024

Published : 09 September 2024

DOI : https://doi.org/10.1186/s40594-024-00498-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Transfer effect
  • Systematic literature review
  • Meta-analysis

difference between computational thinking algorithmic thinking and critical thinking

Computational Thinking by Karl Beecher

Get full access to Computational Thinking and 60K+ other titles, with a free 10-day trial of O'Reilly.

There are also live events, courses curated by job role, and more.

2 LOGICAL AND ALGORITHMIC THINKING

• Learn the importance of logic to computational thinking.

• Appreciate the difference between deductive and inductive reasoning.

• Understand Boolean logic and its importance to computation.

• See the importance of using logical and mathematical notation instead of natural language.

• Learn the properties of algorithms: sequence, iteration, selection.

• Understand the importance of state in algorithms.

• See common mistakes made in logical and algorithmic thinking and learn how to avoid them.

Logic and algorithms are essential to CT. They underpin the subject and rear their heads repeatedly throughout its application. The good news is: humans already have an innate, intuitive understanding ...

Get Computational Thinking now with the O’Reilly learning platform.

O’Reilly members experience books, live events, courses curated by job role, and more from O’Reilly and nearly 200 top publishers.

Don’t leave empty-handed

Get Mark Richards’s Software Architecture Patterns ebook to better understand how to design components—and how they should interact.

It’s yours, free.

Cover of Software Architecture Patterns

Check it out now on O’Reilly

Dive in for free with a 10-day trial of the O’Reilly learning platform—then explore all the other resources our members count on to build skills and solve problems every day.

difference between computational thinking algorithmic thinking and critical thinking

IMAGES

  1. Mathematical and computational thinking

    difference between computational thinking algorithmic thinking and critical thinking

  2. problem solving and algorithmic thinking

    difference between computational thinking algorithmic thinking and critical thinking

  3. Placeholder Image

    difference between computational thinking algorithmic thinking and critical thinking

  4. Computational Thinking & Emerging Technologies in Education

    difference between computational thinking algorithmic thinking and critical thinking

  5. What is Computational Thinking

    difference between computational thinking algorithmic thinking and critical thinking

  6. Thinking vs critical thinking

    difference between computational thinking algorithmic thinking and critical thinking

VIDEO

  1. Computational Thinking in Arabic: Chapter 2.1 Logical and Algorithmic Thinking

  2. Computational Thinking in Arabic: Chapter 2.3 Logical and Algorithmic Thinking

  3. Analytical And Critical Thinking

  4. Algorithmic thinking|| Computational thinking and algorithms||Chapter 2

  5. What's the Difference Between Cognitive Computing and AI?

  6. Lect-29

COMMENTS

  1. Definitions of Computational Thinking, Algorithmic Thinking ...

    Computational thinking is a set of skills and processes that enable students to navigate complex problems. It relies on a four-step process that can be applied to nearly any problem: decomposition, pattern recognition, abstraction and algorithmic thinking. The computational thinking process starts with data as the input and quests to derive ...

  2. Computational Thinking, Algorithmic Thinking, & Design Thinking Defined

    Computational thinking is a multi-disciplinary tool that can be broadly applied in both plugged and unplugged ways. These are some examples of computational thinking in a variety of contexts. 1. Computational Thinking for Collaborative Classroom Projects. To navigate the different concepts of computational thinking - decomposition, pattern ...

  3. Computational Thinking is Critical Thinking—and Belongs in Every

    Computational thinking is not that far afield from critical thinking. The processes mirror each other: "look at the provided information, narrow it down to the most valuable data, find patterns and identify themes," Noonoo writes. Students become more agile thinkers when they exercise these transferrable skills in subjects not often ...

  4. Computational thinking is critical thinking: Connecting to university

    Computational thinking complements critical thinking as a way of reasoning to solve problems, make decisions and interact with our world. It draws concepts and techniques such as abstraction, decomposition, algorithmic design, generalization, evaluation and iteration from computer and

  5. Lesson 1: What is computational thinking?

    The four components of computational thinking. 1. Decomposition. In simple terms, decomposition is the process of breaking down a large problem into smaller problems. There are a few reasons why this is helpful in the bigger picture. It gives you insight into the practicalities associated with solving the problem.

  6. Is Computational Thinking Critical Thinking?

    The idea that the habits of mind used in computer programming could be applicable to other situations has existed since the 1950s (Tedre & Denning, 2016), but it was popularized in the twenty-first century as "computational thinking" (hereafter compT) by a highly influential editorial by Wing ().Since that time, there has been a flood of scholarly work, educational standards, and ...

  7. Algorithmic thinking, cooperativity, creativity, critical thinking, and

    The continued call for twenty-first century skills renders computational thinking a topical subject of study, as it is increasingly recognized as a fundamental competency for the contemporary world. Yet its relationship to academic performance is poorly understood. In this paper, we explore the association between computational thinking and academic performance. We test a structural model ...

  8. The One About Algorithmic Thinking in Computational Thinking

    Algorithmic thinking is a derivative of computer science and the process to develop code and program applications. This approach a utomates the problem-solving process by creating a series of systematic, logical steps that intake a defined set of inputs and produce a defined set of outputs based on these. In other words, algorithmic thinking is ...

  9. PDF Algorithmic thinking, cooperativity, creativity, critical thinking, and

    Algorithmic thinking Computational thinking has been present in the domain of computer science since the 1950s, where it was often phrased as algorithmic thinking (Denning 2009). As the field progressed, a distinction evolved between the two terms. Algorithmic thinking stems from the concept of an algorithm, which refers to solving a problem

  10. Algorithmic thinking, cooperativity, creativity, critical thinking, and

    It is concluded that algorithmic thinking is a fundamental part of computational thinking, but it is also a type of mathematical thinking; that the ability to algorithmize is basic in mathematics ...

  11. Computational thinking

    Computational thinking (CT) refers to the thought processes involved in formulating problems so their solutions can be represented as computational steps and algorithms. [1] In education, CT is a set of problem-solving methods that involve expressing problems and their solutions in ways that a computer could also execute. [2] It involves automation of processes, but also using computing to ...

  12. What is Computational Thinking?

    Computational thinking skills, in the outermost circle, are the cognitive processes necessary to engage with computational tools to solve problems. These skills are the foundation to engage in any computational problem solving and should be integrated into early learning opportunities in K-3. Computational thinking practices, in the middle ...

  13. Computational thinking is critical thinking: Connecting to university

    Although critical thinking is a higher and non-algorithmic mode of thinking, there have been several studies that explore the connections between critical thinking and CT.

  14. [PDF] Computational thinking is critical thinking: Connecting to

    Computational thinking complements critical thinking as a way of reasoning to solve problems, make decisions and interact with our world. It draws concepts and techniques such as abstraction, decomposition, algorithmic design, generalization, evaluation and iteration from computer and information science, but has broad application in the arts ...

  15. Demystifying computational thinking

    In conclusion, CT is an umbrella term containing design thinking and engineering (i.e., efficient solution design), systems thinking (i.e., system understanding and modeling), and mathematical thinking as applied to solving various problems. 3.1.3. Relationship of CT with computer science and programming.

  16. Computational thinking is critical thinking: Connecting to university

    Computational thinking complements critical thinking as a way of reasoning to solve problems, make decisions and interact with our world. It draws concepts and techniques such as abstraction, decomposition, algorithmic design, generalization, evaluation and iteration from computer and information science, but has broad application in the arts, sciences, humanities and social sciences.

  17. What is Algorithmic Thinking? A Beginner's Guide

    Definition. Algorithmic thinking involves using logic and critical thinking skills to develop algorithms, which are sets of instructions that can be used to solve problems. These algorithms can be used by computers or humans to efficiently and effectively solve problems. Algorithmic thinking, like computational thinking, involves exploring ...

  18. Computational/Algorithmic Thinking

    Using an algorithmic thinking to solve a mathematical problem in order to identify its mathematical structure and to generalize the solutions (e.g., in computational problem-solving) Using algorithms to provide accessible introductions to modeling, optimization, operations research, and experimental mathematics.

  19. PDF From Computational Thinking to Systems Thinking

    ably little critical thinking about computational thinking. The few critiques that have been written tend to focus on either the vagueness of the term [15], or on a concern that the field of computer science should not be reduced to just one of its practical tools: "Computational Thinking is one of the key practices of computer science.

  20. What influences computational thinking? A theoretical and empirical

    Cognitive engagement significantly predicted four out of five computational thinking dimensions in traditional multimedia classrooms: creativity, algorithmic thinking, cooperativity, and critical thinking. cognitive engagement was the factor that contribute the most to the algorithmic thinking with values of 0.58 and had a relatively lower ...

  21. Using algorithmic thinking to design algorithms: The case of critical

    The shift towards integrating algorithmic thinking is in line with the recognition that computational thinking—which some argue encompasses algorithmic thinking among other key practices—is a fundamental competence that students should develop in order to meet the current and future demands of the STEM workforce (Weintrop et al., 2016, Wing ...

  22. Algorithmic thinking, cooperativity, creativity, critical thinking, and

    Surprisingly, it is found that there is no association between computational thinking skills and academic performance (except for a link between cooperativity and academic performance). The continued call for twenty-first century skills renders computational thinking a topical subject of study, as it is increasingly recognized as a fundamental competency for the contemporary world. Yet its ...

  23. The transfer effect of computational thinking (CT)-STEM: a systematic

    Computational thinking (CT) The concept of procedural thinking was first introduced by Papert (), who connected programming to procedural thinking and laid a foundation for CT (Merino-Armero et al., 2022).Although Papert was the first to describe CT, Wing (2006, 2008, 2011) brought considerable attention back to the term, a focus that continues to date (Brennan & Resnick, 2012; Chen et al ...

  24. 2. Logical and Algorithmic Thinking

    2. LOGICAL AND ALGORITHMIC THINKING. OBJECTIVES. • Learn the importance of logic to computational thinking. • Appreciate the difference between deductive and inductive reasoning. • Understand Boolean logic and its importance to computation. • See the importance of using logical and mathematical notation instead of natural language.