Unlike the IT industry, education is relatively easy on fads and buzzwords. Still, computational thinking has been one of the persistent ones. Most likely, precisely because of its direct tie to the IT industry. Anyway, it’s about time I write about this trend since, like most of my subjects of interest on this paper writer website, it’s ambiguous and not exactly what it looks like on the surface.
As I mentioned, education is a steady field. The upside is that most passing fads die sooner than they make their way into the classroom. The downside is that education is somewhat slow to catch up with helpful new developments as well.
Which one is computational thinking in this dichotomy, then? Let’s find out.
What is computational thinking?
The coining of the term is connected with the 1950s and the dawn of the computing era. It has been oscillating in and out of vogue in education, making its comebacks in 1980, 1996, 2006, 2016, and now.
Despite what the name might suggest, computational thinking is not thinking like a computer. Instead, it’s being able to present information in the form of a problem that a computer will be able to solve – something only a human can do. The stages of solving a problem with computational thinking are:
- Decomposition – breaking a complex problem into several minor, more manageable problems.
- Pattern recognition –looking for and analyzing any repeating elements and sequences.
- Abstraction – identifying the information crucial for the problem’s solution and disregarding the irrelevant, random aberrations – or discriminating between the trends and the outliers.
- Algorithm design – coming up with the precise order of steps required to solve the problem.
- Iteration – implementation of the solution, evaluation of the result, and amendment of the algorithm for better efficiency.
Computational thinking is distinct from computer science and coding. However, all three fields overlap, especially in the algorithmic processes.
What do fans say?
Computational thinking is a transferrable skill that can be applied almost in any other subject area. It hones the ability to articulate problems and think logically. Although it seems to be most at home in mathematics and science, it’s widely used in linguistics and social sciences.
Another reason for its growing popularity is engagement. To illustrate, Tom Hammond from Lehigh University in Pennsylvania asks, “Would you rather get to play with a data set, or would you rather listen to the teacher tell you about the data set?” For most of us, the answer to that is obvious.
Computational thinking requires students to work with the data actively and self-directed, asking the questions, coming up with hypotheses, and testing them. It allows students to do most of the learning independently, which always results in better efficiency and knowledge retention.
Others claim enthusiastically that computational skill is crucial for preparing students for digital future. Moreover, it helps them build problem-solving skills they can apply to any situation in life – not only in the classroom.
What do critics say?
Critics of the approach exist on both sides – computer scientists and humanity specialists alike. Humanities scholars use a reverse acronym of “computational tyranny” for CT. They claim that formal logic doesn’t always benefit the fields where ethics and empathy should be a part of the equation. On the other hand, computer scientists fear that CT, merely a tiny part of the larger computer science curriculum, would eventually substitute the whole thing in education.
However, the main issue with computational thinking as an innovative approach is that it’s hardly new at all. It sounds very much like an algorithm for scientific research:
- Observing the phenomenon
- Formulating the hypothesis
- Making a prediction based on the hypothesis
- Testing the hypothesis via experimentation
- Modifying the hypothesis until it’s experimentally verified
Yes, computational thinking labels these steps using computer language, but it hardly changes the essence. In the same vein, you can refer to packing a bag as “prefetching and caching” or call finding the shortest line in the store “performance modeling.” That’s what computer scientists and computational thinking enthusiast Jeannette Wing does in her 2006 paper for Communications of the ACM. However, it doesn’t provide you with a newer and more efficient way to perform the tasks. It only gives you different optics for looking at them.
After all, the most challenging step in computational thinking is recognizing relevant data and telling them from irrelevant ones. This is still the task left to our human mind and its own rather old-fashioned devices.
So, is it any good then?
Absolutely! These recommended steps work great. Computational thinking is a solid and helpful thing, but nothing new, really. The ideas involved in the method (for example, abstraction and logical organization of the data) can be found in all kinds of thinking: scientific, design, engineering, etc.
When you look at it close enough, computational thinking is no different from critical thinking or evidence-based reasoning. One way of integrating this method into the curriculum is pointing students in the right direction and letting them come to their own conclusions. In other words, giving them problems to solve instead of offering knowledge on the plate. Teachers have been following this guideline for ages anyway.
Even pattern recognition as such is the natural feature hardwired into the human brain. It helps us make sense of the world, predict what might happen in the future, and solve problems. That is the reason why people enjoy crosswords, riddles, escape rooms, and puzzle videogames – they cater to our pattern-recognition urge fostered by eons of evolution.
Computational thinking is just one way of describing this ability in the digital age, not a revolutionary technique. It doesn’t really bother me – a rose by any other name… As long as it’s practical, helps you learn, and sounds cool enough to grip your attention – I’m all for it.