A NEW LOOK AT WHAT IT MEANS TO LEARN

 José Frutos Joral looks at how learning needs have changed from the 20th into the 21st Century. How should teaching change as a result?
Has education changed?

Education, in essence, has changed far less than we might think. For over a century, we have continued to teach in much the same way, even if the tools have changed. We have replaced books with tablets, chalk with interactive whiteboards, and notebooks with digital platforms, but the focus, the methodology, and even the role of the teacher, I would argue, often remain anchored to a logic of the past: that of transmitting information for the student to memorize.

This failure to innovate is not just a perception; it is well-documented. Education theorists Rupert Wegerif and Louis Major call this a failure of “framing.” In their work The Theory of Educational Technology, they contrast how schools that simply give students individual tablets to facilitate instructional “monologue” fail to foster collaboration, while projects that use technology in a “dialogic” way (as a tool for building and debating in a group) succeed in boosting creativity and critical thinking.

A failure of theory in a changing world

The problem, therefore, is not the technology, but the theory of use that underpins it. Meanwhile, the world our students inhabit has changed radically. Today, they live surrounded by immediate stimuli in a culture where anything that isn’t interesting can be dismissed with a simple click. The experience of searching for information in dozens of library books no longer exists, nor does the weekly wait for a new episode of a series on TV. What is valued is immediacy, instant reward, the outcome without the process. And yet, schools, which should prepare young people to understand and inhabit this world, seem to resist transforming at the same pace.

Not a new problem

This tension is not new. Every technological revolution has challenged a part of what we considered essential to learn. In the industrial-era school of the 20th century, memorization was the cornerstone of the system. To know was to remember. And to remember was to obey. In a world that needed precise operators, discipline and repetition were virtues. Over time, creativity, independent thought, or imagination were relegated to the background. Education served the factory model: everyone at the same time, with the same content, under the same voice.

In the second half of the last century, the expansion of mass media brought a change: it was no longer enough to repeat; one had to understand. Education began to value the ability to apply knowledge to new situations. However, something was also lost: patience. The immediacy of television and new media began to shape a faster, more fragmented, less contemplative way of learning. This is the thesis that essayist Nicholas Carr explored in-depth in his influential book The Shallows, where he argued that the web is, literally, “reprogramming” our brains to prefer efficiency and immediacy over contemplation and deep thought.

Changing learning needs

With the arrival of the Internet in the nineties, the need for encyclopedic memory gave way to the ability to search, compare, and discern information—a new type of intelligence measured not by what one knows, but by how one evaluates what one finds. Two decades later learning and curiosity have ceased to be individual acts and became a shared and instant experience. However, this openness has brought new challenges for formal learning: difficulty concentrating, the superficiality of multitasking, and the loss of space for individual reflection.

And so, we arrive at our era, that of artificial intelligence. Many teachers, as happened before with the Internet, are skeptical about its use in education, or are openly opposed to its use. They fear that students will stop thinking, that they will delegate their reasoning to a machine, that they will lose essential skills. But this fear masks the objective. As Erik Brynjolfsson, a Stanford economist and co-author of The Second Machine Age, points out, the true goal of good AI usage is not to replicate human intelligence, but to ‘augment’ it.

Time for a pedagogical shift . . .

However, as AI becomes part of a teacher’s daily workflow—whether for planning, drafting materials, or managing administrative tasks, attitudes are starting to change. What has not yet fully occurred, however, is the pedagogical shift inside the classroom. That transition requires us to design learning experiences that go beyond mere content creation—something AI can easily automate—and instead cultivate critical thinking, reflective judgment, and meaningful feedback.

Developing usage, however brings a gain in the form of time and that can be used to dedicate more energy to formulating good questions, asking students to analyze, interpret, and decide, instead of repeating information. This is not a simple pedagogical option; it is the explicit demand of the labor market. The World Economic Forum’s Future of Jobs Report 2023 ranks “analytical thinking” and “creative thinking” as the two most crucial skills for workers in the next five years. “Memorization” does not even appear among the top ten rising skills; in fact, it is considered a declining one.

. . . to meet the needs of a changing world

The fact is that AI is becoming part of the real world. For example, in Spain, hospitals of the Catalan Health Institute (ICS), including the Vall d’Hebron University Hospital, use AI to improve breast cancer diagnosis through digital analysis of histological samples and advanced algorithms. This is just one of many AI applications being developed to improve a community’s health provision.

And there is much more underway. According to the IBM Global AI Adoption Index 2025, 51% of large organizations worldwide are now actively integrating AI into their daily operations—an increase of over 10% since 2023. This data, consistent with similar findings from PwC’s 2025 Global AI Business Outlook, confirms that AI has already moved from experimentation to full-scale deployment across industries. This is the real working world our students will face: one where AI is not a “cheating” tool, but an essential cognitive collaborator—just as the calculator once was for an engineer thirty years ago.

The question we must ask ourselves is stark: are we preparing students for that world, or are we still preparing them for a world that no longer exists? Every technological advance forces us to redefine what “to know” means. The essential thing is not to resist change, but to ask ourselves what kind of humanity we want to preserve. Education cannot be limited to teaching how to use new tools:  it must help us understand how each tool transforms who we are. In the age of augmented intelligence, the true mission of school will not be to compete with technology, but to teach us how to be human in an intelligent world.

Postscript: on authorship in the age of AI

As an educator, I firmly believe in transparency. In writing this article, I have practiced the “augmented intelligence” I advocate for. I have used AI tools as collaborators: to accelerate research, verify historical and statistical data, and refine the precision and clarity of the English language. The critical thinking, the thesis, and the voice remain human. Technology, used correctly, simply allows us to amplify them.

José Frutos Joral is the High School and Logistics Coordinator, AI Project Lead and Spanish Teacher at the Bilingual European School, Milan.

FEATURE IMAGE: by Mahmud Shoeb from Pixabay

Support Images:  by Gerd Altmann from Pixabay, Pablo Merchán Montes For Unsplash+ & Andrej Lišakov For Unsplash+