The neural cost of convenience: teaching with, and not by, AI – Dr. Erika Galea on the Sunday Times of Malta (originally published by Oxford University Press)

AI should be introduced in the classroom without harming the cognitive abilities education aims to develop, says Erika Galea

Artificial intelligence is not going away. But in the rush to integrate it into classrooms, we risk framing AI as the ultimate solution for teaching and learning, rather than what it truly is: a tool; a powerful one, yes, but one that, if misused or overused, may undermine the very cognitive abilities education aims to develop.

Neuroscience (Gazerani, 2025Zhan et al., 2018) tells us the brain is influenced by what it repeatedly does. When we outsource thinking to algorithms, we train our brains to disengage. Over time, skills like deep focus, critical thinking and metacognition – skills students must practise to retain – begin to deteriorate.

In a world of instant answers, what becomes of the student’s ability to pause, question and think through ambiguity?

It is tempting to believe that AI will make learning faster and more efficient. And in some ways, it does. Large language models can scaffold understanding, rephrase complex texts or generate test questions in seconds.

But efficiency should never be confused with effectiveness. The act of struggling with a problem, of sitting in cognitive discomfort, is not a bug in the learning process; it is the feature that makes it work.

AI cannot, and should not, replace the neural work that builds meaningful understanding. Nor can it replicate the emotional resonance of human connection in the classroom: the subtle reassurance of a teacher’s encouragement, the instinctive support offered when a student is struggling and the shared joy when they succeed.

This is where educators must step in, not to resist AI, but to guide students in using it wisely. That means teaching them not just how to ‘prompt’ AI, but how to question it, challenge it and sometimes even ignore it.

We need to reframe digital literacy for the AI age. It is not just about knowing how to use the tool; it is about knowing when not to.

This demands a renewed focus on emotional intelligence, self-regulated learning and metacognition – qualities machines do not possess and cannot develop for us. These are what allow a student to pause before accepting an AI output at face value, and to ask: Is this accurate? Is it biased? Is this helping me learn or just helping me get it done?

Developing this kind of discernment takes intentional teaching. It is not enough to throw AI into the classroom and assume students will use it responsibly.

We need to show them how. And we must continue to model and nurture the emotional and interpersonal dimensions of learning – compassion, trust, encouragement and belonging – that no algorithm can provide. These human experiences are often what keep students motivated when the work is hard and the answers do not come easily.

Crucially, we must also help students build a ‘truth filter’. With AI models confidently generating falsehoods, the ability to distinguish between real and fake information has never been more urgent. This calls not only for critical thinking but also for emotional intelligence: the capacity to pause, reflect and manage one’s reactions before accepting or sharing information.

Young people need to be trained in ‘epistemic vigilance’ (Watson & Morgan, 2025): recognising not only what is true but ‘why’ it is true. This means cross-checking sources, noticing when something does not make sense and asking questions, while still keeping an open and balanced mindset.

Teachers play a pivotal role here, not as gatekeepers of knowledge but as coaches of cognition. They are the ones who can create learning environments where students reflect on their thinking, question what they read and test what they believe.

They are also mentors and motivators, who know when a student needs challenge and when they need care. No AI can pick up on a student’s quiet frustration or their sudden spark of curiosity. Teachers do!

AI cannot do this kind of work. But it can support it, when used intentionally.

There is a neuroscience principle called ‘use it or lose it’: if a neural pathway is not activated regularly, the brain reallocates resources elsewhere. If students use AI to bypass writing, reading or problem-solving, those circuits may weaken. What we gain in speed, we lose in strength.

The goal, then, is not to ban AI but to blend it into a pedagogy of thinking. Used well, AI can help students reach further and faster. But without careful guidance, it will make their minds lazier, not sharper.

No matter how advanced technology becomes, the essence of teaching remains profoundly human, and while technology can support learning, it can never replace the unique expertise teachers bring to the classroom. They are not just content-deliverers but cognitive architects, influencing how students think, not just what they know.

That responsibility cannot be automated. And neither can the genuine human bonds that nurture students’ self-belief, resilience and joy in learning. These are forged not through data but through relationships.

We need to build classrooms where AI is treated the way calculators once were, not as a substitute for understanding but as a support for it. Students must still learn how to do the mental maths, even if they later use a tool to speed it up. The same is true of writing, research and reasoning.

AI is here to stay: powerful, pervasive and transforming the way we live and learn. But the human brain still does the heavy lifting of learning, when we let it.

Erika Galea is co-author of Generation Alpha in the Classroom (Oxford University Press English Language Teaching) and founder and director of the Educational Neuroscience Hub Europe (Malta).

This article was first published on December 1, 2025, by Oxford University Press on its AI and Education platform. It is republished here with permission, with full acknowledgement of Oxford University Press as the original publisher.

Click here to read it online: https://timesofmalta.com/article/neural-cost-convenience-teaching-ai.1122987