- | 9:00 am
Is AI helping students learn faster, or is it making them learn less?
Experts weigh in on how over-reliance on AI tools is disrupting the classroom experience for the worse

The essay looks decent and convincing, and the code, too. The student, however, can’t explain either. This increasingly common classroom scenario is raising alarm among educators, as generative AI tools like ChatGPT and Gemini become default academic co-pilots.
So, what happens when AI doesn’t just assist learning but replaces it?
“Students are missing the most essential parts of the learning process,” says Ali Muzaffar, Assistant Professor at Heriot-Watt University Dubai. “They’re handing over not just the work, but the thinking, and that’s a problem we’re already seeing ripple through into their projects and even industry placements.”
It’s a growing crisis.
THE UNSEEN COST: COGNITIVE DE-SKILLING
The promise of AI in education is speed, accuracy, and even personalization. But the trade-offs are becoming clearer.
When students delegate their assignments to AI, they skip over key phases of intellectual development: understanding problems, applying critical thinking, conducting research, and formulating ideas in their own voice.
Mohammed Ghaly, Professor of Ethics and scholar at Hamad Bin Khalifa University (HBKU), sees this as a fundamental disruption of education. “When I assign a paper, the paper isn’t the point. It’s the process: reading, questioning, drafting, and editing that teaches you. If you skip that, you haven’t really learned.”
Ghaly warns that AI tools now enable students to bypass that process, leading to “de-skilling”, a gradual loss of essential cognitive and academic abilities. “Yes, calculators changed how we calculate. But we didn’t stop teaching math. If AI leads us to stop thinking, researching, or writing altogether, we’re entering a very different and dangerous educational paradigm.”
MISSING THE POINT AND THE PROBLEM
Muzaffar highlights how students’ over-reliance on LLMs harms fields like computer science. “We’re seeing students use AI to generate code solutions. But they’re not learning how to solve problems. They’re not debugging. They’re not ideating. These are not optional skills; they are the foundation of our discipline.”
He also notes a surprising casualty: the ability to read. “Students are now copying assignment questions directly into ChatGPT without attempting to understand them. We’re raising a generation struggling to interpret, evaluate, or question, and these skills go far beyond university.”
Writing skills are also deteriorating, Muzaffar warns, because students are no longer practicing the art of expressing original thought. And these cracks are beginning to show—particularly in final-year projects, internships, and interviews.
UNIVERSAL CONCERN
For Ghaly, these concerns go beyond the university. They tap into a broader ethical and cultural conversation about how humanity coexists with intelligent machines.
“AI presents global challenges, so we need a global conversation,” he says. “We often see a narrow perspective—Western, secular, and largely corporate—dominating the AI ethics discourse. But moral traditions differ. Contexts differ. We need diverse voices—Asia, Africa, Latin America, the Middle East—to help shape a future that works for everyone.”
Ghaly is vocal about avoiding what he calls the “repeat cycle.” Western institutions define AI policy and ethical frameworks unilaterally and expect global adoption. “This time, the Arab world—and the broader Global South—must be at the table from the beginning.”
At HBKU in Doha, Ghaly is actively fostering that diversity by including AI in the curriculum, but with nuance. He doesn’t ban AI in student assignments. He requires it. “I ask my students to use generative AI as part of the assignment,” he explains. “But then they must critically analyze what AI produced versus what human reviewers or scholars have said. The question isn’t whether to use AI. It’s how—and with what awareness.”
THE HUMAN FACTOR STILL MATTERS
Scholars agree that AI can’t—and shouldn’t—replace educators. “The one thing AI still cannot do,” says Muzaffar, “is walk into a classroom and feel what’s happening. That atmosphere—the eye contact, silence, confusion, and energy—tells a good teacher whether students are actually learning. AI can’t replicate that.”
Ghaly takes it further. “We are decentralizing humans,” he says. “AI is already reducing the number of editors, group leaders, and even some academic roles. Tech companies talk about ‘human-centered AI,’ but we need to ask, centered for whom? Because right now, many are losing jobs, and the ones gaining new ones are not the same people.”
He warns that our transition period will be socially disruptive and faster than we think. “We’re told new jobs will emerge, which may be true. But the telephone operator isn’t going to become a data scientist. So we can’t dismiss the human cost.”
If education becomes about efficiency over experience, product over process, and automation over understanding, what are we really teaching?
In an AI-powered classroom, the risk isn’t just cheating, but forgetting how to think. As Ghaly reminds us, the goal of learning isn’t to complete a task, but to engage with diverse perspectives, challenge assumptions, and grow intellectually—something no prompt can automate.
“Humans will remain central,” he says, “only if we insist on it. Otherwise, the tools we build will slowly redefine what we think it means to be educated or even human.”