- | 9:00 am
Will ChatGPT end up doing all the homework? Or reboot education in the Middle East
As concerns over AI cheating continue to amplify, here’s how universities in the Middle East are grappling with generative AI
When an early demo of ChatGPT, a large-language model trained by OpenAI, was released last November, its capabilities made internet users tizzy. Within five days, more than one million users had tested it out.
Apart from tech enthusiasts, educators and students embraced the tech wholeheartedly, impressed by how fast the chatbot rendered information into fluid prose. But soon, existential anxiety around generative AI took over as concerns around AI cheating started cropping up, stoking fear of an educational apocalypse. Or could it be the reboot that education sorely needs?
Across the region, university professors and administrators are starting to reassess education in response to ChatGPT, prompting a potentially massive shift in teaching and learning.
Some universities are redesigning their courses while others are imparting education to staff and students on the operating system of ChatGPT, its ethical usage, and focusing on checking assignments with more scrutiny.
“We’ve informed students that the university is aware of ChatGPT and other forms of generative AI. We are taking steps to prevent and detect inappropriate usage to ensure that any such instance will be dealt with as misconduct in line with our regulations,” says Raj Nambiar, Director of the University of Bolton Academic Centre.
UPDATE CURRICULUMS, OVERHAUL CLASSROOMS
This university is conducting experiments with ChatGPT by trying to get the system to generate an essay or other piece of coursework based on a title or brief that the professors have or will be given to students in the future.
“This assessment will give us a general sense of what ChatGPT can do and will help us consider how to amend coursework titles or briefs to make it difficult for students to use ChatGPT in a way that would allow them to generate outputs they could pass off as their work in the modules.”
Meanwhile, Dr. Karim Seghir, Chancellor of Ajman University, says that all its faculties are working on strategies to update the curriculum and overhaul the classrooms to incorporate the latest AI tools and technologies.
“We will assess our current technology infrastructure to determine what upgrades are necessary to support the integration of AI tools into the classroom and also explore possibilities to have partnerships to bring the latest tools and technologies into our classrooms,” adds Dr. Seghir.
In addition, the university’s faculty will also stay up-to-date with the latest advancements in AI. “It is important for universities to address this issue head-on by redesigning assessments and increasing faculty awareness,” Dr. Seghir says.
He believes creating comprehensive and challenging exams, implementing multi-faceted assessments incorporating various forms of evaluation, and regularly updating assessments will deter any potential cheating methods.
DEVELOP CRITICAL THINKING
Dr. Krishnadas Nanath, Campus Program Coordinator and an Associate Professor of Data Science at Middlesex University, says the university is adopting a thorough approach to assessments, utilizing various methods such as vivas, research projects, group presentations, quizzes, and more.
Dr. Nanath contends that this approach allows the university to evaluate students based on consistency and ongoing learning rather than relying solely on one assessment form. “Universities have long struggled with maintaining academic integrity, and with the advent of tools such as ChatGPT, this challenge has become even more pronounced.”
It is clear that by educating students on how to use ChatGPT, they can be empowered to make the most of this technology and harness its full potential. “It is essential to encourage students to use ChatGPT in their studies and research. By providing students with access to this technology and encouraging them to use it, we can help them develop their critical thinking and problem-solving skills and create a culture of innovation and creativity within our institution, which can lead to discoveries and breakthroughs,” says Dr. Seghir.
Heriot-Watt University is supporting staff and students to address the challenges and opportunities of these new technologies, including those posed by inappropriate use of ChatGPT and other AI content creation tools.
“We cannot isolate students from such tools as AI, if used appropriately, can support education. So our approach is to continue supporting students to understand how to avoid the misuse of these tools. Where it is identified as deliberate, we have a robust disciplinary process in place,” says Vanessa Northway, Deputy Vice-Principal For Learning & Teaching At Heriot-Watt University.
Scheduled for launch in the summer, GPT-4 is touted to be faster, with responses more human and detailed. And when that happens, universities will be forced to embrace the looming disruption that will likely follow.
Nambiar says consultations and webinars on the use and abuse of Generative AI will be run centrally by the concerned university offices and in collaboration with colleagues. “We will work to create a robust mechanism to counter the potential collusion and malpractice that the landscape could bring with it.”
Another area that is being relooked at is assessment. Professors are increasingly being advised to read the text submissions bearing in mind that this could be generated by an AI tool. Dr. Nanath believes that universities should consider redesigning assessments in response to ChatGPT. “The university also aims at not wholly relying on plagiarism tools to evaluate the submissions.”
He, however, notes that ChatGPT may need to be more capable of generating complete projects on its own and requires many prompts, thoughtful questions, and critical thinking to produce high-quality output. “If a student dedicates a significant amount of time to using ChatGPT, it can be considered secondary research, similar to using search engines such as Google, to gather information and understand concepts.”
Dr. Nanath notes that schools and universities had similar concerns when search engines were introduced, worrying that students would use them to cheat on assessments. However, today, search engines are widely accepted as valuable tools for obtaining information and data. “ChatGPT will likely be viewed in the same way, and educators must be aware of its existence and the potential for students to use it for assessments.”
At a time when teachers are being pitted against tech, thanks to generative AI, some are taking the disruption as an opportunity to reshape the future of classrooms with resilience.
A new future of learning might just be underway as universities now gear up towards embracing non-text-based assessments that could include data, visualizations, and action-oriented outputs.
Nambiar alludes that the situation has opened the eyes for assessments that focus on the knowledge management process rather than the outputs of knowledge. He says that if the education sector bans AI tools, it might not prepare the students for real-world problems.
“Universities must be proactive in preparing students for the future workforce, where AI and automation will play a significant role, and that means not only providing students with the technical skills they need to work with these technologies, but also equipping them with the critical thinking and problem-solving skills that will be essential in a world where machines can perform many tasks previously done by humans.”
Clearly, the task ahead of the next generation of students and educators seems like a huge and ambitious one: to stay even ahead of the technology powering us.