AIED 2025: Rethinking AI’s Role – From Answer Engine to Socratic Tutor

As someone who has spent years using NVivo’s AI-powered features for qualitative data analysis, I thought I understood artificial intelligence in educational contexts. But when generative AI tools like ChatGPT became widely available, they changed education faster than policies could catch up. As a lecturer in education at the University of Manchester, I found myself asking: is this plagiarism, or a tool students should learn to use responsibly? 

That uncertainty—and my desire to learn from educators and researchers worldwide—is what brought me to the Artificial Intelligence in Education (AIED) 2025 conference in Palermo, Italy this summer. The experience combined academic rigour with cultural richness, bringing together more than 700 participants from around the globe. From lively bus stop conversations to late-night discussions over gelato, it was a week of learning, connection, and reflection on the transformative potential of artificial intelligence in education. 

First Impressions: Palermo and Conference Culture 

two people at a bus-stop sheltering from the sun under an umbrella

Bus-stop conversations: where the best discussions started before the conference even began.

Palermo welcomed us with vibrant streets, historic architecture, and intense summer heat. I stayed in a hotel in the town centre—

affordable, comfortable, and close to everything. Getting around proved challenging; the buses were rarely on time. At first, this felt inconvenient, but I soon realised it was a blessing. Each morning at the bus stop, I met fellow conference attendees heading to the same venue. These unplanned encounters sparked deep conversations about AI in classrooms, the future of teaching, and our shared hopes for education.  

On arrival at the conference, we received a bag with the programme and a water bottle—a simple but invaluable gift in the Sicilian heat. Evenings were filled with Italian dishes (I must have eaten more aubergine and octopus than ever before!) and a welcome variety of vegetarian options. The gala dinner by the seashore was unforgettable: live music, dancing, conversations under the

the post writer at the AIED reception desk with conference staff

AIED 2025: registration day — and the most valuable gift in Palermo’s heat, a water bottle.

stars, and the sparkling view of ships in the distance. Palermo itself buzzed with life—street orchestras, flower-filled squares, and spontaneous meetups with colleagues who quickly became friends.  

From Answer Engine to Socratic Tutor: The Core Shift 

The central question running through the conference was not whether AI belongs in education, but how it should function. A recurring theme emerged with remarkable consistency: AI should not simply provide answers—instead, it should guide students in finding answers, encouraging reasoning, exploration, and deeper engagement. 

This represents a fundamental shift in how we conceptualise AI’s role. Rather than functioning as an answer engine that short-circuits the learning process, AI can act as a Socratic tutor—one that poses questions, prompts reflection, and scaffolds thinking without doing the cognitive work for students. This philosophy resonated throughout keynotes, paper presentations, and informal conversations alike. 

One particularly memorable keynote came from Dr Kristen DeCerbo, Chief Learning Officer at Khan Academy, who spoke about moving education from the transfer of information to transforming learner outcomes. She traced the arc from television to the internet to AI, highlighting how each wave of technology reshaped learning. Crucially, she emphasised that AI should guide students to find answers—not simply hand them over. She referenced Khanmigo, Khan Academy’s AI tutor, not as an answer machine but as a guide for problem-solving. The message was clear: the future of AI in education lies in collaboration, not substitution. 

Pedagogical Frameworks: Making AI Work for Learning 

Several sessions explored practical frameworks for implementing AI in educationally sound ways. The ICAP framework (Interactive, Constructive, Active, Passive learning), developed by cognitive science researcher Michelene Chi and her colleague Ruth Wylie, featured prominently in discussions about how AI can support higher levels of cognitive engagement from Passive, to Active, to Constructive, to Interactive. Rather than allowing students to passively consume answers, well-designed AI tools can prompt them to interact, construct arguments, and take cognitive ownership of their learning. 

Discussions also examined how teachers might use AI to generate questions—selecting the best ones to challenge students—and to provide formative feedback on essays and assignments. This approach isn’t about outsourcing the teacher’s role but about saving time and scaffolding feedback, allowing educators to focus on the deeper pedagogical strategies that matter most. 

A particularly striking framework came from discussions of activity theory, which envisions AI as a tool embedded within the triangle of students, teachers, and outcomes, alongside rules and responsibilities. This helped me see AI not as an external add-on but as part of the educational ecosystem itself—a perspective that resonates with my own experience using AI-powered features in NVivo, where the technology augments rather than replaces analytical judgement. 

Voices from the Field: What Educators Are Doing 

One of the richest aspects of AIED was hearing directly from educators and researchers about their practical implementations:  

Smiling conference delegates standing next to plates of food

Lunch breaks: where research discussions continued with laughter and new connections.

Christine Fox shared her curiosity about how intelligent tutoring systems (ITS) are being used in real courses, highlighting the gap between research prototypes and classroom reality. 

Bruno Poellhuber described developing an intelligent biology tutor through user-centred design with teachers and students—an approach that ensures AI tools meet actual pedagogical needs rather than imposing technological solutions. 

Ute Fiedler offered a particularly compelling example from Nova Scotia Community College, where she has trained GPTs to act as teaching assistants. These AI assistants can answer syllabus questions, explain content, and even apply rubrics to student work—demonstrating how AI can handle routine tasks while freeing instructors for more complex pedagogical interactions. 

Informal polls revealed the tools people are using most: ChatGPT, Claude, Gemini, DeepSeek, and open-source models like LLaMA. Opinions varied on their relative strengths, but what struck me was the spirit of experimentation and adaptation. Benjamin Nye offered a useful comparative assessment: while GPT 4.5 has proven reliable but suffers from slowness, high costs, and ambiguous practical benefits, Google’s new LearnLM (based on Gemini Flash 2.5) shows particular promise for creating educational content. Claude’s performance remains acceptable, though ongoing evaluation continues. 

The European Digital Education Hub (EDEH) presented a thought-provoking report on explainable AI (XAI) in education, covering detection tools, automated grading, and legal and ethical dimensions. Their work on building actionable guidelines for teachers and developers addresses a critical need: as AI becomes more prevalent in education, transparency and accountability become essential. 

Posters, Research, and Critical Perspectives 

The poster sessions were particularly energising. One study explored students’ emotions while using AI—a crucial reminder that learning is affective as well as cognitive. Another examined how AI can support emotional awareness in students during mathematics learning by creating visual representations. Several posters focused on using AI to reduce teachers’ cognitive load, a practical concern that could make the difference between AI adoption and rejection. 

Attendees recommended Failure to Disrupt: Why Technology Alone Can’t Transform Education by Justin Reich—a sobering counterpoint to the conference’s enthusiasm. Reich argues that while AI brings excitement, technology alone won’t address deep-rooted educational challenges without thoughtful pedagogy, equity, and systemic change. This critical perspective is essential: AI holds enormous promise, but we must remain thoughtful and critical in its adoption, not swept up in technological determinism. 

Key Themes: What Emerged Across Sessions 

Several themes echoed throughout the conference: 

Productive Struggle: AI should create friction that matters, pushing students to wrestle with ideas rather than making learning too easy. This connects to decades of research on desirable difficulties and the importance of effortful learning. 

Curiosity as Currency: In the age of LLMs, curiosity and judgement are more valuable than memorising facts. Education must shift toward cultivating dispositions and capabilities that AI cannot replicate. 

AI Literacy for Teachers: Professional development needs to give teachers hands-on experience with AI tools, helping them build comfort and confidence. Teachers cannot thoughtfully integrate tools they don’t understand. 

Human–AI Collaboration: The best future is not AI replacing humans but humans and AI learning together, each contributing their unique strengths to the educational process. 

Equity and Access: Multiple sessions raised concerns about who has access to sophisticated AI tools and whether AI might widen existing educational inequalities rather than closing them. 

Looking Ahead: Five Trends Shaping AI in Education 

Reflecting on the conference, I see five major trends likely to shape the future of AI in education: 

  1. AI as Co-Teacher: From lesson planning to rubrics, AI will increasingly support—not replace—teachers, amplifying their ability to personalise learning and respond to individual student needs.
  2. Explainable and Ethical AI: Students, teachers, and policymakers will demand transparency and responsibility in AI systems. Black box algorithms will face growing scrutiny.
  3. Personalised Learning at Scale: Large language models will help create adaptive pathways where every student can progress at their own pace, while still being guided by strong pedagogical principles.
  4. Integration Beyond the Classroom: AIwon’tstop at assignments—it will support wellbeing, emotional intelligence, and lifelong learning skills, recognising that education encompasses more than academic content. 
  5. Open Source and Democratisation: Open models likeLLaMAand DeepSeek will give educators more flexibility and reduce dependency on a few large technology companies, potentially addressing equity concerns. 

What This Means for My Practice 

Returning to Manchester, I find myself thinking differently about the questions that brought me to Palermo. Is using AI plagiarism or a tool for responsible use? The answer, I now believe, lies in how we frame it pedagogically. 

Just as NVivo’s AI features enhance qualitative analysis without replacing analytical thinking, generative AI can enhance learning without replacing the cognitive work that makes education meaningful. The key is designing learning experiences where AI serves as a Socratic tutor—prompting questions, scaffolding exploration, and supporting metacognition—rather than as an answer engine that short-circuits intellectual development. 

This means I need to: 

  • Design assignments where AI use is transparent and pedagogically purposeful
  • Teach students to critically evaluate AI outputs rather than accepting them uncritically
  • Model my own AI use, showing how I employ these tools whilemaintainingintellectual responsibility 
  • Advocate for institutional policies that support thoughtful AI integration rather than blanket prohibition or uncritical adoption

Final Reflections 

AIED 2025 was more than a conference—it was a reminder of why we do this work. The late buses that led to new friendships, the

Wide shot of dozens of people at an outdoor event

Gala dinner by the sea — a reminder that conferences are built on community.

posters that sparked big questions, the gala dinner that turned into a dance floor—all of it captured the spirit of a community determined to reimagine learning for the future. 

As I flew back from Palermo, I carried not just a conference bag and memories of endless Italian dishes, but a conviction: AI in education is not about machines giving answers. It’s about helping students find their own.  

And that, perhaps, is the most human lesson of all. 

— 

The author, Dr Haleema Sadia, is a Lecturer in Education at the University of Manchester, specialising in international students’ transitions to UK postgraduate studies, visual research methods, and qualitative data analysis using NVivo. 

1 Comment

  1. Alison Sharrock

    As so often with reports about the wonders of AI, I am left feeling very uncertain about *how* it would do the things it is promoted as doing. I haven’t yet heard anything that would make me think that it really would help me in any part of my work. When I return from an AI-free* research project, I would like guidance on where to start for something with actual content. I have read or listened to several accounts, but seen no practical content or real-life application. (*AI-free, in the sense of not using any of the new tools. My voice recognition software is a form of AI, I suppose.)

    Reply

Submit a Comment

Your email address will not be published. Required fields are marked *

Dr Haleema Sadia

Manchester Institute of Education

Ref: 102