Earlier this year I had the privilege of interviewing some of the leading educators, researchers and builders working at the intersection of AI and education. These blogs are my attempt to do justice to those conversations, to pull out the ideas that matter most and make them useful for everyone working in schools right now.
We have spent the last two years almost entirely focused on one side of the AI in education conversation.
What can AI do for teachers? How can it save time? How can it help with lesson planning, feedback, differentiation, marking? The conversation has been relentlessly teacher-facing, and understandably so. Teachers are under enormous pressure, and anything that genuinely reduces workload deserves serious attention.
But somewhere in that conversation, we forgot to ask the people who are most affected by all of it.
We forgot to ask the students.
What students actually think
At AIDUCATION26 in Bucharest, I sat down with Emma Darcy and James Garnett, two educators working at the sharp end of AI governance and classroom implementation. What Emma Darcy shared was not part of the conversation I had expected to be having.
She described a process at her school where an external organisation came in and spoke directly to students about how they felt about the way AI was being used around them. Not by them. Around them.
What came back challenged the assumptions most schools are working from.
Students said they felt it was disrespectful when teachers used AI to plan their lessons or generate feedback on their work without telling them. They were not opposed to teachers using AI. That is an important distinction. They were opposed to not being told. They were opposed to the secrecy of it.
They said that the reason they come to school is the relationship with their teacher. That is the thing school gives them that nothing else can. And if that relationship is being quietly outsourced to a machine, without anyone saying so, then what exactly are they turning up for?
The one-way mirror problem
Here is what strikes me about this. We have spent considerable time and energy worrying about students using AI without telling their teachers. Entire policies have been built around detection, disclosure, and academic integrity. The conversation has been almost entirely framed as a student conduct issue.
But the students in Emma Darcy's school flipped that framing completely. They were not talking about what they were doing. They were talking about what was being done to them. And they had a point.
If we expect students to be transparent about their use of AI, it is worth asking whether we are modelling the same transparency ourselves. If the answer is no, we should not be surprised when the relationship starts to erode.
A 2025 survey by the Center for Democracy and Technology found that half of all students agreed that using AI in class makes them feel less connected to their teacher. Half. That is not a fringe concern. It is a pattern that deserves far more attention than it is currently getting in most schools.
And yet how many school CPD sessions on AI have centred this finding? How many leadership teams have sat with that number and asked what it means for how they are rolling out AI tools?
What school is actually for
Al Kingsley, one of the most thoughtful voices I spoke to across the whole conference, put it plainly. AI, he argued, should be freeing up time for more human interaction, not replacing it. The purpose of deploying AI in a school is to give teachers more of themselves to give to students, not to create a layer of automated distance between them.
That argument makes complete sense. But it only holds if the human relationship remains visible, intentional, and honest.
When a teacher generates feedback using an AI tool and sends it to a student without saying so, they are not freeing up time for human connection. They are substituting the connection itself. And the student, increasingly, knows. They can feel the difference between feedback that has been thought about and feedback that has been processed. Young people are far more perceptive about this than we tend to assume.
A global survey of over 3,800 students conducted by the Digital Education Council found that students are particularly wary of AI being used by teachers in assessments and evaluations, with over 50% believing that over-reliance on AI decreases the value they receive from their education. Students are not anti-AI. They are anti-invisibility. There is a meaningful difference.
The transparency gap
James Garnett, speaking from a governance perspective, made a point that I keep coming back to. The student voice work at his school was not designed to be a tick-box exercise. It was not a school council meeting where AI got mentioned once and then never again. It was an ongoing, structured conversation that gave students real agency in shaping how AI was being used around them.
And what that conversation kept surfacing, over and over, was not hostility. It was a desire to be included. Students wanted to understand. They wanted to know when AI was being used, why it was being used, and what it meant for them. They were not asking for AI to be removed from the classroom. They were asking to not be kept in the dark about it.
This is, when you think about it, not a complicated ask. It is the same ask we make of students themselves. Be honest about your use of AI. Tell us when you used it and how. Acknowledge it rather than hide it.
The question is whether we are willing to hold ourselves to the same standard.
What good practice actually looks like
Emma Darcy's school did not arrive at this insight by accident. They built it deliberately. They created structures for student voice that were meaningful rather than performative. They brought in an external organisation specifically so that students could speak honestly, without the social complexity of saying difficult things directly to their teachers. And they listened to what came back, even when it was uncomfortable.
It is easy to run a student panel. It is harder to let what students say actually change what you do.
What the students in that school asked for was not complicated. They asked for honesty. They asked for teachers to say, I used AI to help plan this lesson, or, I used an AI tool to give you a first pass at feedback and then I reviewed it. They asked to be treated as partners in the process rather than recipients of it.
When that transparency was given, the relationship did not weaken. It deepened. Because now there was something honest to build on.
The question worth sitting with
I am not arguing that teachers should stop using AI. I use it constantly in my own work, and I think the profession would be poorer without it. The time savings are real. The creative possibilities are real. The ability to personalise and differentiate at a scale that was previously impossible is real.
But I think we have built the adoption conversation almost entirely around what AI can do, and not enough around what it does to the relationships that schools run on.
Students come to school for a reason. The curriculum is part of it. The qualifications are part of it. But underneath all of that is something simpler: the experience of being known, taught, and cared for by another human being who chose to show up for them. That is not replaceable. It should not be quietly eroded either.
The good news is that transparency does not cost anything. Telling a student that you used AI to help structure a lesson plan, or that you used a tool to generate initial feedback that you then reviewed and personalised, does not diminish you as a teacher. If anything, it models exactly the kind of honest, reflective relationship with AI that we want students to develop themselves.
It also sends a message that matters more than any AI policy ever could. It tells students that in this classroom, we are honest about the tools we use. And that honesty, more than anything else, is what keeps the human relationship at the centre of what we do.
This blog is part of a series drawing on conversations from AIDUCATION26, a conference dedicated to AI in education held in Bucharest. If you want to understand where your school stands on AI readiness, the DEEP Education Network AI Literacy Audit is a good place to start: audit.deepeducationnetwork.com
Tags
Stay in the Loop
Get practical insights about AI in education, new articles, and training updates delivered to your inbox.
No spam. Unsubscribe anytime.
Work With Alex
Looking for hands-on support with AI integration, curriculum design, or teacher professional development? Alex works with schools and organisations worldwide to build practical, evidence-informed approaches to education technology.