Children in a class room with a teacher and two boys looking at each other
Editorial

Students Speak Out: AI Is Changing School, and No One's in Charge

5 minute read
Nick Jackson avatar
By
SAVED
Two students. One global tech shift. Their stories reveal how AI is upending school — with zero consistency and even less guidance.

While education systems worldwide continue to struggle with the effects of generative AI on teaching and learning, little attention has been given to students and the realities they are facing. Teenagers from across the world find themselves navigating a school experience suddenly transformed by technology that very few foresaw.

Education has always moved at its own pace, resistant to disruption and protective of time-tested methods. But the rapid emergence of AI tools has created a moment of profound disorientation in schools, and students stand at the epicenter of this change, not as subjects of policy but as navigators of uncharted academic territory.

This article features honest and forthright insights from two students in different countries and what they have experienced since the emergence of generative AI. Both students offer reflections on several aspects of their AI experiences: their first encounters with generative AI in education, how it has shifted classroom dynamics, the evolving and often inconsistent rules around its use, their personal experimentation with these tools, the ethical questions they've faced and how their particular cultural and educational contexts might influence their perspectives. Rather than seeking policy recommendations or solutions, the students have been encouraged to simply document what it feels like to navigate this technological disruption day-to-day.

What follows are their unfiltered accounts — two windows into the lived realities of education in transition. These students aren't offering neat conclusions about what AI means for education's future. Instead, their personal accounts provide something more valuable: a portrait of what it feels like to attend school during a period of genuine technological disruption, when the fundamental rules and relationships that have defined academic life for generations have suddenly been called into question.

Amy Wallace, School Captain and Digital Ambassador Living in Adelaide, South Australia

It just appeared, and we had to figure it out as we went.

For me, it was never about cutting corners. I started using AI when I’d exhausted the usual resources — I wanted more revision material, clearer explanations, new angles to think from. In my more creative subjects, it’s especially helpful in the early stages of ideation. Then, I refine things myself. I use it to tutor, to draft my work, to explore questions. It’s efficient, yes — but it still demands critical thinking. Especially when it gets things wrong, which it still does.

What’s more complicated than the tech itself is the way people treat it. Some teachers banned it outright at first. Others encouraged it, within reason. Now, there’s a more consistent school approach — but individual attitudes still vary. Some educators see its value; others believe it lowers the quality of thinking. That tension filters down to students too — and it’s not just about how AI is used, but how it’s talked about. There’s this strange pressure to justify using it, or to mention it with a bit of mockery — like it’s not something you can openly rely on without needing to explain yourself or make a joke about it. Even when students use it responsibly, they’ll often say things like, “I only used it for ideas — not the actual writing!” as if admitting otherwise would cheapen the work.

I’ve spoken with students from across Australia, and the inconsistencies are everywhere. Some schools are strict, some are incredibly relaxed and most are somewhere in between. The lack of understanding of AI is glaringly obvious, and that patchiness shapes how students approach their learning and how they think about integrity. When some people use AI to write entire assignments and still get high grades, while others carefully limit their use, it naturally causes friction. It's not just about permission — it's about fairness, and about what academic success actually means now.”

Related Article: Open-Source AI Is Changing Higher Ed For Better or Worse

William Liang, High School Journalist Living in San Jose, California

At my school, AI tools like ChatGPT didn’t enter the classroom with a big announcement. They slipped in quietly. One student used it to rephrase a few clunky sentences. Another used it to finish a take-home essay they’d been putting off. Then more students caught on. Now, a year later, AI is part of everyday schoolwork — and no one really knows what to do about it.

Teachers give vague warnings. Some assign work assuming students will use AI, others ban it entirely and some don’t mention it at all. There’s no consistency, no enforcement and no clear policy. What we’re left with is a kind of gray zone where everyone is figuring it out on their own.

That might sound manageable, but it has serious consequences.

Students who use AI to cheat aren’t rare. I’ve seen full essays written by ChatGPT turned in without changes. I’ve seen teachers grade them without question. And I’ve seen those same students get praised for their “insightful” work.

The result? Grades are going up, but actual thinking is going down.

For students, AI has made it easier to get by without doing the hard parts of learning — without outlining, revising or even forming your own argument. For teachers, it’s made it harder to tell who understands the material and who just knows how to prompt a chatbot.

And it’s not just happening in one school. There are surveys showing similar experiences at other schools.

We keep hearing about “responsible AI use,” but no one really defines what that means. Teachers are left to enforce vague standards. Students figure out their own boundaries. And the system, meanwhile, pretends that everything is still working the way it used to.

It’s not.

I’ve seen students who used to try — even if their writing wasn’t great — start turning in flawless assignments written entirely by AI. I’ve seen teachers privately admit they don’t know how to respond. I’ve even seen students start mixing their writing with AI output to avoid detection. It’s no longer about copy-pasting. It’s about blending.

All of this raises a bigger issue: AI is changing the culture of learning.

Take-home essays used to measure how well students could think, reason and communicate on their own. Now, more often than not, they measure how good students are at hiding the role of AI. This isn’t just about academic dishonesty. It’s about the erosion of what school is supposed to do.

I’m not against AI. It can be helpful in the right context — drafting resumes, translating text, generating practice quizzes. But in core academic tasks, especially writing, it’s replacing the work, not supporting it.

If education is supposed to build skills — thinking, arguing, problem-solving — then letting AI do that work is counterproductive. And when both students and teachers are using it to save time, learning gets lost.

Related Article: Rewriting the Curriculum: How AI Is Changing What and How We Learn

The Missing Voice in the AI-in-Education Debate

The accounts from Amy and William reveal a striking contrast in perspectives: one student sees AI as a complementary tool that enhances learning when used thoughtfully, while the other views it as fundamentally undermining academic integrity and skill development. These divergent viewpoints reflect the broader tensions in education as institutions struggle to adapt to this technological shift.

Learning Opportunities

There is a common thread from both students in their shared experiences of navigating inconsistent policies and unclear boundaries. They describe how they are being forced to develop their own ethical frameworks in real-time, often with minimal guidance. As education continues to grapple with AI's implications, these firsthand experiences from those living through the transformation should inform more coherent, intentional approaches that preserve the core purpose of education while acknowledging the reality of these powerful new tools.

To close, I want to draw attention to how vital it is to get customer/consumer/user feedback regardless of the product or technology in question. I do not think for a moment that this is a new phenomenon to anyone. Yet, somehow in education, such a concept does not seem to apply to students. Perhaps this is a chance to do something different. 

fa-solid fa-hand-paper Learn how you can join our contributor community.

About the Author
Nick Jackson

Nick Jackson is the leader of digital technologies at Scotch College in Adelaide, Australia and founder of Now Future Learning, providing help to educational institutions and businesses on the integration and use of generative AI. Jackson is also the co-author of the book “The Next Word: AI & Teachers.” He holds a Ph.D. and two master's-level degrees. Connect with Nick Jackson:

Main image: everettovrk on Adobe Stock
Featured Research