For What It’s Worth: Here’s What I Think We Should Do About AI in Education

I believe in AI. I’ve seen what it can do.

However, I, like many of my colleagues, hate the way it’s being used in education.

Most GPTs don’t coach students through problems; they jump right in and do the work unless the prompt says otherwise.

I built a custom GPT inside ChatGPT, designed specifically to coach students instead of doing the work for them.

A GPT designed to coach and teach, not ghostwrite.

It’s been… miserable.

No matter how many instructions I give it—“Don’t write the speech, just give feedback like a coach would”—it still tries to take over and produce content for the user. It’s easier to get it to write a full persuasive speech than it is to beat my four-month-old daughter at Mario Kart.

Frustrating.

But it also reveals how hard it is currently to guide AI towards responsible use.

Because if I’m struggling to set boundaries with an AI chatbot—someone who teaches communication for a living—what do we expect from students who are just trying to get through assignments and survive?

There are two quick solutions: teach students how to use AI intentionally, and build open dialogue about it in class.

1. Embrace AI as a collaborator.

Most students are already using AI. They know it. We know it. They know that we know it—and vice versa.

I see ChatGPT open in a tab on nearly every student’s laptop. Honestly, it’s open on mine most of the time too.

The solution can’t be to shut it down. If you don’t believe me, refer to any moment in history where we tried outlawing the latest shortcut. It never works. Instead, let’s teach students how to use AI intentionally—to think faster, write more clearly, and structure their thoughts under pressure.

I’ve read a lot of articles about professors “AI-proofing” their classrooms. I get it, but I think that strategy misses the point. We’re not preparing students for the real world when we do that.

I love the idea of blue book-style tests to gauge learning. But in the weeks leading up to those tests, let’s show students how to train with AI. For example, they might:

  • Use it to clarify a messy idea before writing it cold

  • Ask it to explain something they’re too afraid to ask in class (and then fact-check it)

  • Run a paragraph through it to catch awkward phrasing

  • Practice timed writing with AI before doing it without

If we train students to see AI as a collaborator, they’ll carry that relationship with them into the real world.

Of course, that only works if we help them develop good instincts. Without some structure, students risk becoming too reliant on AI, letting it shape their ideas before they’ve wrestled with them. And let’s not pretend the tools are perfect: they hallucinate facts, reinforce bias, and often miss nuance. That’s why coaching, not outsourcing, is the key.

2. Have honest (and awkward) conversations about it.

We’ve got to stop acting like students are doing something shocking. They’re not.

Students are busy, overwhelmed, and looking for shortcuts—just like we were.

In my day? Everyone used Quizlet to ace tests.
In my dad’s day? They wrote formulas on their arms.
In his dad’s day? They dropped out at 15 and ran moonshine.

The tools change, but the survival instincts don’t.

The difference now? Students have a well-made hammer, but no one's shown them how to swing it. So they miss. They hit their fingers. And we all act surprised that it’s happening.

I’ve seen students dump a half-loaded prompt into ChatGPT and call it a day because they think that’s what “using AI” means. And AI, being helpful, doesn’t exactly stop them.

Instead of cracking down, let’s carve out space in the classroom to be real about it.

Start by asking them how they are using it—then show them a better way.

Simply reporting a student for misconduct isn’t always the best solution. Maybe I’m just a softy, but I’d rather have a conversation than report someone if possible.

AI isn’t the enemy of education.
Lack of adaptability is.

Demystifying AI means we stop pretending it’s a shortcut or a threat. It’s a tool for efficiency and amplifying rhetorical power. And like any tool, it takes practice to use it well.

The most valuable thinkers in the future won’t be the ones who avoided AI entirely.
They’ll be the ones who know when to use it, how to shape it, and when to set it aside.

If you want to try the GPT I’ve been working on, here it is.

It’s designed to coach—not ghostwrite—but I’ll be honest with you: it still needs reminders about its purpose.
Remind it to coach you, and it usually stays on task.

Teachers, change is scary, and you won’t know everything about how to use AI at first. You might embarrass yourself or say the wrong thing about it all. That’s okay. The best educators in the next ten years won’t be the ones who feared AI—they’ll be the ones who figured out how to model it honestly.

Next
Next

Holiday Parties, Long Time No Sees, and Persuasion