“When I’ve used ChatGPT to do math, it’s mostly been wrong”: This high school teacher is bringing AI into the classroom
Tech-savvy educator Jamie Mitchell talks about the chatbot’s algebra skills, using it to prank his colleagues and whether AI is coming for his job
Since the meteoric rise of ChatGPT—the free, widely available AI chatbot—educators have been divided on its place in the classroom. With its ability to create tailored content on basically any subject imaginable, some fear that it’s the perfect tool for cheating, prompting certain school boards to block the app on their Wi-Fi. But educators like Jamie Mitchell, who teaches high school math at Aldershot School, in Burlington, think ChatGPT has pedagogical promise. Rather than steering students away from the chatbot, he’s been testing its ability to solve complex math problems in the classroom and encouraging his students to critique the results. We spoke with Mitchell about his decision to embrace artificial intelligence—and its limitations—at school.
When did you discover ChatGPT?
The first time I used it was during a lunchtime math tutorial. My students asked if it could write a valedictorian speech, so I asked it to write one for graduates of a STEM program that included two math jokes. They weren’t particularly funny, but its attempt made the students laugh.
How did you start incorporating it into your teaching?
I used it to make a worksheet one day because I was short on time. When I first looked at the equations ChatGPT had made, they seemed perfect, but as I started to check them, I noticed that they were wrong—and I realized that it wasn’t very good at math. At some point in the process, it went off the rails, and the solutions were nonsense. That’s when I thought of getting students to critique it, because that would show me how well they understood what was going on.
Related: Meet the Etobicoke-born inventor of the ChatGPT detector
Most people wouldn’t expect an all-knowing chatbot to struggle with math.
ChatGPT works by pulling information from a large dataset and assembling it into coherent answers. But it’s a language AI, not a math AI. It doesn’t understand the information it grabs. It can make sentences that we perceive as having actual meaning, but when I’ve used it to do math, it’s mostly been wrong.
Where have you seen it go awry?
I asked it a bunch of optimization problems—basically, finding the best way to do something that involves math. One question was: “You have a piece of wire, and you cut it into two pieces. Bend one piece into a square and the other into a circle. Which makes the largest enclosed area?” ChatGPT went on and on about how to do this, and the math was good. Then it said, “Oh, by the way, the best way to do this is to cut the wire in half.” Which is wrong! The students got a big laugh, and they learned the risks of using an AI tool without thinking critically about its output.
How are other teachers at your school reacting to ChatGPT?
I recently pulled a prank on the rest of the staff. When ChatGPT came out, I sent an email listing some of the ways they could use it. The punchline at the end was: “By the way, this whole email was written by ChatGPT.” That piqued some of my colleagues’ interest, and I was approached by a few of them to discuss it further. Someone had been using it for students with individual education plans. For example, you could ask it for a list of accommodations for a student with ADHD. Another teacher used it for a works cited page—they could just throw in links to the articles they had read.
Are any of them concerned about students cheating or generally anti-AI?
I haven’t experienced any resistance to using it in class. Maybe they’re not telling me. And, of course, some teachers are worried about students using it to cheat. People were worried about calculators, too, when they first came out. But, if a teacher knows about ChatGPT, they can frame their tests and assignments so that it’s impossible to use it.
I know that students can use ChatGPT to solve an algebra problem. So, instead, I might ask them to create an equation where the solution is seven. That way they’re working backward. This isn’t a new problem for educators. Since the advent of search engines, it’s been a necessity to write tests and assign homework that go beyond searchable answers. I’m not asking questions that students can easily fool me on. For example, I shared a video of a ball bouncing, moving toward a bucket. I cut out the end of the video, and the question was “Will the ball go into the bucket?” The students had to work backward, collect data, create an equation. It was their best guess backed up by a lot of analysis. My instinct says that ChatGPT couldn’t create the type of work my students did for that assignment.
It seems like AI could automate a lot of students’ grunt work, like writing bibliographies. Would they miss anything important by delegating that?
I don’t think so. I’m sure there are teachers who would disagree with me, but there are already websites that can make a works cited page. I think about the skills we really want kids to have: teamwork, communication, critical thinking. Tools like ChatGPT can help augment those. For instance, a student could do some writing and have a partner critique it. But they could also throw it to ChatGPT to critique. You could compare what the AI says with what your partner says, and as a group you improve.
What’s the biggest drawback of having chatbots in the classroom?
That students might use them to do the least possible amount of work. If they have ChatGPT do work for them, that’s dishonest, right? We’d have to figure out what’s driving that, because it’s a behaviour that sometimes stems from panic. But, if the conversation becomes, “Come talk to me as the teacher when you’re feeling this way, and we can work around it,” that will help students learn when to use that tool versus when they should seek out advice from a trusted adult.
Do you have any hold-outs among your students?
No. They’re using it for literally everything, including to apply to scholarships and universities. If teachers ignore it, students will use it anyway. If school boards start to ban it, well, students have their own devices. Rather than putting our heads in the sand and pretending it doesn’t exist, I think the better idea is to teach kids when it’s an appropriate tool.
Do you think ChatGPT or other AI tools are coming for teaching jobs any time soon?
I don’t think so. Teaching is such a human activity. If we just wanted students to know things? Sure, AI can do that. But good educators are interested in creating lifelong learners, critical thinkers, problem solvers. That needs a human touch.
This interview has been edited for length and clarity.