Tag Archives: cheating

AI and academic integrity

I’ve been using AI to add images to my blog since June of 2022 when I discovered AI generated art: DALL•E. I don’t credit it, I just use it, and find it much easier to generate than to find royalty free alternatives. I haven’t yet used AI as a writing or editing tool on my blog. While I’m sure it would make my writing better, I am writing to write, and I usually do so early in the morning and have limited time.

I already have to limit the time I spend creating an image, if I also had to use AI to edit and revise my work I’d probably only have 15-20 minutes to write… and I write to write, not to use an AI to write or edit for me. That said, I’m not disparaging anyone who uses AI to edit, I think it’s useful and will sometimes use it on emails, I simply don’t want that to be how I spend my (limited) writing time.

I really like the way Chris Kennedy both uses AI and also credits it on his blog. For example, in his recent post, ‘Could AI Reduce Student Technology Use?’ Chris ends with a disclosure: “For this post, I used several AI tools (Chat GPT, Claude, Magic School) as feedback helpers to refine my thinking and assist in the editing process.”

Related side note, I commented on that post,

The magic sauce lies in this part of your post:
“AI won’t automatically shift the focus to human connection—we have to intentionally design learning environments that prioritize it. This involves rethinking instruction, supporting teachers, and ensuring that we use AI as a tool to enhance, not replace, the human elements of education.”

A simple example: I think about the time my teachers spend making students think about formatting their PowerPoint slides, think about colour pallets, theme, aesthetics, and of course messaging… and wonder what they lose in presentation preparation when AI just pumps out a slide or even whole presentation for them? 

“Enhance but not replace,” this is the key, and yet this post really strikes a chord with me because the focus is not just the learning but the human connection, and I think if that is the focus it doesn’t matter if the use of technology is more, less, or the same, what matters is that the activities we do enrich how we engage with each other in the learning.

Take the time to read Chris’ post. He is really thinking deeply about how to use AI effectively in classrooms.

However I’m thinking about the reality that it is a lot harder today to know when a student is using AI to avoid thinking and working. Actually, it’s not just about work avoidance, it’s also about chasing marks. Admittance to university has gotten significantly more challenging, and students care a lot about getting an extra 2-5% in their courses because that difference could mean getting into their choice university or not. So incentives are high… and our ability to detect AI use is getting a lot harder.

Yes, there are AI detectors that we can use, but I could write a complex sentence in three different ways, put it into an AI detector, and one version could say ‘Not AI’, one could say 50% chance that it was written by AI and the third version might say 80% chance of AI… all written by me. 20 years ago, I’d read a complex sentence written in my Grade 8 English class and think, ‘That’s not this kid’s work’. So, I’d put the sentence in quotes in the Google search bar and out would pop the source. When AI is generating the text, the detection is not nearly as simple.

Case in point: ‘The Backlash Against AI Accusations’, and shared in that post, ‘She lost her scholarship over an AI allegation — and it impacted her mental health’. And while I can remember the craze about making assignments ‘Google proof’ by asking questions that can’t easily be answered with Google searches, it is getting significantly harder to create an ‘AI proof’ assessment… and I’d argue that this is getting even harder on a daily basis with AI advances.

Essentially, it’s becoming a simple set of questions that students need to be facing: Do you want to learn this? Do you want to formulate your ideas and improve your thinking? Or do you just want AI to do it for you? The challenge is, if a kid doesn’t care, or if they care more about their mark than their learning, it’s going to be hard to prove they used AI even if you believe they did.

Are there ways to catch students? Yes. But for every example I can think of, I can also think about ways to avoid detection. Here is one example: Microsoft Word documents have version tracking. As a teacher I can look at versions and see large swaths of cut-and-paste sections of writing to ‘prove’ the student is cheating. However, a student could say, “I wrote that part on my phone and sent it to myself to add to the essay”. Or a savvy student could use AI but type the work in rather than pasting it in. All this to say that if a kid really wants to use AI, in many cases they can get away with it.

So what’s the best way to battle this? I’m not sure? What I do know is that taking the policing and detecting approach is a losing battle. Here are my ‘simple to say’ but ‘not so simple to execute’ ideas:

  1. The final product matters less than the process. Have ideation, drafts, and discussions count towards the final grade.
  2. Foster collaboration, have components of the work depend on other student input. Examples include interviews, or reflections of work presented in class, where context matters.
  3. Inject appropriate use of AI into an assignment, so that students learn to use it appropriately and effectively.

Will this prevent inappropriate AI use. No, but it will make the effort to use AI almost as hard as just doing the work. In the end, if a kid wants to use it, it will be harder and harder to detect, so the best strategy is to create assignments that are engaging and fun to do, which also meet the learning objectives that are required… Again, easier said than done.

What are you outsourcing?

Alec Couros recently came to Coquitlam and gave a presentation on “The Promise and Challenges of Generative AI”. In this presentation he had a quote, “Outsource tasks, but not your thinking.

I just googled it and found this LinkedIn post by Aodan Enright. (Worth reading but not directly connected to the use of AI.)

It’s incredible what is possible with AI… and it’s just getting better. People are starting businesses, writing books, creating new recipes, and in the case of students, writing essays and doing homework. I just saw a TikTok of a student who goes to their lecture and records it, runs it through AI to take out all the salient points, then has the AI tool create cue cards and test questions to help them study for upcoming tests. That’s pretty clever.

What’s also clever, but perhaps now wise, is having an AI tool write an essay for you, then running the essay through a paraphraser that breaks the AI structure of the essay so that it isn’t detectable by AI detectors. If you have the AI use the vocabulary of a high school student, and throw in a couple run-on sentences, then you’ve got an essay which not only AI detectors but teachers too would be hard pressed to accuse you of cheating. However, what have you learned?

This a worthy point to think about, and to discuss with students: How do you use AI to make your tasks easier, but not do the thinking for you? 

Because if you are using AI to do your thinking, you are essentially learning how to make yourself redundant in a world of ever-smarter AI. Don’t outsource your thinking… Keep your thinking cap on!

The easy way out

I love the ingenuity of students when it comes to avoiding work. I remember a student showing me how playing 3 French YouTube videos in different tabs simultaneously somehow fooled the Rosetta Stone language learning software to think he was responding to oral tests correctly. How on earth did he figure that out?

Here’s a video of a kid who, while doing an online math quiz for homework, figured out that if you go to the web browser’s developer ‘inspect element’ tool you can find out the correct answer. Just hover over the code of the multiple choice questions and it highlights the choices and the code tells you if that choice is true or false.

@imemezy

Kids know every trick in the book…i mean computer #computer #maths #homework #madeeasy #lol #children #schoolwork #schools #hack #hacks #tricks #tips #test #exam #learning #learn

♬ original sound – Memezy

If there is an easy way to solve things, students will figure it out.

There isn’t an AI detector that can figure out with full certainty that someone cheated using a tool like Chat GPT. And if you find one, it probably would not detect it if the student also used an AI paraphrasing tool to rework the final product. It would be harder again if their prompt said something like, ‘Use grammar, sentence structure, and word choice that a Grade 10 student would use’.

So AI will be used for assignments. Students will go into the inspector code of a web page and find the right answers, and it’s probably already the case that shy students have trained an AI tool to speak with their voice so that they could submit oral (and even video) work without actually having to read anything aloud.

These tools are getting better and better, and thus much harder to detect.

I think tricks and tools like this invite educators to be more creative about what they do in class. We are seeing some of this already, but we are also seeing a lot of backwards sliding: School districts blocking AI tools, teachers giving tests on computers that are blocked from accessing the internet, and even teachers making students, who are used to working with computers, write paper tests.

Meanwhile other teachers are embracing the changes. Wes Fryer created AI Guidelines for students to tell them how to use these tools appropriately for school work. That seems far more enabling than locking tools down and blocking them. Besides, I think that if students are going to use these tools outside of school anyway, we should focus on teaching them appropriate use rather than creating a learning environment that is nothing like the real world.

All that said, if you send home online math quizzes, some students will find an easy way to avoid doing the work. If you have students write essays at home and aren’t actively having them revise that work in class, some will use AI. Basically, some students will cheat the system, and themselves of the learning experience, if they are given the opportunity to do so.

The difference is that innovative, creative teachers will use these tools to enhance learning, and they will be in position to learn along with students how to embrace these tools openly, rather than kids sneakily using them to avoid work, or to lessen the work they need to do… either way, kids are going to use these tools.