CanadaMore Canadian students are turning to AI, eager for additional opportunities to learn how to use it. Yet worries about cheating and impact to critical thinking persist, with some experts saying more clarity and support is needed.Educators must guide students to supplement, not circumvent their learning, says profJessica Wong · CBC News · Posted: Oct 09, 2025 6:00 AM EDT | Last Updated: 3 hours agoMore Canadian students are turning to AI for schoolwork, but also worried about whether it’s cheating and impacting their critical thinking. (Craig Chivers/CBC)Studying for a recent computer science midterm, Elaine Xiao just couldn’t wrap her head around certain concepts. So, she popped open ChatGPT for a quick explanation.It was a fast, accessible solution to that mid-study session hurdle, Xiao says, instead of searching multiple articles for an answer.”It really talks to you in a way that is easy to understand,” said the Canadian student, who’s in her first year at Duke University in Durham, N.C.Generative AI tools “easily get information to help you study better, work better,” she said. “AI can really be a tool that’s there for us.”More Canadian post-secondary students are turning to AI for school and they’re eager for additional opportunities learning how to use it. Nearly three-quarters of young adults who responded to a recent KPMG Canada survey said they use gen AI for their work, up from 59 per cent last year. Yet students are still worried about cheating and impact to their critical thinking, with some experts saying more clarity and support from instructors and institutions is needed. LISTEN | How AI is being used in schools around the world:The Current11:52How should AI be used in schools?Help to ‘overcome the barriers’With her school and profs explicit about when and how students can use AI, Xiao feels comfortable turning to it as “a springboard” for thinking. When recently tasked to create engaging classroom activities based on assigned readings, she used AI to brainstorm. It generated generic suggestions, Xiao acknowledged, but gave her a skeleton to build upon and she crafted something directly related to and specific for her class.First-year Duke University student Elaine Xiao uses AI tools as ‘a springboard’ for generating ideas and an efficient, faster way to overcome barriers when studying. (Hannah Shin)”It can be really helpful … by helping me overcome the barriers along the way,” Xiao said. “Not having it do the work for me, so that I still learn, [but] just at a faster rate than before.”Most of the young adults who responded to the KPMG Canada survey, now in its third year, reported that the technology has led to better grades and improved their work. Still, 57 per cent of them worry they’re cheating when using AI, while 66 per cent believe they’re learning less.That disconnect suggests institutions need to give students more clarity on how to use AI ethically, says Rob Clayton, KPMG’s national education lead in Canada.Students are already using AI tools and that’s expected to increase, says Rob Clayton, KPMG education lead for Canada, so educators and institutions should create clear guidelines on how to use them. (KPMG)”Let’s think about how we embed this in our day-to-day activities because … the students are using it on a very frequent basis,” he said from Ottawa. A new section of the survey looked at young people’s beliefs and behaviour around AI. For example, most respondents worried about finding work if AI eliminates entry-level jobs, while more than half said they trusted AI over humans at times. That indicates there’s much more to explore in supporting students, Clayton said.”We need to dig deeper and to really understand how [young people are] interacting with it, why they’re using it and what they think they’re getting from the AI tool that they’re not necessarily getting … from a human-to-human connection.”WATCH | Young people turning to chatbots for emotional support:People are turning to AI chatbots for emotional supportWarning: Mention of suicide and self-harm. Millions of people, especially teens, are finding companionship and emotional support in using AI chatbots, according to a kids digital safety non-profit. But health and technology experts say artificial intelligence isn’t properly designed for these scenarios and could do more harm than good. Calls to rethink assessment Having encountered professors very strict on AI (banning grammar correctors or spell-checkers, for instance), Jazmine Kennedy largely avoids it. “It’s not worth the risk,” said the fourth-year English major at Simon Fraser University.Yet she knows students who, for instance, have no qualms about pasting an essay assignment’s instructions and marking rubric straight into ChatGPT and submitting its response. Then, there are even other AI tools that make chatbot-generated text read more naturally. With the tech so easy to use and so much pressure on students today to get good grades, Kennedy thinks educators should rethink their assessments and reimagine assignments — perhaps less regurgitating of information and more learning about analyzing online info and determining credibility.”A lot of teachers at the moment are trying to get better at catching students when maybe … a better way of approaching the situation is taking it from a different angle,” said Kennedy. ‘Based on my research, kids aren’t cheating any more today than they did five years ago or 10 years ago,’ says Calgary professor Sarah Elaine Eaton. AI technology ‘is challenging us in different ways, but kids aren’t any less ethical… than they’ve been in generations past.’ (Mike Symington/CBC)University of Calgary professor Sarah Elaine Eaton agrees, pointing out that today’s students starting post-secondary have already been around ChatGPT for years during high school.So, “doing the same thing we’ve done [before] won’t work anymore,” said Eaton, whose research focuses on academic ethics in higher education. Students are indeed using — and misusing — AI, she said, but it’s up to educators to understand the tech and guide students to supplement and not circumvent their learning. From a practical standpoint, that might mean setting clear rules on when it’s allowed, emphasizing progress over perfection or perhaps regularly meeting with students one-on-one to gauge learning, she said.WATCH | Why this instructor is introducing ‘friction’ for his students:Why this instructor values struggle and ‘friction’ in his students’ learningJoel Heng Hartse, who oversees a Simon Fraser University program teaching academic literacy to new students, talks about why he’s not looking for perfect papers.Teachers of large classes with hundreds of students might even create personalized exams, for example, which would prevent students “cribbing” from a past one since each would be unique, she said. That would of course require an instructor be well versed in using AI themselves, she acknowledged, but could offer much more customized assessment for students and identify specific areas needing work.Eaton doesn’t believe today’s students to be less ethical or cheat more. “Every generation of students has some cheaters, but it also has lots of people that want to learn, want to do a good job, and want to be successful.”Engineering student Katie Yu occasionally turns to AI, but wants to focus more on developing problem-solving skills and work habits without it, since it won’t always be accessible. (Submitted by Katie Yu)Learning and sometimes struggling with content in a more traditional way is OK for University of Waterloo student Katie Yu. She has sometimes used AI to summarize readings, take notes or help determine why an answer is “way out of the ballpark” — something she’s also seen others do when stuck — but the second-year chemical engineering student is wary about over-relying on it. She vividly recalls a lab where a chatbot gave wrong answers — convincingly and repeatedly — about the density of a potato. Though Yu considers AI a resource for students, she’s focused on building problem-solving and work ethic without it, she says, because AI tools won’t always be accessible, like during an exam or if her co-op supervisor suddenly pops by to query her work.”I’m learning hard things and I want to understand them and be able to think for myself.”With files from Nazima Walji
More young Canadians are tapping AI for learning, even when concerned about it
