Implementing
Weekly AI Reflection
Students were required to produce videos every week in small groups, including brainstorming, scriptwriting, editing, and recording. To encourage critical reflection on the role of AI in their work, we introduced a simple three-question template Uploaded to Moodle, it asked students to reflect on the following:
- Which stages they used AI for and why
- The most valuable outcomes of using AI
- Any misleading outputs and how they critically addressed them
This was an initial stage of learning about how students use AI in this type of assignments to make more informed decision on where they need assistance the most. We did not introduce any specific AI tools in this case.
Debating the Role of AI in Academia
To extend the discussion, we organized a debate where students were divided into two groups—one supporting the use of AI in academic work, the other opposing it. Each side had five minutes to prepare their arguments, followed by rounds of questions and rebuttals. This led to a rich conversation, with students reflecting on bias in large language models (especially those trained on English content) and reflecting on the ethical and practical implications of AI in learning.
Transcription Tools for Group Discussions
As students began working more intensively in group discussions, we introduced them to Microsoft Teams’ transcription feature. With so many AI tools available for meeting summaries, our goal was to help students streamline collaboration and peer feedback without worrying about manual note-taking. This allowed them to focus more on discussion quality and less on capturing every word.
Introducing the SAMR Model
We discovered the SAMR model (a four-level taxonomy that can be used to integrate technology into the course) developed by the University of Calgary. The model outlines four levels of technology integration: Substitution, Augmentation, Modification, and Redefinition (each level serves a distinct purpose and the main point is that not in all tasks we need to reach the highest level).
Inspired by this framework and an online reflection template, we designed our own visually appealing version tailored to our course. We added a bonus question: Ask AI what it knows about you.
Text Analysis
In this activity, we gave students two texts and asked them to identify which one was AI-generated and which was written by a human. What they didn’t know was that both texts were AI-generated—one was simply crafted with a more carefully designed prompt to make it sound more “human.” The pedagogical rationale for creating a “not honest” assignment is that AI can produce human-like content, which is hard to distinguish from the machine. This assignment aimed to raise critical thinking, stating that it’s almost impossible to differentiate between two and have a definite answer on whether a machine or a human created it. This exercise challenged students to analyze the differences in AI-generated content, determine frequently used words by AI, and once again reflect on how challenging it is to distinguish between human and AI-generated content.