Welcome to the first of many blog posts dedicated to exploring innovative assessment methods that integrate seamlessly with the capabilities of Generative AI (GenAI). In this series, we will dive into practical ways to evolve traditional assessments, creating a future where AI and education work hand in hand. Today, we focus on reimagining one of the most common forms of academic presentation: moving from essays and PowerPoint slides to dynamic video presentations.
Why Video?
Switching to video presentations is not a revolutionary idea—I’ve been using this approach for over a decade. However, its relevance and potential have never been more pronounced than in today’s GenAI-driven world. Students have more ways than ever to access information, from Google searches to GenAI prompts. The real challenge now is not how students find information but how we, as educators, assess their depth of understanding and ensure they retain what they learn. Experience has suggested that one of the best ways to retain knowledge is to teach it to others, and video presentations provide the perfect platform for this. I have created many videos myself and definitely learned more in the process. I have used YouTube extensively to learn new skills and capabilities from others. The idea is to move away from a written assessment (easily created by a contract cheater or GenAI) or a PowerPoint presentation (script and presentation created by GenAI) showcasing what one may have learned. This requires lifting the bar and expecting the students to know how to do more and express themselves differently!
In my most recent implementation, I introduced video-based assessments to postgraduate students studying the ethical implications of GenAI. This activity was a scaffold before getting them to learn and use GenAI in a structured way in the classroom. The assignment was designed with specific learning objectives in mind:
Learning Objectives and Assessment Criteria
Technical Quality (10%) - Students were encouraged to use any technology in any way they saw fit (within policy) to produce a quality video. No technology was off-limits, and the goal was to empower students to explore, learn, and apply new digital tools and skills to enhance their presentations. Hopefully, they will learn that doing well in the assignment is more than just being able to use technology.
Engagement and Audience Impact (10%) - The target audience for these videos was their peers, engaging them with content where they already spend much of their time, like TikTok, YouTube and YouTube Shorts. The challenge was to captivate and communicate the key message effectively in a very short period of time.
Content Relevance and Clarity (40%) - This critical component required students to distill complex information into concise, engaging content. They needed to demonstrate evaluative judgment in selecting the most relevant information and communicating it clearly in a short time frame. Even if GenAI was used, the right prompts were essential to extract and express the core ideas effectively.
Originality (40%) - Originality is the ultimate test of critical and creative thinking. Students were tasked with combining their knowledge and experiences to create something unique. Something that would stand out from their peers and gain maximum attention and impact. While GenAI can be a powerful creative tool, it can also lead to repetitive ideas. The challenge was to stand out and offer a fresh perspective.
Brief Assessment Overview
The students needed to create a digital video of 100 – 150 seconds that explains/teaches/provides key insights to a GenAI ethical implication (covering both benefits and concerns) to a non-technical audience targeted at 15 – 30 year-olds. Knowing what to cover was important, as well as knowing how best to convey it. They could use any technology, and it was unsupervised. The design removed the focus from cheating to the validity of the learning objectives.
Risk Assessment
No assessment can prevent students from trying to cheat. While a moving target, video currently offers much in the way of ensuring academic integrity. Full details are available via the paper in this link, with Table 10 providing a general guide for most assessments and the supplementary materials guiding undertaking such a risk assessment yourself. For this assessment, to ensure that the student actually played some role in its creation ( that possibly a contract cheater didn't do 100% of the work), there were minimum amounts of student voice and appearance that needed to be met. Yes, there are workarounds, but for now, it's safe enough.
Scaffolding for Success
To support students, lectures highlighted the importance of non-technical communication and the art of innovation in content creation. Tutorials offered structured activities, showcasing examples of images and videos with varying levels of creativity. Discussions centred on what worked, what didn’t, and why. It focused on trying to help engineers think about how to sell their message with originality and without the technical focus. For those who don't know, communication in an engineering degree generally concentrates on technical communication methods (technical reports, technical presentations, etc.).
The Outcome: Successes and Challenges
With a cohort of 350 students, the results followed a familiar pattern: a few students (about 5) fully met the brief, producing standout, original content. Around 10 were highly commendable. However, many gravitated towards AI-generated scripts, often read by AI voices, with minimal human input beyond the required elements. While this resulted in a bell curve distribution typical of traditional assessments, it reinforced a vital lesson: even with access to cutting-edge technology, true impact requires evaluative judgment, critical thinking, and a touch of creativity. Access to technology, including GenAI, did not give all the students an HD thanks to easy cheating.
One of the best videos can be found via this link (permission granted by students to share).
Beyond the Learning Objectives
This exercise went beyond meeting the learning objectives—it demonstrated to students that technology alone isn’t enough. To truly excel and differentiate themselves, they must apply critical and creative thinking skills that GenAI cannot replace. The experience also affirmed that even when students have access to the latest tech, originality and impact come from their ability to think deeply and uniquely.
Final Thoughts
By rethinking traditional assessments and embracing innovative formats like video, we can help students develop the skills they need to thrive in a GenAI world. As an engineer, I know that we can focus so much on the technical aspects that we sometimes forget the importance of letting our students express their knowledge in non-technical ways and learning how to express their ideas to a non-technical community. This example shows how it can be done. We don't need to be scared that all unsupervised assessments are an academic integrity risk. As this example shows, there are options. This example might not always be secure, but applying a structured risk assessment will keep us informed of when it is time to pivot again. We must also be reminded that cheating is not just about assessments, but it in fact requires a holistic institutional solution.
Stay tuned for more insights and practical strategies in future posts as we continue to explore the evolving landscape of education and assessment.
Sasha Nikolic
Comments