Skip to main content
Technology & ScienceUniversity

How do students perceive the use of genAI for providing feedback

Aims

Immediate formative feedback is valuable for correcting and reinforcing learning but is often limited by the availability of academic staff. Particularly for lower-level learning, immediate feedback can be partially automated in the form of interactive online exercises. However, creating high-quality interactive exercises which anticipate areas of student misunderstanding and provide suitably targeted feedback is also very time-consuming. This project will test the use of generative AI (genAI) to provide immediate feedback to students, with a particular focus on how students perceive their use. 

 The context for this study will be a second year undergraduate introductory computer coding module run by the Department of Biochemistry (5BBB0238 – An introduction to computer programming for bioscientists). The teaching on this module is delivered through interactive online coding exercises, short videos explaining theoretical concepts, and weekly small-group workshops. We have also introduced some gamification elements into the module, including daily coding challenges, XP (experience points) and a leaderboard of group with the highest combined XP. While these were successful and improved student motivation, they were too time-consuming to provide across the full semester. 

 This project has two objectives: 

  1. Test the use of genAI tools to create interactive coding questions for Moodle quizzes, and investigate how students view the quality and validity of these questions. 
  2. Test the use of genAI tools to provide immediate feedback to students on coding exercises, and understand how students view the use of these tools in this context. 

Aim 1 – Creation of interactive coding questions

For this aim, we will employ between 2-4 GTAs who are assisting with this module to test out a range of genAI tools (ChheleatGPT, GitHub CoPilot) to generate short interactive coding exercises for use in Moodle quizzes. We already use the CodeRunner Moodle plugin to create this type of exercise, but they are time-consuming to generate with suitable test cases to identify when the student submission is correct. The GTAs will use AI to generate a large question bank containing exercises covering the syllabus for each week of the module. These questions will primarily be used to provide daily coding challenges for students to complete as an additional form of coding practice beyond the core module materials. The effectiveness of this approach will be assessed by: 

  • written reports from the GTAs describing the usability of these tools and time taken 
  • anonymous questionnaires of students assessing the quality of the questions and their views on genAI in this context. 

Aim 2 – Using genAI for immediate feedback

For this aim, we will ask 24 students taking the module to volunteer for a 2-hour session. The first hour will be spent asking students to use a genAI tool (Microsoft Bing with AI, ChatGPT, or GitHub CoPilot) as an assistant when answering traditional coding exercises. They will be encouraged to use a specific tool to seek help when they get stuck, judge the correctness of their answer, and provide feedback. The second hour will be spent in a focus group exploring the students’ experience of using the tools. 

Project status: Ongoing

Principal Investigator