Self-Regulated Personal Contracts as a Harm Reduction Approach to Generative AI in Undergraduate Programming Education
A new study shows a simple, non-binding contract can change how students use tools like ChatGPT for programming.
A team of researchers from the University of Michigan has proposed a novel, harm-reduction strategy for managing generative AI in computer science education. Their paper, "Self-Regulated Personal Contracts as a Harm Reduction Approach to Generative AI in Undergraduate Programming Education," details an intervention tested with 217 students in an intermediate Python course. Instead of banning tools like ChatGPT, the researchers introduced a non-binding 'GenAI Contract' grounded in self-regulated learning theory. Students were asked to explicitly articulate their personal learning goals, create their own guidelines for AI use, and reflect on their alignment with those guidelines at strategic points over an 11-week semester. The contract was graded only for completion, emphasizing self-awareness over enforcement.
While the intervention was successful in raising conscious awareness, it revealed the significant challenge of translating that awareness into consistent behavior. For 58% of students, simply going through the process of creating the contract changed their thinking about AI and provided a helpful accountability structure. However, the study found that many students who valued their self-imposed guidelines still abandoned them under deadline pressure or due to peer behavior. The researchers concluded that maintaining intentionality requires constant self-control across hundreds of micro-decisions, a burden that proved unsustainable for many, even with heightened awareness. The work highlights the tension between the ease of using AI and the cognitive effort required to use it strategically for learning, suggesting educators need to support student agency rather than simply restrict tool access.
- A non-binding 'GenAI Contract' changed the thinking of 58% of 217 students in a Python course.
- The contract asked students to set personal AI usage goals but was graded only for completion, not adherence.
- Study found awareness often failed to prevent guideline abandonment under academic and social pressures.
Why It Matters
Provides a scalable, student-centered framework for educators to integrate AI tools while promoting intentional learning over restriction.