Research & Papers

Beyond One-Size-Fits-All Exercises: Personalizing Computer Science Worksheets with Large Language Models

Personalized exercises cut drop-off rates for low-knowledge learners by 25 percentage points

Deep Dive

A new study from the University of Toronto presents the FACET system (Framework for Adaptive Content using Educational Technology), which uses large language models to generate personalized worksheets for first-year computer science students. In a mixed-methods experiment with 409 students learning regular expressions (RegEx), the researchers classified learners into four profiles based on knowledge and motivation. The LLM then tailored the difficulty pacing, structural scaffolding, and motivational tone of each exercise—while keeping the core content aligned with Bloom’s Taxonomy and Self-Determination Theory.

The results were striking. Standard non-adaptive exercises left 25–30% of low-knowledge students unable to complete the task. With LLM-personalized worksheets, completion rates jumped to over 99% across all learner profiles. Among low-knowledge/low-motivation students, correctness improved by 18.2 percentage points. Survey data showed that students valued structural scaffolding (logical sequence, difficulty pacing) far more than motivational tone, and they perceived the adaptive tasks as equally challenging as standard ones. The authors conclude that instructor-facing LLM personalization primarily prevents task abandonment among at-risk students without diluting “desirable difficulty,” effectively closing engagement gaps in introductory CS courses.

Key Points
  • 409 CS1 students tested on RegEx; standard worksheets had 25–30% incompletion vs. >99% with LLM personalization.
  • Low-knowledge/low-motivation students achieved +18.2% higher correctness with personalized materials.
  • Students prioritized structural scaffolding over motivational tone; adaptive tasks maintained perceived challenge.

Why It Matters

LLMs can close retention gaps in CS1 by tailoring exercises to student profiles without sacrificing rigor.