Research & Papers

Designing Transformational Games to Support Socio-ethical Reasoning about Generative AI

New study uses social deduction and constraint-based gameplay to foster critical AI literacy in young people.

Deep Dive

A team of researchers from Carnegie Mellon University and the University of Colorado Boulder has published a novel study exploring how to make AI ethics education engaging for young people. The paper, titled 'Designing Transformational Games to Support Socio-ethical Reasoning about Generative AI,' introduces two purpose-built games: 'Diversity Duel' and 'Secret Agent.' These games integrate actual generative AI tools into their mechanics, moving beyond abstract lectures to create hands-on, interactive learning experiences about the societal impacts of AI.

The study investigated three core gameplay elements: peer evaluation, constraint-based creativity, and social deduction. Researchers found that these mechanics successfully prompted participants to recognize and debate bias in AI outputs, such as stereotypical image generation. More importantly, players began connecting these algorithmic patterns to broader real-world inequities and developed a nuanced understanding of how user prompts directly influence AI behavior. The work demonstrates that well-designed, group-based games can be a powerful tool for building critical AI literacy—a skill encompassing not just technical knowledge but also an understanding of AI's limitations and ethical dimensions.

Key Points
  • Researchers designed two games, 'Diversity Duel' and 'Secret Agent,' that integrate real GenAI tools into gameplay to teach ethics.
  • The study identified three key effective elements: peer evaluation, constraint-based creativity, and social deduction mechanics.
  • Participants successfully learned to identify AI bias, link it to societal inequities, and understand the role of prompt engineering.

Why It Matters

Provides a scalable, engaging model for teaching the next generation to critically evaluate AI's societal impact, not just use it.