The First Generation of AI-Assisted Programming Learners: Gendered Patterns in Critical Thinking and AI Ethics of German Secondary School Students
First study of GenAI in secondary schools finds 16-19 year olds use AI code without understanding it.
A new study by researcher Isabella Graßl examines how the first generation of students learning to program alongside Generative AI (GenAI) tools like GitHub Copilot and ChatGPT actually engages with them. The exploratory research involved 84 German secondary school students aged 16-19 in software development workshops, focusing on their critical thinking practices, perceptions of AI ethics, and gender-related differences. This represents a crucial shift from prior research that focused on university students or professional developers, capturing the next cohort of software engineers at their formative stage.
The findings reveal what researchers term an 'AI paradox.' While students demonstrated strong ethical reasoning and awareness about AI's societal impacts, many reported integrating AI-generated code into their projects without thoroughly understanding it. This disconnect between ethical awareness and practical application highlights a significant gap in current educational approaches. The study also found distinct gendered patterns: boys reported more frequent and experimental use of AI-assisted programming tools, while girls expressed greater skepticism toward AI assistance and emphasized peer collaboration as their preferred learning method.
Additionally, the research uncovered culturally specific attitudes toward responsibility. The majority of German students attributed significant responsibility for AI practices to politics and corporations, potentially reflecting Germany's strong regulatory culture and ongoing public discourse around data privacy (like GDPR). This contrasts with findings from other cultural contexts where individual responsibility might be emphasized more heavily. The study concludes that software engineering education needs to become more culturally responsive and explicitly link ethical concepts to concrete code artifacts, preparing young learners for an AI-driven development landscape where critical AI literacy is essential.
- 84 German students aged 16-19 showed an 'AI paradox': strong ethical awareness but frequent use of AI-generated code without understanding it.
- Clear gendered patterns emerged: boys used AI tools more experimentally, while girls were more skeptical and preferred peer collaboration.
- Most students placed responsibility for AI practices on politics and corporations, reflecting Germany's strong regulatory and data privacy culture.
Why It Matters
Reveals how the next generation of developers is being shaped by AI tools, highlighting critical gaps between ethics education and practical coding habits.