Usability Evaluation and Improvement of a Tool for Self-Service Learning Analytics
A no-code SSLA tool's usability was rigorously tested, leading to concrete design improvements for educators.
A research team led by Shoeb Joarder, Mohamed Amine Chatti, and Louis Born has published a detailed study on improving the usability of a specialized tool for education. Their paper, accepted at CSEDU 2026, focuses on the 'Indicator Editor,' a no-code platform designed to let educators and administrators create custom learning analytics metrics without programming skills. The core challenge they address is that while such Self-Service Learning Analytics (SSLA) tools promise accessibility, their real-world adoption hinges entirely on being intuitive for non-experts.
The researchers employed a multi-stage, iterative evaluation process. This included an initial qualitative user study, expert inspections of high-fidelity prototypes, and a crucial workshop-based evaluation in an authentic educational setting with 46 students. They used industry-standard metrics—the System Usability Scale (SUS), User Experience Questionnaire (UEQ), and Net Promoter Score (NPS)—to gather quantitative and qualitative data on the tool's strengths and weaknesses.
Based on this rigorous testing, the team derived concrete, actionable design implications. The improvements specifically target enhancing workflow guidance, providing clearer user feedback, and optimizing information presentation within the Indicator Editor. The study moves beyond theory, offering a practical roadmap for developers. It translates usability principles into specific features that can make complex data analysis accessible, ultimately aiming to empower educators with data-driven insights they can generate themselves.
- The team's 'Indicator Editor' is a no-code tool for creating custom learning analytics indicators, targeting non-technical users like teachers and admins.
- Usability was tested with 46 students in a real-world workshop using standardized scores (SUS, UEQ, NPS), providing robust, quantitative feedback.
- Concrete design improvements were identified, focusing on three key areas: better workflow guidance, enhanced user feedback, and improved information presentation.
Why It Matters
It provides a proven framework for building truly usable no-code analytics tools, empowering educators to leverage data without relying on IT.