Designing Social Robots with Ethical, User-Adaptive Explainability in the Era of Foundation Models
New paper tackles the 'black box' problem as LLMs and foundation models drive increasingly adaptive robot behavior.
A new research paper from Fethiye Irmak Doğan, Alva Markelius, and Hatice Gunes tackles the critical challenge of explainability in social robots powered by foundation models. As large language models (LLMs) and other foundation models become embedded in robots, they mediate not just actions but also long-term user adaptation, creating a 'black box' problem. The authors argue that traditional, one-size-fits-all explanation strategies are now dangerously inadequate, as generic justifications are wrapped around behavior generated from vast, opaque datasets. They position ethical, user-adapted explainability as a core design objective that must be addressed from the start.
The paper identifies key challenges where both adaptation and explanation are delegated to foundation models, leading to potential ethical pitfalls. To address this, the researchers propose four concrete recommendations: moving towards user-adapted explanations, making them modality-aware, involving users in co-design, and grounding strategies in smaller, fairer datasets. An illustrative use case involving an LLM-driven socially assistive robot demonstrates how these principles can be applied in sensitive real-world domains like healthcare or education. This work, presented at the ACM/IEEE HRI conference, provides a crucial roadmap for ensuring transparency and trust as robots become more intelligent and personalized.
- Identifies the failure of generic explanations for robots using adaptive foundation models like LLMs.
- Proposes four design recommendations for ethical, user-adapted, and co-designed explainability strategies.
- Illustrates framework with a use case for LLM-driven socially assistive robots in sensitive domains.
Why It Matters
Ensures trust and safety as adaptive AI becomes central to robots interacting with people in healthcare, education, and homes.