Designing with Medical Mistrust: Perspectives from Black Older Adults in Publicly Subsidized Housing
New research from CHI 2026 shows how to build trustworthy health tech for Black older adults with medical mistrust.
A team from Georgia Tech and Emory University has published groundbreaking research at the ACM CHI 2026 conference that tackles a critical blind spot in health technology design: medical mistrust. Their paper, 'Designing with Medical Mistrust: Perspectives from Black Older Adults in Publicly Subsidized Housing,' argues that mistrust is a rational, protective response to historical and structural inequities, not a problem to be solved. By conducting in-depth interviews in the Southern U.S., the researchers center the lived experiences of a community often excluded from tech design, moving beyond superficial 'cultural sensitivity' to address foundational trust barriers.
The study identifies three core themes from the community: deep skepticism of healthcare's financial motivations, critical questions about the true intentions behind health AI systems, and the need for accreditation and embodiment in care. Using Black Feminist Thought as a framework, the authors reframe these findings into actionable design principles for creating health self-management technologies. Crucially, they provide a reflective exercise for researchers and designers to examine their own positionality, challenging the field of Human-Computer Interaction (HCI) to move from extraction to partnership. This work establishes a new precedent for ethically engaging communities with historically grounded mistrust, directly impacting how future AI-driven health apps, wearables, and diagnostic tools are built and introduced.
- Study based on interviews with Black older adults in Southern U.S. subsidized housing, a group often excluded from tech design
- Identifies 3 key community concerns: skepticism of financial motives, questions about AI intentions, and need for accredited, embodied care
- Proposes design principles using Black Feminist Thought and includes a researcher positionality exercise for ethical community engagement
Why It Matters
Provides a crucial framework for building health AI that communities will actually trust and use, moving beyond technical fixes to address historical inequity.