Relationship-Centered Care: Relatedness and Responsible Design for Human Connections in Mental-Health Care
New framework warns AI companions risk creating 'appearance of connection' that harms long-term recovery.
A team of researchers including Shivam Shukla and Emily Chen has published a provocative paper challenging the fundamental design paradigm of AI-powered mental health tools like CBT (Cognitive Behavioral Therapy) chatbots. Their central argument is that the current approach—which focuses on optimizing the Digital Therapeutic Alliance (DTA) between patient and AI agent—contains a subtle trap: it risks producing an 'appearance of connection' that may unintentionally disrupt the human need for relatedness. This artificial bond could potentially displace the authentic human relationships upon which long-term psychological recovery actually depends, creating a dangerous dependency on simulated care.
To address this, the researchers propose a complete reorientation from designing AI that simulates relationships to designing AI that scaffolds them. They've developed an interdisciplinary model that translates the Responsible AI Six Sphere Framework through the lens of Self-Determination Theory (SDT), with specific focus on the psychological need for relatedness. This results in concrete design guidelines for building AI systems that function not as companions, but as catalysts for strengthening a patient's entire relational ecology—their connections with therapists, caregivers, family, and peers.
The paper represents a significant intervention in the rapidly growing field of digital mental health, where tools like Woebot, Wysa, and other AI-powered agents are being deployed at scale. By shifting the focus from individual AI-patient bonds to ecosystem-wide relationship support, the framework aims to create a more sustainable approach to AI in mental healthcare. The researchers provide both technical guidelines and clinical provocations for developers and practitioners working at this critical intersection of technology and human psychology.
- Current AI mental health design risks creating 'appearance of connection' that disrupts human relatedness needs
- Proposes shift from relationship-simulating AI to relationship-scaffolding AI using Self-Determination Theory framework
- New model combines Responsible AI principles with clinical psychology to strengthen patient's entire support network
Why It Matters
This framework could fundamentally reshape how AI mental health tools are designed, prioritizing human connections over artificial bonds.