We Need Granular Sharing of De-Identified Data-But Will Patients Engage? Investigating Health System Leaders' and Patients' Perspectives on A Patient-Controlled Data-Sharing Platform
A study of 523 patients and 16 health system leaders exposes a fundamental tension in patient-controlled data sharing.
A multi-university research team led by Xi Lu and Di Hu has published a significant study investigating the viability of patient-controlled data-sharing platforms for medical research. The team, which includes researchers from UC Irvine and UC San Diego, developed a high-fidelity web prototype and conducted a two-phase, mixed-methods investigation. They performed semi-structured interviews with 16 health system leaders and surveyed 523 patient participants to understand their perspectives on sharing granular, de-identified health data.
The study, accepted for presentation at the ACM CSCW 2026 conference, revealed a meaningful divergence in stakeholder views. Both groups appreciated the platform's potential to enhance patient autonomy and transparency. However, health system leaders primarily interpreted transparency and granular control through the lens of informed consent and institutional ethics—a top-down, compliance-focused perspective. In contrast, patients viewed these same features as bottom-up safeguards against potential risks and uncertainties about how their data might be used.
These findings underscore critical tensions at the heart of health data ecosystems, particularly between individual control and the integrity of large-scale research. The researchers conclude that successful system design must move beyond a one-size-fits-all approach. They propose building context-aware platforms that support flexible data-sharing granularity, provide ongoing 'benefit-centered' transparency (clearly explaining how data use helps research), and adapt to diverse user literacy and privacy needs to foster genuine trust and engagement.
- The study involved 523 patient survey participants and in-depth interviews with 16 health system leaders.
- A core finding was the divergent view of 'transparency': leaders saw it as an ethical duty, while patients saw it as a personal risk mitigation tool.
- The research offers concrete design implications for building trustworthy systems, emphasizing flexible control and benefit-centered communication.
Why It Matters
This research identifies the core trust barriers that must be solved for patient-centric health AI and data-driven research to succeed at scale.