Meta’s New AI Asked for My Raw Health Data—and Gave Me Terrible Advice
Meta's new AI model asks users to upload fitness tracker and lab data for analysis, but experts warn of HIPAA violations.
Meta's Superintelligence Labs has launched Muse Spark, its first generative AI model now available through the Meta AI app with plans for integration across Facebook, Instagram, and WhatsApp. The company specifically designed Muse Spark for health-related queries, collaborating with over 1,000 physicians to curate training data for more factual responses. During testing, the AI directly prompted users to "paste your numbers from a fitness tracker, glucose monitor, or a lab report" with promises to calculate trends and flag patterns. This approach mirrors similar health-focused features from OpenAI's ChatGPT and Anthropic's Claude, which allow connections to Apple/Android health data.
Medical experts immediately raised red flags about Muse Spark's data handling practices. Monica Agrawal of Duke University notes these models aren't HIPAA compliant, meaning sensitive health information lacks the privacy protections required in medical settings. Meta's privacy policy confirms that data shared with Muse Spark may be stored for training future AI models and could be used to tailor advertisements. Dr. Gauri Agarwal from the University of Miami expressed particular concern about users connecting biometric data to services they can't control, recommending people limit interactions to lower-stakes uses like preparing questions for doctors. With healthcare costs rising and access limited, experts acknowledge the temptation to use AI for medical advice but warn against delegating important health decisions without proper safeguards and research proving these tools' effectiveness.
- Muse Spark prompts users to upload fitness tracker data, glucose readings, and lab reports for AI analysis
- The model was trained with input from 1,000+ physicians but lacks HIPAA compliance for data protection
- Meta's privacy policy states health data may be stored for AI training and used for targeted advertising
Why It Matters
Millions will encounter this AI across Meta's platforms, potentially sharing sensitive health data without medical privacy protections.