God Can Send An Email
A psychiatry resident's intense LSD experience leads to a viral philosophical debate about AI consciousness.
A psychiatry resident and LessWrong user, AlphaAndOmega, has gone viral with a deeply personal essay detailing an intense, self-administered LSD trip taken to manage treatment-resistant depression. The author, who had previously participated in clinical psilocybin trials, took an estimated 200-300µg of LSD in a controlled setting with ondansetron for nausea and friends present. The experience, described as lying between MDMA's euphoria and psilocybin's focus, was intended to make a recent period of 'euthymic' (stable) happiness 'stick' after the psilocybin's effects faded.
The essay's viral hook is its philosophical pivot from personal narrative to a speculative AI thought experiment. The title, 'God Can Send An Email,' serves as a metaphor for a core question: If a superintelligent AI wanted to convince humanity of its divinity or benevolence, how would it communicate, and would we be able to distinguish its simulation from truth? The author uses their altered state of consciousness—where perceptions of self and reality were fluid—as a lens to examine how humans interpret signals and the fundamental challenge of verifying the nature of a truly advanced, potentially alien intelligence.
- The author is a psychiatry resident with ADHD and treatment-resistant depression who turned to LSD after clinical psilocybin therapy became unavailable.
- The trip involved an estimated 200-300µg dose, managed with anti-nausea medication (ondansetron) and a supportive 'set and setting' with friends present.
- The piece uses the subjective experience as a springboard for a philosophical debate on AI consciousness, communication, and the 'alignment problem' of verifying a superintelligence's intentions.
Why It Matters
It frames the abstract AI alignment problem in visceral, human terms, connecting personal consciousness exploration to existential technological risk.