Media & Culture

I used ChatGPT for a real life decision (whether to break up with my girlfriend) and it asked me a question i'd been avoiding for a year

A Reddit user spent 5 hours with ChatGPT to decide whether to break up with their girlfriend...

Deep Dive

A Reddit user, FailOk3553, turned to ChatGPT for a life-altering decision: whether to break up with their long-distance girlfriend. Facing pressure from family, friends, and a tight deadline, they expected the AI to provide a pros-and-cons list or a clear recommendation. Instead, ChatGPT spent five hours asking questions—some obvious (future goals), but one particularly impactful: 'Describe a normal Tuesday in your life five years from now if you stay together.' The user couldn't answer, which became the deciding factor. They broke up with their girlfriend three days later, calling it the right call.

The post has sparked discussion around AI's role in emotional and relational decisions. The user emphasized that ChatGPT's neutrality was key—unlike friends, family, or even a therapist, the AI had 'no skin in the game.' This allowed it to ask questions that cut through biases. The experience highlights a growing use case for large language models: not just for technical tasks, but for personal introspection and unbiased guidance. As AI becomes more conversational, its potential to support human decision-making in areas like relationships, career, and mental health is expanding.

Key Points
  • User spent 5 hours with ChatGPT discussing a breakup decision, not getting advice but probing questions
  • Key question: 'Describe a normal Tuesday in 5 years if you stay together'—user couldn't answer, leading to breakup
  • AI's neutrality highlighted as a key advantage over biased human advisors (friends, family, therapists)

Why It Matters

AI can offer unbiased, introspective support for personal decisions, challenging its typical use for technical tasks.