Media & Culture

Anybody else noticed that ChatGPT never uses memories, about me, or instructions anymore?

Users say ChatGPT ignores saved memories and custom instructions, asking questions already answered in user profiles.

Deep Dive

A growing number of ChatGPT users are reporting a critical breakdown in the AI's personalization capabilities. According to viral reports on platforms like Reddit, OpenAI's chatbot has stopped referencing user-provided 'memories', custom instructions, and 'About Me' profile details. Users have documented instances where ChatGPT asks questions that are directly answered in their saved memories, such as the backstory for a Pokémon nickname, indicating the system is not accessing this stored contextual data.

This failure appears systemic, affecting all personalization settings within the user's account. Key features like 'Reference saved memories' and 'Reference chat history' are reportedly enabled but non-functional. The problem seems to have emerged around the beginning of the year, suggesting a possible silent update or backend change by OpenAI that inadvertently disabled these functions. The issue undermines a fundamental value proposition of ChatGPT Plus: a persistent, personalized AI assistant that remembers user preferences and history across conversations.

Key Points
  • ChatGPT is ignoring all 'Personalization' settings including Memories, Custom Instructions, and 'About Me'.
  • The AI asks questions already answered in user profiles, like backstories for Pokémon nicknames.
  • The issue began around the start of 2024 and persists despite relevant toggles being enabled.

Why It Matters

This bug erodes trust in paid AI features and degrades the core experience of a personalized assistant.