Media & Culture

PSA: Anyone with a link can view your Granola notes by default

The AI-powered meeting notepad makes notes viewable to anyone with a link and uses data for AI training unless users opt out.

Deep Dive

The AI-powered meeting assistant app Granola has come under scrutiny for its default privacy settings, which could expose sensitive user data. According to a report by The Verge, the app, which integrates with calendars to record, transcribe, and summarize meetings, sets all user-generated notes to be 'viewable to anyone with the link' by default. This means any note link, if accidentally shared, can be accessed from a private browser without any login, revealing the note's content, owner, and creation date. While full meeting transcripts require app access, bullet-point summaries and key quotes are visible via these public links. The company's support documentation confirms this setting, advising users to manually change their 'Default link sharing' to 'Private' or 'Only my company' in the settings menu.

In addition to the link-sharing issue, Granola's privacy policy reveals it 'may use anonymized data' from non-enterprise users to train its internal AI models by default. Only customers on enterprise plans are automatically opted out of this data usage. Individual users must proactively disable this feature by toggling off the 'Use my data to improve models for everyone' option in settings. The company states it does not share this data with third-party AI firms like OpenAI or Anthropic. These defaults have raised significant security concerns, with at least one major company reportedly denying a senior executive's use of the tool. Granola stores notes and transcripts in an AWS private cloud with encryption but does not store the original meeting audio.

Key Points
  • Default 'public link' setting allows anyone with a URL to view meeting notes without logging in, a major risk for sensitive business discussions.
  • Non-enterprise user data is used by default for internal AI model training; only enterprise clients are auto-opted out.
  • Users must manually change two settings: 'Default link sharing' to 'Private' and toggle off 'Use my data to improve models' to secure their data.

Why It Matters

Professionals using AI meeting tools risk exposing confidential discussions and having their data used for training without explicit consent.