
PSA: Anyone with a Link Can View Your Granola Notes by Default
Companies Mentioned
Why It Matters
The default public link setting creates a hidden data‑leak risk for businesses, while unrestricted AI training raises privacy concerns for individual users. Adjusting these defaults is essential to protect confidential information and comply with corporate security policies.
Key Takeaways
- •Granola notes publicly accessible via link by default
- •Users must opt out to keep notes private
- •AI training uses all non‑enterprise data unless disabled
- •Enterprise accounts default to private and no AI training
- •Notes stored encrypted on US AWS; audio not retained
Pulse Analysis
Granola’s default link‑sharing model is a privacy blind spot that many users overlook. While the app markets itself as a secure AI notepad, any shared URL—intentionally or accidentally—can be opened in a private browser without authentication, exposing meeting summaries and partial transcripts. For enterprises that handle confidential negotiations or proprietary strategies, this inadvertent exposure could lead to competitive intelligence leaks or regulatory breaches, prompting IT departments to scrutinize the app’s configuration before adoption.
Beyond link visibility, Granola leverages user‑generated notes to fine‑tune its generative AI, a practice common among SaaS productivity tools. Non‑enterprise users are automatically enrolled in this data‑training loop, meaning every captured bullet point may contribute to model improvements unless the “Use my data to improve models for everyone” toggle is disabled. Compared with rivals like Notion or Evernote, which often require explicit consent for model training, Granola’s opt‑out approach places the onus on users to protect their data, raising questions about informed consent and compliance with emerging data‑privacy regulations such as the CCPA and GDPR.
Granola does employ robust security infrastructure: notes are encrypted at rest and in transit within a US‑hosted Amazon Web Services private cloud, and audio recordings are never stored. However, encryption alone does not mitigate the risk of public link exposure. Users should proactively adjust the default sharing setting to “Private” or “Only my company,” disable AI‑training participation, and regularly audit shared links. As AI‑driven productivity tools proliferate, organizations must balance convenience with stringent governance to safeguard intellectual property and maintain regulatory compliance.
Comments
Want to join the conversation?
Loading comments...