Granola AI Note-Taking App Exposes Private Notes to Anyone With a Link by Default
Granola, an AI-powered note-taking app, is shipping with a critical privacy flaw: notes are not private by default. Despite marketing claims of privacy, the app's default settings make any note viewable to anyone who possesses a shareable link, effectively broadcasting potentially sensitive meeting summaries and personal memos to the public internet. This exposure is compounded by a second default setting that conscripts user notes into the company's internal AI training datasets unless the user manually opts out, a practice at odds with the app's stated commitment to user confidentiality.
The app, which positions itself as an 'AI notepad for people in back-to-back meetings,' automatically records audio from calendar-linked meetings and uses AI to generate summarized notes. This core functionality processes highly sensitive corporate and personal discussions. The discovery reveals a stark contradiction between Granola's branding and its operational reality, placing the burden of security entirely on the user to navigate and alter complex privacy settings post-signup.
This configuration poses a direct data leakage risk for professionals in legal, corporate, and entrepreneurial sectors who may discuss proprietary strategies, financial details, or confidential personnel matters. The incident triggers immediate scrutiny of default settings in productivity AI tools and raises broader questions about data stewardship and informed consent in applications that handle ambient workplace recordings. Users are advised to audit their Granola sharing permissions and training consents immediately.