
Send us Fan Mail You already knew you were the product. But did you know you're also the teacher? Companies are quietly feeding your emails, your work decisions, your customer interactions, and your daily patterns into AI systems — systems designed to automate exactly what you do. And most people have no idea it's happening. In this episode of Privacy Please, we break down how it works, who's doing it, why your right to delete your own data is functionally broken in the AI era, and what you can actually do about it. What we cover: How "function creep" turns your data into AI training fuel without new consent The GitHub policy change that's happening right now — and how to opt out Why employees at Amazon, Google, and JPMorgan described training AI as "building your own coffin." The deletion problem — why you can't remove yourself from a trained model Practical steps to audit your tools and protect yourself today Links: GitHub opt-out: github.com/settings/copilot/features Khan v. Figma lawsuit: rainintelligence.com FTC on AI data practices: ftc.gov Check your state privacy rights: iapp.org/resources/article/us-state-privacy-legislation-tracker Delete old posts: redact.dev Privacy Please is part of The Problem Lounge network. 🌐 theproblemlounge.com 🎙️ Subscribe on Apple Podcasts, Spotify, or wherever you listen Support the show
AI Summary coming soon
Sign up to get notified when the full AI-powered summary is ready.
Free forever for up to 3 podcasts. No credit card required.

S7, E270 - The 40-Minute Hack That Stole the Blueprint for AI | The Mercor Breach

S7, E268 - AI Can Unmask Your Anonymous Account for $4 | Here's How

S7, E267 - Your SOC 2 Won't Save You: Here's What Will with Girish Redekar, co-founder & CEO Sprinto

S7, E266 - Good Boy, Bad Data
Free AI-powered recaps of Privacy Please and your other favorite podcasts, delivered to your inbox.
Free forever for up to 3 podcasts. No credit card required.