Why Pasting Client Data into ChatGPT is a GDPR Liability (and the Fix)
Published as: Ilya, Founder of PrivacyScrubber — privacyscrubber.com Every week, I watch legal teams, HR professionals, and developers do something that makes compliance officers lose sleep: they p...

Source: DEV Community
Published as: Ilya, Founder of PrivacyScrubber — privacyscrubber.com Every week, I watch legal teams, HR professionals, and developers do something that makes compliance officers lose sleep: they paste client files — contracts, resumes, medical records, support tickets — straight into ChatGPT to summarize, draft, or analyze them. I get it. ChatGPT is genuinely useful. The problem isn't the AI. The problem is the data that rides along with your prompt. The Legal Reality Nobody Talks About Let me be specific. Under GDPR Article 28, if you use an AI assistant to process personal data on behalf of clients or employees, you need a Data Processing Agreement (DPA) with that AI provider. OpenAI offers a DPA — but only on their API (not ChatGPT Free), and you still bear the burden of proving lawful processing. More critically: Article 5(1)(f) requires that personal data be processed with "appropriate security... and protection against unauthorised or unlawful processing." Pasting an unredacted