Research & Papers

PII Shield: A Browser-Level Overlay for User-Controlled Personal Identifiable Information (PII) Management in AI Interactions

Researchers' open-source tool brings enterprise-grade privacy controls to consumer AI interactions.

Deep Dive

Researchers Max Holschneider and Saetbyeol Lee have introduced PII Shield, a first-of-its-kind browser extension designed to give users control over their personal data during AI interactions. As AI chatbots increasingly serve as confidants and therapists, users are sharing vast amounts of sensitive personal identifiable information (PII) with opaque corporate platforms. PII Shield addresses this by applying enterprise-grade redaction techniques locally in the browser, anonymizing details like names, locations, and financial data before they are sent to services like ChatGPT or Claude.

The tool operates on two key mechanisms. First, it performs local entity anonymization, scanning and redacting PII directly on the user's device to prevent initial data leakage. Second, it employs 'smokescreens'—autonomous agent activity that generates decoy queries and interactions to disrupt third-party profiling and advertising tracking. This dual approach allows users to maintain privacy without sacrificing the utility of AI assistants for personal or therapeutic conversations. The system is available as a free, open-source implementation on GitHub, making sophisticated privacy technology accessible to non-technical users for the first time.

Key Points
  • Uses local entity anonymization to redact PII on-device before data reaches AI services
  • Deploys 'smokescreen' autonomous agents to generate decoy activity and disrupt third-party profiling
  • Free, open-source browser extension brings enterprise-grade privacy controls to consumer AI chats

Why It Matters

Empowers individuals to use AI for sensitive conversations without surrendering data control to tech companies.