Your LLM Doesn't Need to Know Your Customers' Names
Your LLM Doesn't Need to Know Your Customers' Names
Every enterprise wants to use AI. Most can't - because the data they need to send is the data they're required to protect.
Patient records. Account numbers. Social security numbers. The moment that information hits a third-party LLM, you've got a compliance problem. HIPAA. GDPR. SOX. PCI-DSS. Pick your acronym - the answer is the same: sensitive data can't leave your environment.
Until now, the workarounds have been painful. Build custom middleware. Manually redact documents. Or just avoid AI entirely and watch your competitors move faster.
Today, we're introducing NOI.
One Line of Code. Zero PII Exposure.
NOI is a PII-tokenizing reverse proxy that sits between your application and any LLM provider. It automatically detects sensitive data, replaces it with secure tokens, and forwards a sanitized request to the LLM. When the response comes back, NOI swaps the tokens for real values before your application receives them.
The LLM never sees real data. Your application never knows the difference.
Getting started looks like this:
# Before - PII goes straight to OpenAI
client = OpenAI(api_key="sk-...")
# After - PII is intercepted and tokenized
client = OpenAI(api_key="sk-...", base_url="https://api.nopii.co/v1")That's it. No SDK. No middleware rewrite. No changes to your application logic. Change the base_url, and every request is protected.
Smart Tokenization, Not Dumb Redaction
Most approaches to PII protection strip data and leave gaps. NOI does something different.
Its tokenization is deterministic - the same value always produces the same token. That means the LLM can reason consistently across a multi-turn conversation. It can recognize that "Customer A" in message one is the same "Customer A" in message five, without ever seeing a real name.
NOI detects 10+ entity types out of the box: names, emails, phone numbers, SSNs, passport numbers, credit cards, bank accounts, IBANs, IP addresses, and more. Admins can toggle entity types on or off and fine-tune detection confidence thresholds to match their risk profile.
Fail-Safe, Not Fail-Open
If tokenization fails for any reason, NOI blocks the request. Period. There is no scenario where PII leaks through to an LLM provider. This isn't a best-effort filter - it's a hard gate.
One Proxy. Nine Providers.
NOI works with OpenAI, Anthropic, Google Gemini, xAI, DeepSeek, Mistral, Groq, Together, and Fireworks. One configuration protects your traffic across every provider, with full streaming support for both OpenAI and Anthropic APIs.
Switch models. Switch providers. Your PII protection stays exactly the same.
Full Visibility for Compliance Teams
Every PII detection is logged with the entity type, confidence score, session, provider, model, and timestamp. The admin console gives you real-time analytics on API call volume, entities detected, and token usage - plus a searchable audit log that exports to CSV.
When the auditors come knocking, you'll have the receipts.
Built for Regulated Industries
NOI was designed for environments where data protection isn't optional:
- Healthcare - Protect patient names, SSNs, and medical identifiers in AI-assisted clinical workflows. Supports HIPAA compliance requirements without slowing down care teams.
- Financial services - Mask account numbers, credit cards, and routing numbers in AI-powered reporting. Built for SOX and PCI-DSS regulated environments.
- Legal - Redact client names and case details before AI-assisted document review.
- Customer support - Use AI for ticket routing, summarization, and response drafting without exposing customer data.
Get Started Free
NOI offers a free tier with 1M protected tokens per month - enough to test, prototype, and validate the integration in your environment. When you're ready to scale, the Pro tier starts at $50/month with 50M protected tokens included.
Your customers trust you with their data. NOI makes sure that trust extends to every AI interaction.
Try it yourself
See how NOI detects, tokenizes, and restores personally identifiable information in real time.
Choose an example or write your own: