Zero-Backend Architecture: Building AI Workbenches That Respect Privacy
Why does every AI tool want your API keys? The default architecture is: user → proxy server → provider API. The proxy sees everything: your keys, your prompts, your business logic. We built AIWorkbench.dev to prove there's a better way.
The Proxy Problem
When you route API calls through a backend, you introduce three failure modes:
- Key Exposure: The server must store or transmit your API key. Even temporarily, it's in their logs, their database backups, and potentially their error tracking (Sentry, Datadog).
- Data Retention: Your prompts pass through the proxy. The provider's privacy policy might promise deletion, but what about the proxy's?
- Single Point of Failure: The proxy goes down, you can't use AI. The proxy gets rate-limited, you get throttled. The proxy changes pricing, you're locked in.
The BYOK (Bring Your Own Key) Model
AIWorkbench.dev is a static Next.js application. It compiles to HTML, CSS, and JavaScript. There is no server component handling your requests.
How it works:
- You paste your API key into a browser input field.
- The key is stored in
sessionStorage(cleared when the tab closes). - When you click "Send," the browser initiates a direct HTTPS fetch to
api.anthropic.com,api.openai.com, orgenerativelanguage.googleapis.com. - The response streams directly back to your browser. No middleman touches the data.
Security Verification
You don't have to trust us. Open your browser's Network tab. Every request's Host header points to the provider's domain, not ours. Your keys never appear in our Google Analytics, our error logs, or any third-party service.
Trade-Offs
This architecture is not without compromises:
- CORS: Some providers (looking at you, early OpenAI) block browser requests via CORS. We handle preflight negotiation where possible, but some endpoints simply won't work from the browser.
- No Server-Side Features: We can't do server-side caching, rate-limit smoothing, or request batching. Each request is exactly one browser fetch.
- Key Management: Users must manage their own keys. We can't offer "one key for all providers" because that would require... a proxy.
Why We Chose This Anyway
For developers testing prompts, comparing models, and iterating on AI features, the privacy guarantee outweighs the convenience loss. You are not our product. Your data is not our inventory.
Key Takeaway
The next time an AI tool asks for your API key, ask where it goes. If the answer involves "our servers," you are paying for the privilege of being a data source. Build — or use — tools that keep the pipeline direct.