Product Thumbnail

Arlopass

AI wallet that lets web apps use your models, not your keys

Chrome Extensions
Open Source
Artificial Intelligence
GitHub

Hunted byDavid SzakacsDavid Szakacs

Arlopass is an open-source browser extension and developer SDK that lets any web app use your AI providers — Ollama, Claude, GPT, Bedrock — without ever touching your API keys. You approve each request. You pick the model. Your credentials never leave your device.

Top comment

Hey Product Hunt! 👋 Arlopass is your personal AI pass — a browser extension that lets any web app use your AI providers without ever seeing your API keys. Here's the problem: every web app with AI features asks you to paste an API key and trust that they won't log it, leak it, or lose it in a breach. Developers burn days building server-side proxies just to keep keys safe. Users get locked into whatever model the app chose. It's a bad architecture that's become the default. So we built Arlopass. Install the extension, connect your providers (Ollama, Claude, GPT, Bedrock — whatever you use), and now every supported web app can request AI through your pass. You see the request, pick the model, approve it, and the request routes through a local bridge on your machine. Keys stay in your OS keychain. The app gets intelligence. You keep sovereignty. For developers — it's 10 lines to add AI to your app. connect(), chat.send(), chat.stream(). No API key management. No server proxy. No .env files. TypeScript-first, async iterators, Stripe-level DX. For privacy folks — it works with fully local models (Ollama, LM Studio). Zero cloud. Zero exposure. Everything stays on your machine. What's next: We're building React hooks (useChat, useProvider), more community adapters, and working toward a v1.0 protocol spec. We'd love contributors — the adapter contract is a single TypeScript file. Fully open source. MIT licensed. No accounts. No telemetry. Would love to hear how you're handling AI credentials in your apps today — and what providers you'd want us to support next. Happy to answer any questions!

Comment highlights

How does the approval flow work when a web app requests access to an expensive model like Claude Opus? The concept of an AI wallet is really creative, congrats on launching!

This is a massive relief. I’ve spent way too many hours building custom backend proxies just to keep my keys safe, or worse—wasting time manually swapping .env files every time I want to test a new model. It’s such a repetitive time sink that usually kills the momentum of a project.

The local bridge approach is definitely the right architecture. Keeping the keys in the OS keychain instead of the browser's local storage is a huge win for peace of mind. Truly feels like the 'missing link' for privacy-first AI apps. Congrats on the launch!

This is so helpful, I had to pay for Open AI API as well as the ChatGPT both this should help to cut down the cost! Thanks Man!

I use a bunch of different AI-powered web tools and every single one wants my API key, which means I'm trusting random apps with my credentials and have zero control over which model they actually use. Having a browser extension that acts as a wallet where I approve each request and pick the model myself makes a lot more sense from a security standpoint. Does it work with local Ollama models too, or mainly cloud providers?