1. What is Paywalls.ai?
Paywalls.ai is a programmable paywall and usage-based billing proxy for OpenAI-compatible APIs.It lets you monetize any LLM without changing your frontend code — simply route your API requests through Paywalls.ai instead of calling the model provider directly.
OpenAPI spec: https://api.paywalls.ai/v1/openapi.yml
2. Create Your Account & API Key
- Sign up at Paywalls.ai Dashboard
- Create a Paywall in the dashboard
- Copy your API Key (you’ll use this in the
Authorization
header)
3. Point Your App to the Proxy
Replace your OpenAI endpoint and key with the Paywalls.ai endpoint and your API key. **Example (Node.js with **openai
npm package):
Note: You can pass the user ID in the request body (
user
property) or in
an X-Paywall-User
header. Learn more about User
Identification.4. How It Works (In Short)
- Proxy receives your request
Identifies the user and checks their paywall authorization and balance. - If not authorized → returns an
authorization
link
If balance too low → returns atopup
link - If authorized → forwards the request to the LLM provider
- Bills in real time based on actual usage (tokens, per-request fees, or both)
5. Billing Options
You can configure each paywall or model to bill by:- Per request — fixed price per API call
- Per token — based on measured prompt & completion tokens
- Manual charges — via
/user/charge
- Subscriptions — handled externally or via custom logic
6. Benefits
- Drop-in compatible with the OpenAI API
- No custom billing or metering code required
- Works with any frontend/backend stack
- Supports pay-per-message, microtransactions, and token quotas
- Use with code or no-code tools (Zapier, n8n, Flowise, etc.)