Overview
The Paywall Proxy sits between your backend and the actual LLM API provider (e.g., OpenAI, Anthropic or OpenRouter). It enforces billing, usage limits, and authentication — while preserving OpenAI API compatibility.Endpoint
Replace OpenAI’s endpoint with:Authentication
Use theAuthorization: Bearer YOUR_API_KEY
header to authenticate your app (not the user).
User Identification
Each request must specify the user:- Option 1:
user
field in the request body (recommended for OpenAI-compatible clients) - Option 2:
X-Paywall-User
header (recommended for middleware injection) - Option 3:
user
URL path parameter, e.g.https://api.paywalls.ai/v1/{user}
(when no other option works)
Behavior
- If user is not authorized → returns an authorization link.
- If user has no balance → returns a top-up link.
- If both valid → deduct fee and forward to the model provider.
Compatibility
The Paywall Proxy supports most OpenAI-compatible endpoints and features. However, some limitations apply depending on the chosen model provider.Endpoints
- Chat Completions: Supported (incl. streaming:
stream: true
) - Embeddings: Not supported yet (coming soon)
- Images: Not supported yet (coming soon)
- Audio: Not supported yet (coming soon)
- Files / Assistants: Not supported yet (coming soon)
Features
- Tools / function calling: Supported; billed per prompt and output tokens
- JSON mode / structured output: Supported; billed per prompt and output tokens
- Vision inputs: Supported on compatible models
- Logprobs / reasoning tokens: Not supported yet (coming soon)
Model providers
- OpenAI: Fully supported
- Anthropic: Fully supported
- OpenRouter: Fully supported
- Other OpenAI-compatible providers: Fully supported
- Custom non-OpenAI providers: Not supported yet (coming soon)