The model provider is the underlying AI service that runs the model (for example, OpenAI, Azure OpenAI, Anthropic). The default provider and model are configured in the Dashboard, which controls available models, credentials, quotas, and routing behavior.Documentation Index
Fetch the complete documentation index at: https://docs.paywalls.ai/llms.txt
Use this file to discover all available pages before exploring further.
- Default mode
BYOK (bring your own key) — Connect any OpenAI-compatible AI provider you already use. Simply add your API key in the Dashboard and you’re ready to go.Supported providers include:
- OpenAI
- Anthropic
- OpenRouter
- Together AI
- Any OpenAI-compatible endpoint
Model selection & routing
- You choose a model id (e.g.,
openai/gpt-4o-mini,openrouter/llama-3.1-70b). We normalize requests and route to the right upstream. - You can switch providers or models at any time without changing your integration.
- You can also set up custom routing rules to use different providers/models for different users, teams, or use cases (coming soon).