Skip to main content
The model provider is the underlying AI service that runs the model (for example, OpenAI, Azure OpenAI, Anthropic). The default provider and model are configured in the Dashboard, which controls available models, credentials, quotas, and routing behavior.
  • Default mode
  • Shared mode
BYOK (bring your own key) — Connect any OpenAI-compatible AI provider you already use. Simply add your API key in the Dashboard and you’re ready to go.Supported providers include:
  • OpenAI
  • Anthropic
  • OpenRouter
  • Together AI
  • Any OpenAI-compatible endpoint
How it works: You pay your AI provider directly for usage. Paywalls handles billing your end users and passes requests through to your provider.

Model selection & routing

  • You choose a model id (e.g., openai/gpt-4o-mini, openrouter/llama-3.1-70b). We normalize requests and route to the right upstream.
  • You can switch providers or models at any time without changing your integration.
  • You can also set up custom routing rules to use different providers/models for different users, teams, or use cases (coming soon).