Skip to main content
Client‑agnostic by design. If your client can call an OpenAI‑style HTTP API, it works with Paywalls. In most cases you only change the baseURL and the API key — no new SDK integration or code changes required. Works with: official OpenAI SDK, Vercel AI SDK, OpenRouter SDK, community clients, and plain HTTP requests. Edge/serverless runtimes are supported. Compatibility: Chat Completions (including streaming), tools/function‑calling, JSON/structured outputs.

OpenAI SDK

  • Typescript
  • Python
import OpenAI from "openai";

const user = "user_123"; // Stable, pseudonymous user ID

const client = new OpenAI({
  apiKey: process.env.PAYWALLS_API_KEY, // Set your Paywalls API key here
  baseURL: "https://api.paywalls.ai/v1", // Change baseURL to Paywalls
  headers: {
    "X-Paywall-User": user // Identify the end user
  }
});

const stream = await client.chat.completions.create({
  model: "gpt-5",
  user: "user_123",
  stream: true,
  messages: [
    { role: "user", content: "Explain usage-based pricing in 2 lines." },
  ],
});

for await (const chunk of stream) {
  const delta = chunk.choices?.[0]?.delta?.content;
  if (delta) process.stdout.write(delta);
}

Vercel AI SDK with Next.js

  • Next.js API route
  • Client component
import { streamText } from 'ai'
import { createOpenAI } from '@ai-sdk/openai'

export async function POST(request: Request) {
  const { messages } = await request.json()

  const user = "user_123" // Stable, pseudonymous user ID

  const monetizedAI = createOpenAI({
    apiKey: process.env.PAYWALLS_API_KEY, // Your Paywalls API key
    baseURL: 'https://api.paywalls.ai/v1', // Change baseURL to Paywalls
    headers: {
      "X-Paywall-User": user // Identify the end user
    }
  })

  const result = await streamText({
    model: monetizedAI('gpt-5'),
    messages
  })

  return result.toAIStreamResponse()
}

OpenRouter

import { createOpenRouter } from "@openrouter/ai-sdk-provider";
import { streamText } from "ai";

const openrouter = createOpenRouter({
  apiKey: process.env.PAYWALLS_API_KEY, // Your Paywalls API key
  baseURL: "https://api.paywalls.ai/v1", // Change baseURL to Paywalls
  headers: {
    "X-Paywall-User": "user_123", // Stable, pseudonymous user ID
  },
});
const model = openrouter("anthropic/claude-3.7-sonnet:thinking");
await streamText({
  model,
  messages: [{ role: "user", content: "Hello" }],
});

Together AI

import Together from "together-ai";

const client = new Together({
  apiKey: process.env.PAYWALLS_API_KEY, // Your Paywalls API key
  baseURL: "https://api.paywalls.ai/v1", // Change baseURL to Paywalls
  headers: {
    "X-Paywall-User": "user_123", // Stable, pseudonymous user ID
  },
});

const chatCompletion = await client.chat.completions.create({
  messages: [
    { role: "user", content: "Explain usage-based pricing in 2 lines." },
  ],
  model: "mistralai/Mixtral-8x7B-Instruct-v0.1",
});

console.log(chatCompletion.choices);

Environment Setup

Ensure you have your Paywalls API key set in your environment variables:
PAYWALLS_API_KEY="your_paywalls_api_key_here"
Always send a stable, pseudonymous user id. For browser-only apps, do not expose your paywall key — use a server or edge function.