OpenAI-Compatible API
AI Gateway provides OpenAI-compatible API endpoints, letting you use multiple AI providers through a familiar interface. You can use existing OpenAI client libraries, switch to the AI Gateway with a URL change, and keep your current tools and workflows without code rewrites.
The OpenAI-compatible API implements the same specification as the OpenAI API.
The OpenAI-compatible API is available at the following base URL:
https://ai-gateway.vercel.sh/v1
The OpenAI-compatible API supports the same authentication methods as the main AI Gateway:
- API key: Use your AI Gateway API key with the
Authorization: Bearer <token>header - OIDC token: Use your Vercel OIDC token with the
Authorization: Bearer <token>header
You only need to use one of these forms of authentication. If an API key is specified it will take precedence over any OIDC token, even if the API key is invalid.
The AI Gateway supports the following OpenAI-compatible endpoints:
GET /models- List available modelsGET /models/{model}- Retrieve a specific modelPOST /chat/completions- Create chat completions with support for streaming, attachments, tool calls, and structured outputsPOST /embeddings- Generate vector embeddings
For advanced features, see:
- Advanced configuration - Reasoning, provider options, model fallbacks, BYOK, prompt caching, and extended context
- Image generation - Generate images using multimodal models
- Direct REST API usage - Use the API without client libraries
You can use the AI Gateway's OpenAI-compatible API with existing tools and libraries like the OpenAI client libraries and AI SDK. Point your existing client to the AI Gateway's base URL and use your AI Gateway API key or OIDC token for authentication.
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: process.env.AI_GATEWAY_API_KEY,
baseURL: 'https://ai-gateway.vercel.sh/v1',
});
const response = await openai.chat.completions.create({
model: 'anthropic/claude-sonnet-4.5',
messages: [{ role: 'user', content: 'Hello, world!' }],
});import os
from openai import OpenAI
client = OpenAI(
api_key=os.getenv('AI_GATEWAY_API_KEY'),
base_url='https://ai-gateway.vercel.sh/v1'
)
response = client.chat.completions.create(
model='anthropic/claude-sonnet-4.5',
messages=[
{'role': 'user', 'content': 'Hello, world!'}
]
)For compatibility with AI SDK and AI Gateway, install the @ai-sdk/openai-compatible package.
import { createOpenAICompatible } from '@ai-sdk/openai-compatible';
import { generateText } from 'ai';
const gateway = createOpenAICompatible({
name: 'openai',
apiKey: process.env.AI_GATEWAY_API_KEY,
baseURL: 'https://ai-gateway.vercel.sh/v1',
});
const response = await generateText({
model: gateway('anthropic/claude-sonnet-4.5'),
prompt: 'Hello, world!',
});Retrieve a list of all available models that can be used with the AI Gateway.
GET /models
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: process.env.AI_GATEWAY_API_KEY,
baseURL: 'https://ai-gateway.vercel.sh/v1',
});
const models = await openai.models.list();
console.log(models);import os
from openai import OpenAI
client = OpenAI(
api_key=os.getenv('AI_GATEWAY_API_KEY'),
base_url='https://ai-gateway.vercel.sh/v1'
)
models = client.models.list()
print(models)The response follows the OpenAI API format:
{
"object": "list",
"data": [
{
"id": "anthropic/claude-sonnet-4.5",
"object": "model",
"created": 1677610602,
"owned_by": "anthropic"
},
{
"id": "openai/gpt-5.2",
"object": "model",
"created": 1677610602,
"owned_by": "openai"
}
]
}Retrieve details about a specific model.
GET /models/{model}
model(required): The model ID to retrieve (e.g.,anthropic/claude-sonnet-4)
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: process.env.AI_GATEWAY_API_KEY,
baseURL: 'https://ai-gateway.vercel.sh/v1',
});
const model = await openai.models.retrieve('anthropic/claude-sonnet-4.5');
console.log(model);import os
from openai import OpenAI
client = OpenAI(
api_key=os.getenv('AI_GATEWAY_API_KEY'),
base_url='https://ai-gateway.vercel.sh/v1'
)
model = client.models.retrieve('anthropic/claude-sonnet-4.5')
print(model){
"id": "anthropic/claude-sonnet-4.5",
"object": "model",
"created": 1677610602,
"owned_by": "anthropic"
}The API returns standard HTTP status codes and error responses:
400 Bad Request: Invalid request parameters401 Unauthorized: Invalid or missing authentication403 Forbidden: Insufficient permissions404 Not Found: Model or endpoint not found429 Too Many Requests: Rate limit exceeded500 Internal Server Error: Server error
{
"error": {
"message": "Invalid request: missing required parameter 'model'",
"type": "invalid_request_error",
"param": "model",
"code": "missing_parameter"
}
}Was this helpful?