Menu

OpenAI Codex

Last updated January 16, 2026

OpenAI Codex is OpenAI's agentic coding tool. You can configure it to use Vercel AI Gateway, enabling you to:

  • Route requests through multiple AI providers
  • Monitor traffic and spend in your AI Gateway Overview
  • View detailed traces in Vercel Observability under AI
  • Use any model available through the gateway

You can configure Codex to use AI Gateway through its configuration file or command-line arguments. The configuration file approach is recommended for persistent settings.

  1. Follow the installation instructions on the OpenAI Codex site to install the Codex CLI tool.

    OpenAI Codex also offers a VS Code extension if you prefer an IDE-integrated experience.

  2. Set your AI Gateway API key in your shell configuration file, for example in ~/.zshrc or ~/.bashrc:

    export AI_GATEWAY_API_KEY="your-ai-gateway-api-key"

    After adding this, reload your shell configuration:

    source ~/.zshrc  # or source ~/.bashrc
  3. Create or edit the Codex configuration file at ~/.codex/config.toml:

    ~/.codex/config.toml
    profile = "default"
     
    [model_providers.vercel]
    name = "Vercel AI Gateway"
    base_url = "https://ai-gateway.vercel.sh/v1"
    env_key = "AI_GATEWAY_API_KEY"
    wire_api = "chat"
     
    [profiles.default]
    model_provider = "vercel"
    model = "openai/gpt-5.2-codex"

    The configuration above:

    • Sets up a model provider named vercel that points to the AI Gateway
    • References your AI_GATEWAY_API_KEY environment variable
    • Creates a default profile that uses the Vercel provider
    • Specifies openai/gpt-5.2-codex as the default model
  4. Start Codex with your new configuration:

    codex

    Your requests will now be routed through Vercel AI Gateway. You can verify this by checking your AI Gateway Overview in the Vercel dashboard.

  5. You can use any model available through the AI Gateway by updating the model field in your profile. Here are some examples:

    ~/.codex/config.toml
    [profiles.default]
    model_provider = "vercel"
    model = "zai/glm-4.7"
    # Or try other models:
    # model = "kwaipilot/kat-coder-pro-v1"
    # model = "minimax/minimax-m2.1"
    # model = "anthropic/claude-sonnet-4.5"

    Models vary widely in their support for tools, extended thinking, and other features that Codex relies on. Performance may differ significantly depending on the model and provider you select.

  6. You can define multiple profiles for different use cases:

    ~/.codex/config.toml
    profile = "default"
     
    [model_providers.vercel]
    name = "Vercel AI Gateway"
    base_url = "https://ai-gateway.vercel.sh/v1"
    env_key = "AI_GATEWAY_API_KEY"
    wire_api = "chat"
     
    [profiles.default]
    model_provider = "vercel"
    model = "openai/gpt-5.2-codex"
     
    [profiles.fast]
    model_provider = "vercel"
    model = "openai/gpt-4o-mini"
     
    [profiles.reasoning]
    model_provider = "vercel"
    model = "openai/o1"

    Switch between profiles using the --profile flag:

    codex --profile fast

Was this helpful?

supported.