K
KairosRoute
Docs/Migrate from OpenAI

Migrate from OpenAI

Change the base URL. That's the migration. Your messages, parameters, streaming, and error-handling stay identical.

Before

from openai import OpenAI

client = OpenAI(api_key="sk-...")

resp = client.chat.completions.create(
    model="gpt-4o",
    messages=[{"role": "user", "content": "Hello"}],
)

After

from openai import OpenAI

client = OpenAI(
    api_key="kr-your-key",
    base_url="https://api.kairosroute.com/v1",  # ← one line
)

resp = client.chat.completions.create(
    model="auto",  # ← optional: let the router pick
    messages=[{"role": "user", "content": "Hello"}],
)

Want to pin a specific model?

Pass its ID (e.g., claude-opus-4-6, gpt-4o) instead of auto. Full list at GET /v1/models.

A few details

Can I run both in parallel while I cut over?

Yes. Keep your OpenAI client for some traffic, route the rest through us, and compare receipts in the dashboard before flipping the rest.

Will my error handling still work?

Same status codes, same error shapes. Two additions: 402 when a cost-cap guardrail blocks a request, and 429 when a hard-cap is enabled and the allotment is exhausted.

Tool calls, JSON mode, vision?

All passthrough — we proxy the feature flags straight to the provider. If a model supports it, you get it.

Ready when you are.

Grab a key and paste in the new base URL.

Get Your API Key →