Same OpenAI SDK you already use. Point base_url at api.tokenroute.io and swap the API key — done. Python, JavaScript, Go, any client that speaks OpenAI.
Set model: "tokenroute-auto" and we pick frontier models for hard queries, small fast models for trivial ones. Typical mixed traffic lands 40–80% cheaper than always-GPT-4.
No subscriptions, no minimums. Top up with a card, watch per-request cost in your dashboard in real time, withdraw unused balance any time.
from openai import OpenAI
client = OpenAI(
api_key="sk-tr-…", # from /dashboard/keys
base_url="https://api.tokenroute.io/v1",
)
response = client.chat.completions.create(
model="tokenroute-auto", # smart routing
messages=[{"role": "user", "content": "Summarize this commit log."}],
)
print(response.choices[0].message.content)Any OpenAI client works — Node.js, Go, Ruby, curl. Streaming, function calling, JSON mode, vision — all passed through unchanged.
You pay the published list price from each upstream provider (OpenAI, Anthropic, and more), billed per token as you use it. Credits you top up go directly against that cost — no minimums, no monthly fee. Balance you haven't spent stays available indefinitely.
Create an account