Use GPT 5.4 Mini and Nano on AI Gateway
Mirrored from Vercel — AI for archival readability. Support the source by reading on the original site.
GPT-5.4 Mini and GPT-5.4 Nano from OpenAI are now available on Vercel AI Gateway. Both models deliver state-of-the-art performance for their size class in coding and computer use, and are built for sub-agent workflows where multiple smaller models coordinate on parts of a larger task.
The models also support the verbosity and reasoning level parameters, giving you control over response detail and how much the model reasons before answering.
GPT-5.4 Mini
GPT-5.4 Mini handles code generation, tool orchestration, and multi-step browser interactions more reliably than previous mini-tier models. It's a strong default for agentic tasks that need to balance capability and cost. To use this model, set model to openai/gpt-5.4-mini in the AI SDK.
GPT-5.4 Nano
GPT-5.4 Nano performs close to GPT-5.4 Mini in evaluations at a lower price point. The model is well-suited for high-volume use cases like sub-agent workflows where cost scales with the number of parallel calls. To use this model, set model to openai/gpt-5.4-nano in the AI SDK.
AI Gateway provides a unified API for calling models, tracking usage and cost, and configuring retries, failover, and performance optimizations for higher-than-provider uptime. It includes built-in observability, Bring Your Own Key support, and intelligent provider routing with automatic retries.
Learn more about AI Gateway, view the AI Gateway model leaderboard or try it in our model playground.
Discussion (0)
Sign in to join the discussion. Free account, 30 seconds — email code or GitHub.
Sign in →No comments yet. Sign in and be the first to say something.