Resolving 'Model openai-codex/gpt-5.4 is not allowed' in OpenClaw
TL;DR โ Quick Fix
The error occurs because your cached session credentials lack the updated model scope, so running a re-authentication command will force OpenClaw to fetch the correct permissions.
Run DiagnosticsNext Step
Fix now, then reduce repeat incidents
If this issue keeps coming back, validate your setup in Doctor first, then harden your config.
Error Signal
Model "openai-codex/gpt-5.4" is not allowed.What's Happening
You are seeing the "Model 'openai-codex/gpt-5.4' is not allowed" error because your local OpenClaw environment is still referencing old session data that doesn't include the new model permissions. Even if you have updated to the latest OpenClaw source or stable release, your existing credentials haven't refreshed to recognize that you are authorized for the GPT-5.4 model via the OpenAI gateway.
The Fix
Before you dive into logs, try a simple re-authentication. This forces OpenClaw to ping the upstream provider and update your model capability list.
Run this command in your terminal:
openclaw models auth login --provider openai-codex --method oauth
After you finish the login flow, check your model list again to confirm it shows up:
openclaw models list
You should see openai-codex/gpt-5.4 present in the output. If it still does not appear, ensure you are on the latest update by running openclaw update again before attempting to re-auth.
Why This Occurs
OpenClaw maintains a local cache of allowed models fetched during your initial OAuth handshake. When new models like GPT-5.4 roll out upstream, your local client doesn't automatically purge the old model list. The software defaults to the last known 'safe' version (usually GPT-5.3-codex) and strictly denies any model string not found in that outdated cached list to prevent invalid API calls.
Prevention
OpenClaw's model routing is evolving rapidly. While the team works on better automated cache invalidation, you can proactively avoid this by re-authenticating whenever you notice major model releases upstream or after performing a major openclaw update. If you are using a custom CLI backend like openai-cli-anka, ensure your backend is updated separately, as those interfaces often handle their own model allowlisting independently of the OpenClaw core configuration.
Last Updated: March 2026
Did this guide solve your problem?