Resolving Telegram Word-by-Word Response Lag in OpenClaw
TL;DR โ Quick Fix
The sluggish output is a deliberate change in version 2026.2.15 increasing the streaming throttle to 1000ms; the only reliable fix currently is to roll back to 2026.2.14.
Run DiagnosticsNext Step
Fix now, then reduce repeat incidents
If this issue keeps coming back, validate your setup in Doctor first, then harden your config.
Error Signal
dmPolicy: pairingWhat's Happening
After updating to OpenClaw 2026.2.15, your Telegram bot responses feel broken. The first word appears instantly, but the rest of the message trickles in word-by-word, creating severe latency that makes the bot feel unresponsive.
The Fix
Since this is an intentional change in the core code rather than a configuration bug, you cannot 'fix' it via settings files. Your best option is to roll back to version 2026.2.14 until a custom throttle configuration option is implemented by the maintainers.
To roll back, pull the previous stable tag:
docker stop openclaw
docker pull openclaw/openclaw:2026.2.14
docker run -d --name openclaw openclaw/openclaw:2026.2.14
Why This Occurs
This behavior stems from commit a69e82765. To avoid hitting Telegram API rate limits during heavy usage, the developers increased the default streaming throttle:
- Version <= 2026.2.14: 300ms default / 50ms min interval
- Version >= 2026.2.15: 1000ms default / 250ms min interval
While this prevents API errors, it drastically slows down the user experience. The 'pairing' dmPolicy remains active, but the output speed is now capped by these tighter interval constants.
Prevention
Track the GitHub issue #18269 for updates. The community is currently pushing for a user-definable throttle setting in the channels.telegram config block. Avoid upgrading to new beta releases if you rely on high-speed Telegram interactions until this setting is merged.
Last Updated: March 2026
Did this guide solve your problem?