OpenClaw 3.7 is the biggest release since the rebrand. Models, localization, stability, plugin architecture—everything got touched.
Model Ecosystem: GPT-5.4 and Gemini 3.1 Flash-Lite on Day One
OpenAI shipped GPT-5.4 on March 5, 2026. OpenClaw merged support the next day. Google dropped the Gemini 3.1 Flash-Lite Preview on March 3. OpenClaw had it running within 48 hours.
That speed isn't people refreshing Twitter—it's the Model Providers layer doing what it was designed to do. Adding a new model is a well-scoped contribution: clean interfaces, clear docs. The moment a provider publishes an API spec, someone in the community opens a PR. It looks like vendor coordination from the outside, but it's really just a contributor pipeline that works.
Why It Matters
If you're picking an agent framework, few things hurt more than a hot new model dropping and your framework not supporting it for weeks. OpenClaw has quietly built a track record of day-one availability, and that matters:
| Model | Provider Release | OpenClaw Support | Delta |
|---|---|---|---|
| GPT-5.4 | March 5 | March 6 | 1 day |
| Gemini 3.1 Flash-Lite | March 3 | March 5 | 2 days |
Clean provider abstractions plus a community large enough to cover every major lab. That's the whole trick.
Multilingual UI: Spanish, German, and Better Search
The Control UI now speaks Spanish and German, contributed by @DaoPromociones via PR #35038. Locale detection, lazy-loaded translation bundles, language picker labels—the full package.
The search engine got smarter too. Spanish, Portuguese, Japanese, Korean, and Arabic stop-words and tokenization are now handled correctly, so multilingual users actually get relevant results back.
Why Now
282,000+ stars. 1,100+ contributors from every continent. The user base went global a while ago; the tooling was just late to catch up. Adding locales isn't a bet on international growth—it's paying off a debt.
200+ Bug Fixes: What Production Traffic Does to Software
Two hundred bug fixes. You can read that as "the software was broken" or "enough people are running it in production to find every edge case." Look at what actually got fixed and judge for yourself:
- •Plugin command validation: Malformed command specs crashed startup. Now validated at registration.
- •Telegram gateway: Missing account tokens triggered
token.trim()errors. Null-guarded. - •TLS pairing: Local self-connections were forced through device pairing, breaking Docker and LAN setups. Local paths now skip it.
- •Config substitution: Unresolved
${VAR}placeholders caused hard failures. Now degrades gracefully with warnings—but still won't let unresolved placeholders pass as credentials.
These aren't polish. This is what happens when a project goes from "cool weekend thing" to "thing people depend on."
ContextEngine: The Real Story of 3.7
The biggest thing in this release is the ContextEngine plugin interface—a slot-based system that lets third-party plugins fully own how session context gets ingested, assembled, and compacted.
We wrote a separate deep-dive covering the architecture, lifecycle hooks, and early ecosystem. The short version: the built-in sliding-window compaction can now be cleanly replaced, and people are already shipping alternatives—Lossless-Claw for DAG-based summarization, MemOS Cloud Plugin for persistent cross-session memory.
If 3.7 is remembered for one thing, this is it.
By the Numbers
| Metric | Before 3.7 | After 3.7 |
|---|---|---|
| GitHub Stars | 266,500 | 282,300+ |
| Contributors | 1,108 | 1,164 |
| Total Commits | 16,992 | 17,781 |
| Supported Model Providers | 30+ | 30+ (with GPT-5.4, Gemini 3.1) |
| Control UI Languages | 12 | 14 (+es, +de) |
What's Next
ContextEngine is the foundation. What gets built on it is the interesting part. RAG-aware context assembly, multi-agent shared memory, token-budget-optimized compaction—people are already working on all of these.
3.7 isn't the kind of release that makes headlines with a single flashy feature. It's the kind that clears the path for everything after it. Faster model onboarding, broader language support, a more stable core, and an extensible architecture that the community can actually build on. Not glamorous. Usually the important ones aren't.