OpenClaw Docker CN-IM: Deploy a Chinese IM AI Gateway
OpenClaw Docker CN-IM: Deploy a Chinese IM AI Gateway
If you’re building an AI assistant that talks across China’s dominant messaging services—Feishu, DingTalk, QQ, WeCom—then you need a robust, modular backend. OpenClaw‑Docker‑CN‑IM provides exactly that: a one‑click Docker image that bundles an AI gateway, plug‑ins for every major platform, and optional AI‑code assistance out of the box.
Why OpenClaw‑CN‑IM?
- Zero‑config, Docker‑powered: Pull the image, start
docker-compose up -d, and you’re almost ready. - All‑in‑one plug‑ins: Feishu, DingTalk, QQ, WeCom are pre‑installed and auto‑enabled.
- AI‑first: Swap any OpenAI/Claude compatible model with one environment variable.
- Persistent storage: Configs and workspace live in Docker volumes; no data loss on reboot.
- Open source: GPL‑3.0 licensed, community contributions welcome.
Prerequisites
- Docker and Docker‑Compose (v1+).
- An AI provider API key (OpenAI, Gemini, Claude, etc.).
- Optional: credentials for any target IM platform you plan to use.
Step‑by‑Step Installation
-
Clone the repo (you can also use
wgetto grabdocker‑compose.ymldirectly):git clone https://github.com/justlovemaki/OpenClaw-Docker-CN-IM.git cd OpenClaw-Docker-CN-IM -
Copy the environment template and edit it for your environment:
At minimum, set your AI model:cp .env.example .env nano .envMODEL_ID=gemini-3-flash-preview BASE_URL=https://api.gemini.google/v1 API_KEY=sk-… API_PROTOCOL=openai-completions CONTEXT_WINDOW=1000000 MAX_TOKENS=8192 -
Add IM credentials (optional‑but‑recommended if you want multi‑platform support). For instance, Feishu:
Repeat for DingTalk, QQ, WeCom as needed.FEISHU_APP_ID=your-app-id FEISHU_APP_SECRET=your-app-secret -
Start the stack:
The first run will generatedocker compose up -d~/.openclaw/openclaw.jsonfrom your.env. -
Verify the gateway:
You should see OpenClaw launching and listening on the gateway port (default 18789). Open your browser todocker compose logs -fhttp://<your-host>:18789to test connectivity.
Configuring AI Clients
OpenClaw supports OpenAI and Claude protocol styles. If you’re using Gemini, pick the OpenAI protocol:
API_PROTOCOL=openai-completions
BASE_URL=https://api.gemini.google/v1
For Claude:
API_PROTOCOL=anthropic-messages
BASE_URL=http://localhost:3000
The API_KEY field simply forwards to the AI vendor—you can also set up a local AIClient-2-API service if you want zero external calls.
Persisting Data
Docker volumes expose two key folders:
/home/node/.openclaw– configuration and plugin data./home/node/.openclaw/workspace– workspace for OpenCode AI and other services.
If you ever need to reset everything, delete the host volumes and restart:
docker compose down -v
Troubleshooting Tips
| Issue | Likely Cause | Fix |
|---|---|---|
docker-compose logs shows “permission denied” |
Volume permissions mismatch | Ensure host folders are owned by the user node inside the container or use --user flag |
| No messages received in Feishu | IM credential missing OR event subscription not set | Double‑check FEISHU_APP_ID/SECRET and enable im.message.receive_v1 in Feishu's app dashboard |
| 401 error when calling AI API | Wrong API_KEY or missing scopes |
Verify key; check that your provider allows the chosen model |
| Gateway port conflict | Port 18789/18790 already in use | Change environment variables OPENCLAW_GATEWAY_PORT / OPENCLAW_BRIDGE_PORT |
Going Beyond the Basics
- AIClient‑2‑API: Run a local API aggregator to avoid exposing your AI key to the internet.
- Playwright integration: Use the bundled Playwright tools to automate browser actions and add richer AI capabilities.
- TTS: Enable Chinese text‑to‑speech for voice‑ready bots.
- Custom Plug‑ins: OpenClaw lets you add new channels by placing a Node‑JS package under
/pluginsand updatingopenclaw.json.
Wrap‑Up
OpenClaw‑Docker‑CN‑IM turns a complex, multi‑platform AI gateway into a one‑liner docker-compose command. Whether you’re a hobbyist adding AI to your Feishu workspace or a small business deploying a cross‑platform assistant, this solution lowers the barrier to entry dramatically.
Happy deploying—and feel free to submit PRs or issues on the GitHub repo to help the community grow!