OpenClaw Docker CN-IM: Deploy a Chinese IM AI Gateway

OpenClaw Docker CN-IM: Deploy a Chinese IM AI Gateway

If you’re building an AI assistant that talks across China’s dominant messaging services—Feishu, DingTalk, QQ, WeCom—then you need a robust, modular backend. OpenClaw‑Docker‑CN‑IM provides exactly that: a one‑click Docker image that bundles an AI gateway, plug‑ins for every major platform, and optional AI‑code assistance out of the box.

Why OpenClaw‑CN‑IM?

  • Zero‑config, Docker‑powered: Pull the image, start docker-compose up -d, and you’re almost ready.
  • All‑in‑one plug‑ins: Feishu, DingTalk, QQ, WeCom are pre‑installed and auto‑enabled.
  • AI‑first: Swap any OpenAI/Claude compatible model with one environment variable.
  • Persistent storage: Configs and workspace live in Docker volumes; no data loss on reboot.
  • Open source: GPL‑3.0 licensed, community contributions welcome.

Prerequisites

  1. Docker and Docker‑Compose (v1+).
  2. An AI provider API key (OpenAI, Gemini, Claude, etc.).
  3. Optional: credentials for any target IM platform you plan to use.

Step‑by‑Step Installation

  1. Clone the repo (you can also use wget to grab docker‑compose.yml directly):

    git clone https://github.com/justlovemaki/OpenClaw-Docker-CN-IM.git
    cd OpenClaw-Docker-CN-IM
    

  2. Copy the environment template and edit it for your environment:

    cp .env.example .env
    nano .env
    
    At minimum, set your AI model:
    MODEL_ID=gemini-3-flash-preview
    BASE_URL=https://api.gemini.google/v1
    API_KEY=sk-…
    API_PROTOCOL=openai-completions
    CONTEXT_WINDOW=1000000
    MAX_TOKENS=8192
    

  3. Add IM credentials (optional‑but‑recommended if you want multi‑platform support). For instance, Feishu:

    FEISHU_APP_ID=your-app-id
    FEISHU_APP_SECRET=your-app-secret
    
    Repeat for DingTalk, QQ, WeCom as needed.

  4. Start the stack:

    docker compose up -d
    
    The first run will generate ~/.openclaw/openclaw.json from your .env.

  5. Verify the gateway:

    docker compose logs -f
    
    You should see OpenClaw launching and listening on the gateway port (default 18789). Open your browser to http://<your-host>:18789 to test connectivity.

Configuring AI Clients

OpenClaw supports OpenAI and Claude protocol styles. If you’re using Gemini, pick the OpenAI protocol:

API_PROTOCOL=openai-completions
BASE_URL=https://api.gemini.google/v1

For Claude:

API_PROTOCOL=anthropic-messages
BASE_URL=http://localhost:3000

The API_KEY field simply forwards to the AI vendor—you can also set up a local AIClient-2-API service if you want zero external calls.

Persisting Data

Docker volumes expose two key folders:

  • /home/node/.openclaw – configuration and plugin data.
  • /home/node/.openclaw/workspace – workspace for OpenCode AI and other services.

If you ever need to reset everything, delete the host volumes and restart:

docker compose down -v

Troubleshooting Tips

Issue Likely Cause Fix
docker-compose logs shows “permission denied” Volume permissions mismatch Ensure host folders are owned by the user node inside the container or use --user flag
No messages received in Feishu IM credential missing OR event subscription not set Double‑check FEISHU_APP_ID/SECRET and enable im.message.receive_v1 in Feishu's app dashboard
401 error when calling AI API Wrong API_KEY or missing scopes Verify key; check that your provider allows the chosen model
Gateway port conflict Port 18789/18790 already in use Change environment variables OPENCLAW_GATEWAY_PORT / OPENCLAW_BRIDGE_PORT

Going Beyond the Basics

  • AIClient‑2‑API: Run a local API aggregator to avoid exposing your AI key to the internet.
  • Playwright integration: Use the bundled Playwright tools to automate browser actions and add richer AI capabilities.
  • TTS: Enable Chinese text‑to‑speech for voice‑ready bots.
  • Custom Plug‑ins: OpenClaw lets you add new channels by placing a Node‑JS package under /plugins and updating openclaw.json.

Wrap‑Up

OpenClaw‑Docker‑CN‑IM turns a complex, multi‑platform AI gateway into a one‑liner docker-compose command. Whether you’re a hobbyist adding AI to your Feishu workspace or a small business deploying a cross‑platform assistant, this solution lowers the barrier to entry dramatically.

Happy deploying—and feel free to submit PRs or issues on the GitHub repo to help the community grow!

Original Article: View Original

Share this article