How to Deploy OpenClaw: One‑Click AI Assistant Setup
Deploying OpenClaw – The Ultimate One‑Click AI Assistant Guide
OpenAI, Anthropic, Google Gemini, Groq, Mistral, Ollama – all the major LLMs can now talk to you through the same, lightweight gateway. OpenClaw, an open‑source project released under MIT, bundles the gateway, a rich command‑line interface, and an optional desktop manager powered by Tauri. The OpenClawInstaller repo on GitHub gives you a ready‑to‑run script that detects your OS, installs dependencies, pulls the latest OpenClaw binaries, wires up your preferred model and chat channels, and launches the service.
Why OpenClaw? * 1‑click CLI install on macOS, Linux, and even Windows via WSL * Plug‑and‑play support for Telegram, Discord, WhatsApp, WeChat, Slack, Feishu, iMessage – no server‑side code required * Run almost any major LLM, locally with Ollama or via API, complete with custom proxy support * Persistent long‑term memory, scheduled tasks, remote command execution, and skill‑based extensions written in Markdown * Docker‑friendly so you can containerise the gateway for isolation and scalability
Below you’ll find a step‑by‑step walkthrough, from prerequisites to advanced configuration.
1. Prerequisites
| Requirement | Minimum | Recommendation |
|---|---|---|
| OS | macOS 12+, Ubuntu 20.04, Debian 11, CentOS 8 | — |
| Node.js | v22+ | — |
| RAM | 2 GB | 4 GB+ |
| Disk space | 1 GB | — |
| Network | HTTPS/HTTP | — |
Tip: If you use a managed cloud instance, consider a 2‑CPU, 4 GB RAM VM. Docker adds ~500 MB overhead.
2. One‑Click Command Line Installation
curl -fsSL https://raw.githubusercontent.com/miaoxworld/OpenClawInstaller/main/install.sh | bash
The script will: 1. Detect your OS and install required packages (Node, Docker if needed). 2. Download the pre‑built binaries for the latest OpenClaw version. 3. Execute the config‑menu.sh wizard to guide you through: * Selecting the LLM provider and model * Providing API keys or choosing a local model (Ollama) * Configuring your message channels 4. Optionally launch the gateway as a background service.
Security note: If you encounter permissions errors on macOS, try running
sudoor granting the terminal full disk access.
3. Manual Clone & Setup
If you prefer to review the source or use a custom Docker Compose, follow these commands:
# Clone the repo
git clone https://github.com/miaoxworld/OpenClawInstaller.git
cd OpenClawInstaller
# Make scripts executable
chmod +x install.sh config-menu.sh
# Run the installer
./install.sh
Optional Docker deployment:
docker compose up -d
The Docker image pulls the openclaw binary and exposes port 8000 for the gateway API.
4. Quick-start Configuration
After installation, the installer usually asks if you want to start the gateway immediately. If you chose yes, you’ll see a status page: openclaw gateway status. If not, start it manually with:
# In the foreground (for debugging)
openclaw gateway
# Or as a daemon
openclaw gateway start
Configuring Messaging Channels
OpenClaw’s wizard (config‑menu.sh) exposes a menu under Message Channels:
| Channel | Setup Steps |
|---|---|
| Telegram | Create bot via @BotFather, copy token, fetch user ID with @userinfobot; enter token & ID in OpenClaw. |
| Discord | Create bot, enable Message Content Intent, invite bot to server; copy token & channel ID. |
| Scan QR code in terminal – no Business API required. | |
| WeChat / iMessage | Use MacOS Tauri app or terminal. |
| Slack / Feishu | Provide app tokens or secret keys. |
After configuration, restart the gateway:
openclaw gateway restart
5. Advanced Settings
OpenClaw’s global configuration lives in ~/.openclaw/openclaw.json and .openclaw/env. For most users, the wizard handles this automatically. However, if you want to tweak behavior:
- Custom LLM endpoints – For Anthropic or OpenAI proxies, set
ANALPHA_BASE_URLandOPENAI_BASE_URL. The script will add acustomprovider to the JSON file. - Memory & File Access – By default,
enable_shell_commandsandenable_file_accessarefalse. You can enable them in thesecuritysection to allow the assistant to run shell commands or read/write files. - Sandbox mode – Setting
sandbox_mode: trueconfines all external calls to a safe environment.
Use the command‑line tool to adjust:
openclaw config set security.enable_shell_commands true
openclaw config set sandbox_mode true
6. Data Management & Backup
# Export chat history to JSON
openclaw export --format json > conversations.json
# Clear long‑term memory
openclaw memory clear
# Backup whole config and logs
openclaw backup
Backups are stored in ~/.openclaw/backups/. You can also version‑control the backup directory if you want an audit trail.
7. Uninstall
If you ever need to clean up:
# Stop the service
openclaw gateway stop
# Remove global npm package
npm uninstall -g openclaw
# Delete configuration files
rm -rf ~/.openclaw
8. Troubleshooting Checklist
| Symptom | Likely Cause | Fix |
|---|---|---|
| Node.js too old | node -v < 22 |
Install Node 22 via nvm or OS package manager |
| Gateway cannot connect to API | Incorrect API key or custom base URL | Verify keys and URL; run openclaw doctor |
| Messaging bot silent | Bot token mis‑typed or not authorised | Re‑run channel config, ensure bot is invited to server |
| Docker container crashes | Resource limits too low | Allocate more RAM or limit with -m |
| “Permission denied” during install | macOS Gatekeeper | Grant Terminal full disk access or use sudo |
9. Recap
OpenClawInstaller turns a bare metal machine or a Docker‑enabled VM into a full‑featured AI assistant in just a couple of minutes. With one‑click installation, a wizard‑driven configuration, and support for almost every LLM and channel, you can replace external bot platforms and retain full control over data and performance.
Next steps – Explore the OpenClaw Manager desktop app for a GUI view or dive into the OpenClaw skill system by adding custom Markdown files in
~/.openclaw/skills/.
Happy hacking, and enjoy your personal AI assistant!