OpenClaw Installation Guide: Mac, Windows & Linux (Step by Step)
A complete, platform-specific guide to installing OpenClaw on macOS, Windows, and Linux. Covers system requirements, every installation method (npm, Homebrew, curl, Docker), first agent setup, local models with Ollama, troubleshooting common errors, and clean uninstallation. Follow the section for your operating system and you will have a running agent in under 10 minutes.
1. System Requirements
Before installing OpenClaw, make sure your machine meets these minimum requirements. OpenClaw itself is lightweight, but if you plan to run local models through Ollama, you will need more RAM.
Node.js
Version 18 or later (Node.js 22 recommended). Check with node --version. If you do not have Node.js, the macOS and Linux sections below include install commands. Docker users can skip this.
Operating System
macOS 12 (Monterey) or later. Windows 10/11 with WSL 2. Ubuntu 20.04+, Debian 11+, Fedora 36+, or any Linux distro with glibc 2.31 or later. ARM64 (Apple Silicon, Raspberry Pi) is fully supported.
RAM
512 MB free for OpenClaw gateway. 8 GB total if running Ollama with 7B models. 16 GB total if running 13B models. Cloud API users (Anthropic, OpenAI) need only the base 512 MB.
Disk Space
200 MB for OpenClaw core. 4-8 GB additional if using Ollama models (varies by model size). Session logs and knowledge files grow over time but rarely exceed 100 MB.
API Key (Optional)
An API key from Anthropic (Claude), OpenAI (GPT-4o), or Google (Gemini) if you want to use cloud models. Not needed if you use Ollama for free local inference.
2. Installing on macOS
macOS is the most straightforward platform for OpenClaw. You have three options: Homebrew (recommended), npm global install, or the curl one-liner. All three work on both Intel and Apple Silicon Macs.
Option A: Homebrew (Recommended)
If you already use Homebrew, this is the fastest path. It handles Node.js as a dependency automatically.
# Install Homebrew (skip if already installed)
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
# Install OpenClaw
brew install openclaw
# Verify installation
openclaw --versionOption B: npm Global Install
Requires Node.js 18+ already installed. Use nvm if you need to manage multiple Node versions.
# Install Node.js 22 via nvm (skip if you have Node 18+)
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash
source ~/.zshrc
nvm install 22
# Install OpenClaw globally
npm install -g openclaw
# Verify
openclaw --versionOption C: curl One-Liner
Downloads and runs the install script directly. Installs Node.js if missing.
curl -fsSL https://get.openclaw.dev | shAfter installation, verify everything is working:
# Check OpenClaw version
openclaw --version
# Expected: openclaw v2.x.x
# Check Node.js version
node --version
# Expected: v18.x.x or higher (v22.x.x recommended)3. Installing on Windows (WSL 2)
OpenClaw on Windows requires WSL 2 (Windows Subsystem for Linux). Native Windows support is experimental and not recommended for production use. WSL 2 gives you a real Linux environment running inside Windows, and it takes about 5 minutes to set up.
# Open PowerShell as Administrator and run:
wsl --install
# This installs WSL 2 with Ubuntu by default
# Restart your computer when prompted
# After restart, open "Ubuntu" from the Start menu
# Set your Linux username and password when asked# Inside the Ubuntu/WSL terminal:
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash
source ~/.bashrc
nvm install 22
node --versionnpm install -g openclaw
openclaw --versionImportant: Always run OpenClaw commands from inside the WSL terminal, not from PowerShell or Command Prompt. Your project files should live inside the WSL filesystem (e.g., ~/projects/) for best performance. Accessing files on the Windows filesystem (/mnt/c/) from WSL is significantly slower.
4. Installing on Linux (Ubuntu, Debian, Fedora)
Linux is the native environment for OpenClaw. The installation is the same across most distributions. You just need Node.js 18+ and npm.
Ubuntu / Debian
# Install Node.js 22 via NodeSource
curl -fsSL https://deb.nodesource.com/setup_22.x | sudo -E bash -
sudo apt-get install -y nodejs
# Verify Node.js
node --version
# Install OpenClaw
npm install -g openclaw
openclaw --versionFedora / RHEL / CentOS
# Install Node.js 22 via NodeSource
curl -fsSL https://rpm.nodesource.com/setup_22.x | sudo bash -
sudo dnf install -y nodejs
# Install OpenClaw
npm install -g openclaw
openclaw --versionAny Linux (using nvm)
If you prefer not to use NodeSource or want to manage multiple Node.js versions, use nvm:
curl -o- https://raw.githubusercontent.com/nvm-sh/nvm/v0.39.7/install.sh | bash
source ~/.bashrc
nvm install 22
npm install -g openclaw
openclaw --versionPermission note: If you see EACCES errors during npm install -g, do not use sudo with npm. Instead, fix npm permissions by running mkdir ~/.npm-global && npm config set prefix '~/.npm-global' and adding export PATH=~/.npm-global/bin:$PATH to your shell profile. Or use nvm, which avoids this issue entirely.
5. First Agent Setup
With OpenClaw installed, you are ready to create and run your first agent. This takes three commands: onboard, configure your SOUL.md, and start the gateway.
# Run the onboarding wizard
openclaw onboard
# This will ask you to:
# 1. Choose a project name
# 2. Select a model provider (Anthropic, OpenAI, Google, or Ollama)
# 3. Paste your API key (skipped for Ollama)
# 4. Pick a default model
# 5. Create your first agent with a starter SOUL.mdAfter onboarding, edit your SOUL.md to define the agent's identity. Here is a minimal example:
# MyFirstAgent
## Role
You are a helpful assistant that answers questions
clearly and concisely.
## Rules
- ALWAYS respond in English
- Keep responses under 200 words unless asked for more
- NEVER make up information you are not sure about
## Personality
- Tone: Friendly and professional
- Use short paragraphs
- Include examples when helpfulNow start the gateway and talk to your agent:
# Start the gateway
openclaw gateway start
# Expected output:
# [OpenClaw] Loading SOUL.md...
# [OpenClaw] Agent: MyFirstAgent
# [OpenClaw] Gateway running on port 18789
# [OpenClaw] Type a message to start chatting...
# Send a test message from another terminal
openclaw agent --agent my-first-agent --message "Hello, what can you do?"To connect your agent to Telegram, create a bot via @BotFather and configure the token:
openclaw config set channels.telegram.bottoken YOUR_BOT_TOKEN
openclaw gateway restart6. Docker Installation
Docker is the best choice if you want an isolated, reproducible environment or if you do not want to install Node.js on your host machine. It works on all platforms where Docker Desktop or Docker Engine is available.
# Pull and run the latest OpenClaw image
docker run -d \
--name openclaw \
-p 18789:18789 \
-v $(pwd)/agents:/app/agents \
-e ANTHROPIC_API_KEY=your-api-key-here \
openclaw/openclaw:latestversion: "3.8"
services:
openclaw:
image: openclaw/openclaw:latest
container_name: openclaw-agent
restart: unless-stopped
ports:
- "18789:18789"
volumes:
- ./agents:/app/agents
- ./config:/app/config
- ./sessions:/app/sessions
environment:
- ANTHROPIC_API_KEY=your-api-key-here
- OPENCLAW_MODEL=claude-sonnet-4-5
- OPENCLAW_PORT=18789# Start the container
docker compose up -d
# Check logs
docker compose logs -f openclaw
# Stop the container
docker compose downFROM node:22-slim
# Install OpenClaw
RUN npm install -g openclaw
# Create working directory
WORKDIR /app
# Copy your agent configuration
COPY SOUL.md /app/SOUL.md
COPY config/ /app/config/
# Expose the gateway port
EXPOSE 18789
# Start the gateway
CMD ["openclaw", "gateway", "start"]Tip: Mount your SOUL.md and config directory as volumes so you can edit them without rebuilding the container. Changes take effect after running docker compose restart.
7. Installing with Ollama (Free Local Models)
Ollama lets you run large language models locally on your own hardware. This means zero API costs, complete data privacy, and no internet requirement after the initial model download. OpenClaw has built-in support for Ollama as a model provider.
# macOS (Homebrew)
brew install ollama
# Linux
curl -fsSL https://ollama.com/install.sh | sh
# Windows: Download from https://ollama.com/download# Start the Ollama server
ollama serve
# In another terminal, pull a model
ollama pull llama3 # 4.7 GB, good general purpose
ollama pull mistral # 4.1 GB, fast and capable
ollama pull codellama # 3.8 GB, optimized for code tasks
ollama pull qwen2 # 4.4 GB, strong multilingual support# Set Ollama as the model provider
openclaw models auth ollama
# Set your preferred model
openclaw config set models.default llama3
# Start the gateway
openclaw gateway start
# Your agent now runs 100% locally with no API key8. Troubleshooting Common Issues
Most installation problems fall into a few predictable categories. Here are the most common issues and their fixes.
Port 18789 already in use
Another process is occupying the default gateway port. On macOS/Linux, find it with 'lsof -i :18789' and stop it with 'kill [PID]'. Alternatively, change the port: 'openclaw config set gateway.port 18790'.
Node.js version too old
OpenClaw requires Node.js 18 or later. Check your version with 'node --version'. If it is below 18, upgrade using nvm: 'nvm install 22 && nvm use 22'. On Ubuntu, the system apt package is often outdated. Use NodeSource or nvm instead.
'command not found: openclaw'
The npm global bin directory is not in your PATH. Run 'npm config get prefix' to find it, then add its bin subdirectory to your PATH. With nvm, this is usually '~/.nvm/versions/node/v22.x.x/bin'. Or use 'npx openclaw' as a workaround.
EACCES permission error on npm install -g
Never use 'sudo npm install -g'. Instead, fix npm global directory permissions: 'mkdir ~/.npm-global && npm config set prefix ~/.npm-global' then add 'export PATH=~/.npm-global/bin:$PATH' to your ~/.bashrc or ~/.zshrc. Using nvm avoids this problem entirely.
Ollama model not found or connection refused
Make sure the Ollama server is running with 'ollama serve' before starting the OpenClaw gateway. If you get 'model not found', pull it first: 'ollama pull llama3'. Ollama listens on port 11434 by default. Check it is accessible with 'curl http://localhost:11434'.
WSL 2 networking issues on Windows
If you cannot access localhost:18789 from your Windows browser, try the WSL IP instead. Run 'hostname -I' inside WSL to get the IP. Also check that Windows Firewall is not blocking the port. Restarting WSL with 'wsl --shutdown' from PowerShell and reopening Ubuntu often resolves networking glitches.
Docker container exits immediately
Check the logs with 'docker logs openclaw'. Common causes: missing API key environment variable, invalid SOUL.md syntax, or the port is already taken on the host. Make sure your docker-compose.yml has the correct environment variables set.
9. Uninstalling OpenClaw
If you need to remove OpenClaw, the process depends on how you installed it. Here are the commands for each method.
# Remove the global package
npm uninstall -g openclaw
# Remove configuration and session data (optional)
rm -rf ~/.openclawbrew uninstall openclaw
rm -rf ~/.openclaw# Stop and remove the container
docker compose down
# Remove the image
docker rmi openclaw/openclaw:latest
# Remove volumes (deletes all agent data)
docker volume pruneYour project files (SOUL.md, config, knowledge) live in whatever directory you created during onboarding. These are not deleted by the uninstall commands above. Delete them manually if you no longer need them.
Frequently Asked Questions
What are the minimum system requirements for OpenClaw?
OpenClaw requires Node.js 18 or later, 512 MB of free RAM (4 GB if using Ollama with local models), and about 200 MB of disk space. It runs on macOS 12 or later, Windows 10/11 via WSL 2, Ubuntu 20.04 or later, Debian 11 or later, and most other Linux distributions with glibc 2.31 or later.
Can I install OpenClaw without Node.js?
Yes. You can use the Docker installation method, which bundles Node.js inside the container. You only need Docker and Docker Compose installed on your system. The Docker image includes everything OpenClaw needs to run, so you do not need to install Node.js separately.
Does OpenClaw work on Windows without WSL?
Native Windows support is experimental and may have issues with file paths, permission handling, and certain skills. WSL 2 is the recommended approach for Windows users. It takes about 5 minutes to set up and gives you a full Linux environment that runs OpenClaw without issues.
How do I update OpenClaw to a newer version?
If you installed with npm, run 'npm update -g openclaw'. If you use Homebrew on macOS, run 'brew upgrade openclaw'. For Docker, pull the latest image with 'docker pull openclaw/openclaw:latest' and recreate the container. After updating, restart your gateway with 'openclaw gateway restart'.
Can I run OpenClaw with free local models instead of paid APIs?
Yes. Install Ollama from ollama.com, pull a model like llama3 or mistral, and configure OpenClaw to use the Ollama provider. This runs the model entirely on your machine with no API key and no per-message cost. You need at least 8 GB of RAM for 7B parameter models and 16 GB for 13B models.
Want to skip the installation?
CrewClaw gives you ready-to-deploy agent packages. Pick a template, download your configured agent, and run it in 60 seconds. No installation headaches, no dependency issues, no configuration guesswork.
Deploy a Ready-Made AI Agent
Skip the setup. Pick a template and deploy in 60 seconds.