[OPENCLAW] Phase 1: Install OpenClaw on M3 Max #52

Closed
opened 2026-03-28 00:42:08 +00:00 by perplexity · 6 comments
Member

Parent: #50

Steps

# 1. Install Node 24
brew install node@24

# 2. Install OpenClaw
npm install -g openclaw@latest

# 3. Run onboarding with Ollama
openclaw onboard --non-interactive \
  --auth-choice ollama \
  --custom-base-url "http://127.0.0.1:11434" \
  --custom-model-id "hermes4:14b" \
  --accept-risk

# 4. Install launchd daemon
openclaw onboard --install-daemon

# 5. Health check
openclaw doctor
openclaw models status

Known Apple Silicon Issues

  • sharp build issue fixed in current release (0.35.0-rc.0+)
  • If sharp fails: npm install -g node-gyp then retry
  • PATH fix: export PATH="$(npm prefix -g)/bin:$PATH" in ~/.zshrc

Acceptance

  • openclaw doctor passes
  • openclaw models status shows ollama/hermes4:14b
  • Gateway starts on port 18789
  • launchd daemon survives reboot
## Parent: #50 ## Steps ```bash # 1. Install Node 24 brew install node@24 # 2. Install OpenClaw npm install -g openclaw@latest # 3. Run onboarding with Ollama openclaw onboard --non-interactive \ --auth-choice ollama \ --custom-base-url "http://127.0.0.1:11434" \ --custom-model-id "hermes4:14b" \ --accept-risk # 4. Install launchd daemon openclaw onboard --install-daemon # 5. Health check openclaw doctor openclaw models status ``` ## Known Apple Silicon Issues - `sharp` build issue fixed in current release (0.35.0-rc.0+) - If `sharp` fails: `npm install -g node-gyp` then retry - PATH fix: `export PATH="$(npm prefix -g)/bin:$PATH"` in `~/.zshrc` ## Acceptance - [ ] `openclaw doctor` passes - [ ] `openclaw models status` shows `ollama/hermes4:14b` - [ ] Gateway starts on port 18789 - [ ] launchd daemon survives reboot
Owner

OpenClaw install started on Maximum Maxitude.

Verified so far:

  • npm install -g openclaw@latest succeeded
  • Installed version: 2026.3.24
  • openclaw onboard --non-interactive --accept-risk --mode local --auth-choice skip --skip-daemon --skip-channels --skip-search --skip-skills --skip-ui --skip-health --json succeeded
  • Config written: ~/.openclaw/openclaw.json
  • Workspace created: ~/.openclaw/workspace
  • openclaw config validate passed
  • openclaw gateway install installed LaunchAgent
  • openclaw gateway start succeeded
  • openclaw gateway health => OK (0ms)
  • openclaw gateway status reports runtime running and listening on 127.0.0.1:18789
  • Local dashboard: http://127.0.0.1:18789/

Current blocker for the full local-model path:

  • ollama is not currently in PATH on this machine, so the Ollama-backed Timmy model wiring is not finished yet

Next install steps:

  1. locate or install a working Ollama binary
  2. point OpenClaw at the local model endpoint
  3. replace the default workspace soul with Timmy-specific SOUL.md / AGENTS.md
  4. run first real agent turn as Timmy
OpenClaw install started on Maximum Maxitude. Verified so far: - `npm install -g openclaw@latest` succeeded - Installed version: `2026.3.24` - `openclaw onboard --non-interactive --accept-risk --mode local --auth-choice skip --skip-daemon --skip-channels --skip-search --skip-skills --skip-ui --skip-health --json` succeeded - Config written: `~/.openclaw/openclaw.json` - Workspace created: `~/.openclaw/workspace` - `openclaw config validate` passed - `openclaw gateway install` installed LaunchAgent - `openclaw gateway start` succeeded - `openclaw gateway health` => `OK (0ms)` - `openclaw gateway status` reports runtime running and listening on `127.0.0.1:18789` - Local dashboard: http://127.0.0.1:18789/ Current blocker for the full local-model path: - `ollama` is not currently in PATH on this machine, so the Ollama-backed Timmy model wiring is not finished yet Next install steps: 1. locate or install a working Ollama binary 2. point OpenClaw at the local model endpoint 3. replace the default workspace soul with Timmy-specific `SOUL.md` / `AGENTS.md` 4. run first real agent turn as Timmy
Owner

Update: OpenClaw is now running on the OpenAI Codex backend instead of a local model.

Verified:

  • default model switched to openai-codex/gpt-5.4
  • auth order pinned to openai-codex:default
  • Timmy identity loaded into ~/.openclaw/workspace
  • first successful live agent turn completed through OpenClaw

Proof:

  • local dashboard: http://127.0.0.1:18789/
  • gateway health: OK (0ms)
  • successful agent response: “I’m Timmy Time, a plainspoken sovereign AI built to be useful first and honest always. I serve whoever runs me; in this workspace, I serve Alexander Whitestone.”

Current state:

  • OpenClaw is installed, configured, and using Codex
  • local llama.cpp/ollama integration intentionally not wired in this pass
  • next step is deciding whether OpenClaw should remain a sidecar/gateway only, or take over specific channel/cron duties from the Hermes harness
Update: OpenClaw is now running on the OpenAI Codex backend instead of a local model. Verified: - default model switched to `openai-codex/gpt-5.4` - auth order pinned to `openai-codex:default` - Timmy identity loaded into `~/.openclaw/workspace` - first successful live agent turn completed through OpenClaw Proof: - local dashboard: http://127.0.0.1:18789/ - gateway health: `OK (0ms)` - successful agent response: “I’m Timmy Time, a plainspoken sovereign AI built to be useful first and honest always. I serve whoever runs me; in this workspace, I serve Alexander Whitestone.” Current state: - OpenClaw is installed, configured, and using Codex - local llama.cpp/ollama integration intentionally not wired in this pass - next step is deciding whether OpenClaw should remain a sidecar/gateway only, or take over specific channel/cron duties from the Hermes harness
Member

🔧 gemini working on this via Huey. Branch: gemini/issue-52

🔧 `gemini` working on this via Huey. Branch: `gemini/issue-52`
Timmy was assigned by Rockachopa 2026-03-28 03:52:20 +00:00
Member

🔧 grok working on this via Huey. Branch: grok/issue-52

🔧 `grok` working on this via Huey. Branch: `grok/issue-52`
Member

⚠️ grok produced no changes for this issue. Skipping.

⚠️ `grok` produced no changes for this issue. Skipping.
Owner

Closing as resolved in practice. OpenClaw was installed and verified locally; the current frontier is no longer installation but what responsibilities, if any, OpenClaw should keep after the backlog reset.

Closing as resolved in practice. OpenClaw was installed and verified locally; the current frontier is no longer installation but what responsibilities, if any, OpenClaw should keep after the backlog reset.
Timmy closed this issue 2026-03-28 04:52:57 +00:00
Sign in to join this conversation.
4 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: Timmy_Foundation/timmy-config#52