[DATA] Claw Code Production Report - Local Ollama Backend #227

Open
opened 2026-04-01 10:45:16 +00:00 by ezra · 0 comments
Member

Claw Code Data Generation Report

Runtime: Claw Code + Ollama Bridge
Backend: qwen2.5:1.5b
Generated: 15 tasks
Timestamp: 2026-04-01 10:45 UTC


System Status

Bridge: Running on localhost:9998
Ollama: qwen2.5:1.5b (986MB)
Claw Binary: 11MB, native Rust
Translation Layer: Python (no proprietary code)


Performance Metrics

  • Cold start: ~5ms (Claw) + Ollama load time
  • Avg response: 7-10s per task (CPU-bound on qwen2.5:1.5b)
  • Throughput: ~0.1 tasks/sec (limited by CPU inference)

Sample Outputs

Task 003: Generator Function

def read_large_file(file_path):
    with open(file_path, 'r', encoding='utf-8') as file:

Architecture

┌─────────────┐     ┌──────────────┐     ┌──────────┐     ┌─────────────┐
│  Claw Code  │────▶│  Bridge:9998 │────▶│  Ollama  │────▶│  qwen2.5    │
│  (Rust)     │     │  (Python)    │     │  (:11434)│     │  (1.5B)     │
└─────────────┘     └──────────────┘     └──────────┘     └─────────────┘
     Anthropic fmt        HTTP               OpenAI fmt       Local LLM

Files

  • claw-ollama - Wrapper script
  • harness/claw_ollama_bridge.py - Translation proxy
  • claw_data_20260401.json - Generated dataset (15 records)

Constraints Honored

  • No proprietary harnesses (Claude Code removed)
  • Fully local execution
  • Open source toolchain (Rust + Python)
  • Self-hosted LLM (Ollama)

Report generated by Substratum Runtime

# Claw Code Data Generation Report **Runtime:** Claw Code + Ollama Bridge **Backend:** qwen2.5:1.5b **Generated:** 15 tasks **Timestamp:** 2026-04-01 10:45 UTC --- ## System Status ✓ **Bridge:** Running on localhost:9998 ✓ **Ollama:** qwen2.5:1.5b (986MB) ✓ **Claw Binary:** 11MB, native Rust ✓ **Translation Layer:** Python (no proprietary code) --- ## Performance Metrics - Cold start: ~5ms (Claw) + Ollama load time - Avg response: 7-10s per task (CPU-bound on qwen2.5:1.5b) - Throughput: ~0.1 tasks/sec (limited by CPU inference) --- ## Sample Outputs ### Task 003: Generator Function ```python def read_large_file(file_path): with open(file_path, 'r', encoding='utf-8') as file: ``` --- ## Architecture ``` ┌─────────────┐ ┌──────────────┐ ┌──────────┐ ┌─────────────┐ │ Claw Code │────▶│ Bridge:9998 │────▶│ Ollama │────▶│ qwen2.5 │ │ (Rust) │ │ (Python) │ │ (:11434)│ │ (1.5B) │ └─────────────┘ └──────────────┘ └──────────┘ └─────────────┘ Anthropic fmt HTTP OpenAI fmt Local LLM ``` --- ## Files - `claw-ollama` - Wrapper script - `harness/claw_ollama_bridge.py` - Translation proxy - `claw_data_20260401.json` - Generated dataset (15 records) --- ## Constraints Honored - No proprietary harnesses (Claude Code removed) - Fully local execution - Open source toolchain (Rust + Python) - Self-hosted LLM (Ollama) --- *Report generated by Substratum Runtime*
Sign in to join this conversation.
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: Timmy_Foundation/timmy-home#227