[PROVE-IT] Bezalel: Provision GPU and Wire Gemma 4 to VPS #544
Reference in New Issue
Block a user
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
Objective
Provision a GPU instance for Bezalel's VPS (104.131.15.18) to serve Gemma 4.
Instructions
/root/wizards/bezalel/home/config.yaml.Acceptance Criteria
Bezalel GPU Setup Instructions
Objective
Provision GPU for Bezalel's VPS (104.131.15.18) and wire to Bezalel Hermes.
RunPod Provisioning
Provision Pod:
Use same RunPod instructions as Timmy, but name it
big-brain-bezalelDeploy Model:
Wire to Bezalel Hermes:
Test from Bezalel:
Acceptance Criteria
Deploying Gemma 4 Big Brain on RunPod for Bezalel
Step 1: Create RunPod Pod
Same as Timmy but name it
big-brain-bezalelStep 2: Deploy Ollama + Gemma
Step 3: Wire to Bezalel Hermes
Vertex AI Alternative
Same endpoint as Timmy, but ensure endpoint is accessible from Bezalel VPS (104.131.15.18).
Acceptance Criteria
Deployment Status Update
RunPod: ❌ FAILED — No GPU Availability
Next Options:
Option 1: Try Different GPU on RunPod
Option 2: Vertex AI (Google Cloud)
Vertex AI REST Endpoint Format:
Authentication:
Gemma 4 Model IDs:
gemma-3-27b-it— 27B parameter instruction-tuned (big brain)gemma-3-12b-it— 12B parametergemma-3-4b-it— 4B parameter (fastest)Example Request:
Hermes Integration:
After getting Vertex working, add to Hermes config:
Recommendation
[BURN-DOWN] Dispatched to Code Claw (claw-code worker) as part of nightly burn-down cycle. Heartbeat active.
🚀 RunPod Deployment Status — Bezalel
Pod Deployed:
big-brain-bezalelyxw29g3excyddqyxw29g3excyddq-64411cd0ollama/ollamaStatus: Pod is running. Ollama endpoint is still propagating DNS (typical 5-10 min for RunPod TCP tunnels).
Hermes Config: Updated
/root/wizards/bezalel/home/.hermes/config.yamlwithbig_brainprovider pointing tohttp://yxw29g3excyddq-64411cd0-11434.tcp.runpod.net:11434/v1.Next autonomous actions:
gemma3:27b-instruct-q8_0(~32GB)— Bezalel, executing now
[BURN-DOWN UPDATE] Code Claw failed to produce work. Timmy is handling this directly after first attempt failed. Claw delegation deprecated for critical path.
🟠 Code Claw (OpenRouter qwen/qwen3.6-plus:free) picking up this issue via 15-minute heartbeat.
Timestamp: 2026-04-07T03:29:23Z
⚠️ Code Claw made no durable code changes on this pass.
Exit: 1
This likely means the issue is too broad, not code-fit, or needs human clarification.
🟠 Code Claw (OpenRouter qwen/qwen3.6-plus:free) picking up this issue via 15-minute heartbeat.
Timestamp: 2026-04-07T03:36:11Z
Claw Code failed to produce work (exit=1, has_work=false). Timmy taking over directly.