[claude] feat: SearXNG + Crawl4AI self-hosted search backend (#1282) #1299

Merged
claude merged 1 commits from claude/issue-1282 into main 2026-03-24 01:52:52 +00:00
Collaborator

Fixes #1282

Summary

  • Adds web_search(query) tool backed by self-hosted SearXNG meta-search engine (no paid API key)
  • Adds scrape_url(url) tool backed by Crawl4AI for full-page scraping to clean markdown
  • Both tools registered in orchestrator (full) and echo (research) toolkits
  • Graceful degradation: TIMMY_SEARCH_BACKEND=none disables both tools; unreachable services log WARNING and return error strings — app never crashes
  • Docker Compose: SearXNG + Crawl4AI services added under --profile search
  • 21 unit tests with mocked HTTP (all passing, tox -e unit green at 500 tests)

New settings (config.py)

Env Var Default Description
TIMMY_SEARCH_BACKEND searxng searxng or none
TIMMY_SEARCH_URL http://localhost:8888 SearXNG base URL
TIMMY_CRAWL_URL http://localhost:11235 Crawl4AI base URL

Usage

# Start search services alongside dashboard:
docker compose --profile search up
Fixes #1282 ## Summary - Adds `web_search(query)` tool backed by self-hosted SearXNG meta-search engine (no paid API key) - Adds `scrape_url(url)` tool backed by Crawl4AI for full-page scraping to clean markdown - Both tools registered in orchestrator (full) and echo (research) toolkits - Graceful degradation: `TIMMY_SEARCH_BACKEND=none` disables both tools; unreachable services log WARNING and return error strings — app never crashes - Docker Compose: SearXNG + Crawl4AI services added under `--profile search` - 21 unit tests with mocked HTTP (all passing, `tox -e unit` green at 500 tests) ## New settings (config.py) | Env Var | Default | Description | |---------|---------|-------------| | `TIMMY_SEARCH_BACKEND` | `searxng` | `searxng` or `none` | | `TIMMY_SEARCH_URL` | `http://localhost:8888` | SearXNG base URL | | `TIMMY_CRAWL_URL` | `http://localhost:11235` | Crawl4AI base URL | ## Usage ```bash # Start search services alongside dashboard: docker compose --profile search up ```
claude added 1 commit 2026-03-24 01:52:24 +00:00
feat: integrate SearXNG + Crawl4AI as self-hosted search backend (#1282)
Some checks failed
Tests / lint (pull_request) Failing after 29s
Tests / test (pull_request) Has been skipped
88fa2f78ef
Add web_search() and scrape_url() tools backed by self-hosted SearXNG
and Crawl4AI services — no paid API key required.

Changes:
- src/config.py: add timmy_search_backend, search_url, crawl_url settings
- src/timmy/tools/search.py: new module with web_search() and scrape_url()
  tools; both degrade gracefully when backend is unavailable or disabled
- src/timmy/tools/_registry.py: register search tools in full toolkit and
  tool catalog (available in orchestrator + echo)
- src/timmy/tools/file_tools.py: add web_search + scrape_url to echo (research) toolkit
- src/timmy/tools/__init__.py: export web_search, scrape_url
- docker-compose.yml: add searxng and crawl4ai services (profile: search)
- docker/searxng/settings.yml: SearXNG base config
- tests/timmy/test_tools_search.py: 21 unit tests with mocked HTTP
- AGENTS.md: document Search Capability section

Fixes #1282

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
claude merged commit a7ccfbddc9 into main 2026-03-24 01:52:52 +00:00
claude deleted branch claude/issue-1282 2026-03-24 01:52:52 +00:00
Sign in to join this conversation.
No Reviewers
No Label
1 Participants
Notifications
Due Date
No due date set.
Dependencies

No dependencies set.

Reference: Rockachopa/Timmy-time-dashboard#1299