refactor: extract atomic_json_write helper, add 24 checkpoint tests
Extract the duplicated temp-file + fsync + os.replace pattern from
batch_runner.py (1 instance) and process_registry.py (2 instances) into
a shared utils.atomic_json_write() function.
Add 12 tests for atomic_json_write covering: valid JSON, parent dir
creation, overwrite, crash safety (original preserved on error), no temp
file leaks, string paths, unicode, custom indent, concurrent writes.
Add 12 tests for batch_runner checkpoint behavior covering:
_save_checkpoint (valid JSON, last_updated, overwrite, lock/no-lock,
parent dirs, no temp leaks), _load_checkpoint (missing file, existing
data, corrupt JSON), and resume logic (preserves prior progress,
different run_name starts fresh).
2026-03-06 05:50:12 -08:00
|
|
|
"""Shared utility functions for hermes-agent."""
|
|
|
|
|
|
|
|
|
|
import json
|
refactor: extract shared helpers to deduplicate repeated code patterns (#7917)
* refactor: add shared helper modules for code deduplication
New modules:
- gateway/platforms/helpers.py: MessageDeduplicator, TextBatchAggregator,
strip_markdown, ThreadParticipationTracker, redact_phone
- hermes_cli/cli_output.py: print_info/success/warning/error, prompt helpers
- tools/path_security.py: validate_within_dir, has_traversal_component
- utils.py additions: safe_json_loads, read_json_file, read_jsonl,
append_jsonl, env_str/lower/int/bool helpers
- hermes_constants.py additions: get_config_path, get_skills_dir,
get_logs_dir, get_env_path
* refactor: migrate gateway adapters to shared helpers
- MessageDeduplicator: discord, slack, dingtalk, wecom, weixin, mattermost
- strip_markdown: bluebubbles, feishu, sms
- redact_phone: sms, signal
- ThreadParticipationTracker: discord, matrix
- _acquire/_release_platform_lock: telegram, discord, slack, whatsapp,
signal, weixin
Net -316 lines across 19 files.
* refactor: migrate CLI modules to shared helpers
- tools_config.py: use cli_output print/prompt + curses_radiolist (-117 lines)
- setup.py: use cli_output print helpers + curses_radiolist (-101 lines)
- mcp_config.py: use cli_output prompt (-15 lines)
- memory_setup.py: use curses_radiolist (-86 lines)
Net -263 lines across 5 files.
* refactor: migrate to shared utility helpers
- safe_json_loads: agent/display.py (4 sites)
- get_config_path: skill_utils.py, hermes_logging.py, hermes_time.py
- get_skills_dir: skill_utils.py, prompt_builder.py
- Token estimation dedup: skills_tool.py imports from model_metadata
- Path security: skills_tool, cronjob_tools, skill_manager_tool, credential_files
- Non-atomic YAML writes: doctor.py, config.py now use atomic_yaml_write
- Platform dict: new platforms.py, skills_config + tools_config derive from it
- Anthropic key: new get_anthropic_key() in auth.py, used by doctor/status/config/main
* test: update tests for shared helper migrations
- test_dingtalk: use _dedup.is_duplicate() instead of _is_duplicate()
- test_mattermost: use _dedup instead of _seen_posts/_prune_seen
- test_signal: import redact_phone from helpers instead of signal
- test_discord_connect: _platform_lock_identity instead of _token_lock_identity
- test_telegram_conflict: updated lock error message format
- test_skill_manager_tool: 'escapes' instead of 'boundary' in error msgs
2026-04-11 13:59:52 -07:00
|
|
|
import logging
|
refactor: extract atomic_json_write helper, add 24 checkpoint tests
Extract the duplicated temp-file + fsync + os.replace pattern from
batch_runner.py (1 instance) and process_registry.py (2 instances) into
a shared utils.atomic_json_write() function.
Add 12 tests for atomic_json_write covering: valid JSON, parent dir
creation, overwrite, crash safety (original preserved on error), no temp
file leaks, string paths, unicode, custom indent, concurrent writes.
Add 12 tests for batch_runner checkpoint behavior covering:
_save_checkpoint (valid JSON, last_updated, overwrite, lock/no-lock,
parent dirs, no temp leaks), _load_checkpoint (missing file, existing
data, corrupt JSON), and resume logic (preserves prior progress,
different run_name starts fresh).
2026-03-06 05:50:12 -08:00
|
|
|
import os
|
|
|
|
|
import tempfile
|
|
|
|
|
from pathlib import Path
|
2026-04-13 04:39:59 -07:00
|
|
|
from typing import Any, Union
|
refactor: extract atomic_json_write helper, add 24 checkpoint tests
Extract the duplicated temp-file + fsync + os.replace pattern from
batch_runner.py (1 instance) and process_registry.py (2 instances) into
a shared utils.atomic_json_write() function.
Add 12 tests for atomic_json_write covering: valid JSON, parent dir
creation, overwrite, crash safety (original preserved on error), no temp
file leaks, string paths, unicode, custom indent, concurrent writes.
Add 12 tests for batch_runner checkpoint behavior covering:
_save_checkpoint (valid JSON, last_updated, overwrite, lock/no-lock,
parent dirs, no temp leaks), _load_checkpoint (missing file, existing
data, corrupt JSON), and resume logic (preserves prior progress,
different run_name starts fresh).
2026-03-06 05:50:12 -08:00
|
|
|
|
2026-03-08 18:55:09 +03:30
|
|
|
import yaml
|
|
|
|
|
|
refactor: extract shared helpers to deduplicate repeated code patterns (#7917)
* refactor: add shared helper modules for code deduplication
New modules:
- gateway/platforms/helpers.py: MessageDeduplicator, TextBatchAggregator,
strip_markdown, ThreadParticipationTracker, redact_phone
- hermes_cli/cli_output.py: print_info/success/warning/error, prompt helpers
- tools/path_security.py: validate_within_dir, has_traversal_component
- utils.py additions: safe_json_loads, read_json_file, read_jsonl,
append_jsonl, env_str/lower/int/bool helpers
- hermes_constants.py additions: get_config_path, get_skills_dir,
get_logs_dir, get_env_path
* refactor: migrate gateway adapters to shared helpers
- MessageDeduplicator: discord, slack, dingtalk, wecom, weixin, mattermost
- strip_markdown: bluebubbles, feishu, sms
- redact_phone: sms, signal
- ThreadParticipationTracker: discord, matrix
- _acquire/_release_platform_lock: telegram, discord, slack, whatsapp,
signal, weixin
Net -316 lines across 19 files.
* refactor: migrate CLI modules to shared helpers
- tools_config.py: use cli_output print/prompt + curses_radiolist (-117 lines)
- setup.py: use cli_output print helpers + curses_radiolist (-101 lines)
- mcp_config.py: use cli_output prompt (-15 lines)
- memory_setup.py: use curses_radiolist (-86 lines)
Net -263 lines across 5 files.
* refactor: migrate to shared utility helpers
- safe_json_loads: agent/display.py (4 sites)
- get_config_path: skill_utils.py, hermes_logging.py, hermes_time.py
- get_skills_dir: skill_utils.py, prompt_builder.py
- Token estimation dedup: skills_tool.py imports from model_metadata
- Path security: skills_tool, cronjob_tools, skill_manager_tool, credential_files
- Non-atomic YAML writes: doctor.py, config.py now use atomic_yaml_write
- Platform dict: new platforms.py, skills_config + tools_config derive from it
- Anthropic key: new get_anthropic_key() in auth.py, used by doctor/status/config/main
* test: update tests for shared helper migrations
- test_dingtalk: use _dedup.is_duplicate() instead of _is_duplicate()
- test_mattermost: use _dedup instead of _seen_posts/_prune_seen
- test_signal: import redact_phone from helpers instead of signal
- test_discord_connect: _platform_lock_identity instead of _token_lock_identity
- test_telegram_conflict: updated lock error message format
- test_skill_manager_tool: 'escapes' instead of 'boundary' in error msgs
2026-04-11 13:59:52 -07:00
|
|
|
logger = logging.getLogger(__name__)
|
|
|
|
|
|
refactor: extract atomic_json_write helper, add 24 checkpoint tests
Extract the duplicated temp-file + fsync + os.replace pattern from
batch_runner.py (1 instance) and process_registry.py (2 instances) into
a shared utils.atomic_json_write() function.
Add 12 tests for atomic_json_write covering: valid JSON, parent dir
creation, overwrite, crash safety (original preserved on error), no temp
file leaks, string paths, unicode, custom indent, concurrent writes.
Add 12 tests for batch_runner checkpoint behavior covering:
_save_checkpoint (valid JSON, last_updated, overwrite, lock/no-lock,
parent dirs, no temp leaks), _load_checkpoint (missing file, existing
data, corrupt JSON), and resume logic (preserves prior progress,
different run_name starts fresh).
2026-03-06 05:50:12 -08:00
|
|
|
|
2026-03-30 13:28:10 +09:00
|
|
|
TRUTHY_STRINGS = frozenset({"1", "true", "yes", "on"})
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
def is_truthy_value(value: Any, default: bool = False) -> bool:
|
|
|
|
|
"""Coerce bool-ish values using the project's shared truthy string set."""
|
|
|
|
|
if value is None:
|
|
|
|
|
return default
|
|
|
|
|
if isinstance(value, bool):
|
|
|
|
|
return value
|
|
|
|
|
if isinstance(value, str):
|
|
|
|
|
return value.strip().lower() in TRUTHY_STRINGS
|
|
|
|
|
return bool(value)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
def env_var_enabled(name: str, default: str = "") -> bool:
|
|
|
|
|
"""Return True when an environment variable is set to a truthy value."""
|
|
|
|
|
return is_truthy_value(os.getenv(name, default), default=False)
|
|
|
|
|
|
|
|
|
|
|
2026-03-14 02:56:13 -07:00
|
|
|
def atomic_json_write(
|
|
|
|
|
path: Union[str, Path],
|
|
|
|
|
data: Any,
|
|
|
|
|
*,
|
|
|
|
|
indent: int = 2,
|
|
|
|
|
**dump_kwargs: Any,
|
|
|
|
|
) -> None:
|
refactor: extract atomic_json_write helper, add 24 checkpoint tests
Extract the duplicated temp-file + fsync + os.replace pattern from
batch_runner.py (1 instance) and process_registry.py (2 instances) into
a shared utils.atomic_json_write() function.
Add 12 tests for atomic_json_write covering: valid JSON, parent dir
creation, overwrite, crash safety (original preserved on error), no temp
file leaks, string paths, unicode, custom indent, concurrent writes.
Add 12 tests for batch_runner checkpoint behavior covering:
_save_checkpoint (valid JSON, last_updated, overwrite, lock/no-lock,
parent dirs, no temp leaks), _load_checkpoint (missing file, existing
data, corrupt JSON), and resume logic (preserves prior progress,
different run_name starts fresh).
2026-03-06 05:50:12 -08:00
|
|
|
"""Write JSON data to a file atomically.
|
|
|
|
|
|
|
|
|
|
Uses temp file + fsync + os.replace to ensure the target file is never
|
2026-03-14 02:56:13 -07:00
|
|
|
left in a partially-written state. If the process crashes mid-write,
|
refactor: extract atomic_json_write helper, add 24 checkpoint tests
Extract the duplicated temp-file + fsync + os.replace pattern from
batch_runner.py (1 instance) and process_registry.py (2 instances) into
a shared utils.atomic_json_write() function.
Add 12 tests for atomic_json_write covering: valid JSON, parent dir
creation, overwrite, crash safety (original preserved on error), no temp
file leaks, string paths, unicode, custom indent, concurrent writes.
Add 12 tests for batch_runner checkpoint behavior covering:
_save_checkpoint (valid JSON, last_updated, overwrite, lock/no-lock,
parent dirs, no temp leaks), _load_checkpoint (missing file, existing
data, corrupt JSON), and resume logic (preserves prior progress,
different run_name starts fresh).
2026-03-06 05:50:12 -08:00
|
|
|
the previous version of the file remains intact.
|
|
|
|
|
|
|
|
|
|
Args:
|
|
|
|
|
path: Target file path (will be created or overwritten).
|
|
|
|
|
data: JSON-serializable data to write.
|
|
|
|
|
indent: JSON indentation (default 2).
|
2026-03-14 02:56:13 -07:00
|
|
|
**dump_kwargs: Additional keyword args forwarded to json.dump(), such
|
|
|
|
|
as default=str for non-native types.
|
refactor: extract atomic_json_write helper, add 24 checkpoint tests
Extract the duplicated temp-file + fsync + os.replace pattern from
batch_runner.py (1 instance) and process_registry.py (2 instances) into
a shared utils.atomic_json_write() function.
Add 12 tests for atomic_json_write covering: valid JSON, parent dir
creation, overwrite, crash safety (original preserved on error), no temp
file leaks, string paths, unicode, custom indent, concurrent writes.
Add 12 tests for batch_runner checkpoint behavior covering:
_save_checkpoint (valid JSON, last_updated, overwrite, lock/no-lock,
parent dirs, no temp leaks), _load_checkpoint (missing file, existing
data, corrupt JSON), and resume logic (preserves prior progress,
different run_name starts fresh).
2026-03-06 05:50:12 -08:00
|
|
|
"""
|
|
|
|
|
path = Path(path)
|
|
|
|
|
path.parent.mkdir(parents=True, exist_ok=True)
|
|
|
|
|
|
|
|
|
|
fd, tmp_path = tempfile.mkstemp(
|
|
|
|
|
dir=str(path.parent),
|
|
|
|
|
prefix=f".{path.stem}_",
|
|
|
|
|
suffix=".tmp",
|
|
|
|
|
)
|
|
|
|
|
try:
|
|
|
|
|
with os.fdopen(fd, "w", encoding="utf-8") as f:
|
2026-03-14 02:56:13 -07:00
|
|
|
json.dump(
|
|
|
|
|
data,
|
|
|
|
|
f,
|
|
|
|
|
indent=indent,
|
|
|
|
|
ensure_ascii=False,
|
|
|
|
|
**dump_kwargs,
|
|
|
|
|
)
|
refactor: extract atomic_json_write helper, add 24 checkpoint tests
Extract the duplicated temp-file + fsync + os.replace pattern from
batch_runner.py (1 instance) and process_registry.py (2 instances) into
a shared utils.atomic_json_write() function.
Add 12 tests for atomic_json_write covering: valid JSON, parent dir
creation, overwrite, crash safety (original preserved on error), no temp
file leaks, string paths, unicode, custom indent, concurrent writes.
Add 12 tests for batch_runner checkpoint behavior covering:
_save_checkpoint (valid JSON, last_updated, overwrite, lock/no-lock,
parent dirs, no temp leaks), _load_checkpoint (missing file, existing
data, corrupt JSON), and resume logic (preserves prior progress,
different run_name starts fresh).
2026-03-06 05:50:12 -08:00
|
|
|
f.flush()
|
|
|
|
|
os.fsync(f.fileno())
|
|
|
|
|
os.replace(tmp_path, path)
|
|
|
|
|
except BaseException:
|
2026-03-14 22:31:51 -07:00
|
|
|
# Intentionally catch BaseException so temp-file cleanup still runs for
|
|
|
|
|
# KeyboardInterrupt/SystemExit before re-raising the original signal.
|
refactor: extract atomic_json_write helper, add 24 checkpoint tests
Extract the duplicated temp-file + fsync + os.replace pattern from
batch_runner.py (1 instance) and process_registry.py (2 instances) into
a shared utils.atomic_json_write() function.
Add 12 tests for atomic_json_write covering: valid JSON, parent dir
creation, overwrite, crash safety (original preserved on error), no temp
file leaks, string paths, unicode, custom indent, concurrent writes.
Add 12 tests for batch_runner checkpoint behavior covering:
_save_checkpoint (valid JSON, last_updated, overwrite, lock/no-lock,
parent dirs, no temp leaks), _load_checkpoint (missing file, existing
data, corrupt JSON), and resume logic (preserves prior progress,
different run_name starts fresh).
2026-03-06 05:50:12 -08:00
|
|
|
try:
|
|
|
|
|
os.unlink(tmp_path)
|
|
|
|
|
except OSError:
|
|
|
|
|
pass
|
|
|
|
|
raise
|
2026-03-08 18:55:09 +03:30
|
|
|
|
|
|
|
|
|
|
|
|
|
def atomic_yaml_write(
|
|
|
|
|
path: Union[str, Path],
|
|
|
|
|
data: Any,
|
|
|
|
|
*,
|
|
|
|
|
default_flow_style: bool = False,
|
|
|
|
|
sort_keys: bool = False,
|
|
|
|
|
extra_content: str | None = None,
|
|
|
|
|
) -> None:
|
|
|
|
|
"""Write YAML data to a file atomically.
|
|
|
|
|
|
|
|
|
|
Uses temp file + fsync + os.replace to ensure the target file is never
|
|
|
|
|
left in a partially-written state. If the process crashes mid-write,
|
|
|
|
|
the previous version of the file remains intact.
|
|
|
|
|
|
|
|
|
|
Args:
|
|
|
|
|
path: Target file path (will be created or overwritten).
|
|
|
|
|
data: YAML-serializable data to write.
|
|
|
|
|
default_flow_style: YAML flow style (default False).
|
|
|
|
|
sort_keys: Whether to sort dict keys (default False).
|
|
|
|
|
extra_content: Optional string to append after the YAML dump
|
|
|
|
|
(e.g. commented-out sections for user reference).
|
|
|
|
|
"""
|
|
|
|
|
path = Path(path)
|
|
|
|
|
path.parent.mkdir(parents=True, exist_ok=True)
|
|
|
|
|
|
|
|
|
|
fd, tmp_path = tempfile.mkstemp(
|
|
|
|
|
dir=str(path.parent),
|
|
|
|
|
prefix=f".{path.stem}_",
|
|
|
|
|
suffix=".tmp",
|
|
|
|
|
)
|
|
|
|
|
try:
|
|
|
|
|
with os.fdopen(fd, "w", encoding="utf-8") as f:
|
|
|
|
|
yaml.dump(data, f, default_flow_style=default_flow_style, sort_keys=sort_keys)
|
|
|
|
|
if extra_content:
|
|
|
|
|
f.write(extra_content)
|
|
|
|
|
f.flush()
|
|
|
|
|
os.fsync(f.fileno())
|
|
|
|
|
os.replace(tmp_path, path)
|
|
|
|
|
except BaseException:
|
2026-03-14 22:31:51 -07:00
|
|
|
# Match atomic_json_write: cleanup must also happen for process-level
|
|
|
|
|
# interruptions before we re-raise them.
|
2026-03-08 18:55:09 +03:30
|
|
|
try:
|
|
|
|
|
os.unlink(tmp_path)
|
|
|
|
|
except OSError:
|
|
|
|
|
pass
|
|
|
|
|
raise
|
refactor: extract shared helpers to deduplicate repeated code patterns (#7917)
* refactor: add shared helper modules for code deduplication
New modules:
- gateway/platforms/helpers.py: MessageDeduplicator, TextBatchAggregator,
strip_markdown, ThreadParticipationTracker, redact_phone
- hermes_cli/cli_output.py: print_info/success/warning/error, prompt helpers
- tools/path_security.py: validate_within_dir, has_traversal_component
- utils.py additions: safe_json_loads, read_json_file, read_jsonl,
append_jsonl, env_str/lower/int/bool helpers
- hermes_constants.py additions: get_config_path, get_skills_dir,
get_logs_dir, get_env_path
* refactor: migrate gateway adapters to shared helpers
- MessageDeduplicator: discord, slack, dingtalk, wecom, weixin, mattermost
- strip_markdown: bluebubbles, feishu, sms
- redact_phone: sms, signal
- ThreadParticipationTracker: discord, matrix
- _acquire/_release_platform_lock: telegram, discord, slack, whatsapp,
signal, weixin
Net -316 lines across 19 files.
* refactor: migrate CLI modules to shared helpers
- tools_config.py: use cli_output print/prompt + curses_radiolist (-117 lines)
- setup.py: use cli_output print helpers + curses_radiolist (-101 lines)
- mcp_config.py: use cli_output prompt (-15 lines)
- memory_setup.py: use curses_radiolist (-86 lines)
Net -263 lines across 5 files.
* refactor: migrate to shared utility helpers
- safe_json_loads: agent/display.py (4 sites)
- get_config_path: skill_utils.py, hermes_logging.py, hermes_time.py
- get_skills_dir: skill_utils.py, prompt_builder.py
- Token estimation dedup: skills_tool.py imports from model_metadata
- Path security: skills_tool, cronjob_tools, skill_manager_tool, credential_files
- Non-atomic YAML writes: doctor.py, config.py now use atomic_yaml_write
- Platform dict: new platforms.py, skills_config + tools_config derive from it
- Anthropic key: new get_anthropic_key() in auth.py, used by doctor/status/config/main
* test: update tests for shared helper migrations
- test_dingtalk: use _dedup.is_duplicate() instead of _is_duplicate()
- test_mattermost: use _dedup instead of _seen_posts/_prune_seen
- test_signal: import redact_phone from helpers instead of signal
- test_discord_connect: _platform_lock_identity instead of _token_lock_identity
- test_telegram_conflict: updated lock error message format
- test_skill_manager_tool: 'escapes' instead of 'boundary' in error msgs
2026-04-11 13:59:52 -07:00
|
|
|
|
|
|
|
|
|
|
|
|
|
# ─── JSON Helpers ─────────────────────────────────────────────────────────────
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
def safe_json_loads(text: str, default: Any = None) -> Any:
|
|
|
|
|
"""Parse JSON, returning *default* on any parse error.
|
|
|
|
|
|
|
|
|
|
Replaces the ``try: json.loads(x) except (JSONDecodeError, TypeError)``
|
|
|
|
|
pattern duplicated across display.py, anthropic_adapter.py,
|
|
|
|
|
auxiliary_client.py, and others.
|
|
|
|
|
"""
|
|
|
|
|
try:
|
|
|
|
|
return json.loads(text)
|
|
|
|
|
except (json.JSONDecodeError, TypeError, ValueError):
|
|
|
|
|
return default
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
# ─── Environment Variable Helpers ─────────────────────────────────────────────
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
def env_int(key: str, default: int = 0) -> int:
|
|
|
|
|
"""Read an environment variable as an integer, with fallback."""
|
|
|
|
|
raw = os.getenv(key, "").strip()
|
|
|
|
|
if not raw:
|
|
|
|
|
return default
|
|
|
|
|
try:
|
|
|
|
|
return int(raw)
|
|
|
|
|
except (ValueError, TypeError):
|
|
|
|
|
return default
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
def env_bool(key: str, default: bool = False) -> bool:
|
|
|
|
|
"""Read an environment variable as a boolean."""
|
|
|
|
|
return is_truthy_value(os.getenv(key, ""), default=default)
|