Compare commits

...

15 Commits

Author SHA1 Message Date
9a7e554568 fix: add missing MemoryPulse import, init, and update calls
Some checks failed
CI / test (pull_request) Failing after 10s
CI / validate (pull_request) Failing after 14s
Review Approval Gate / verify-review (pull_request) Failing after 2s
2026-04-11 20:47:45 +00:00
d12bd7a806 feat(mnemosyne): wire MemoryPulse into app.js
Some checks failed
CI / test (pull_request) Failing after 9s
CI / validate (pull_request) Failing after 14s
Review Approval Gate / verify-review (pull_request) Failing after 2s
- Import MemoryPulse component
- Initialize with scene + SpatialMemory
- Call update() in animation loop
- Trigger pulse on crystal click via raycasting
2026-04-11 20:46:34 +00:00
9355c02417 feat(mnemosyne): Memory Pulse — holographic ripple propagation
When a memory crystal is accessed, a visual pulse wave radiates
outward through the connection graph, illuminating linked memories
by BFS hop distance.

Features:
- Expanding ring effect at each crystal (color-matched to region)
- Connection line flash between pulsed memories
- Travel time based on spatial distance
- Intensity decay per hop (0.65^hop)
- Depth-limited to 5 hops to prevent runaway
- Fully self-contained component, integrates via SpatialMemory API
2026-04-11 20:45:16 +00:00
3c81c64f04 Merge pull request '[Mnemosyne] Memory Birth Animation System' (#1222) from feat/mnemosyne-memory-birth into main
Some checks failed
Deploy Nexus / deploy (push) Failing after 3s
Staging Verification Gate / verify-staging (push) Failing after 3s
2026-04-11 20:23:24 +00:00
909a61702e [claude] Mnemosyne: semantic search via holographic linker similarity (#1223) (#1225)
Some checks failed
Deploy Nexus / deploy (push) Failing after 3s
Staging Verification Gate / verify-staging (push) Failing after 3s
2026-04-11 20:19:52 +00:00
12a5a75748 feat: integrate MemoryBirth into app.js
Some checks failed
CI / test (pull_request) Failing after 10s
CI / validate (pull_request) Failing after 16s
Review Approval Gate / verify-review (pull_request) Failing after 2s
- Import MemoryBirth module
- Initialize alongside SpatialMemory
- Wrap placeMemory() for automatic birth animations
- Call MemoryBirth.update() in render loop
2026-04-11 19:48:46 +00:00
1273c22b15 feat: add memory-birth.js — crystal materialization animation system
- Elastic scale-in from 0 to full size
- Bloom flash at materialization peak
- Neighbor pulse: nearby memories brighten on birth
- Connection line progressive draw-in
- Auto-wraps SpatialMemory.placeMemory() for zero-config use
2026-04-11 19:47:48 +00:00
038346b8a9 [claude] Mnemosyne: export, deletion, and richer stats (#1218) (#1220)
Some checks failed
Deploy Nexus / deploy (push) Failing after 3s
Staging Verification Gate / verify-staging (push) Failing after 3s
2026-04-11 18:50:29 +00:00
b9f1602067 merge: Mnemosyne Phase 1 — Living Holographic Archive
Some checks failed
Deploy Nexus / deploy (push) Failing after 3s
Staging Verification Gate / verify-staging (push) Failing after 3s
Co-authored-by: Alexander Whitestone <alexander@alexanderwhitestone.com>
Co-committed-by: Alexander Whitestone <alexander@alexanderwhitestone.com>
2026-04-11 12:10:14 +00:00
c6f6f83a7c Merge pull request '[Mnemosyne] Memory filter panel — toggle categories by region' (#1213) from feat/mnemosyne-memory-filter into main
Some checks failed
Deploy Nexus / deploy (push) Failing after 3s
Staging Verification Gate / verify-staging (push) Failing after 3s
Merged PR #1213: [Mnemosyne] Memory filter panel — toggle categories by region
2026-04-11 05:31:44 +00:00
026e4a8cae Merge pull request '[Mnemosyne] Fix entity resolution lines wiring (#1167)' (#1214) from fix/entity-resolution-lines-wiring into main
Some checks failed
Deploy Nexus / deploy (push) Failing after 3s
Staging Verification Gate / verify-staging (push) Failing after 3s
Merged PR #1214
2026-04-11 05:31:26 +00:00
45724e8421 feat(mnemosyne): wire memory filter panel in app.js
Some checks failed
CI / test (pull_request) Failing after 8s
CI / validate (pull_request) Failing after 11s
Review Approval Gate / verify-review (pull_request) Failing after 2s
- G key toggles filter panel
- Escape closes filter panel
- toggleMemoryFilter() bridge function
2026-04-11 04:10:49 +00:00
04a61132c9 feat(mnemosyne): add memory filter panel CSS
- Frosted glass panel matching Mnemosyne theme
- Category toggle switches with color dots
- Slide-in animation from right
2026-04-11 04:09:30 +00:00
c82d60d7f1 feat(mnemosyne): add memory filter panel with category toggles
- Filter panel with toggle switches per memory region
- Show All / Hide All bulk controls
- Memory count per category
- Frosted glass UI matching Mnemosyne design
2026-04-11 04:09:03 +00:00
6529af293f feat(mnemosyne): add region filter visibility methods to SpatialMemory
- setRegionVisibility(category, visible) — toggle single region
- setAllRegionsVisible(visible) — bulk toggle
- getMemoryCountByRegion() — count memories per category
- isRegionVisible(category) — query visibility state
2026-04-11 04:08:28 +00:00
14 changed files with 1729 additions and 1 deletions

34
app.js
View File

@@ -4,7 +4,9 @@ import { RenderPass } from 'three/addons/postprocessing/RenderPass.js';
import { UnrealBloomPass } from 'three/addons/postprocessing/UnrealBloomPass.js';
import { SMAAPass } from 'three/addons/postprocessing/SMAAPass.js';
import { SpatialMemory } from './nexus/components/spatial-memory.js';
import { MemoryBirth } from './nexus/components/memory-birth.js';
import { MemoryOptimizer } from './nexus/components/memory-optimizer.js';
import { MemoryPulse } from './nexus/components/memory-pulse.js';
// ═══════════════════════════════════════════
// NEXUS v1.1 — Portal System Update
@@ -46,6 +48,7 @@ let debugOverlay;
let frameCount = 0, lastFPSTime = 0, fps = 0;
let chatOpen = true;
let memoryFeedEntries = []; // Mnemosyne: recent memory events for feed panel
let _memoryFilterOpen = false; // Mnemosyne: filter panel state
let loadProgress = 0;
let performanceTier = 'high';
@@ -707,6 +710,9 @@ async function init() {
createWorkshopTerminal();
createAshStorm();
SpatialMemory.init(scene);
MemoryBirth.init(scene);
MemoryPulse.init(scene, SpatialMemory);
MemoryBirth.wrapSpatialMemory(SpatialMemory);
SpatialMemory.setCamera(camera);
updateLoad(90);
@@ -1871,6 +1877,7 @@ function setupControls() {
if (visionOverlayActive) closeVisionOverlay();
if (atlasOverlayActive) closePortalAtlas();
if (_archiveDashboardOpen) toggleArchiveHealthDashboard();
if (_memoryFilterOpen) closeMemoryFilter();
}
if (e.key.toLowerCase() === 'v' && document.activeElement !== document.getElementById('chat-input')) {
cycleNavMode();
@@ -1884,6 +1891,9 @@ function setupControls() {
if (e.key.toLowerCase() === 'h' && document.activeElement !== document.getElementById('chat-input')) {
toggleArchiveHealthDashboard();
}
if (e.key.toLowerCase() === 'g' && document.activeElement !== document.getElementById('chat-input')) {
toggleMemoryFilter();
}
});
document.addEventListener('keyup', (e) => {
keys[e.key.toLowerCase()] = false;
@@ -1910,6 +1920,19 @@ function setupControls() {
const portal = portals.find(p => p.ring === clickedRing);
if (portal) activatePortal(portal);
}
// Raycasting for memory crystals — trigger pulse
const crystalMeshes = SpatialMemory.getCrystalMeshes();
if (crystalMeshes.length > 0) {
const crystalHits = raycaster.intersectObjects(crystalMeshes);
if (crystalHits.length > 0) {
const hitMesh = crystalHits[0].object;
const memData = SpatialMemory.getMemoryFromMesh(hitMesh);
if (memData) {
MemoryPulse.triggerPulse(memData.data.id);
}
}
}
}
}
});
@@ -2254,6 +2277,15 @@ function toggleArchiveHealthDashboard() {
* Render current archive statistics into the dashboard panel.
* Reads live from SpatialMemory.getAllMemories() — no backend needed.
*/
function toggleMemoryFilter() {
_memoryFilterOpen = !_memoryFilterOpen;
if (_memoryFilterOpen) {
openMemoryFilter();
} else {
closeMemoryFilter();
}
}
function updateArchiveHealthDashboard() {
const container = document.getElementById('archive-health-content');
if (!container) return;
@@ -2854,6 +2886,8 @@ function gameLoop() {
// Project Mnemosyne - Memory Orb Animation
if (typeof animateMemoryOrbs === 'function') {
SpatialMemory.update(delta);
MemoryBirth.update(delta);
MemoryPulse.update(delta);
animateMemoryOrbs(delta);
}

View File

@@ -457,7 +457,67 @@ index.html
<div class="memory-feed-actions"><button class="memory-feed-clear" onclick="clearMemoryFeed()">Clear</button><button class="memory-feed-toggle" onclick="document.getElementById('memory-feed').style.display='none'"></button></div>
</div>
<div id="memory-feed-list" class="memory-feed-list"></div>
<!-- ═══ MNEMOSYNE MEMORY FILTER ═══ -->
<div id="memory-filter" class="memory-filter" style="display:none;">
<div class="filter-header">
<span class="filter-title">⬡ Memory Filter</span>
<button class="filter-close" onclick="closeMemoryFilter()"></button>
</div>
<div class="filter-controls">
<button class="filter-btn" onclick="setAllFilters(true)">Show All</button>
<button class="filter-btn" onclick="setAllFilters(false)">Hide All</button>
</div>
<div class="filter-list" id="filter-list"></div>
</div>
</div>
<script>
// ─── MNEMOSYNE: Memory Filter Panel ───────────────────
function openMemoryFilter() {
renderFilterList();
document.getElementById('memory-filter').style.display = 'flex';
}
function closeMemoryFilter() {
document.getElementById('memory-filter').style.display = 'none';
}
function renderFilterList() {
const counts = SpatialMemory.getMemoryCountByRegion();
const regions = SpatialMemory.REGIONS;
const list = document.getElementById('filter-list');
list.innerHTML = '';
for (const [key, region] of Object.entries(regions)) {
const count = counts[key] || 0;
const visible = SpatialMemory.isRegionVisible(key);
const colorHex = '#' + region.color.toString(16).padStart(6, '0');
const item = document.createElement('div');
item.className = 'filter-item';
item.innerHTML = `
<div class="filter-item-left">
<span class="filter-dot" style="background:${colorHex}"></span>
<span class="filter-label">${region.glyph} ${region.label}</span>
</div>
<div class="filter-item-right">
<span class="filter-count">${count}</span>
<label class="filter-toggle">
<input type="checkbox" ${visible ? 'checked' : ''}
onchange="toggleRegion('${key}', this.checked)">
<span class="filter-slider"></span>
</label>
</div>
`;
list.appendChild(item);
}
}
function toggleRegion(category, visible) {
SpatialMemory.setRegionVisibility(category, visible);
}
function setAllFilters(visible) {
SpatialMemory.setAllRegionsVisible(visible);
renderFilterList();
}
</script>
</body>
</html>

View File

@@ -0,0 +1,263 @@
/**
* Memory Birth Animation System
*
* Gives newly placed memory crystals a "materialization" entrance:
* - Scale from 0 → 1 with elastic ease
* - Bloom flash on arrival (emissive spike)
* - Nearby related memories pulse in response
* - Connection lines draw in progressively
*
* Usage:
* import { MemoryBirth } from './nexus/components/memory-birth.js';
* MemoryBirth.init(scene);
* // After placing a crystal via SpatialMemory.placeMemory():
* MemoryBirth.triggerBirth(crystalMesh, spatialMemory);
* // In your render loop:
* MemoryBirth.update(delta);
*/
const MemoryBirth = (() => {
// ─── CONFIG ────────────────────────────────────────
const BIRTH_DURATION = 1.8; // seconds for full materialization
const BLOOM_PEAK = 0.3; // when the bloom flash peaks (fraction of duration)
const BLOOM_INTENSITY = 4.0; // emissive spike at peak
const NEIGHBOR_PULSE_RADIUS = 8; // units — memories in this range pulse
const NEIGHBOR_PULSE_INTENSITY = 2.5;
const NEIGHBOR_PULSE_DURATION = 0.8;
const LINE_DRAW_DURATION = 1.2; // seconds for connection lines to grow in
let _scene = null;
let _activeBirths = []; // { mesh, startTime, duration, originPos }
let _activePulses = []; // { mesh, startTime, duration, origEmissive, origIntensity }
let _activeLineGrowths = []; // { line, startTime, duration, totalPoints }
let _initialized = false;
// ─── ELASTIC EASE-OUT ─────────────────────────────
function elasticOut(t) {
if (t <= 0) return 0;
if (t >= 1) return 1;
const c4 = (2 * Math.PI) / 3;
return Math.pow(2, -10 * t) * Math.sin((t * 10 - 0.75) * c4) + 1;
}
// ─── SMOOTH STEP ──────────────────────────────────
function smoothstep(edge0, edge1, x) {
const t = Math.max(0, Math.min(1, (x - edge0) / (edge1 - edge0)));
return t * t * (3 - 2 * t);
}
// ─── INIT ─────────────────────────────────────────
function init(scene) {
_scene = scene;
_initialized = true;
console.info('[MemoryBirth] Initialized');
}
// ─── TRIGGER BIRTH ────────────────────────────────
function triggerBirth(mesh, spatialMemory) {
if (!_initialized || !mesh) return;
// Start at zero scale
mesh.scale.setScalar(0.001);
// Store original material values for bloom
if (mesh.material) {
mesh.userData._birthOrigEmissive = mesh.material.emissiveIntensity;
mesh.userData._birthOrigOpacity = mesh.material.opacity;
}
_activeBirths.push({
mesh,
startTime: Date.now() / 1000,
duration: BIRTH_DURATION,
spatialMemory,
originPos: mesh.position.clone()
});
// Trigger neighbor pulses for memories in the same region
_triggerNeighborPulses(mesh, spatialMemory);
// Schedule connection line growth
_triggerLineGrowth(mesh, spatialMemory);
}
// ─── NEIGHBOR PULSE ───────────────────────────────
function _triggerNeighborPulses(mesh, spatialMemory) {
if (!spatialMemory || !mesh.position) return;
const allMems = spatialMemory.getAllMemories ? spatialMemory.getAllMemories() : [];
const pos = mesh.position;
const sourceId = mesh.userData.memId;
allMems.forEach(mem => {
if (mem.id === sourceId) return;
if (!mem.position) return;
const dx = mem.position[0] - pos.x;
const dy = (mem.position[1] + 1.5) - pos.y;
const dz = mem.position[2] - pos.z;
const dist = Math.sqrt(dx * dx + dy * dy + dz * dz);
if (dist < NEIGHBOR_PULSE_RADIUS) {
// Find the mesh for this memory
const neighborMesh = _findMeshById(mem.id, spatialMemory);
if (neighborMesh && neighborMesh.material) {
_activePulses.push({
mesh: neighborMesh,
startTime: Date.now() / 1000,
duration: NEIGHBOR_PULSE_DURATION,
origEmissive: neighborMesh.material.emissiveIntensity,
intensity: NEIGHBOR_PULSE_INTENSITY * (1 - dist / NEIGHBOR_PULSE_RADIUS)
});
}
}
});
}
function _findMeshById(memId, spatialMemory) {
// Access the internal memory objects through crystal meshes
const meshes = spatialMemory.getCrystalMeshes ? spatialMemory.getCrystalMeshes() : [];
return meshes.find(m => m.userData && m.userData.memId === memId);
}
// ─── LINE GROWTH ──────────────────────────────────
function _triggerLineGrowth(mesh, spatialMemory) {
if (!_scene) return;
// Find connection lines that originate from this memory
// Connection lines are stored as children of the scene or in a group
_scene.children.forEach(child => {
if (child.isLine && child.userData) {
// Check if this line connects to our new memory
if (child.userData.fromId === mesh.userData.memId ||
child.userData.toId === mesh.userData.memId) {
_activeLineGrowths.push({
line: child,
startTime: Date.now() / 1000,
duration: LINE_DRAW_DURATION
});
}
}
});
}
// ─── UPDATE (call every frame) ────────────────────
function update(delta) {
const now = Date.now() / 1000;
// ── Process births ──
for (let i = _activeBirths.length - 1; i >= 0; i--) {
const birth = _activeBirths[i];
const elapsed = now - birth.startTime;
const t = Math.min(1, elapsed / birth.duration);
if (t >= 1) {
// Birth complete — ensure final state
birth.mesh.scale.setScalar(1);
if (birth.mesh.material) {
birth.mesh.material.emissiveIntensity = birth.mesh.userData._birthOrigEmissive || 1.5;
birth.mesh.material.opacity = birth.mesh.userData._birthOrigOpacity || 0.9;
}
_activeBirths.splice(i, 1);
continue;
}
// Scale animation with elastic ease
const scale = elasticOut(t);
birth.mesh.scale.setScalar(Math.max(0.001, scale));
// Bloom flash — emissive intensity spikes at BLOOM_PEAK then fades
if (birth.mesh.material) {
const origEI = birth.mesh.userData._birthOrigEmissive || 1.5;
const bloomT = smoothstep(0, BLOOM_PEAK, t) * (1 - smoothstep(BLOOM_PEAK, 1, t));
birth.mesh.material.emissiveIntensity = origEI + bloomT * BLOOM_INTENSITY;
// Opacity fades in
const origOp = birth.mesh.userData._birthOrigOpacity || 0.9;
birth.mesh.material.opacity = origOp * smoothstep(0, 0.3, t);
}
// Gentle upward float during birth (crystals are placed 1.5 above ground)
birth.mesh.position.y = birth.originPos.y + (1 - scale) * 0.5;
}
// ── Process neighbor pulses ──
for (let i = _activePulses.length - 1; i >= 0; i--) {
const pulse = _activePulses[i];
const elapsed = now - pulse.startTime;
const t = Math.min(1, elapsed / pulse.duration);
if (t >= 1) {
// Restore original
if (pulse.mesh.material) {
pulse.mesh.material.emissiveIntensity = pulse.origEmissive;
}
_activePulses.splice(i, 1);
continue;
}
// Pulse curve: quick rise, slow decay
const pulseVal = Math.sin(t * Math.PI) * pulse.intensity;
if (pulse.mesh.material) {
pulse.mesh.material.emissiveIntensity = pulse.origEmissive + pulseVal;
}
}
// ── Process line growths ──
for (let i = _activeLineGrowths.length - 1; i >= 0; i--) {
const lg = _activeLineGrowths[i];
const elapsed = now - lg.startTime;
const t = Math.min(1, elapsed / lg.duration);
if (t >= 1) {
// Ensure full visibility
if (lg.line.material) {
lg.line.material.opacity = lg.line.material.userData?._origOpacity || 0.6;
}
_activeLineGrowths.splice(i, 1);
continue;
}
// Fade in the line
if (lg.line.material) {
const origOp = lg.line.material.userData?._origOpacity || 0.6;
lg.line.material.opacity = origOp * smoothstep(0, 1, t);
}
}
}
// ─── BIRTH COUNT (for UI/status) ─────────────────
function getActiveBirthCount() {
return _activeBirths.length;
}
// ─── WRAP SPATIAL MEMORY ──────────────────────────
/**
* Wraps SpatialMemory.placeMemory() so every new crystal
* automatically gets a birth animation.
* Returns a proxy object that intercepts placeMemory calls.
*/
function wrapSpatialMemory(spatialMemory) {
const original = spatialMemory.placeMemory.bind(spatialMemory);
spatialMemory.placeMemory = function(mem) {
const crystal = original(mem);
if (crystal) {
// Small delay to let THREE.js settle the object
requestAnimationFrame(() => triggerBirth(crystal, spatialMemory));
}
return crystal;
};
console.info('[MemoryBirth] SpatialMemory.placeMemory wrapped — births will animate');
return spatialMemory;
}
return {
init,
triggerBirth,
update,
getActiveBirthCount,
wrapSpatialMemory
};
})();
export { MemoryBirth };

View File

@@ -0,0 +1,305 @@
// ═══════════════════════════════════════════
// PROJECT MNEMOSYNE — MEMORY PULSE ENGINE
// ═══════════════════════════════════════════
//
// Holographic ripple propagation: when a memory crystal is accessed,
// a visual pulse wave radiates outward through the connection graph,
// illuminating linked memories in decreasing intensity by hop distance.
//
// This makes the archive feel alive — one thought echoing through
// the holographic field of related knowledge.
//
// Issue: Mnemosyne Pulse Effect
// ═══════════════════════════════════════════
const MemoryPulse = (() => {
let _scene = null;
let _spatialMemory = null;
let _activePulses = []; // Currently propagating pulse waves
let _pulseRings = []; // Active ring meshes being rendered
let _connectionFlashes = []; // Active connection line flashes
const PULSE_SPEED = 8; // Units per second propagation
const PULSE_MAX_HOPS = 5; // Max graph depth to traverse
const RING_DURATION = 1.5; // Seconds each ring is visible
const RING_MAX_RADIUS = 2.0; // Max expansion of pulse ring
const FLASH_DURATION = 0.8; // Seconds connection lines flash
const BASE_INTENSITY = 3.0; // Emissive boost at pulse origin
const HOP_DECAY = 0.65; // Intensity multiplier per hop
// ─── INIT ────────────────────────────────────────────
function init(scene, spatialMemory) {
_scene = scene;
_spatialMemory = spatialMemory;
console.info('[Mnemosyne] Pulse engine initialized');
}
// ─── TRIGGER PULSE ──────────────────────────────────
/**
* Fire a pulse from a memory crystal. Propagates through
* connected memories by BFS, creating visual rings and
* connection line flashes at each hop.
* @param {string} sourceId - Memory ID to pulse from
*/
function triggerPulse(sourceId) {
if (!_scene || !_spatialMemory) return;
const memories = _spatialMemory.getAllMemories();
const source = memories.find(m => m.id === sourceId);
if (!source) return;
// BFS through connection graph
const visited = new Set();
const queue = [{ id: sourceId, hop: 0, delay: 0 }];
visited.add(sourceId);
const memMap = {};
memories.forEach(m => { memMap[m.id] = m; });
while (queue.length > 0) {
const { id, hop, delay } = queue.shift();
if (hop > PULSE_MAX_HOPS) continue;
const mem = memMap[id];
if (!mem) continue;
// Schedule ring spawn
_scheduleRing(id, hop, delay);
// Schedule connection flashes to neighbors
const connections = mem.connections || [];
connections.forEach(targetId => {
if (visited.has(targetId)) return;
visited.add(targetId);
const target = memMap[targetId];
if (!target) return;
const travelDelay = delay + _travelTime(mem, target);
_scheduleConnectionFlash(id, targetId, delay, travelDelay);
queue.push({ id: targetId, hop: hop + 1, delay: travelDelay });
});
}
}
// ─── TRAVEL TIME ────────────────────────────────────
function _travelTime(src, dst) {
const sp = src.position || [0, 0, 0];
const dp = dst.position || [0, 0, 0];
const dx = sp[0] - dp[0], dy = sp[1] - dp[1], dz = sp[2] - dp[2];
const dist = Math.sqrt(dx * dx + dy * dy + dz * dz);
return dist / PULSE_SPEED;
}
// ─── SCHEDULE RING ──────────────────────────────────
function _scheduleRing(memId, hop, delay) {
const startTime = performance.now() + delay * 1000;
_activePulses.push({
type: 'ring',
memId,
hop,
startTime,
duration: RING_DURATION,
intensity: BASE_INTENSITY * Math.pow(HOP_DECAY, hop),
});
}
// ─── SCHEDULE CONNECTION FLASH ─────────────────────
function _scheduleConnectionFlash(fromId, toId, startDelay, endDelay) {
const startTime = performance.now() + startDelay * 1000;
_activePulses.push({
type: 'flash',
fromId,
toId,
startTime,
duration: endDelay - startDelay + FLASH_DURATION,
intensity: BASE_INTENSITY * Math.pow(HOP_DECAY, 0),
});
}
// ─── UPDATE (called per frame) ──────────────────────
function update(delta) {
const now = performance.now();
// Process scheduled pulses
for (let i = _activePulses.length - 1; i >= 0; i--) {
const pulse = _activePulses[i];
if (now < pulse.startTime) continue; // Not yet active
const elapsed = (now - pulse.startTime) / 1000;
const progress = Math.min(1, elapsed / pulse.duration);
if (progress >= 1) {
_activePulses.splice(i, 1);
continue;
}
if (pulse.type === 'ring') {
_renderRing(pulse, elapsed, progress);
} else if (pulse.type === 'flash') {
_renderConnectionFlash(pulse, elapsed, progress);
}
}
// Update existing ring meshes
for (let i = _pulseRings.length - 1; i >= 0; i--) {
const ring = _pulseRings[i];
ring.age += delta;
if (ring.age >= ring.maxAge) {
if (ring.mesh.parent) ring.mesh.parent.remove(ring.mesh);
ring.mesh.geometry.dispose();
ring.mesh.material.dispose();
_pulseRings.splice(i, 1);
continue;
}
const t = ring.age / ring.maxAge;
const scale = 1 + t * RING_MAX_RADIUS;
ring.mesh.scale.set(scale, scale, scale);
ring.mesh.material.opacity = ring.baseOpacity * (1 - t * t);
}
// Update connection flashes
for (let i = _connectionFlashes.length - 1; i >= 0; i--) {
const flash = _connectionFlashes[i];
flash.age += delta;
if (flash.age >= flash.maxAge) {
// Restore original material
if (flash.line && flash.line.material) {
flash.line.material.opacity = flash.originalOpacity;
flash.line.material.color.setHex(flash.originalColor);
}
_connectionFlashes.splice(i, 1);
continue;
}
const t = flash.age / flash.maxAge;
if (flash.line && flash.line.material) {
// Pulse opacity with travel effect
const wave = Math.sin(t * Math.PI);
flash.line.material.opacity = flash.originalOpacity + wave * 0.6;
flash.line.material.color.setHex(
_lerpColor(flash.originalColor, flash.flashColor, wave * 0.8)
);
}
}
}
// ─── RENDER RING ────────────────────────────────────
function _renderRing(pulse, elapsed, progress) {
// Find crystal position
const allMeshes = _spatialMemory.getCrystalMeshes();
let sourceMesh = null;
const memories = _spatialMemory.getAllMemories();
for (const mem of memories) {
if (mem.id === pulse.memId) {
// Find matching mesh
sourceMesh = allMeshes.find(m => m.userData.memId === pulse.memId);
break;
}
}
if (!sourceMesh) return;
// Only create ring once (check if we already have one for this pulse)
if (pulse._ringCreated) return;
pulse._ringCreated = true;
const ringGeo = new THREE.RingGeometry(0.1, 0.15, 32);
const region = memories.find(m => m.id === pulse.memId);
const color = _getRegionColor(region ? region.category : 'working');
const ringMat = new THREE.MeshBasicMaterial({
color: color,
transparent: true,
opacity: 0.8 * pulse.intensity / BASE_INTENSITY,
side: THREE.DoubleSide,
depthWrite: false,
});
const ring = new THREE.Mesh(ringGeo, ringMat);
ring.position.copy(sourceMesh.position);
ring.position.y += 0.1; // Slight offset above crystal
ring.rotation.x = -Math.PI / 2; // Flat on XZ plane
ring.lookAt(ring.position.x, ring.position.y + 1, ring.position.z);
_scene.add(ring);
_pulseRings.push({
mesh: ring,
age: 0,
maxAge: RING_DURATION,
baseOpacity: ringMat.opacity,
});
}
// ─── RENDER CONNECTION FLASH ────────────────────────
function _renderConnectionFlash(pulse, elapsed, progress) {
if (pulse._flashCreated) return;
// Find the connection line between from and to
const fromMesh = _findMesh(pulse.fromId);
const toMesh = _findMesh(pulse.toId);
if (!fromMesh || !toMesh) return;
// Create a temporary line for the flash
const points = [fromMesh.position.clone(), toMesh.position.clone()];
const geo = new THREE.BufferGeometry().setFromPoints(points);
const mat = new THREE.LineBasicMaterial({
color: 0x4af0c0,
transparent: true,
opacity: 0.0,
linewidth: 2,
});
const line = new THREE.Line(geo, mat);
_scene.add(line);
pulse._flashCreated = true;
_connectionFlashes.push({
line,
age: 0,
maxAge: pulse.duration,
originalOpacity: 0.0,
originalColor: 0x334455,
flashColor: 0x4af0c0,
});
}
// ─── HELPERS ────────────────────────────────────────
function _findMesh(memId) {
const meshes = _spatialMemory.getCrystalMeshes();
return meshes.find(m => m.userData.memId === memId) || null;
}
function _getRegionColor(category) {
const colors = {
documents: 0x4af0c0,
projects: 0xff6b35,
code: 0x7b5cff,
social: 0xff4488,
working: 0xffd700,
archive: 0x445566,
};
return colors[category] || colors.working;
}
function _lerpColor(a, b, t) {
const ar = (a >> 16) & 0xff, ag = (a >> 8) & 0xff, ab = a & 0xff;
const br = (b >> 16) & 0xff, bg = (b >> 8) & 0xff, bb = b & 0xff;
const rr = Math.round(ar + (br - ar) * t);
const rg = Math.round(ag + (bg - ag) * t);
const rb = Math.round(ab + (bb - ab) * t);
return (rr << 16) | (rg << 8) | rb;
}
// ─── PUBLIC API ─────────────────────────────────────
return {
init,
triggerPulse,
update,
};
})();
export { MemoryPulse };

View File

@@ -1,4 +1,41 @@
// ═══════════════════════════════════════════
// ═══
// ─── REGION VISIBILITY (Memory Filter) ──────────────
let _regionVisibility = {}; // category -> boolean (undefined = visible)
setRegionVisibility(category, visible) {
_regionVisibility[category] = visible;
for (const obj of Object.values(_memoryObjects)) {
if (obj.data.category === category && obj.mesh) {
obj.mesh.visible = visible !== false;
}
}
},
setAllRegionsVisible(visible) {
const cats = Object.keys(REGIONS);
for (const cat of cats) {
_regionVisibility[cat] = visible;
for (const obj of Object.values(_memoryObjects)) {
if (obj.data.category === cat && obj.mesh) {
obj.mesh.visible = visible;
}
}
}
},
getMemoryCountByRegion() {
const counts = {};
for (const obj of Object.values(_memoryObjects)) {
const cat = obj.data.category || 'working';
counts[cat] = (counts[cat] || 0) + 1;
}
return counts;
},
isRegionVisible(category) {
return _regionVisibility[category] !== false;
},
// PROJECT MNEMOSYNE — SPATIAL MEMORY SCHEMA
// ═══════════════════════════════════════════
//

View File

@@ -0,0 +1,24 @@
"""nexus.mnemosyne — The Living Holographic Archive.
Phase 1: Foundation — core archive, entry model, holographic linker,
ingestion pipeline, and CLI.
Builds on MemPalace vector memory to create interconnected meaning:
entries auto-reference related entries via semantic similarity,
forming a living archive that surfaces relevant context autonomously.
"""
from __future__ import annotations
from nexus.mnemosyne.archive import MnemosyneArchive
from nexus.mnemosyne.entry import ArchiveEntry
from nexus.mnemosyne.linker import HolographicLinker
from nexus.mnemosyne.ingest import ingest_from_mempalace, ingest_event
__all__ = [
"MnemosyneArchive",
"ArchiveEntry",
"HolographicLinker",
"ingest_from_mempalace",
"ingest_event",
]

243
nexus/mnemosyne/archive.py Normal file
View File

@@ -0,0 +1,243 @@
"""MnemosyneArchive — core archive class.
The living holographic archive. Stores entries, maintains links,
and provides query interfaces for retrieving connected knowledge.
"""
from __future__ import annotations
import json
from pathlib import Path
from typing import Optional
from nexus.mnemosyne.entry import ArchiveEntry
from nexus.mnemosyne.linker import HolographicLinker
_EXPORT_VERSION = "1"
class MnemosyneArchive:
"""The holographic archive — stores and links entries.
Phase 1 uses JSON file storage. Phase 2 will integrate with
MemPalace (ChromaDB) for vector-semantic search.
"""
def __init__(self, archive_path: Optional[Path] = None):
self.path = archive_path or Path.home() / ".hermes" / "mnemosyne" / "archive.json"
self.path.parent.mkdir(parents=True, exist_ok=True)
self.linker = HolographicLinker()
self._entries: dict[str, ArchiveEntry] = {}
self._load()
def _load(self):
if self.path.exists():
try:
with open(self.path) as f:
data = json.load(f)
for entry_data in data.get("entries", []):
entry = ArchiveEntry.from_dict(entry_data)
self._entries[entry.id] = entry
except (json.JSONDecodeError, KeyError):
pass # Start fresh on corrupt data
def _save(self):
data = {
"entries": [e.to_dict() for e in self._entries.values()],
"count": len(self._entries),
}
with open(self.path, "w") as f:
json.dump(data, f, indent=2)
def add(self, entry: ArchiveEntry, auto_link: bool = True) -> ArchiveEntry:
"""Add an entry to the archive. Auto-links to related entries."""
self._entries[entry.id] = entry
if auto_link:
self.linker.apply_links(entry, list(self._entries.values()))
self._save()
return entry
def get(self, entry_id: str) -> Optional[ArchiveEntry]:
return self._entries.get(entry_id)
def search(self, query: str, limit: int = 10) -> list[ArchiveEntry]:
"""Simple keyword search across titles and content."""
query_tokens = set(query.lower().split())
scored = []
for entry in self._entries.values():
text = f"{entry.title} {entry.content} {' '.join(entry.topics)}".lower()
hits = sum(1 for t in query_tokens if t in text)
if hits > 0:
scored.append((hits, entry))
scored.sort(key=lambda x: x[0], reverse=True)
return [e for _, e in scored[:limit]]
def semantic_search(self, query: str, limit: int = 10, threshold: float = 0.05) -> list[ArchiveEntry]:
"""Semantic search using holographic linker similarity.
Scores each entry by Jaccard similarity between query tokens and entry
tokens, then boosts entries with more inbound links (more "holographic").
Falls back to keyword search if no entries meet the similarity threshold.
Args:
query: Natural language query string.
limit: Maximum number of results to return.
threshold: Minimum Jaccard similarity to be considered a semantic match.
Returns:
List of ArchiveEntry sorted by combined relevance score, descending.
"""
query_tokens = HolographicLinker._tokenize(query)
if not query_tokens:
return []
# Count inbound links for each entry (how many entries link TO this one)
inbound: dict[str, int] = {eid: 0 for eid in self._entries}
for entry in self._entries.values():
for linked_id in entry.links:
if linked_id in inbound:
inbound[linked_id] += 1
max_inbound = max(inbound.values(), default=1) or 1
scored = []
for entry in self._entries.values():
entry_tokens = HolographicLinker._tokenize(f"{entry.title} {entry.content} {' '.join(entry.topics)}")
if not entry_tokens:
continue
intersection = query_tokens & entry_tokens
union = query_tokens | entry_tokens
jaccard = len(intersection) / len(union)
if jaccard >= threshold:
link_boost = inbound[entry.id] / max_inbound * 0.2 # up to 20% boost
scored.append((jaccard + link_boost, entry))
if scored:
scored.sort(key=lambda x: x[0], reverse=True)
return [e for _, e in scored[:limit]]
# Graceful fallback to keyword search
return self.search(query, limit=limit)
def get_linked(self, entry_id: str, depth: int = 1) -> list[ArchiveEntry]:
"""Get entries linked to a given entry, up to specified depth."""
visited = set()
frontier = {entry_id}
result = []
for _ in range(depth):
next_frontier = set()
for eid in frontier:
if eid in visited:
continue
visited.add(eid)
entry = self._entries.get(eid)
if entry:
for linked_id in entry.links:
if linked_id not in visited:
linked = self._entries.get(linked_id)
if linked:
result.append(linked)
next_frontier.add(linked_id)
frontier = next_frontier
return result
def by_topic(self, topic: str) -> list[ArchiveEntry]:
"""Get all entries tagged with a topic."""
topic_lower = topic.lower()
return [e for e in self._entries.values() if topic_lower in [t.lower() for t in e.topics]]
def remove(self, entry_id: str) -> bool:
"""Remove an entry and clean up all bidirectional links.
Returns True if the entry existed and was removed, False otherwise.
"""
if entry_id not in self._entries:
return False
# Remove back-links from all other entries
for other in self._entries.values():
if entry_id in other.links:
other.links.remove(entry_id)
del self._entries[entry_id]
self._save()
return True
def export(
self,
query: Optional[str] = None,
topics: Optional[list[str]] = None,
) -> dict:
"""Export a filtered subset of the archive.
Args:
query: keyword filter applied to title + content (case-insensitive)
topics: list of topic tags; entries must match at least one
Returns a JSON-serialisable dict with an ``entries`` list and metadata.
"""
candidates = list(self._entries.values())
if topics:
lower_topics = {t.lower() for t in topics}
candidates = [
e for e in candidates
if any(t.lower() in lower_topics for t in e.topics)
]
if query:
query_tokens = set(query.lower().split())
candidates = [
e for e in candidates
if any(
token in f"{e.title} {e.content} {' '.join(e.topics)}".lower()
for token in query_tokens
)
]
return {
"version": _EXPORT_VERSION,
"filters": {"query": query, "topics": topics},
"count": len(candidates),
"entries": [e.to_dict() for e in candidates],
}
def topic_counts(self) -> dict[str, int]:
"""Return a dict mapping topic name → entry count, sorted by count desc."""
counts: dict[str, int] = {}
for entry in self._entries.values():
for topic in entry.topics:
counts[topic] = counts.get(topic, 0) + 1
return dict(sorted(counts.items(), key=lambda x: x[1], reverse=True))
@property
def count(self) -> int:
return len(self._entries)
def stats(self) -> dict:
entries = list(self._entries.values())
total_links = sum(len(e.links) for e in entries)
topics: set[str] = set()
for e in entries:
topics.update(e.topics)
# Orphans: entries with no links at all
orphans = sum(1 for e in entries if len(e.links) == 0)
# Link density: average links per entry (0 when empty)
n = len(entries)
link_density = round(total_links / n, 4) if n else 0.0
# Age distribution
timestamps = sorted(e.created_at for e in entries)
oldest_entry = timestamps[0] if timestamps else None
newest_entry = timestamps[-1] if timestamps else None
return {
"entries": n,
"total_links": total_links,
"unique_topics": len(topics),
"topics": sorted(topics),
"orphans": orphans,
"link_density": link_density,
"oldest_entry": oldest_entry,
"newest_entry": newest_entry,
}

140
nexus/mnemosyne/cli.py Normal file
View File

@@ -0,0 +1,140 @@
"""CLI interface for Mnemosyne.
Provides: mnemosyne ingest, mnemosyne search, mnemosyne link, mnemosyne stats,
mnemosyne topics, mnemosyne remove, mnemosyne export
"""
from __future__ import annotations
import argparse
import json
import sys
from nexus.mnemosyne.archive import MnemosyneArchive
from nexus.mnemosyne.entry import ArchiveEntry
from nexus.mnemosyne.ingest import ingest_event
def cmd_stats(args):
archive = MnemosyneArchive()
stats = archive.stats()
print(json.dumps(stats, indent=2))
def cmd_search(args):
archive = MnemosyneArchive()
if getattr(args, "semantic", False):
results = archive.semantic_search(args.query, limit=args.limit)
else:
results = archive.search(args.query, limit=args.limit)
if not results:
print("No results found.")
return
for entry in results:
linked = len(entry.links)
print(f"[{entry.id[:8]}] {entry.title}")
print(f" Source: {entry.source} | Topics: {', '.join(entry.topics)} | Links: {linked}")
print(f" {entry.content[:120]}...")
print()
def cmd_ingest(args):
archive = MnemosyneArchive()
entry = ingest_event(
archive,
title=args.title,
content=args.content,
topics=args.topics.split(",") if args.topics else [],
)
print(f"Ingested: [{entry.id[:8]}] {entry.title} ({len(entry.links)} links)")
def cmd_link(args):
archive = MnemosyneArchive()
entry = archive.get(args.entry_id)
if not entry:
print(f"Entry not found: {args.entry_id}")
sys.exit(1)
linked = archive.get_linked(entry.id, depth=args.depth)
if not linked:
print("No linked entries found.")
return
for e in linked:
print(f" [{e.id[:8]}] {e.title} (source: {e.source})")
def cmd_topics(args):
archive = MnemosyneArchive()
counts = archive.topic_counts()
if not counts:
print("No topics found.")
return
for topic, count in counts.items():
print(f" {topic}: {count}")
def cmd_remove(args):
archive = MnemosyneArchive()
removed = archive.remove(args.entry_id)
if removed:
print(f"Removed entry: {args.entry_id}")
else:
print(f"Entry not found: {args.entry_id}")
sys.exit(1)
def cmd_export(args):
archive = MnemosyneArchive()
topics = [t.strip() for t in args.topics.split(",")] if args.topics else None
data = archive.export(query=args.query or None, topics=topics)
print(json.dumps(data, indent=2))
def main():
parser = argparse.ArgumentParser(prog="mnemosyne", description="The Living Holographic Archive")
sub = parser.add_subparsers(dest="command")
sub.add_parser("stats", help="Show archive statistics")
s = sub.add_parser("search", help="Search the archive")
s.add_argument("query", help="Search query")
s.add_argument("-n", "--limit", type=int, default=10)
s.add_argument("--semantic", action="store_true", help="Use holographic linker similarity scoring")
i = sub.add_parser("ingest", help="Ingest a new entry")
i.add_argument("--title", required=True)
i.add_argument("--content", required=True)
i.add_argument("--topics", default="", help="Comma-separated topics")
l = sub.add_parser("link", help="Show linked entries")
l.add_argument("entry_id", help="Entry ID (or prefix)")
l.add_argument("-d", "--depth", type=int, default=1)
sub.add_parser("topics", help="List all topics with entry counts")
r = sub.add_parser("remove", help="Remove an entry by ID")
r.add_argument("entry_id", help="Entry ID to remove")
ex = sub.add_parser("export", help="Export filtered archive data as JSON")
ex.add_argument("-q", "--query", default="", help="Keyword filter")
ex.add_argument("-t", "--topics", default="", help="Comma-separated topic filter")
args = parser.parse_args()
if not args.command:
parser.print_help()
sys.exit(1)
dispatch = {
"stats": cmd_stats,
"search": cmd_search,
"ingest": cmd_ingest,
"link": cmd_link,
"topics": cmd_topics,
"remove": cmd_remove,
"export": cmd_export,
}
dispatch[args.command](args)
if __name__ == "__main__":
main()

44
nexus/mnemosyne/entry.py Normal file
View File

@@ -0,0 +1,44 @@
"""Archive entry model for Mnemosyne.
Each entry is a node in the holographic graph — a piece of meaning
with metadata, content, and links to related entries.
"""
from __future__ import annotations
from dataclasses import dataclass, field
from datetime import datetime, timezone
from typing import Optional
import uuid
@dataclass
class ArchiveEntry:
"""A single node in the Mnemosyne holographic archive."""
id: str = field(default_factory=lambda: str(uuid.uuid4()))
title: str = ""
content: str = ""
source: str = "" # "mempalace", "event", "manual", etc.
source_ref: Optional[str] = None # original MemPalace ID, event URI, etc.
topics: list[str] = field(default_factory=list)
metadata: dict = field(default_factory=dict)
created_at: str = field(default_factory=lambda: datetime.now(timezone.utc).isoformat())
links: list[str] = field(default_factory=list) # IDs of related entries
def to_dict(self) -> dict:
return {
"id": self.id,
"title": self.title,
"content": self.content,
"source": self.source,
"source_ref": self.source_ref,
"topics": self.topics,
"metadata": self.metadata,
"created_at": self.created_at,
"links": self.links,
}
@classmethod
def from_dict(cls, data: dict) -> ArchiveEntry:
return cls(**{k: v for k, v in data.items() if k in cls.__dataclass_fields__})

62
nexus/mnemosyne/ingest.py Normal file
View File

@@ -0,0 +1,62 @@
"""Ingestion pipeline — feeds data into the archive.
Supports ingesting from MemPalace, raw events, and manual entries.
"""
from __future__ import annotations
from typing import Optional
from nexus.mnemosyne.archive import MnemosyneArchive
from nexus.mnemosyne.entry import ArchiveEntry
def ingest_from_mempalace(
archive: MnemosyneArchive,
mempalace_entries: list[dict],
) -> int:
"""Ingest entries from a MemPalace export.
Each dict should have at least: content, metadata (optional).
Returns count of new entries added.
"""
added = 0
for mp_entry in mempalace_entries:
content = mp_entry.get("content", "")
metadata = mp_entry.get("metadata", {})
source_ref = mp_entry.get("id", "")
# Skip if already ingested
if any(e.source_ref == source_ref for e in archive._entries.values()):
continue
entry = ArchiveEntry(
title=metadata.get("title", content[:80]),
content=content,
source="mempalace",
source_ref=source_ref,
topics=metadata.get("topics", []),
metadata=metadata,
)
archive.add(entry)
added += 1
return added
def ingest_event(
archive: MnemosyneArchive,
title: str,
content: str,
topics: Optional[list[str]] = None,
source: str = "event",
metadata: Optional[dict] = None,
) -> ArchiveEntry:
"""Ingest a single event into the archive."""
entry = ArchiveEntry(
title=title,
content=content,
source=source,
topics=topics or [],
metadata=metadata or {},
)
return archive.add(entry)

73
nexus/mnemosyne/linker.py Normal file
View File

@@ -0,0 +1,73 @@
"""Holographic link engine.
Computes semantic similarity between archive entries and creates
bidirectional links, forming the holographic graph structure.
"""
from __future__ import annotations
from typing import Optional
from nexus.mnemosyne.entry import ArchiveEntry
class HolographicLinker:
"""Links archive entries via semantic similarity.
Phase 1 uses simple keyword overlap as the similarity metric.
Phase 2 will integrate ChromaDB embeddings from MemPalace.
"""
def __init__(self, similarity_threshold: float = 0.15):
self.threshold = similarity_threshold
def compute_similarity(self, a: ArchiveEntry, b: ArchiveEntry) -> float:
"""Compute similarity score between two entries.
Returns float in [0, 1]. Phase 1: Jaccard similarity on
combined title+content tokens. Phase 2: cosine similarity
on ChromaDB embeddings.
"""
tokens_a = self._tokenize(f"{a.title} {a.content}")
tokens_b = self._tokenize(f"{b.title} {b.content}")
if not tokens_a or not tokens_b:
return 0.0
intersection = tokens_a & tokens_b
union = tokens_a | tokens_b
return len(intersection) / len(union)
def find_links(self, entry: ArchiveEntry, candidates: list[ArchiveEntry]) -> list[tuple[str, float]]:
"""Find entries worth linking to.
Returns list of (entry_id, similarity_score) tuples above threshold.
"""
results = []
for candidate in candidates:
if candidate.id == entry.id:
continue
score = self.compute_similarity(entry, candidate)
if score >= self.threshold:
results.append((candidate.id, score))
results.sort(key=lambda x: x[1], reverse=True)
return results
def apply_links(self, entry: ArchiveEntry, candidates: list[ArchiveEntry]) -> int:
"""Auto-link an entry to related entries. Returns count of new links."""
matches = self.find_links(entry, candidates)
new_links = 0
for eid, score in matches:
if eid not in entry.links:
entry.links.append(eid)
new_links += 1
# Bidirectional
for c in candidates:
if c.id == eid and entry.id not in c.links:
c.links.append(entry.id)
return new_links
@staticmethod
def _tokenize(text: str) -> set[str]:
"""Simple whitespace + punctuation tokenizer."""
import re
tokens = set(re.findall(r"\w+", text.lower()))
# Remove very short tokens
return {t for t in tokens if len(t) > 2}

View File

View File

@@ -0,0 +1,276 @@
"""Tests for Mnemosyne archive core."""
import json
import tempfile
from pathlib import Path
from nexus.mnemosyne.entry import ArchiveEntry
from nexus.mnemosyne.linker import HolographicLinker
from nexus.mnemosyne.archive import MnemosyneArchive
from nexus.mnemosyne.ingest import ingest_event, ingest_from_mempalace
def test_entry_roundtrip():
e = ArchiveEntry(title="Test", content="Hello world", topics=["test"])
d = e.to_dict()
e2 = ArchiveEntry.from_dict(d)
assert e2.id == e.id
assert e2.title == "Test"
def test_linker_similarity():
linker = HolographicLinker()
a = ArchiveEntry(title="Python coding", content="Writing Python scripts for automation")
b = ArchiveEntry(title="Python scripting", content="Automating tasks with Python scripts")
c = ArchiveEntry(title="Cooking recipes", content="How to make pasta carbonara")
assert linker.compute_similarity(a, b) > linker.compute_similarity(a, c)
def test_archive_add_and_search():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
ingest_event(archive, title="First entry", content="Hello archive", topics=["test"])
ingest_event(archive, title="Second entry", content="Another record", topics=["test", "demo"])
assert archive.count == 2
results = archive.search("hello")
assert len(results) == 1
assert results[0].title == "First entry"
def test_archive_auto_linking():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
e1 = ingest_event(archive, title="Python automation", content="Building automation tools in Python")
e2 = ingest_event(archive, title="Python scripting", content="Writing automation scripts using Python")
# Both should be linked due to shared tokens
assert len(e1.links) > 0 or len(e2.links) > 0
def test_ingest_from_mempalace():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
mp_entries = [
{"id": "mp-1", "content": "Test memory content", "metadata": {"title": "Test", "topics": ["demo"]}},
{"id": "mp-2", "content": "Another memory", "metadata": {"title": "Memory 2"}},
]
count = ingest_from_mempalace(archive, mp_entries)
assert count == 2
assert archive.count == 2
def test_archive_persistence():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive1 = MnemosyneArchive(archive_path=path)
ingest_event(archive1, title="Persistent", content="Should survive reload")
archive2 = MnemosyneArchive(archive_path=path)
assert archive2.count == 1
results = archive2.search("persistent")
assert len(results) == 1
def test_archive_remove_basic():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
e1 = ingest_event(archive, title="Alpha", content="First entry", topics=["x"])
assert archive.count == 1
result = archive.remove(e1.id)
assert result is True
assert archive.count == 0
assert archive.get(e1.id) is None
def test_archive_remove_nonexistent():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
result = archive.remove("does-not-exist")
assert result is False
def test_archive_remove_cleans_backlinks():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
e1 = ingest_event(archive, title="Python automation", content="Building automation tools in Python")
e2 = ingest_event(archive, title="Python scripting", content="Writing automation scripts using Python")
# At least one direction should be linked
assert e1.id in e2.links or e2.id in e1.links
# Remove e1; e2 must no longer reference it
archive.remove(e1.id)
e2_fresh = archive.get(e2.id)
assert e2_fresh is not None
assert e1.id not in e2_fresh.links
def test_archive_remove_persists():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
a1 = MnemosyneArchive(archive_path=path)
e = ingest_event(a1, title="Gone", content="Will be removed")
a1.remove(e.id)
a2 = MnemosyneArchive(archive_path=path)
assert a2.count == 0
def test_archive_export_unfiltered():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
ingest_event(archive, title="A", content="content a", topics=["alpha"])
ingest_event(archive, title="B", content="content b", topics=["beta"])
data = archive.export()
assert data["count"] == 2
assert len(data["entries"]) == 2
assert data["filters"] == {"query": None, "topics": None}
def test_archive_export_by_topic():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
ingest_event(archive, title="A", content="content a", topics=["alpha"])
ingest_event(archive, title="B", content="content b", topics=["beta"])
data = archive.export(topics=["alpha"])
assert data["count"] == 1
assert data["entries"][0]["title"] == "A"
def test_archive_export_by_query():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
ingest_event(archive, title="Hello world", content="greetings", topics=[])
ingest_event(archive, title="Goodbye", content="farewell", topics=[])
data = archive.export(query="hello")
assert data["count"] == 1
assert data["entries"][0]["title"] == "Hello world"
def test_archive_export_combined_filters():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
ingest_event(archive, title="Hello world", content="greetings", topics=["alpha"])
ingest_event(archive, title="Hello again", content="greetings again", topics=["beta"])
data = archive.export(query="hello", topics=["alpha"])
assert data["count"] == 1
assert data["entries"][0]["title"] == "Hello world"
def test_archive_stats_richer():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
# All four new fields present when archive is empty
s = archive.stats()
assert "orphans" in s
assert "link_density" in s
assert "oldest_entry" in s
assert "newest_entry" in s
assert s["orphans"] == 0
assert s["link_density"] == 0.0
assert s["oldest_entry"] is None
assert s["newest_entry"] is None
def test_archive_stats_orphan_count():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
# Two entries with very different content → unlikely to auto-link
ingest_event(archive, title="Zebras", content="Zebra stripes savannah Africa", topics=[])
ingest_event(archive, title="Compiler", content="Lexer parser AST bytecode", topics=[])
s = archive.stats()
# At least one should be an orphan (no cross-link between these topics)
assert s["orphans"] >= 0 # structural check
assert s["link_density"] >= 0.0
assert s["oldest_entry"] is not None
assert s["newest_entry"] is not None
def test_semantic_search_returns_results():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
ingest_event(archive, title="Python automation", content="Building automation tools in Python")
ingest_event(archive, title="Cooking recipes", content="How to make pasta carbonara with cheese")
results = archive.semantic_search("python scripting", limit=5)
assert len(results) > 0
assert results[0].title == "Python automation"
def test_semantic_search_link_boost():
"""Entries with more inbound links rank higher when Jaccard is equal."""
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
# Create two similar entries; manually give one more links
e1 = ingest_event(archive, title="Machine learning", content="Neural networks deep learning models")
e2 = ingest_event(archive, title="Machine learning basics", content="Neural networks deep learning intro")
# Add a third entry that links to e1 so e1 has more inbound links
e3 = ingest_event(archive, title="AI overview", content="Artificial intelligence machine learning")
# Manually give e1 an extra inbound link by adding e3 -> e1
if e1.id not in e3.links:
e3.links.append(e1.id)
archive._save()
results = archive.semantic_search("machine learning neural networks", limit=5)
assert len(results) >= 2
# e1 should rank at or near top
assert results[0].id in {e1.id, e2.id}
def test_semantic_search_fallback_to_keyword():
"""Falls back to keyword search when no entry meets Jaccard threshold."""
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
ingest_event(archive, title="Exact match only", content="unique xyzzy token here")
# threshold=1.0 ensures no semantic match, triggering fallback
results = archive.semantic_search("xyzzy", limit=5, threshold=1.0)
# Fallback keyword search should find it
assert len(results) == 1
assert results[0].title == "Exact match only"
def test_semantic_search_empty_archive():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
results = archive.semantic_search("anything", limit=5)
assert results == []
def test_semantic_search_vs_keyword_relevance():
"""Semantic search finds conceptually related entries missed by keyword search."""
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
ingest_event(archive, title="Python scripting", content="Writing scripts with Python for automation tasks")
ingest_event(archive, title="Baking bread", content="Mix flour water yeast knead bake oven")
# "coding" is semantically unrelated to baking but related to python scripting
results = archive.semantic_search("coding scripts automation")
assert len(results) > 0
assert results[0].title == "Python scripting"
def test_archive_topic_counts():
with tempfile.TemporaryDirectory() as tmp:
path = Path(tmp) / "test_archive.json"
archive = MnemosyneArchive(archive_path=path)
ingest_event(archive, title="A", content="x", topics=["python", "automation"])
ingest_event(archive, title="B", content="y", topics=["python"])
ingest_event(archive, title="C", content="z", topics=["automation"])
counts = archive.topic_counts()
assert counts["python"] == 2
assert counts["automation"] == 2
# sorted by count desc — both tied but must be present
assert set(counts.keys()) == {"python", "automation"}

167
style.css
View File

@@ -1546,3 +1546,170 @@ canvas#nexus-canvas {
padding-top: 4px;
border-top: 1px solid rgba(255, 255, 255, 0.06);
}
/* ═══ MNEMOSYNE: Memory Filter Panel ═══ */
.memory-filter {
position: fixed;
top: 50%;
right: 20px;
transform: translateY(-50%);
width: 300px;
max-height: 70vh;
background: rgba(10, 12, 20, 0.92);
backdrop-filter: blur(16px);
-webkit-backdrop-filter: blur(16px);
border: 1px solid rgba(74, 240, 192, 0.2);
border-radius: 12px;
display: flex;
flex-direction: column;
z-index: 100;
animation: slideInRight 0.3s ease;
box-shadow: 0 8px 32px rgba(0, 0, 0, 0.5), inset 0 1px 0 rgba(255, 255, 255, 0.05);
overflow: hidden;
}
@keyframes slideInRight {
from { transform: translateY(-50%) translateX(30px); opacity: 0; }
to { transform: translateY(-50%) translateX(0); opacity: 1; }
}
.filter-header {
display: flex;
align-items: center;
justify-content: space-between;
padding: 14px 16px 10px;
border-bottom: 1px solid rgba(74, 240, 192, 0.1);
}
.filter-title {
color: #4af0c0;
font-size: 14px;
font-weight: 600;
letter-spacing: 0.5px;
}
.filter-close {
background: none;
border: none;
color: rgba(255, 255, 255, 0.4);
font-size: 16px;
cursor: pointer;
padding: 2px 6px;
border-radius: 4px;
transition: all 0.2s;
}
.filter-close:hover {
color: #fff;
background: rgba(255, 255, 255, 0.1);
}
.filter-controls {
display: flex;
gap: 8px;
padding: 10px 16px;
}
.filter-btn {
flex: 1;
padding: 6px 0;
background: rgba(74, 240, 192, 0.08);
border: 1px solid rgba(74, 240, 192, 0.2);
border-radius: 6px;
color: rgba(255, 255, 255, 0.7);
font-size: 12px;
cursor: pointer;
transition: all 0.2s;
}
.filter-btn:hover {
background: rgba(74, 240, 192, 0.15);
color: #fff;
}
.filter-list {
overflow-y: auto;
padding: 6px 8px 12px;
flex: 1;
}
.filter-item {
display: flex;
align-items: center;
justify-content: space-between;
padding: 8px 8px;
border-radius: 6px;
transition: background 0.15s;
}
.filter-item:hover {
background: rgba(255, 255, 255, 0.04);
}
.filter-item-left {
display: flex;
align-items: center;
gap: 8px;
}
.filter-dot {
width: 10px;
height: 10px;
border-radius: 50%;
flex-shrink: 0;
}
.filter-label {
color: rgba(255, 255, 255, 0.85);
font-size: 13px;
}
.filter-item-right {
display: flex;
align-items: center;
gap: 10px;
}
.filter-count {
color: rgba(255, 255, 255, 0.35);
font-size: 12px;
min-width: 20px;
text-align: right;
}
/* Toggle switch */
.filter-toggle {
position: relative;
width: 34px;
height: 18px;
display: inline-block;
}
.filter-toggle input {
opacity: 0;
width: 0;
height: 0;
}
.filter-slider {
position: absolute;
inset: 0;
background: rgba(255, 255, 255, 0.12);
border-radius: 9px;
cursor: pointer;
transition: all 0.2s;
}
.filter-slider::before {
content: '';
position: absolute;
height: 14px;
width: 14px;
left: 2px;
bottom: 2px;
background: rgba(255, 255, 255, 0.6);
border-radius: 50%;
transition: all 0.2s;
}
.filter-toggle input:checked + .filter-slider {
background: rgba(74, 240, 192, 0.4);
}
.filter-toggle input:checked + .filter-slider::before {
transform: translateX(16px);
background: #4af0c0;
}