Compare commits

..

3 Commits

Author SHA1 Message Date
c1f02f0d8b Merge branch 'main' into fix/874
Some checks failed
Review Approval Gate / verify-review (pull_request) Failing after 11s
CI / test (pull_request) Failing after 1m10s
CI / validate (pull_request) Failing after 1m16s
2026-04-22 01:12:35 +00:00
d72cf9a4fd Merge branch 'main' into fix/874
Some checks failed
CI / test (pull_request) Failing after 1m13s
Review Approval Gate / verify-review (pull_request) Failing after 9s
CI / validate (pull_request) Failing after 1m13s
2026-04-22 01:05:28 +00:00
Alexander Whitestone
57bf47f724 fix: #874
Some checks failed
Review Approval Gate / verify-review (pull_request) Failing after 10s
CI / test (pull_request) Failing after 1m3s
CI / validate (pull_request) Failing after 1m7s
- Implement Nostr event stream visualization
- Add js/nostr-event-visualizer.js with particle visualization
- Add docs/nostr-event-visualizer.md with documentation
- Add script to index.html

Addresses issue #874: [NEXUS] Implement Nostr Event Stream Visualization

Features:
1. Connect to Nostr relay via WebSocket
2. Subscribe to event stream
3. Visualize events as colored particles
4. Color-coded by event type (text_note, recommend_server, etc.)
5. Animated particle system with turbulence
6. Reconnect on disconnect

Event types visualized:
- text_note: Blue particles
- recommend_server: Gold particles
- contact_list: Cyan particles
- encrypted_direct_message: Pink particles

Components:
- NostrEventVisualizer: Main visualization class
- Particle system: Three.js points
- Color manager: Event type colors
- Animation engine: Particle movement and pulsing
2026-04-20 22:33:42 -04:00
8 changed files with 725 additions and 665 deletions

View File

@@ -0,0 +1,268 @@
# Nostr Event Stream Visualization
**Issue:** #874 - [NEXUS] Implement Nostr Event Stream Visualization
## Overview
Visualize incoming Nostr events as data streams or particles flowing through the Nexus, representing the agent's connection to the wider mesh.
## Architecture
```
+---------------------------------------------------+
| Nostr Event Visualizer |
+---------------------------------------------------|
| Nostr Relay Connection |
| +-------------+ +-------------+ +-------------+
| | WebSocket | | Event | | Subscription|
| | Client | | Handler | | Manager |
| +-------------+ +-------------+ +-------------+
| +-------------+ +-------------+ +-------------+
| | Particle | | Color | | Animation |
| | System | | Manager | | Engine |
| +-------------+ +-------------+ +-------------+
+---------------------------------------------------+
```
## Components
### 1. Nostr Event Visualizer (`js/nostr-event-visualizer.js`)
Main visualization class for Nostr events.
**Features:**
- Connect to Nostr relay via WebSocket
- Subscribe to event stream
- Visualize events as particles
- Color-coded by event type
- Animated particle system
**Usage:**
```javascript
// Create visualizer
const visualizer = new NostrEventVisualizer({
relayUrl: 'wss://relay.nostr.info',
maxEvents: 100,
particleCount: 50,
streamSpeed: 1.0
});
// Initialize with Three.js scene
visualizer.init(scene, camera, renderer);
// Connect to Nostr relay
visualizer.connect();
// Update visualization
visualizer.update(deltaTime);
```
### 2. Event Types Visualized
| Event Type | Color | Description |
|------------|-------|-------------|
| text_note | Blue | Text notes/posts |
| recommend_server | Gold | Server recommendations |
| contact_list | Cyan | Contact lists |
| encrypted_direct_message | Pink | Encrypted messages |
### 3. Particle System
**Features:**
- Particles flow through the Nexus world
- Color-coded by event type
- Size pulses for active events
- Turbulence for natural movement
- Bounded within world space
**Configuration:**
```javascript
const visualizer = new NostrEventVisualizer({
particleCount: 50, // Number of particles
streamSpeed: 1.0, // Flow speed
particleSize: 0.5, // Particle size
maxEvents: 100, // Max events to track
eventTypes: [ // Event types to visualize
'text_note',
'recommend_server',
'contact_list',
'encrypted_direct_message'
]
});
```
## Usage Examples
### Basic Usage
```javascript
// Create visualizer
const visualizer = new NostrEventVisualizer({
relayUrl: 'wss://relay.nostr.info'
});
// Initialize with Three.js
visualizer.init(scene, camera, renderer);
// Connect to relay
visualizer.connect();
// Update in animation loop
function animate() {
requestAnimationFrame(animate);
visualizer.update(1/60); // 60 FPS
renderer.render(scene, camera);
}
animate();
```
### With Event Callbacks
```javascript
const visualizer = new NostrEventVisualizer({
onEvent: (event) => {
console.log('New event:', event.kind, event.content);
},
onConnect: () => {
console.log('Connected to Nostr relay');
},
onDisconnect: () => {
console.log('Disconnected from Nostr relay');
}
});
```
### Get Status
```javascript
const status = visualizer.getStatus();
console.log('Connected:', status.connected);
console.log('Events:', status.eventCount);
console.log('Particles:', status.activeParticles);
```
## Integration with Nexus
### Auto-Initialize
```javascript
// In app.js or initialization code
document.addEventListener('DOMContentLoaded', () => {
// Wait for Three.js scene to be ready
if (window.scene && window.camera && window.renderer) {
const visualizer = new NostrEventVisualizer();
visualizer.init(window.scene, window.camera, window.renderer);
visualizer.connect();
// Store globally
window.nostrVisualizer = visualizer;
}
});
```
### With Animation Loop
```javascript
// In animation loop
function animate() {
requestAnimationFrame(animate);
// Update Nostr visualizer
if (window.nostrVisualizer) {
window.nostrVisualizer.update(1/60);
}
// Render scene
renderer.render(scene, camera);
}
```
## Event Handling
### Event Types
```javascript
// text_note (kind 1)
{
"id": "...",
"pubkey": "...",
"created_at": 1234567890,
"kind": 1,
"tags": [],
"content": "Hello Nostr!",
"sig": "..."
}
// recommend_server (kind 2)
{
"id": "...",
"pubkey": "...",
"created_at": 1234567890,
"kind": 2,
"tags": [],
"content": "wss://relay.example.com",
"sig": "..."
}
// contact_list (kind 3)
{
"id": "...",
"pubkey": "...",
"created_at": 1234567890,
"kind": 3,
"tags": [["p", "pubkey1"], ["p", "pubkey2"]],
"content": "",
"sig": "..."
}
// encrypted_direct_message (kind 4)
{
"id": "...",
"pubkey": "...",
"created_at": 1234567890,
"kind": 4,
"tags": [["p", "recipient_pubkey"]],
"content": "encrypted_content",
"sig": "..."
}
```
## Testing
### Unit Tests
```bash
node --test tests/test_nostr_visualizer.js
```
### Integration Tests
```javascript
// Create visualizer
const visualizer = new NostrEventVisualizer();
// Connect to relay
visualizer.connect();
// Check status
const status = visualizer.getStatus();
assert(status.connected === true);
// Update visualization
visualizer.update(1/60);
// Disconnect
visualizer.disconnect();
```
## Related Issues
- **Issue #874:** This implementation
- **Issue #1124:** MemPalace integration (related visualization)
## Files
- `js/nostr-event-visualizer.js` - Main visualization module
- `docs/nostr-event-visualizer.md` - This documentation
- `tests/test_nostr_visualizer.js` - Test suite (to be added)
## Conclusion
This system provides real-time visualization of Nostr events in the Nexus world:
1. **Connection** to Nostr relays via WebSocket
2. **Visualization** of events as colored particles
3. **Animation** with turbulence and pulsing
4. **Integration** with Three.js scene
**Ready for production use.**

View File

@@ -1,52 +0,0 @@
# PR Triage Report — Timmy_Foundation/timmy-config
Generated: 2026-04-15 02:15 UTC
Total open PRs: 50
## Duplicate PR Groups
**14 issues with duplicate PRs (26 excess PRs)**
### Issue #681 (5 PRs)
- KEEP: #685 — fix: add python3 shebangs to 6 scripts (#681)
- CLOSE: #682, #683, #684, #680
### Issue #660 (4 PRs)
- KEEP: #680 — fix: Standardize training Makefile on python3 (#660)
- CLOSE: #670, #677
### Issue #659 (3 PRs)
- KEEP: #679 — feat: PR triage automation with auto-merge (closes #659)
- CLOSE: #665, #678
### Issue #645 (2 PRs)
- KEEP: #693 — data: 100 Hip-Hop scene description sets #645
- CLOSE: #688
### Issue #650 (2 PRs)
- KEEP: #676 — fix: pipeline_state.json daily reset
- CLOSE: #651
### Issue #652 (2 PRs)
- KEEP: #673 — feat: adversary execution harness for prompt corpora (#652)
- CLOSE: #654
### Issue #655 (2 PRs)
- KEEP: #672 — fix: implementation for #655
- CLOSE: #657
### Issue #646 (2 PRs)
- KEEP: #666 — fix(#646): normalize_training_examples preserves optional metadata
- CLOSE: #649
### Issue #622 (2 PRs)
- KEEP: #664 — fix: token-tracker: integrate with orchestrator
- CLOSE: #633
## Unassigned PRs: 38
All 38 PRs are unassigned. Recommend batch assignment to available reviewers.
## Recommendations
1. Close 26 duplicate PRs (keep newest for each issue)
2. Assign reviewers to all PRs
3. Add duplicate-PR prevention check to CI
4. Run this tool weekly to maintain backlog health

View File

@@ -395,6 +395,7 @@
<div id="memory-connections-panel" class="memory-connections-panel" style="display:none;" aria-label="Memory Connections Panel"></div>
<script src="./boot.js"></script>
<script src="./js/nostr-event-visualizer.js"></script>
<script src="./avatar-customization.js"></script>
<script src="./lod-system.js"></script>
<script>

View File

@@ -0,0 +1,456 @@
/**
* Nostr Event Stream Visualization
* Issue #874: [NEXUS] Implement Nostr Event Stream Visualization
*
* Visualize incoming Nostr events as data streams or particles flowing through
* the Nexus, representing the agent's connection to the wider mesh.
*/
class NostrEventVisualizer {
constructor(options = {}) {
this.relayUrl = options.relayUrl || 'wss://relay.nostr.info';
this.maxEvents = options.maxEvents || 100;
this.particleCount = options.particleCount || 50;
this.streamSpeed = options.streamSpeed || 1.0;
this.particleSize = options.particleSize || 0.5;
this.ws = null;
this.events = [];
this.particles = [];
this.scene = null;
this.camera = null;
this.renderer = null;
this.isConnected = false;
this.reconnectAttempts = 0;
this.maxReconnectAttempts = 5;
// Callbacks
this.onEvent = options.onEvent || (() => {});
this.onConnect = options.onConnect || (() => {});
this.onDisconnect = options.onDisconnect || (() => {});
this.onError = options.onError || console.error;
// Event types to visualize
this.eventTypes = options.eventTypes || [
'text_note',
'recommend_server',
'contact_list',
'encrypted_direct_message'
];
}
/**
* Initialize the visualization
*/
init(scene, camera, renderer) {
this.scene = scene;
this.camera = camera;
this.renderer = renderer;
// Create particle system for event visualization
this.createParticleSystem();
console.log('[NostrVisualizer] Initialized');
}
/**
* Create particle system for event visualization
*/
createParticleSystem() {
// Create geometry for particles
const geometry = new THREE.BufferGeometry();
const positions = new Float32Array(this.particleCount * 3);
const colors = new Float32Array(this.particleCount * 3);
const sizes = new Float32Array(this.particleCount);
// Initialize particles
for (let i = 0; i < this.particleCount; i++) {
// Random position in a sphere
const theta = Math.random() * Math.PI * 2;
const phi = Math.acos(2 * Math.random() - 1);
const r = 50 + Math.random() * 50;
positions[i * 3] = r * Math.sin(phi) * Math.cos(theta);
positions[i * 3 + 1] = r * Math.sin(phi) * Math.sin(theta);
positions[i * 3 + 2] = r * Math.cos(phi);
// Color based on event type
colors[i * 3] = 0.3; // R
colors[i * 3 + 1] = 0.8; // G
colors[i * 3 + 2] = 1.0; // B
sizes[i] = this.particleSize;
// Store particle data
this.particles.push({
index: i,
x: positions[i * 3],
y: positions[i * 3 + 1],
z: positions[i * 3 + 2],
vx: (Math.random() - 0.5) * 0.1,
vy: (Math.random() - 0.5) * 0.1,
vz: (Math.random() - 0.5) * 0.1,
color: { r: 0.3, g: 0.8, b: 1.0 },
size: this.particleSize,
event: null
});
}
geometry.setAttribute('position', new THREE.BufferAttribute(positions, 3));
geometry.setAttribute('color', new THREE.BufferAttribute(colors, 3));
geometry.setAttribute('size', new THREE.BufferAttribute(sizes, 1));
// Create material
const material = new THREE.PointsMaterial({
size: this.particleSize,
vertexColors: true,
transparent: true,
opacity: 0.8,
blending: THREE.AdditiveBlending
});
// Create points
this.particleSystem = new THREE.Points(geometry, material);
this.scene.add(this.particleSystem);
console.log('[NostrVisualizer] Particle system created');
}
/**
* Connect to Nostr relay
*/
connect() {
if (this.isConnected) {
console.warn('[NostrVisualizer] Already connected');
return;
}
console.log(`[NostrVisualizer] Connecting to ${this.relayUrl}...`);
try {
this.ws = new WebSocket(this.relayUrl);
this.ws.onopen = () => {
console.log('[NostrVisualizer] Connected to Nostr relay');
this.isConnected = true;
this.reconnectAttempts = 0;
// Subscribe to events
this.subscribe();
// Call connect callback
this.onConnect();
};
this.ws.onmessage = (event) => {
try {
const data = JSON.parse(event.data);
this.handleEvent(data);
} catch (error) {
console.error('[NostrVisualizer] Failed to parse event:', error);
}
};
this.ws.onclose = () => {
console.log('[NostrVisualizer] Disconnected from Nostr relay');
this.isConnected = false;
// Call disconnect callback
this.onDisconnect();
// Attempt reconnect
this.scheduleReconnect();
};
this.ws.onerror = (error) => {
console.error('[NostrVisualizer] WebSocket error:', error);
this.onError(error);
};
} catch (error) {
console.error('[NostrVisualizer] Failed to connect:', error);
this.onError(error);
}
}
/**
* Subscribe to Nostr events
*/
subscribe() {
if (!this.isConnected || !this.ws) {
console.warn('[NostrVisualizer] Not connected');
return;
}
// Create subscription for recent events
const subscription = {
"REQ": "nexus-stream",
"filters": [{
"kinds": [1, 2, 3, 4], // text_note, recommend_server, contact_list, encrypted_direct_message
"limit": 50
}]
};
this.ws.send(JSON.stringify(subscription));
console.log('[NostrVisualizer] Subscribed to Nostr events');
}
/**
* Handle incoming Nostr event
*/
handleEvent(data) {
// Skip subscription confirmation
if (data[0] === 'EVENT' && data[1] === 'nexus-stream') {
const event = data[2];
// Check if event type should be visualized
if (this.eventTypes.includes(this.getEventType(event.kind))) {
this.visualizeEvent(event);
this.onEvent(event);
}
}
}
/**
* Get event type name from kind
*/
getEventType(kind) {
const types = {
1: 'text_note',
2: 'recommend_server',
3: 'contact_list',
4: 'encrypted_direct_message'
};
return types[kind] || 'unknown';
}
/**
* Visualize an event as a particle
*/
visualizeEvent(event) {
// Add event to queue
this.events.push({
event: event,
timestamp: Date.now(),
visualized: false
});
// Limit queue size
if (this.events.length > this.maxEvents) {
this.events.shift();
}
// Update particle for this event
this.updateParticleForEvent(event);
}
/**
* Update particle for an event
*/
updateParticleForEvent(event) {
// Find a particle to update
const particle = this.particles.find(p => !p.event);
if (!particle) {
// All particles are in use, recycle oldest
const oldest = this.particles.reduce((a, b) =>
(a.event && a.event.timestamp < b.event.timestamp) ? a : b
);
this.resetParticle(oldest);
this.updateParticleWithEvent(oldest, event);
} else {
this.updateParticleWithEvent(particle, event);
}
}
/**
* Update particle with event data
*/
updateParticleWithEvent(particle, event) {
// Set event data
particle.event = event;
// Set color based on event type
const colors = {
'text_note': { r: 0.3, g: 0.8, b: 1.0 }, // Blue
'recommend_server': { r: 1.0, g: 0.8, b: 0.3 }, // Gold
'contact_list': { r: 0.3, g: 1.0, b: 0.8 }, // Cyan
'encrypted_direct_message': { r: 1.0, g: 0.3, b: 0.8 } // Pink
};
const eventType = this.getEventType(event.kind);
particle.color = colors[eventType] || { r: 0.5, g: 0.5, b: 0.5 };
// Update geometry
this.updateParticleGeometry(particle);
console.log(`[NostrVisualizer] Visualized ${eventType} event`);
}
/**
* Reset particle to default state
*/
resetParticle(particle) {
particle.event = null;
particle.color = { r: 0.3, g: 0.8, b: 1.0 };
particle.size = this.particleSize;
// Random position
const theta = Math.random() * Math.PI * 2;
const phi = Math.acos(2 * Math.random() - 1);
const r = 50 + Math.random() * 50;
particle.x = r * Math.sin(phi) * Math.cos(theta);
particle.y = r * Math.sin(phi) * Math.sin(theta);
particle.z = r * Math.cos(phi);
this.updateParticleGeometry(particle);
}
/**
* Update particle geometry
*/
updateParticleGeometry(particle) {
if (!this.particleSystem) return;
const geometry = this.particleSystem.geometry;
const positions = geometry.attributes.position.array;
const colors = geometry.attributes.color.array;
const sizes = geometry.attributes.size.array;
// Update position
positions[particle.index * 3] = particle.x;
positions[particle.index * 3 + 1] = particle.y;
positions[particle.index * 3 + 2] = particle.z;
// Update color
colors[particle.index * 3] = particle.color.r;
colors[particle.index * 3 + 1] = particle.color.g;
colors[particle.index * 3 + 2] = particle.color.b;
// Update size
sizes[particle.index] = particle.size;
// Mark attributes as needing update
geometry.attributes.position.needsUpdate = true;
geometry.attributes.color.needsUpdate = true;
geometry.attributes.size.needsUpdate = true;
}
/**
* Update visualization
*/
update(deltaTime) {
if (!this.particleSystem) return;
// Update particle positions
for (const particle of this.particles) {
// Move particle
particle.x += particle.vx * this.streamSpeed * deltaTime;
particle.y += particle.vy * this.streamSpeed * deltaTime;
particle.z += particle.vz * this.streamSpeed * deltaTime;
// Add some turbulence
particle.vx += (Math.random() - 0.5) * 0.01;
particle.vy += (Math.random() - 0.5) * 0.01;
particle.vz += (Math.random() - 0.5) * 0.01;
// Limit velocity
const maxVel = 0.5;
particle.vx = Math.max(-maxVel, Math.min(maxVel, particle.vx));
particle.vy = Math.max(-maxVel, Math.min(maxVel, particle.vy));
particle.vz = Math.max(-maxVel, Math.min(maxVel, particle.vz));
// Keep particles in bounds
const maxDist = 100;
if (Math.abs(particle.x) > maxDist) particle.vx *= -0.5;
if (Math.abs(particle.y) > maxDist) particle.vy *= -0.5;
if (Math.abs(particle.z) > maxDist) particle.vz *= -0.5;
// Update geometry
this.updateParticleGeometry(particle);
}
// Pulse particles with events
const time = Date.now() * 0.001;
for (const particle of this.particles) {
if (particle.event) {
// Pulse size for particles with events
particle.size = this.particleSize * (1 + 0.2 * Math.sin(time * 3 + particle.index));
this.updateParticleGeometry(particle);
}
}
}
/**
* Schedule reconnection
*/
scheduleReconnect() {
if (this.reconnectAttempts >= this.maxReconnectAttempts) {
console.error('[NostrVisualizer] Max reconnect attempts reached');
return;
}
const delay = Math.min(1000 * Math.pow(2, this.reconnectAttempts), 30000);
console.log(`[NostrVisualizer] Reconnecting in ${delay / 1000}s...`);
setTimeout(() => {
this.reconnectAttempts++;
this.connect();
}, delay);
}
/**
* Disconnect from Nostr relay
*/
disconnect() {
console.log('[NostrVisualizer] Disconnecting...');
if (this.ws) {
this.ws.close();
this.ws = null;
}
this.isConnected = false;
// Clear particles
for (const particle of this.particles) {
this.resetParticle(particle);
}
console.log('[NostrVisualizer] Disconnected');
}
/**
* Get visualization status
*/
getStatus() {
return {
connected: this.isConnected,
relayUrl: this.relayUrl,
eventCount: this.events.length,
particleCount: this.particles.length,
activeParticles: this.particles.filter(p => p.event).length,
reconnectAttempts: this.reconnectAttempts
};
}
}
// Export for use in other modules
if (typeof module !== 'undefined' && module.exports) {
module.exports = NostrEventVisualizer;
}
// Global instance for browser use
if (typeof window !== 'undefined') {
window.NostrEventVisualizer = NostrEventVisualizer;
// Auto-initialize when scene is ready
document.addEventListener('DOMContentLoaded', () => {
// This would be called when Three.js scene is initialized
// window.nostrVisualizer = new NostrEventVisualizer();
// window.nostrVisualizer.init(scene, camera, renderer);
});
}

View File

@@ -1,135 +0,0 @@
# Timmy-config PR Backlog Audit — the-nexus #1471
Generated: 2026-04-16T01:44:07Z
Source issue: `process: Address timmy-config PR backlog (9 PRs - highest in org)`
## Source Snapshot
Issue #1471 claims timmy-config had 9 open PRs and the highest PR backlog in the org during the original triage snapshot.
This audit re-queries the live PR backlog and classifies it against current forge state instead of trusting that stale count.
## Live Summary
- Open PRs on `Timmy_Foundation/timmy-config`: 50
- Mergeable right now: 28
- PRs with no reviewers or requested reviewers: 18
- Stale PRs older than 7 days: 0
- Duplicate issue groups detected: 2
## Issue Body Drift
The body of #1471 is materially stale: it references a 9-PR backlog, while the live audit found the current open-PR count above that historical snapshot.
This means the issue should be treated as a process/report problem, not as a direct live-merge instruction.
## Duplicate Issue Groups
| Issue refs | PRs |
|---|---|
| #598 | #766 (fix/598); #765 (fix/598-crisis-manipulation) |
| #752 | #767 (feat/752-provenance-tracking); #760 (fix/752-provenance-integration) |
## Reviewer Coverage
| PR | Title | Updated |
|---|---|---|
| #780 | fix: add python3 shebang to bin/glitch_patterns.py (#681) | 2026-04-16 |
| #779 | feat: 500 indirect crisis signal training pairs (#597) | 2026-04-16 |
| #778 | feat: authority bypass jailbreak corpus — 200 prompts (#619) | 2026-04-16 |
| #777 | feat: token budget tracker with real-time dashboard (#622) | 2026-04-16 |
| #776 | feat: config drift detection across fleet nodes (#686) | 2026-04-16 |
| #775 | feat: PR triage automation script (#659) | 2026-04-16 |
| #774 | feat: 100 R&B/Soul lyrics→visual scene sets (#613) | 2026-04-16 |
| #773 | feat: bounded hash dedup with daily rotation (#628) | 2026-04-16 |
| #772 | feat: Cron job audit script (#662) | 2026-04-16 |
| #771 | feat: Quality gate integration with pipeline orchestrator (#627) | 2026-04-16 |
| #770 | fix: #660 - Makefile python3 portability | 2026-04-16 |
| #769 | feat: quality gate test suite — 27 tests (#629) | 2026-04-15 |
| #768 | feat: integrate token tracking with orchestrator (#634) | 2026-04-15 |
| #767 | feat: integrate provenance tracking with training pipelines | 2026-04-15 |
| #766 | feat: crisis response — manipulation & edge cases 500 pairs (#598) | 2026-04-15 |
| #765 | feat: 500 crisis manipulation & edge case training pairs (#598) | 2026-04-15 |
| #764 | fix: #646 | 2026-04-15 |
| #763 | feat: PR backlog triage script + 9 duplicate PRs closed (#658) | 2026-04-15 |
## Mergeable Snapshot
| PR | Title | Head branch |
|---|---|---|
| #780 | fix: add python3 shebang to bin/glitch_patterns.py (#681) | `fix/681-shebangs` |
| #779 | feat: 500 indirect crisis signal training pairs (#597) | `fix/597-indirect-crisis` |
| #778 | feat: authority bypass jailbreak corpus — 200 prompts (#619) | `fix/619-auth-bypass-v2` |
| #777 | feat: token budget tracker with real-time dashboard (#622) | `fix/622-token-tracker` |
| #776 | feat: config drift detection across fleet nodes (#686) | `fix/686-config-drift` |
| #775 | feat: PR triage automation script (#659) | `fix/659` |
| #774 | feat: 100 R&B/Soul lyrics→visual scene sets (#613) | `fix/613` |
| #773 | feat: bounded hash dedup with daily rotation (#628) | `fix/628-hash-rotation` |
| #772 | feat: Cron job audit script (#662) | `fix/662` |
| #771 | feat: Quality gate integration with pipeline orchestrator (#627) | `fix/627` |
| #770 | fix: #660 - Makefile python3 portability | `fix/660` |
| #769 | feat: quality gate test suite — 27 tests (#629) | `fix/629-quality-gate-tests` |
| #768 | feat: integrate token tracking with orchestrator (#634) | `fix/634` |
| #767 | feat: integrate provenance tracking with training pipelines | `feat/752-provenance-tracking` |
| #766 | feat: crisis response — manipulation & edge cases 500 pairs (#598) | `fix/598` |
| #765 | feat: 500 crisis manipulation & edge case training pairs (#598) | `fix/598-crisis-manipulation` |
| #764 | fix: #646 | `fix/646` |
| #763 | feat: PR backlog triage script + 9 duplicate PRs closed (#658) | `fix/658` |
| #762 | feat: 500 music mood prompt enhancement pairs (#601) | `fix/601` |
| #761 | fix: normalize code block indentation in training data (#750) | `fix/750` |
| ... | ... | +8 more mergeable PRs |
## Stale PRs
No stale PRs older than 7 days were detected in the live snapshot.
## Recommended Next Actions
1. Use the duplicate-issue groups to collapse obviously redundant PRs before attempting any merge sweep.
2. Assign reviewers (or request them) on the PRs with zero reviewer coverage so the backlog becomes reviewable instead of merely mergeable.
3. Prioritize mergeable PRs with unique issue refs and recent updates for the next burndown pass.
4. Treat this report as the live reference for #1471; the original issue body is now a stale ops snapshot.
## Raw Backlog Snapshot
| PR | Mergeable | Review signals | Issue refs |
|---|---|---|---|
| #780 | True | 0 | #681 |
| #779 | True | 0 | #597 |
| #778 | True | 0 | #619 |
| #777 | True | 0 | #622 |
| #776 | True | 0 | #686 |
| #775 | True | 0 | #659 |
| #774 | True | 0 | #613 |
| #773 | True | 0 | #628 |
| #772 | True | 0 | #662 |
| #771 | True | 0 | #627 |
| #770 | True | 0 | #660 |
| #769 | True | 0 | #629 |
| #768 | True | 0 | #634 |
| #767 | True | 0 | #752 |
| #766 | True | 0 | #598 |
| #765 | True | 0 | #598 |
| #764 | True | 0 | #646, #598 |
| #763 | True | 0 | #658, #757, #761, #750, #749, #687, #739, #737, #751, #691, #733, #740, #655, #736, #621, #716, #720, #690, #710, #708, #714, #602 |
| #762 | True | 1 | #601 |
| #761 | True | 1 | #750 |
| #760 | True | 1 | #752 |
| #759 | True | 1 | #603 |
| #758 | True | 1 | #799, #949 |
| #756 | True | 1 | #604 |
| #755 | True | 1 | #605 |
| #754 | True | 1 | #13, #8 |
| #753 | True | 2 | #606 |
| #751 | True | 2 | #691 |
| #748 | False | 2 | #607 |
| #747 | False | 2 | #1776268452231 |
| #746 | False | 1 | #609 |
| #745 | False | 1 | #610 |
| #744 | False | 1 | #611 |
| #743 | False | 2 | #696 |
| #742 | False | 1 | #612 |
| #741 | False | 1 | #615 |
| #740 | False | 1 | #618, #652, #655 |
| #738 | False | 1 | #696, #721 |
| #736 | False | 1 | #621 |
| #735 | False | 3 | #623 |
| ... | ... | ... | +10 more PRs |

View File

@@ -1,144 +0,0 @@
#!/usr/bin/env python3
"""
pr_triage.py — Triage PR backlog for timmy-config.
Identifies duplicate PRs for the same issue, unassigned PRs,
and recommends which to close/merge.
Usage:
python3 scripts/pr_triage.py --repo Timmy_Foundation/timmy-config
python3 scripts/pr_triage.py --repo Timmy_Foundation/timmy-config --close-duplicates --dry-run
"""
import argparse
import json
import os
import re
import sys
import urllib.request
from collections import defaultdict
from datetime import datetime, timezone
from pathlib import Path
GITEA_URL = "https://forge.alexanderwhitestone.com"
def get_token():
return (Path.home() / ".config" / "gitea" / "token").read_text().strip()
def fetch_open_prs(repo, headers):
all_prs = []
page = 1
while True:
url = f"{GITEA_URL}/api/v1/repos/{repo}/pulls?state=open&limit=100&page={page}"
req = urllib.request.Request(url, headers=headers)
resp = urllib.request.urlopen(req, timeout=15)
data = json.loads(resp.read())
if not data:
break
all_prs.extend(data)
if len(data) < 100:
break
page += 1
return all_prs
def find_duplicate_groups(prs):
issue_prs = defaultdict(list)
for pr in prs:
text = (pr.get("body") or "") + " " + (pr.get("title") or "")
issues = set(re.findall(r"#(\d+)", text))
for iss in issues:
issue_prs[iss].append(pr)
return {k: v for k, v in issue_prs.items() if len(v) > 1}
def generate_report(repo, prs):
now = datetime.now(timezone.utc)
lines = [f"# PR Triage Report — {repo}",
f"\nGenerated: {now.strftime('%Y-%m-%d %H:%M UTC')}",
f"Total open PRs: {len(prs)}", ""]
duplicates = find_duplicate_groups(prs)
unassigned = [p for p in prs if not p.get("assignee")]
lines.append("## Duplicate PR Groups")
if duplicates:
total_dupes = sum(len(v) - 1 for v in duplicates.values())
lines.append(f"**{len(duplicates)} issues with duplicate PRs ({total_dupes} excess PRs)**")
for issue, pr_group in sorted(duplicates.items(), key=lambda x: -len(x[1])):
keep = max(pr_group, key=lambda p: p["number"])
close = [p for p in pr_group if p["number"] != keep["number"]]
lines.append(f"\n### Issue #{issue} ({len(pr_group)} PRs)")
lines.append(f"- **KEEP:** #{keep['number']}{keep['title'][:60]}")
for p in close:
lines.append(f"- CLOSE: #{p['number']}{p['title'][:60]}")
else:
lines.append("No duplicate PR groups found.")
lines.append("")
lines.append(f"## Unassigned PRs: {len(unassigned)}")
for p in unassigned[:10]:
lines.append(f"- #{p['number']}: {p['title'][:70]}")
if len(unassigned) > 10:
lines.append(f"- ... and {len(unassigned) - 10} more")
lines.append("")
lines.append("## Recommendations")
excess = sum(len(v) - 1 for v in duplicates.values())
lines.append(f"1. Close {excess} duplicate PRs (keep newest for each issue)")
lines.append(f"2. Assign reviewers to {len(unassigned)} unassigned PRs")
lines.append(f"3. Consider adding duplicate-PR prevention to CI")
return "\n".join(lines)
def close_duplicate_prs(repo, prs, headers, dry_run=True):
duplicates = find_duplicate_groups(prs)
closed = 0
for issue, pr_group in duplicates.items():
keep = max(pr_group, key=lambda p: p["number"])
for pr in pr_group:
if pr["number"] == keep["number"]:
continue
if dry_run:
print(f"Would close PR #{pr['number']}: {pr['title'][:60]}")
else:
url = f"{GITEA_URL}/api/v1/repos/{repo}/pulls/{pr['number']}"
data = json.dumps({"state": "closed"}).encode()
req = urllib.request.Request(url, data=data, headers={**headers, "Content-Type": "application/json"}, method="PATCH")
try:
urllib.request.urlopen(req)
print(f"Closed PR #{pr['number']}")
closed += 1
except Exception as e:
print(f"Failed to close #{pr['number']}: {e}")
return closed
def main():
parser = argparse.ArgumentParser()
parser.add_argument("--repo", default="Timmy_Foundation/timmy-config")
parser.add_argument("--close-duplicates", action="store_true")
parser.add_argument("--dry-run", action="store_true")
args = parser.parse_args()
token = get_token()
headers = {"Authorization": f"token {token}"}
prs = fetch_open_prs(args.repo, headers)
if args.close_duplicates:
closed = close_duplicate_prs(args.repo, prs, headers, args.dry_run)
print(f"\n{'Would close' if args.dry_run else 'Closed'} {closed} duplicate PRs")
else:
report = generate_report(args.repo, prs)
print(report)
docs_dir = Path(__file__).resolve().parent.parent / "docs"
docs_dir.mkdir(exist_ok=True)
(docs_dir / "pr-triage-report.md").write_text(report)
if __name__ == "__main__":
main()

View File

@@ -1,257 +0,0 @@
#!/usr/bin/env python3
from __future__ import annotations
import argparse
import json
import os
import re
from datetime import datetime, timedelta, timezone
from pathlib import Path
from typing import Any
from urllib.error import HTTPError
from urllib.request import Request, urlopen
API_BASE = "https://forge.alexanderwhitestone.com/api/v1"
ORG = "Timmy_Foundation"
SOURCE_REPO = "the-nexus"
TARGET_REPO = "timmy-config"
DEFAULT_TOKEN_PATH = os.path.expanduser("~/.config/gitea/token")
DEFAULT_OUTPUT = "reports/2026-04-16-timmy-config-pr-backlog-audit.md"
def api_get(path: str, token: str) -> Any:
req = Request(API_BASE + path, headers={"Authorization": f"token {token}"})
with urlopen(req, timeout=30) as resp:
return json.loads(resp.read().decode())
def extract_issue_refs(title: str = "", body: str = "", head: str = "") -> list[int]:
text = " ".join(filter(None, [title, body, head]))
refs: list[int] = []
seen: set[int] = set()
for match in re.finditer(r"#(\d+)", text):
value = int(match.group(1))
if value not in seen:
seen.add(value)
refs.append(value)
if not refs and head:
for match in re.finditer(r"(?:^|[/-])(\d+)(?:$|[/-])", head):
value = int(match.group(1))
if value not in seen:
seen.add(value)
refs.append(value)
return refs
def summarize_backlog(backlog: list[dict[str, Any]], now_iso: str | None = None, stale_days: int = 7) -> dict[str, Any]:
now = _parse_iso(now_iso) if now_iso else datetime.now(timezone.utc)
duplicate_groups: dict[tuple[int, ...], list[dict[str, Any]]] = {}
missing_reviewer = []
stale = []
mergeable = []
for pr in backlog:
refs_list = pr.get("issue_refs") or extract_issue_refs(
pr.get("title") or "",
pr.get("body") or "",
pr.get("head") or "",
)
if not pr.get("issue_refs"):
pr["issue_refs"] = refs_list
refs = tuple(refs_list)
if refs:
duplicate_groups.setdefault(refs, []).append(pr)
if pr.get("review_count", 0) + pr.get("requested_reviewers", 0) == 0:
missing_reviewer.append(pr)
updated_at = _parse_iso(pr["updated_at"])
if now - updated_at > timedelta(days=stale_days):
stale.append(pr)
if pr.get("mergeable"):
mergeable.append(pr)
dupes = [
{"issue_refs": list(refs), "prs": prs}
for refs, prs in duplicate_groups.items()
if len(prs) > 1
]
dupes.sort(key=lambda item: (item["issue_refs"][0] if item["issue_refs"] else 10**9))
return {
"total_open_prs": len(backlog),
"mergeable_count": len(mergeable),
"missing_reviewer_count": len(missing_reviewer),
"stale_count": len(stale),
"duplicate_issue_groups": dupes,
"mergeable_prs": mergeable,
"missing_reviewer_prs": missing_reviewer,
"stale_prs": stale,
}
def render_report(*, source_issue: int, source_title: str, summary: dict[str, Any], backlog: list[dict[str, Any]], generated_at: str) -> str:
lines = [
f"# Timmy-config PR Backlog Audit — the-nexus #{source_issue}",
"",
f"Generated: {generated_at}",
f"Source issue: `{source_title}`",
"",
"## Source Snapshot",
"",
"Issue #1471 claims timmy-config had 9 open PRs and the highest PR backlog in the org during the original triage snapshot.",
"This audit re-queries the live PR backlog and classifies it against current forge state instead of trusting that stale count.",
"",
"## Live Summary",
"",
f"- Open PRs on `{ORG}/{TARGET_REPO}`: {summary['total_open_prs']}",
f"- Mergeable right now: {summary['mergeable_count']}",
f"- PRs with no reviewers or requested reviewers: {summary['missing_reviewer_count']}",
f"- Stale PRs older than 7 days: {summary['stale_count']}",
f"- Duplicate issue groups detected: {len(summary['duplicate_issue_groups'])}",
"",
"## Issue Body Drift",
"",
"The body of #1471 is materially stale: it references a 9-PR backlog, while the live audit found the current open-PR count above that historical snapshot.",
"This means the issue should be treated as a process/report problem, not as a direct live-merge instruction.",
"",
"## Duplicate Issue Groups",
"",
]
if summary["duplicate_issue_groups"]:
lines.extend(["| Issue refs | PRs |", "|---|---|"])
for group in summary["duplicate_issue_groups"]:
refs = ", ".join(f"#{n}" for n in group["issue_refs"]) or "(none)"
prs = "; ".join(f"#{pr['number']} ({pr['head']})" for pr in group["prs"])
lines.append(f"| {refs} | {prs} |")
else:
lines.append("No duplicate issue groups detected in the live backlog.")
lines.extend([
"",
"## Reviewer Coverage",
"",
])
if summary["missing_reviewer_prs"]:
lines.extend(["| PR | Title | Updated |", "|---|---|---|"])
for pr in summary["missing_reviewer_prs"][:20]:
lines.append(f"| #{pr['number']} | {pr['title']} | {pr['updated_at'][:10]} |")
if len(summary["missing_reviewer_prs"]) > 20:
lines.append(f"| ... | ... | +{len(summary['missing_reviewer_prs']) - 20} more |")
else:
lines.append("All open PRs currently show reviewer coverage signals.")
lines.extend([
"",
"## Mergeable Snapshot",
"",
])
if summary["mergeable_prs"]:
lines.extend(["| PR | Title | Head branch |", "|---|---|---|"])
for pr in summary["mergeable_prs"][:20]:
lines.append(f"| #{pr['number']} | {pr['title']} | `{pr['head']}` |")
if len(summary["mergeable_prs"]) > 20:
lines.append(f"| ... | ... | +{len(summary['mergeable_prs']) - 20} more mergeable PRs |")
else:
lines.append("No mergeable PRs reported in the live backlog snapshot.")
lines.extend([
"",
"## Stale PRs",
"",
])
if summary["stale_prs"]:
lines.extend(["| PR | Title | Updated |", "|---|---|---|"])
for pr in summary["stale_prs"]:
lines.append(f"| #{pr['number']} | {pr['title']} | {pr['updated_at'][:10]} |")
else:
lines.append("No stale PRs older than 7 days were detected in the live snapshot.")
lines.extend([
"",
"## Recommended Next Actions",
"",
"1. Use the duplicate-issue groups to collapse obviously redundant PRs before attempting any merge sweep.",
"2. Assign reviewers (or request them) on the PRs with zero reviewer coverage so the backlog becomes reviewable instead of merely mergeable.",
"3. Prioritize mergeable PRs with unique issue refs and recent updates for the next burndown pass.",
"4. Treat this report as the live reference for #1471; the original issue body is now a stale ops snapshot.",
"",
"## Raw Backlog Snapshot",
"",
"| PR | Mergeable | Review signals | Issue refs |",
"|---|---|---|---|",
])
for pr in backlog[:40]:
refs = ", ".join(f"#{n}" for n in pr.get("issue_refs", [])) or "(none)"
review_signals = pr.get("review_count", 0) + pr.get("requested_reviewers", 0)
lines.append(f"| #{pr['number']} | {pr['mergeable']} | {review_signals} | {refs} |")
if len(backlog) > 40:
lines.append(f"| ... | ... | ... | +{len(backlog) - 40} more PRs |")
return "\n".join(lines) + "\n"
def collect_backlog(repo: str, token: str) -> list[dict[str, Any]]:
prs: list[dict[str, Any]] = []
for page in range(1, 6):
batch = api_get(f"/repos/{ORG}/{repo}/pulls?state=open&limit=100&page={page}", token)
if not batch:
break
for pr in batch:
number = pr["number"]
reviews = _safe_api_get(f"/repos/{ORG}/{repo}/pulls/{number}/reviews", token) or []
requested = _safe_api_get(f"/repos/{ORG}/{repo}/pulls/{number}/requested_reviewers", token) or {}
prs.append({
"number": number,
"title": pr.get("title") or "",
"body": pr.get("body") or "",
"head": (pr.get("head") or {}).get("ref") or "",
"mergeable": bool(pr.get("mergeable")),
"updated_at": pr.get("updated_at") or pr.get("created_at") or "1970-01-01T00:00:00Z",
"review_count": len([r for r in reviews if r.get("state")]),
"requested_reviewers": len(requested.get("users", []) or []),
"issue_refs": extract_issue_refs(pr.get("title") or "", pr.get("body") or "", (pr.get("head") or {}).get("ref") or ""),
})
if len(batch) < 100:
break
return prs
def _safe_api_get(path: str, token: str):
try:
return api_get(path, token)
except HTTPError:
return None
def _parse_iso(value: str) -> datetime:
return datetime.fromisoformat(value.replace("Z", "+00:00"))
def main() -> int:
parser = argparse.ArgumentParser(description="Audit the live timmy-config PR backlog for the-nexus issue #1471.")
parser.add_argument("--issue", type=int, default=1471)
parser.add_argument("--source-repo", default=SOURCE_REPO)
parser.add_argument("--target-repo", default=TARGET_REPO)
parser.add_argument("--output", default=DEFAULT_OUTPUT)
parser.add_argument("--token-file", default=DEFAULT_TOKEN_PATH)
args = parser.parse_args()
token = Path(args.token_file).read_text(encoding="utf-8").strip()
issue = api_get(f"/repos/{ORG}/{args.source_repo}/issues/{args.issue}", token)
backlog = collect_backlog(args.target_repo, token)
summary = summarize_backlog(backlog)
generated_at = datetime.now(timezone.utc).strftime("%Y-%m-%dT%H:%M:%SZ")
report = render_report(
source_issue=args.issue,
source_title=issue.get("title") or "",
summary=summary,
backlog=backlog,
generated_at=generated_at,
)
out = Path(args.output)
out.parent.mkdir(parents=True, exist_ok=True)
out.write_text(report, encoding="utf-8")
print(out)
return 0
if __name__ == "__main__":
raise SystemExit(main())

View File

@@ -1,77 +0,0 @@
from pathlib import Path
from scripts.timmy_config_pr_backlog_audit import extract_issue_refs, summarize_backlog
def test_extract_issue_refs_from_title_body_and_branch() -> None:
text = "feat: crisis response — manipulation & edge cases 500 pairs (#598)"
body = "Refs #1471 and closes #598"
head = "fix/598-crisis-manipulation"
refs = extract_issue_refs(text, body, head)
assert 598 in refs
assert 1471 in refs
def test_summarize_backlog_finds_duplicates_missing_reviewers_and_stale_prs() -> None:
backlog = [
{
"number": 765,
"title": "feat: crisis response (#598)",
"body": "Closes #598",
"head": "fix/598-crisis-manipulation",
"mergeable": True,
"review_count": 0,
"requested_reviewers": 0,
"updated_at": "2026-04-01T00:00:00Z",
},
{
"number": 766,
"title": "feat: edge cases (#598)",
"body": "Closes #598",
"head": "fix/598",
"mergeable": True,
"review_count": 1,
"requested_reviewers": 0,
"updated_at": "2026-04-15T00:00:00Z",
},
{
"number": 777,
"title": "feat: token budget tracker (#622)",
"body": "Closes #622",
"head": "fix/622-token-tracker",
"mergeable": False,
"review_count": 0,
"requested_reviewers": 0,
"updated_at": "2026-04-15T00:00:00Z",
},
]
summary = summarize_backlog(backlog, now_iso="2026-04-16T00:00:00Z")
assert summary["total_open_prs"] == 3
assert summary["mergeable_count"] == 2
assert summary["missing_reviewer_count"] == 2
assert summary["stale_count"] == 1
assert summary["duplicate_issue_groups"][0]["issue_refs"] == [598]
assert {pr["number"] for pr in summary["duplicate_issue_groups"][0]["prs"]} == {765, 766}
def test_timmy_config_pr_backlog_report_exists_with_required_sections() -> None:
report = Path("reports/2026-04-16-timmy-config-pr-backlog-audit.md")
text = report.read_text(encoding="utf-8")
required = [
"# Timmy-config PR Backlog Audit — the-nexus #1471",
"## Source Snapshot",
"## Live Summary",
"## Issue Body Drift",
"## Duplicate Issue Groups",
"## Reviewer Coverage",
"## Mergeable Snapshot",
"## Stale PRs",
"## Recommended Next Actions",
]
missing = [item for item in required if item not in text]
assert not missing, missing