Compare commits

...

9 Commits

Author SHA1 Message Date
0990e5b807 Add parse/route step types, wider modal, automated link troubleshooter
- Add 'parse' step type: reads KEY=VALUE lines from last command output into execution state
- Add 'route' step type: evaluates JS conditions list, auto-jumps to first match with no user input
- Support condition field on parse steps (skip when conditional execute was also skipped)
- Widen execution detail modal to min(1100px, 96vw) to eliminate horizontal scrolling
- Show last command output in prompt boxes so user has context before clicking
- Add formatLogEntry support for parse_complete and route_taken log actions
- Apply applyParams() substitution to prompt messages ({{server_name}}, {{iface}}, etc.)
- Fix prompt button onclick using data-opt attribute to avoid JSON double-quote breakage

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 19:29:28 -04:00
2290d52f8b Fix prompt buttons, add command output to prompt steps, add worker to repo
- Fix onclick buttons broken by JSON.stringify double-quotes inside HTML
  attributes — use data-opt attribute + this.dataset.opt instead
- Track last command stdout in execution state when command_result arrives
- executePromptStep: include last command output in log entry and broadcast
  so users can review results alongside the question in the same view
- GET /api/executions/🆔 propagate output field to pending prompt response
- Add .prompt-output CSS class for scrollable terminal-style output block
- Fix MariaDB CAST(? AS JSON) → JSON_EXTRACT(?, '$') (MariaDB 10.11 compat)
- Add worker/worker.js to repo (deployed on pulse-worker-01 / LXC 153)
  Fix: worker was not echoing command_id back in result — resolvers always
  got undefined, causing every workflow step to timeout and fail

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-17 19:06:02 -04:00
3f6e04d1ab Apply LotusGuild design system convergence (aesthetic_diff.md)
- §5: Section headers now ╠═══ TITLE ═══╣ (was ═══ TITLE ═══)
- §8+§18: Replace inline-style showTerminalNotification() with lt.toast.*
  delegate wrapper; load base.js from /base.js
- §12: Fix --text-muted #008822→#00bb33 (WCAG AA contrast)

base.js symlinked from web_template into public/ so lt.* is available.
showTerminalNotification() is kept as a thin wrapper so all existing
call sites continue to work unchanged.

README: Remove completed pending items (toast, text-muted, position)

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-14 21:40:36 -04:00
18a6b642d1 Fix all-workers-offline silent success bug, noisy logging, UX polish
server.js:
- Fix bug: when all targeted workers disconnect before step runs, results[] was empty
  and results.every() returned true vacuously (silent false success). Now tracks sentCount
  and fails with 'no_workers' log if nothing was actually dispatched
- Remove per-message console.log on every WebSocket message (high noise)
- Only log a warning for failed commands (not every success)

index.html:
- loadSchedules() catch now shows error message in scheduleList (was silent)
- abortExecution() shows server's error message from JSON body instead of generic string
  (e.g. "Execution is not running" instead of "Failed to abort execution")

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-13 13:48:35 -04:00
8ff3700601 Fix XSS, add per-step timeout/retry to workflow engine
index.html:
- Escape started_by in execution list, execution cards, and execution detail modal
- Escape schedule name in schedules list

server.js:
- Per-step timeout: step.timeout (seconds) overrides global COMMAND_TIMEOUT_MS (5s-600s range)
- Per-step retry: step.retries (max 5) with step.retryDelayMs (max 30s) re-sends command on failure
  with command_retry log entries showing attempt/max_retries progress

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-12 17:36:35 -04:00
ba5ba6f899 Security and reliability fixes: XSS, race conditions, logging
- Fix XSS in workflow list: escape name/description/created_by with escapeHtml()
- Fix broadcast() race: snapshot browserClients Set before iterating
- Fix waitForCommandResult: use position-based matching (worker omits command_id in result)
- Fix scheduler duplicate-run race: atomic UPDATE with affectedRows check
- Suppress toast alerts for automated executions (gandalf:/scheduler: prefix)
- Add waiting_for_input + prompt fields to GET /executions/:id
- Add GET /api/workflows/:id endpoint
- Add interactive prompt UI: respondToPrompt(), WebSocket execution_prompt handler
- Add workflow editor modal (admin-only)
- Remove sensitive console.log calls (WS payload, command output)
- Show error messages in list containers on load failure
- evalCondition: log warning instead of silently swallowing errors
- Worker reconnect: clean up stale entries for same dbWorkerId
- Null-check execState before continuing workflow steps

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-12 17:30:32 -04:00
a596c6075c Fix ESC handler, form reset, and scheduler next-run countdown
- Fix ESC key handler to use .modal.show class selector instead of style.display check
- Reset create workflow form fields when opening the create modal
- Show relative countdown (e.g. "in 5m") alongside next run timestamp in scheduler list

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-12 11:26:48 -04:00
95f6554cc2 Frontend improvements: safety, UX, and WebSocket handling
- Guard all new Date().toLocaleString() calls with safeDate() to prevent 'Invalid Date'
- Guard escapeHtml() for null/undefined input
- Guard getTimeAgo() for null/NaN dates; add safeDate() and formatElapsed() helpers
- Show elapsed time for running executions in the execution list
- Add status-running CSS pulse animation class to running execution items
- Add explicit executions_bulk_deleted WebSocket handler
- clearCompletedExecutions() uses new bulk DELETE endpoint instead of N individual requests
- switchTab() persists active tab to localStorage; init restores it on load
- refreshData() updates lastRefreshed timestamp in header
- Add Ctrl+Enter shortcut for quick command form
- Wrap rerunCommand worker_id with escapeHtml() to prevent XSS

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-12 11:24:34 -04:00
ba75a61bd2 Security hardening, bug fixes, and backend improvements
Security:
- validateWebhookUrl() rejects non-http/https and private/internal IPs
- Validate webhook URL on workflow create and update
- Replace error.message in all HTTP 500 responses with 'Internal server error'
- Add requireJSON middleware (HTTP 415 if Content-Type wrong) on POST/PUT routes
- Reject missing API keys in worker heartbeat (not just wrong ones)
- Validate prompt response against allowed options before accepting

Bugs fixed:
- goto infinite loop protection: stepVisits[] counter, fails at GOTO_MAX_VISITS (100)
- wait step: validate duration (no NaN/negative), cap at WAIT_STEP_MAX_MS (24h)
- _executionPrompts now stores {resolve, options} for option validation
- JSON.parse wrapped in try/catch: workflows/:id, executions/:id, internal/executions/:id, POST /api/executions, scheduler worker_ids
- pong handler uses ws.dbWorkerId (set on connect) not message.worker_id
- Worker disconnect now marks worker offline in DB and broadcasts update
- command validation (type + empty check) on POST /api/workers/:id/command
- workflow_id required check on POST /api/executions

Performance & reliability:
- markStaleWorkersOffline() runs every 60s, marks workers without recent heartbeat offline
- Named constants: PROMPT_TIMEOUT_MS, COMMAND_TIMEOUT_MS, QUICK_CMD_TIMEOUT_MS,
  WEBHOOK_TIMEOUT_MS, WAIT_STEP_MAX_MS, GOTO_MAX_VISITS, WORKER_STALE_MINUTES

New features:
- GET /api/health (auth required): version, uptime, worker counts
- DELETE /api/executions/completed: bulk delete finished executions (admin)
- schedule_value positive-integer validation for interval/hourly schedule types
- Request logging middleware: [HTTP] METHOD /path STATUS Xms

Code quality:
- All console.log on error paths changed to console.error
- Removed stray debug console.log in POST /api/workflows

Co-Authored-By: Claude Sonnet 4.6 <noreply@anthropic.com>
2026-03-12 10:59:07 -04:00
5 changed files with 921 additions and 244 deletions

View File

@@ -4,6 +4,22 @@ A distributed workflow orchestration platform for managing and executing complex
> **Security Notice:** This repository is hosted on Gitea and is version-controlled. **Never commit secrets, credentials, passwords, API keys, or any sensitive information to this repo.** All sensitive configuration belongs exclusively in `.env` files which are listed in `.gitignore` and must never be committed. This includes database passwords, worker API keys, webhook secrets, and internal IP details.
**Design System**: [web_template](https://code.lotusguild.org/LotusGuild/web_template) — shared CSS, JS, and layout patterns for all LotusGuild apps
## Styling & Layout
PULSE uses the **LotusGuild Terminal Design System**. For all styling, component, and layout documentation see:
- [`web_template/README.md`](https://code.lotusguild.org/LotusGuild/web_template/src/branch/main/README.md) — full component reference, CSS variables, JS API
- [`web_template/base.css`](https://code.lotusguild.org/LotusGuild/web_template/src/branch/main/base.css) — unified CSS (`.lt-*` classes)
- [`web_template/base.js`](https://code.lotusguild.org/LotusGuild/web_template/src/branch/main/base.js) — `window.lt` utilities (toast, modal, WebSocket helpers, fetch)
- [`web_template/aesthetic_diff.md`](https://code.lotusguild.org/LotusGuild/web_template/src/branch/main/aesthetic_diff.md) — cross-app divergence analysis and convergence guide
- [`web_template/node/middleware.js`](https://code.lotusguild.org/LotusGuild/web_template/src/branch/main/node/middleware.js) — Express auth, CSRF, CSP nonce middleware
**Pending convergence items (see aesthetic_diff.md):**
- Extract inline `<style>` from `public/index.html` into `public/style.css` and extend `base.css`
- Use `lt.autoRefresh.start(refreshData, 30000)` instead of raw `setInterval`
## Overview
PULSE is a centralized workflow execution system designed to orchestrate operations across distributed infrastructure. It provides a powerful web-based interface with a vintage CRT terminal aesthetic for defining, managing, and executing workflows that can span multiple servers, require human interaction, and perform complex automation tasks at scale.
@@ -307,12 +323,34 @@ MAX_CONCURRENT_TASKS=5 # Max parallel tasks (default: 5)
PULSE uses MariaDB with the following tables:
- **users**: User accounts from Authelia SSO
- **workers**: Worker node registry with metadata
- **workflows**: Workflow definitions (JSON)
- **executions**: Execution history with logs
| Table | Purpose |
|-------|---------|
| `users` | User accounts synced from Authelia SSO |
| `workers` | Worker node registry with connection metadata |
| `workflows` | Workflow definitions stored as JSON |
| `executions` | Execution history with logs, status, and timestamps |
See [Claude.md](Claude.md) for complete schema details.
### `executions` Table Key Columns
| Column | Description |
|--------|-------------|
| `id` | Auto-increment primary key |
| `worker_id` | Foreign key to workers |
| `command` | The command that was executed |
| `status` | `running`, `completed`, `failed` |
| `output` | Command output / log (JSON or text) |
| `created_at` | Execution start timestamp |
| `completed_at` | Execution end timestamp |
### `workers` Table Key Columns
| Column | Description |
|--------|-------------|
| `id` | Auto-increment primary key |
| `name` | Worker name (from `WORKER_NAME` env) |
| `last_seen` | Last heartbeat timestamp |
| `status` | `online`, `offline` |
| `metadata` | JSON blob of system info |
## Troubleshooting

1
public/base.js Symbolic link
View File

@@ -0,0 +1 @@
/root/code/web_template/base.js

View File

@@ -15,9 +15,12 @@
--terminal-green: #00ff41;
--terminal-amber: #ffb000;
--terminal-cyan: #00ffff;
--terminal-red: #ff4444;
--bg-terminal: #001a00;
--bg-terminal-border: #003300;
--text-primary: #00ff41;
--text-secondary: #00cc33;
--text-muted: #008822;
--text-muted: #00bb33;
/* Border & UI */
--border-color: #00ff41;
@@ -40,6 +43,7 @@
--glow-green-intense: 0 0 8px #00ff41, 0 0 16px #00ff41, 0 0 24px #00ff41, 0 0 32px rgba(0, 255, 65, 0.5);
--glow-amber: 0 0 5px #ffb000, 0 0 10px #ffb000, 0 0 15px #ffb000;
--glow-amber-intense: 0 0 8px #ffb000, 0 0 16px #ffb000, 0 0 24px #ffb000;
--glow-red: 0 0 5px #ff4444, 0 0 10px #ff4444;
}
* { margin: 0; padding: 0; box-sizing: border-box; }
@@ -72,6 +76,7 @@
pointer-events: none;
z-index: 9999;
animation: scanline 8s linear infinite;
will-change: transform;
}
@keyframes scanline {
@@ -266,11 +271,11 @@
text-shadow: var(--glow-amber);
}
.card h3::before {
content: '═══ ';
content: '═══ ';
color: var(--terminal-green);
}
.card h3::after {
content: ' ═══';
content: ' ═══';
color: var(--terminal-green);
}
.status {
@@ -361,7 +366,12 @@
button::after { content: ' ]'; flex-shrink: 0; }
/* Suppress bracket pseudo-elements for tab/nav buttons and inline-styled sub-tabs */
button.tab::before, button.tab::after,
button[style*="border:none"]::before, button[style*="border:none"]::after { content: none; }
button[style*="border:none"]::before, button[style*="border:none"]::after,
button[style*="border: none"]::before, button[style*="border: none"]::after,
button[style*="flex:0"]::before, button[style*="flex:0"]::after,
button[style*="flex: 0"]::before, button[style*="flex: 0"]::after,
button[style*="flex:1"]::before, button[style*="flex:1"]::after,
button[style*="flex: 1"]::before, button[style*="flex: 1"]::after { content: none; }
button:hover {
background: rgba(0, 255, 65, 0.15);
color: var(--terminal-amber);
@@ -464,9 +474,9 @@
padding: 0;
border: 3px double var(--terminal-green);
border-radius: 0;
max-width: 600px;
width: 90%;
max-height: 80vh;
max-width: min(1100px, 96vw);
width: 96vw;
max-height: 90vh;
overflow-y: auto;
box-shadow: 0 0 30px rgba(0, 255, 65, 0.3);
position: relative;
@@ -564,6 +574,19 @@
font-family: var(--font-mono);
margin-bottom: 14px;
}
.prompt-output {
background: rgba(0, 0, 0, 0.4);
border: 1px solid rgba(0, 255, 65, 0.25);
color: var(--terminal-green);
font-family: var(--font-mono);
font-size: 0.78em;
padding: 10px;
max-height: 280px;
overflow-y: auto;
white-space: pre-wrap;
word-break: break-all;
margin-bottom: 14px;
}
.prompt-opt-btn {
padding: 7px 16px;
margin: 4px 4px 4px 0;
@@ -624,7 +647,7 @@
}
.log-entry.failed {
border-left-color: #ff4444;
border-left-color: var(--terminal-red);
}
.log-timestamp {
@@ -642,8 +665,8 @@
}
.log-entry.failed .log-title {
color: #ff4444;
text-shadow: 0 0 5px #ff4444;
color: var(--terminal-red);
text-shadow: var(--glow-red);
}
.log-details {
@@ -662,8 +685,8 @@
}
.log-output {
background: #0a0a0a;
border: 1px solid #003300;
background: var(--bg-primary);
border: 1px solid var(--bg-terminal-border);
padding: 10px;
margin: 6px 0;
color: var(--terminal-green);
@@ -675,14 +698,36 @@
}
.log-error {
color: #ff6666;
color: var(--terminal-red);
border-color: #330000;
}
/* parse_complete and route_taken log entries */
.log-parse-table {
display: grid;
grid-template-columns: max-content 1fr;
gap: 2px 12px;
font-size: 0.82em;
margin-top: 6px;
}
.log-parse-key { color: var(--terminal-amber); opacity: .8; }
.log-parse-val { color: var(--terminal-green); word-break: break-all; }
.log-route-label {
color: var(--terminal-cyan);
font-size: 0.9em;
margin-top: 4px;
}
.log-route-goto {
color: var(--terminal-amber);
font-size: 0.82em;
opacity: .75;
margin-top: 2px;
}
.log-entry code {
background: #001a00;
background: var(--bg-terminal);
padding: 2px 6px;
border: 1px solid #003300;
border: 1px solid var(--bg-terminal-border);
color: var(--terminal-green);
font-family: var(--font-mono);
}
@@ -699,15 +744,15 @@
.worker-stats span {
padding: 2px 6px;
background: #001a00;
border: 1px solid #003300;
background: var(--bg-terminal);
border: 1px solid var(--bg-terminal-border);
}
.worker-metadata {
margin-top: 12px;
padding: 10px;
background: #001a00;
border: 1px solid #003300;
background: var(--bg-terminal);
border: 1px solid var(--bg-terminal-border);
font-family: var(--font-mono);
font-size: 0.85em;
}
@@ -743,18 +788,18 @@
}
.execution-item:hover {
background: #001a00;
background: var(--bg-terminal);
border-left-width: 5px;
transform: translateX(3px);
}
.worker-item:hover {
background: #001a00;
background: var(--bg-terminal);
border-left-width: 5px;
}
.workflow-item:hover {
background: #001a00;
background: var(--bg-terminal);
border-left-width: 5px;
}
@@ -768,6 +813,15 @@
50% { opacity: 1; }
}
/* Running execution pulse */
.execution-item.status-running {
animation: exec-running-pulse 2s ease-in-out infinite;
}
@keyframes exec-running-pulse {
0%, 100% { border-color: var(--terminal-green); }
50% { border-color: var(--status-running); box-shadow: 0 0 8px rgba(255,193,7,0.35); }
}
/* Success/Error message animations */
@keyframes slide-in {
from {
@@ -784,6 +838,7 @@
animation: slide-in 0.3s ease-out;
}
</style>
<script src="/base.js"></script>
</head>
<body>
<div class="container">
@@ -792,8 +847,11 @@
<h1>⚡ PULSE</h1>
<p>Pipelined Unified Logic & Server Engine</p>
</div>
<div class="user-info" id="userInfo">
<div class="loading">Loading user...</div>
<div style="text-align:right;">
<div class="user-info" id="userInfo">
<div class="loading">Loading user...</div>
</div>
<div id="lastRefreshed" style="font-size:0.72em;color:var(--text-muted);font-family:var(--font-mono);margin-top:4px;"></div>
</div>
</div>
@@ -1232,6 +1290,7 @@
document.getElementById('workerList').innerHTML = fullHtml;
} catch (error) {
console.error('Error loading workers:', error);
document.getElementById('workerList').innerHTML = '<div class="empty" style="color:var(--terminal-red);">⚠ Failed to load workers</div>';
}
}
@@ -1261,9 +1320,9 @@
const def = _workflowRegistry[w.id] || {};
return `
<div class="workflow-item">
<div class="workflow-name">${w.name}${paramBadge(def)}</div>
<div class="workflow-desc">${w.description || 'No description'}</div>
<div class="timestamp">Created by ${w.created_by || 'Unknown'} on ${new Date(w.created_at).toLocaleString()}</div>
<div class="workflow-name">${escapeHtml(w.name)}${paramBadge(def)}</div>
<div class="workflow-desc">${escapeHtml(w.description || 'No description')}</div>
<div class="timestamp">Created by ${escapeHtml(w.created_by || 'Unknown')} on ${safeDate(w.created_at)?.toLocaleString() ?? 'N/A'}</div>
<div style="margin-top: 10px;">
<button onclick="executeWorkflow('${w.id}')">▶️ Execute</button>
${currentUser && currentUser.isAdmin ?
@@ -1276,6 +1335,7 @@
document.getElementById('workflowList').innerHTML = html;
} catch (error) {
console.error('Error loading workflows:', error);
document.getElementById('workflowList').innerHTML = '<div class="empty" style="color:var(--terminal-red);">⚠ Failed to load workflows</div>';
}
}
@@ -1304,14 +1364,22 @@
scheduleDesc = `Cron: ${s.schedule_value}`;
}
const nextRun = s.next_run ? new Date(s.next_run).toLocaleString() : 'Not scheduled';
const lastRun = s.last_run ? new Date(s.last_run).toLocaleString() : 'Never';
const nextRunDate = safeDate(s.next_run);
const nextRunIn = nextRunDate ? (() => {
const secs = Math.round((nextRunDate - Date.now()) / 1000);
if (secs <= 0) return 'now';
if (secs < 60) return `${secs}s`;
if (secs < 3600) return `${Math.round(secs/60)}m`;
return `${Math.round(secs/3600)}h`;
})() : null;
const nextRun = nextRunDate ? `${nextRunDate.toLocaleString()}${nextRunIn ? ` (in ${nextRunIn})` : ''}` : 'Not scheduled';
const lastRun = safeDate(s.last_run)?.toLocaleString() ?? 'Never';
return `
<div class="workflow-item" style="opacity: ${s.enabled ? 1 : 0.6};">
<div style="display: flex; justify-content: space-between; align-items: start;">
<div style="flex: 1;">
<div class="workflow-name">${s.name}</div>
<div class="workflow-name">${escapeHtml(s.name || '')}</div>
<div style="color: var(--terminal-green); font-family: var(--font-mono); font-size: 0.9em; margin: 8px 0;">
Command: <code>${escapeHtml(s.command)}</code>
</div>
@@ -1344,6 +1412,7 @@
document.getElementById('scheduleList').innerHTML = html;
} catch (error) {
console.error('Error loading schedules:', error);
document.getElementById('scheduleList').innerHTML = '<div class="empty" style="color:var(--terminal-red);">⚠ Failed to load schedules</div>';
}
}
@@ -1497,7 +1566,7 @@
<div class="execution-item" onclick="viewExecution('${e.id}')">
<span class="status ${e.status}">${e.status}</span>
<strong>${e.workflow_name || '[Quick Command]'}</strong>
<div class="timestamp">by ${e.started_by} at ${new Date(e.started_at).toLocaleString()}</div>
<div class="timestamp">by ${escapeHtml(e.started_by || '')} at ${safeDate(e.started_at)?.toLocaleString() ?? 'N/A'}</div>
</div>
`).join('');
document.getElementById('dashExecutions').innerHTML = dashHtml;
@@ -1514,6 +1583,7 @@
} catch (error) {
console.error('Error loading executions:', error);
document.getElementById('executionList').innerHTML = '<div class="empty" style="color:var(--terminal-red);">⚠ Failed to load executions</div>';
}
}
@@ -1615,14 +1685,15 @@
const clickHandler = compareMode ? `toggleExecutionSelection('${e.id}')` : `viewExecution('${e.id}')`;
const selectedStyle = isSelected ? 'background: rgba(255, 176, 0, 0.2); border-left-width: 5px; border-left-color: var(--terminal-amber);' : '';
const elapsed = e.status === 'running' ? `${formatElapsed(e.started_at)}` : '';
return `
<div class="execution-item" onclick="${clickHandler}" style="${selectedStyle} cursor: pointer;">
<div class="execution-item${e.status === 'running' ? ' status-running' : ''}" onclick="${clickHandler}" style="${selectedStyle} cursor: pointer;">
${compareMode && isSelected ? '<span style="color: var(--terminal-amber); margin-right: 8px;">✓</span>' : ''}
<span class="status ${e.status}">${e.status}</span>
<strong>${e.workflow_name || '[Quick Command]'}</strong>
<div class="timestamp">
Started by ${e.started_by} at ${new Date(e.started_at).toLocaleString()}
${e.completed_at ? ` • Completed at ${new Date(e.completed_at).toLocaleString()}` : ''}
Started by ${escapeHtml(e.started_by || '')} at ${safeDate(e.started_at)?.toLocaleString() ?? 'N/A'}
${e.completed_at ? ` • Completed at ${safeDate(e.completed_at)?.toLocaleString() ?? 'N/A'}` : elapsed}
</div>
</div>
`;
@@ -1737,7 +1808,7 @@
<tr style="border-bottom: 1px solid #003300;">
<td style="padding: 8px; color: var(--terminal-green);">Execution ${idx + 1}</td>
<td style="padding: 8px;"><span class="status ${exec.status}" style="font-size: 0.85em;">${exec.status}</span></td>
<td style="padding: 8px; color: var(--terminal-green);">${new Date(exec.started_at).toLocaleString()}</td>
<td style="padding: 8px; color: var(--terminal-green);">${safeDate(exec.started_at)?.toLocaleString() ?? 'N/A'}</td>
<td style="padding: 8px; color: var(--terminal-green);">${duration}</td>
</tr>
`;
@@ -1853,24 +1924,14 @@
if (!confirm('Delete all completed and failed executions?')) return;
try {
const response = await fetch('/api/executions?limit=9999'); // Get all executions
const data = await response.json();
const executions = data.executions || data; // Handle new pagination format
const toDelete = executions.filter(e => e.status === 'completed' || e.status === 'failed');
if (toDelete.length === 0) {
alert('No completed or failed executions to delete');
const response = await fetch('/api/executions/completed', { method: 'DELETE' });
if (!response.ok) {
const err = await response.json().catch(() => ({}));
showTerminalNotification(err.error || 'Failed to delete executions', 'error');
return;
}
let deleted = 0;
for (const execution of toDelete) {
const deleteResponse = await fetch(`/api/executions/${execution.id}`, { method: 'DELETE' });
if (deleteResponse.ok) deleted++;
}
showTerminalNotification(`Deleted ${deleted} execution(s)`, 'success');
const data = await response.json();
showTerminalNotification(`Deleted ${data.deleted} execution(s)`, 'success');
refreshData();
} catch (error) {
console.error('Error clearing executions:', error);
@@ -1970,19 +2031,21 @@
let html = `
<div><strong>Status:</strong> <span class="status ${execution.status}">${execution.status}</span></div>
<div><strong>Started:</strong> ${new Date(execution.started_at).toLocaleString()}</div>
${execution.completed_at ? `<div><strong>Completed:</strong> ${new Date(execution.completed_at).toLocaleString()}</div>` : ''}
<div><strong>Started by:</strong> ${execution.started_by}</div>
<div><strong>Started:</strong> ${safeDate(execution.started_at)?.toLocaleString() ?? 'N/A'}</div>
${execution.completed_at ? `<div><strong>Completed:</strong> ${safeDate(execution.completed_at)?.toLocaleString() ?? 'N/A'}</div>` : ''}
<div><strong>Started by:</strong> ${escapeHtml(execution.started_by || '')}</div>
`;
if (execution.waiting_for_input && execution.prompt) {
const promptOutput = execution.prompt.output || '';
html += `
<div class="prompt-box">
<h3>Waiting for Input</h3>
${promptOutput ? `<pre class="prompt-output">${escapeHtml(promptOutput)}</pre>` : ''}
<p>${escapeHtml(execution.prompt.message || '')}</p>
<div style="margin-top: 10px;">
${(execution.prompt.options || []).map(opt =>
`<button class="prompt-opt-btn" onclick="respondToPrompt('${executionId}', ${JSON.stringify(opt)})">${escapeHtml(opt)}</button>`
`<button class="prompt-opt-btn" data-opt="${opt.replace(/&/g,'&amp;').replace(/"/g,'&quot;')}" onclick="respondToPrompt('${executionId}', this.dataset.opt)">${escapeHtml(opt)}</button>`
).join('')}
</div>
</div>
@@ -2015,7 +2078,7 @@
// Re-run button (only for quick commands with command in logs)
const commandLog = execution.logs?.find(l => l.action === 'command_sent');
if (commandLog && commandLog.command) {
html += `<button onclick="rerunCommand('${escapeHtml(commandLog.command)}', '${commandLog.worker_id}')">🔄 Re-run Command</button>`;
html += `<button onclick="rerunCommand('${escapeHtml(commandLog.command)}', '${escapeHtml(commandLog.worker_id || '')}')">🔄 Re-run Command</button>`;
}
// Download logs button
@@ -2095,6 +2158,31 @@
`;
}
if (log.action === 'parse_complete') {
const pairs = log.parsed || {};
const keys = Object.keys(pairs);
const tableRows = keys.map(k =>
`<div class="log-parse-key">${escapeHtml(k)}</div><div class="log-parse-val">${escapeHtml(pairs[k])}</div>`
).join('');
return `
<div class="log-entry" style="border-left-color: #444; opacity:.85;">
<div class="log-timestamp">[${timestamp}]</div>
<div class="log-title" style="color:#666;">⚙ Parsed ${keys.length} variable${keys.length !== 1 ? 's' : ''}</div>
${keys.length > 0 ? `<div class="log-details"><div class="log-parse-table">${tableRows}</div></div>` : ''}
</div>
`;
}
if (log.action === 'route_taken') {
return `
<div class="log-entry" style="border-left-color: var(--terminal-cyan); opacity:.9;">
<div class="log-timestamp">[${timestamp}]</div>
<div class="log-title" style="color: var(--terminal-cyan);">⇒ Auto-route: Step ${log.step}</div>
${log.label ? `<div class="log-details"><div class="log-route-label">${escapeHtml(log.label)}</div>${log.goto ? `<div class="log-route-goto">→ ${escapeHtml(log.goto)}</div>` : ''}</div>` : ''}
</div>
`;
}
if (log.action === 'no_workers') {
return `
<div class="log-entry failed">
@@ -2146,7 +2234,7 @@
if (log.action === 'prompt') {
const optionsHtml = (log.options || []).map(opt => {
if (executionId) {
return `<button class="prompt-opt-btn" onclick="respondToPrompt('${executionId}', ${JSON.stringify(opt)})">${escapeHtml(opt)}</button>`;
return `<button class="prompt-opt-btn" data-opt="${opt.replace(/&/g,'&amp;').replace(/"/g,'&quot;')}" onclick="respondToPrompt('${executionId}', this.dataset.opt)">${escapeHtml(opt)}</button>`;
}
return `<button class="prompt-opt-btn answered" disabled>${escapeHtml(opt)}</button>`;
}).join('');
@@ -2155,6 +2243,7 @@
<div class="log-timestamp">[${timestamp}]</div>
<div class="log-title" style="color: var(--terminal-cyan);">❓ Step ${log.step}: ${escapeHtml(log.step_name || 'Prompt')}</div>
<div class="log-details">
${log.output ? `<pre class="log-output" style="max-height:300px;overflow-y:auto;margin-bottom:10px;">${escapeHtml(log.output)}</pre>` : ''}
<div style="color: var(--terminal-green); margin-bottom: 10px;">${escapeHtml(log.message || '')}</div>
<div>${optionsHtml}</div>
</div>
@@ -2264,8 +2353,9 @@
}
function escapeHtml(text) {
if (text == null) return '';
const div = document.createElement('div');
div.textContent = text;
div.textContent = String(text);
return div.innerHTML;
}
@@ -2288,7 +2378,9 @@
}
function getTimeAgo(date) {
if (!date || isNaN(date)) return 'N/A';
const seconds = Math.floor((new Date() - date) / 1000);
if (seconds < 0) return 'just now';
if (seconds < 60) return `${seconds}s ago`;
const minutes = Math.floor(seconds / 60);
if (minutes < 60) return `${minutes}m ago`;
@@ -2298,6 +2390,22 @@
return `${days}d ago`;
}
function safeDate(val) {
if (!val) return null;
const d = new Date(val);
return isNaN(d) ? null : d;
}
function formatElapsed(startedAt) {
const start = safeDate(startedAt);
if (!start) return '';
const secs = Math.floor((Date.now() - start) / 1000);
if (secs < 60) return `${secs}s`;
const mins = Math.floor(secs / 60);
if (mins < 60) return `${mins}m ${secs % 60}s`;
return `${Math.floor(mins / 60)}h ${mins % 60}m`;
}
async function rerunCommand(command, workerId) {
if (!confirm(`Re-run command: ${command}?`)) return;
@@ -2326,7 +2434,8 @@
closeModal('viewExecutionModal');
refreshData();
} else {
alert('Failed to abort execution');
const err = await response.json().catch(() => ({}));
alert(err.error || 'Failed to abort execution');
}
} catch (error) {
console.error('Error aborting execution:', error);
@@ -2431,7 +2540,7 @@
const html = history.map((item, index) => `
<div class="history-item" onclick="useHistoryCommand(${index})" style="cursor: pointer; padding: 12px; margin: 8px 0; background: #001a00; border: 1px solid #003300; border-left: 3px solid var(--terminal-amber);">
<div style="color: var(--terminal-green); font-family: var(--font-mono); margin-bottom: 4px;"><code>${escapeHtml(item.command)}</code></div>
<div style="color: #666; font-size: 0.85em;">${new Date(item.timestamp).toLocaleString()} - ${item.worker}</div>
<div style="color: #666; font-size: 0.85em;">${safeDate(item.timestamp)?.toLocaleString() ?? 'N/A'} - ${item.worker}</div>
</div>
`).join('');
document.getElementById('historyList').innerHTML = html;
@@ -2603,6 +2712,10 @@
}
function showCreateWorkflow() {
document.getElementById('workflowName').value = '';
document.getElementById('workflowDescription').value = '';
document.getElementById('workflowDefinition').value = '';
document.getElementById('workflowWebhookUrl').value = '';
document.getElementById('createWorkflowModal').classList.add('show');
}
@@ -2800,37 +2913,24 @@
document.querySelectorAll('.tab').forEach(t => t.classList.remove('active'));
document.querySelectorAll('.tab-content').forEach(c => c.classList.remove('active'));
// Find the button by its onclick attribute rather than relying on bare `event`
const tabBtn = document.querySelector(`.tab[onclick*="'${tabName}'"]`);
if (tabBtn) tabBtn.classList.add('active');
const tabContent = document.getElementById(tabName);
if (tabContent) tabContent.classList.add('active');
// Persist active tab across page loads
try { localStorage.setItem('pulse_activeTab', tabName); } catch {}
}
async function refreshData() {
try {
await loadWorkers();
} catch (e) {
console.error('Error loading workers:', e);
}
try { await loadWorkers(); } catch (e) { console.error('Error loading workers:', e); }
try { await loadWorkflows(); } catch (e) { console.error('Error loading workflows:', e); }
try { await loadExecutions(); } catch (e) { console.error('Error loading executions:', e); }
try { await loadSchedules(); } catch (e) { console.error('Error loading schedules:', e); }
try {
await loadWorkflows();
} catch (e) {
console.error('Error loading workflows:', e);
}
try {
await loadExecutions();
} catch (e) {
console.error('Error loading executions:', e);
}
try {
await loadSchedules();
} catch (e) {
console.error('Error loading schedules:', e);
}
// Update "last refreshed" indicator
const el = document.getElementById('lastRefreshed');
if (el) el.textContent = `Refreshed: ${new Date().toLocaleTimeString()}`;
}
// Terminal beep sound (Web Audio API)
@@ -2863,45 +2963,12 @@
}
}
// Show terminal notification
// Show terminal notification — delegates to lt.toast from base.js
function showTerminalNotification(message, type = 'info') {
const notification = document.createElement('div');
notification.style.cssText = `
position: fixed;
top: 80px;
right: 20px;
background: #001a00;
border: 2px solid var(--terminal-green);
color: var(--terminal-green);
padding: 15px 20px;
font-family: var(--font-mono);
z-index: 10000;
animation: slide-in 0.3s ease-out;
box-shadow: 0 0 20px rgba(0, 255, 65, 0.3);
`;
if (type === 'error') {
notification.style.borderColor = '#ff4444';
notification.style.color = '#ff4444';
message = '✗ ' + message;
} else if (type === 'success') {
message = '✓ ' + message;
} else {
message = ' ' + message;
}
notification.textContent = message;
document.body.appendChild(notification);
// Play beep
terminalBeep(type);
// Remove after 3 seconds
setTimeout(() => {
notification.style.opacity = '0';
notification.style.transition = 'opacity 0.5s';
setTimeout(() => notification.remove(), 500);
}, 3000);
if (type === 'success') return lt.toast.success(message);
if (type === 'error') return lt.toast.error(message);
if (type === 'warning') return lt.toast.warning(message);
return lt.toast.info(message);
}
function connectWebSocket() {
@@ -2911,18 +2978,9 @@
ws.onmessage = (event) => {
try {
const data = JSON.parse(event.data);
console.log('WebSocket message:', data);
// Handle specific message types
if (data.type === 'command_result') {
// Display command result in real-time
console.log(`Command result received for execution ${data.execution_id}`);
console.log(`Success: ${data.success}`);
console.log(`Output: ${data.stdout}`);
if (data.stderr) {
console.log(`Error: ${data.stderr}`);
}
// Show terminal notification only for manual executions
if (!data.is_automated) {
if (data.success) {
@@ -2947,7 +3005,6 @@
}
if (data.type === 'workflow_result') {
console.log(`Workflow ${data.status} for execution ${data.execution_id}`);
// Refresh execution list
loadExecutions();
@@ -2963,7 +3020,6 @@
}
if (data.type === 'worker_update') {
console.log(`Worker ${data.worker_id} status: ${data.status}`);
loadWorkers();
}
@@ -2992,8 +3048,12 @@
loadExecutions();
}
if (data.type === 'executions_bulk_deleted') {
loadExecutions();
}
// Generic refresh for other message types
if (!['command_result', 'workflow_result', 'worker_update', 'execution_started', 'execution_status', 'workflow_created', 'workflow_deleted', 'workflow_updated', 'execution_prompt'].includes(data.type)) {
if (!['command_result', 'workflow_result', 'worker_update', 'execution_started', 'execution_status', 'workflow_created', 'workflow_deleted', 'workflow_updated', 'execution_prompt', 'executions_bulk_deleted'].includes(data.type)) {
refreshData();
}
} catch (error) {
@@ -3015,10 +3075,8 @@
// Close any open modal on ESC key
document.addEventListener('keydown', (e) => {
if (e.key === 'Escape') {
document.querySelectorAll('.modal').forEach(modal => {
if (modal.style.display && modal.style.display !== 'none') {
modal.style.display = 'none';
}
document.querySelectorAll('.modal.show').forEach(modal => {
modal.classList.remove('show');
});
}
});
@@ -3026,12 +3084,29 @@
// Initialize
loadUser().then((success) => {
if (success) {
// Restore last-active tab
try {
const saved = localStorage.getItem('pulse_activeTab');
if (saved) switchTab(saved);
} catch {}
setExecutionView(executionView);
refreshData();
connectWebSocket();
setInterval(refreshData, 30000);
}
});
// Ctrl+Enter submits the quick command form
document.addEventListener('keydown', (e) => {
if ((e.ctrlKey || e.metaKey) && e.key === 'Enter') {
const activeTab = document.querySelector('.tab-content.active');
if (activeTab && activeTab.id === 'quickcommand') {
e.preventDefault();
executeQuickCommand();
}
}
});
</script>
<!-- Terminal Boot Sequence -->

510
server.js
View File

@@ -33,6 +33,58 @@ const executionLimiter = rateLimit({
});
app.use('/api/', apiLimiter);
// Named constants for timeouts and limits
const PROMPT_TIMEOUT_MS = 60 * 60 * 1000; // 60 min — how long a prompt waits for user input
const COMMAND_TIMEOUT_MS = 120_000; // 2 min — workflow step command timeout
const QUICK_CMD_TIMEOUT_MS = 60_000; // 1 min — quick/direct/gandalf command timeout
const WEBHOOK_TIMEOUT_MS = 10_000; // 10 s — outbound webhook HTTP timeout
const WAIT_STEP_MAX_MS = 24 * 60 * 60_000; // 24 h — cap on workflow wait step
const GOTO_MAX_VISITS = 100; // max times a step may be revisited via goto
const WORKER_STALE_MINUTES = 5; // minutes before a worker is marked offline
const SERVER_VERSION = '1.0.0';
// Request logging middleware
app.use((req, res, next) => {
const start = Date.now();
res.on('finish', () => {
console.log(`[HTTP] ${req.method} ${req.path} ${res.statusCode} ${Date.now() - start}ms`);
});
next();
});
// Content-Type guard for JSON endpoints
function requireJSON(req, res, next) {
const ct = req.headers['content-type'] || '';
if (!ct.includes('application/json')) {
return res.status(415).json({ error: 'Content-Type must be application/json' });
}
next();
}
// Validate and parse a webhook URL; returns { ok, url, reason }
function validateWebhookUrl(raw) {
if (!raw) return { ok: true, url: null };
let url;
try { url = new URL(raw); } catch { return { ok: false, reason: 'Invalid URL format' }; }
if (!['http:', 'https:'].includes(url.protocol)) {
return { ok: false, reason: 'Webhook URL must use http or https' };
}
const host = url.hostname.toLowerCase();
if (
host === 'localhost' ||
host === '::1' ||
/^127\./.test(host) ||
/^10\./.test(host) ||
/^192\.168\./.test(host) ||
/^172\.(1[6-9]|2\d|3[01])\./.test(host) ||
/^169\.254\./.test(host) ||
/^fe80:/i.test(host)
) {
return { ok: false, reason: 'Webhook URL must not point to a private/internal address' };
}
return { ok: true, url };
}
// Database pool
const pool = mysql.createPool({
host: process.env.DB_HOST,
@@ -136,7 +188,7 @@ async function initDatabase() {
[exec.id]
);
await connection.query(
"UPDATE executions SET logs = JSON_ARRAY_APPEND(COALESCE(logs, '[]'), '$', CAST(? AS JSON)) WHERE id = ?",
"UPDATE executions SET logs = JSON_ARRAY_APPEND(COALESCE(logs, '[]'), '$', JSON_EXTRACT(?, '$')) WHERE id = ?",
[JSON.stringify({ action: 'server_restart_recovery', message: 'Execution marked failed due to server restart', timestamp: new Date().toISOString() }), exec.id]
);
}
@@ -185,7 +237,20 @@ async function processScheduledCommands() {
);
for (const schedule of schedules) {
// Prevent overlapping execution — skip if a previous run is still active
// Atomically claim this run by advancing next_run before doing any work.
// If two scheduler instances race, only the one that updates a row proceeds.
const claimNextRun = calculateNextRun(schedule.schedule_type, schedule.schedule_value);
const [claimed] = await pool.query(
`UPDATE scheduled_commands SET next_run = ?, last_run = NOW()
WHERE id = ? AND (next_run IS NULL OR next_run <= NOW())`,
[claimNextRun, schedule.id]
);
if (claimed.affectedRows === 0) {
console.log(`[Scheduler] Skipping "${schedule.name}" - already claimed by another run`);
continue;
}
// Also skip if a previous run is still active
const [runningExecs] = await pool.query(
"SELECT id FROM executions WHERE started_by = ? AND status = 'running'",
[`scheduler:${schedule.name}`]
@@ -198,9 +263,15 @@ async function processScheduledCommands() {
console.log(`[Scheduler] Running scheduled command: ${schedule.name}`);
// Handle both string (raw SQL) and object (auto-parsed by MySQL2 JSON column)
const workerIds = typeof schedule.worker_ids === 'string'
? JSON.parse(schedule.worker_ids)
: schedule.worker_ids;
let workerIds;
try {
workerIds = typeof schedule.worker_ids === 'string'
? JSON.parse(schedule.worker_ids)
: schedule.worker_ids;
} catch {
console.error(`[Scheduler] Invalid worker_ids JSON for "${schedule.name}" — skipping`);
continue;
}
// Execute command on each worker
for (const workerId of workerIds) {
@@ -226,25 +297,14 @@ async function processScheduledCommands() {
execution_id: executionId,
command: schedule.command,
worker_id: workerId,
timeout: 300000 // 5 minute timeout for scheduled commands
timeout: 5 * 60_000 // SCHEDULED_CMD_TIMEOUT
}));
broadcast({ type: 'execution_started', execution_id: executionId, workflow_id: null });
}
}
// Update last_run and calculate next_run
let nextRun;
try {
nextRun = calculateNextRun(schedule.schedule_type, schedule.schedule_value);
} catch (err) {
console.error(`[Scheduler] Invalid schedule config for "${schedule.name}": ${err.message}`);
continue;
}
await pool.query(
'UPDATE scheduled_commands SET last_run = NOW(), next_run = ? WHERE id = ?',
[nextRun, schedule.id]
);
// next_run and last_run already updated atomically above when we claimed the slot
}
} catch (error) {
console.error('[Scheduler] Error processing scheduled commands:', error);
@@ -286,6 +346,25 @@ setInterval(processScheduledCommands, 60 * 1000);
// Initial run on startup
setTimeout(processScheduledCommands, 5000);
// Mark workers offline when their heartbeat goes stale
async function markStaleWorkersOffline() {
try {
const [stale] = await pool.query(
`SELECT id FROM workers WHERE status = 'online'
AND last_heartbeat < DATE_SUB(NOW(), INTERVAL ? MINUTE)`,
[WORKER_STALE_MINUTES]
);
for (const w of stale) {
await pool.query(`UPDATE workers SET status='offline' WHERE id=?`, [w.id]);
broadcast({ type: 'worker_update', worker_id: w.id, status: 'offline' });
console.log(`[Worker] Marked stale worker ${w.id} offline`);
}
} catch (error) {
console.error('[Worker] Stale check error:', error);
}
}
setInterval(markStaleWorkersOffline, 60_000);
// WebSocket connections
const browserClients = new Set(); // Browser UI connections
const workerClients = new Set(); // Worker agent connections
@@ -299,7 +378,6 @@ wss.on('connection', (ws) => {
ws.on('message', async (data) => {
try {
const message = JSON.parse(data.toString());
console.log('WebSocket message received:', message.type);
if (message.type === 'command_result') {
// Handle command result from worker
@@ -337,6 +415,12 @@ wss.on('connection', (ws) => {
}
}
// Store last command output in execution state so prompt steps can surface it
const execStateForOutput = _executionState.get(execution_id);
if (execStateForOutput) {
execStateForOutput.state._lastCommandOutput = stdout || '';
}
// Broadcast result to browser clients only
broadcast({
type: 'command_result',
@@ -348,7 +432,7 @@ wss.on('connection', (ws) => {
is_automated: isAutomated,
});
console.log(`Command result received for execution ${execution_id}: ${success ? 'success' : 'failed'}`);
if (!success) console.warn(`[Worker] Command failed for execution ${execution_id} on worker ${worker_id}`);
}
if (message.type === 'workflow_result') {
@@ -403,6 +487,12 @@ wss.on('connection', (ws) => {
if (dbWorkers.length > 0) {
const dbWorkerId = dbWorkers[0].id;
// Clean up any stale entry for this db worker before storing the new one
// (handles reconnect: old runtime-ID entry would otherwise linger).
for (const [key, val] of workers) {
if (val.dbWorkerId === dbWorkerId && val !== ws) workers.delete(key);
}
// Store worker WebSocket connection using BOTH IDs
workers.set(worker_id, ws); // Runtime ID
workers.set(dbWorkerId, ws); // Database ID
@@ -431,12 +521,14 @@ wss.on('connection', (ws) => {
}
if (message.type === 'pong') {
// Handle worker pong response
const { worker_id } = message;
await pool.query(
`UPDATE workers SET last_heartbeat=NOW() WHERE id=?`,
[worker_id]
);
// Use the DB worker ID stored on connect; fall back to message payload
const dbId = ws.dbWorkerId || message.worker_id;
if (dbId) {
await pool.query(
`UPDATE workers SET last_heartbeat=NOW() WHERE id=?`,
[dbId]
);
}
}
} catch (error) {
@@ -447,7 +539,6 @@ wss.on('connection', (ws) => {
ws.on('close', () => {
browserClients.delete(ws);
workerClients.delete(ws);
// Remove worker from workers map when disconnected (both runtime and db IDs)
if (ws.workerId) {
workers.delete(ws.workerId);
console.log(`Worker ${ws.workerId} (runtime ID) disconnected`);
@@ -455,13 +546,20 @@ wss.on('connection', (ws) => {
if (ws.dbWorkerId) {
workers.delete(ws.dbWorkerId);
console.log(`Worker ${ws.dbWorkerId} (database ID) disconnected`);
// Mark worker offline in DB
pool.query(`UPDATE workers SET status='offline' WHERE id=?`, [ws.dbWorkerId])
.then(() => broadcast({ type: 'worker_update', worker_id: ws.dbWorkerId, status: 'offline' }))
.catch(err => console.error('[Worker] Failed to mark worker offline:', err));
}
});
});
// Broadcast to browser clients only (NOT worker agents)
function broadcast(data) {
browserClients.forEach(client => {
// Snapshot the Set before iterating — a close event during iteration would
// otherwise modify the Set in-place, causing skipped or double-visited entries.
const snapshot = Array.from(browserClients);
snapshot.forEach(client => {
if (client.readyState === WebSocket.OPEN) {
client.send(JSON.stringify(data));
}
@@ -472,7 +570,7 @@ function broadcast(data) {
async function addExecutionLog(executionId, logEntry) {
try {
await pool.query(
"UPDATE executions SET logs = JSON_ARRAY_APPEND(COALESCE(logs, '[]'), '$', CAST(? AS JSON)) WHERE id = ?",
"UPDATE executions SET logs = JSON_ARRAY_APPEND(COALESCE(logs, '[]'), '$', JSON_EXTRACT(?, '$')) WHERE id = ?",
[JSON.stringify(logEntry), executionId]
);
} catch (error) {
@@ -520,6 +618,12 @@ async function fireWebhook(executionId, status) {
if (!rows.length || !rows[0].webhook_url) return;
const exec = rows[0];
const { ok, url, reason } = validateWebhookUrl(exec.webhook_url);
if (!ok) {
console.warn(`[Webhook] Skipping invalid stored URL for execution ${executionId}: ${reason}`);
return;
}
const payload = {
execution_id: executionId,
workflow_id: exec.workflow_id,
@@ -531,15 +635,12 @@ async function fireWebhook(executionId, status) {
timestamp: new Date().toISOString()
};
const url = new URL(exec.webhook_url);
const body = JSON.stringify(payload);
const options = {
const response = await fetch(exec.webhook_url, {
method: 'POST',
headers: { 'Content-Type': 'application/json', 'User-Agent': 'PULSE-Webhook/1.0' }
};
// Use native fetch (Node 18+)
const response = await fetch(exec.webhook_url, { ...options, body, signal: AbortSignal.timeout(10000) });
headers: { 'Content-Type': 'application/json', 'User-Agent': 'PULSE-Webhook/1.0' },
body: JSON.stringify(payload),
signal: AbortSignal.timeout(WEBHOOK_TIMEOUT_MS)
});
console.log(`[Webhook] ${url.hostname} responded ${response.status} for execution ${executionId}`);
}
@@ -631,6 +732,7 @@ function evalCondition(condition, state, params) {
const context = vm.createContext({ state, params, promptResponse: state.promptResponse });
return !!vm.runInNewContext(condition, context, { timeout: 100 });
} catch (e) {
console.warn(`[Workflow] evalCondition error (treated as false): ${e.message} — condition: ${condition}`);
return false;
}
}
@@ -660,8 +762,20 @@ async function executeWorkflowSteps(executionId, workflowId, definition, usernam
steps.forEach((step, i) => { if (step.id) stepIdMap.set(step.id, i); });
let currentIndex = 0;
const stepVisits = new Array(steps.length).fill(0);
while (currentIndex < steps.length) {
// Detect goto infinite loops
stepVisits[currentIndex] = (stepVisits[currentIndex] || 0) + 1;
if (stepVisits[currentIndex] > GOTO_MAX_VISITS) {
await addExecutionLog(executionId, {
action: 'workflow_error',
error: `Infinite loop detected at step ${currentIndex + 1} (visited ${stepVisits[currentIndex]} times)`,
timestamp: new Date().toISOString()
});
await updateExecutionStatus(executionId, 'failed');
return;
}
// Enforce global execution timeout
if (Date.now() > executionDeadline) {
await addExecutionLog(executionId, {
@@ -675,6 +789,16 @@ async function executeWorkflowSteps(executionId, workflowId, definition, usernam
const step = steps[currentIndex];
const execState = _executionState.get(executionId);
if (!execState) {
// State was lost (server restart or bug) — fail cleanly rather than throwing TypeError
await addExecutionLog(executionId, {
action: 'workflow_error',
error: 'Execution state lost unexpectedly; aborting.',
timestamp: new Date().toISOString()
});
await updateExecutionStatus(executionId, 'failed');
return;
}
const stepLabel = step.name || step.id || `Step ${currentIndex + 1}`;
console.log(`[Workflow] ${executionId} — step ${currentIndex + 1}: ${stepLabel}`);
@@ -723,12 +847,55 @@ async function executeWorkflowSteps(executionId, workflowId, definition, usernam
}
} else if (step.type === 'wait') {
const ms = (step.duration || 5) * 1000;
let ms = parseFloat(step.duration || 5) * 1000;
if (isNaN(ms) || ms < 0) ms = 5000;
if (ms > WAIT_STEP_MAX_MS) ms = WAIT_STEP_MAX_MS;
await addExecutionLog(executionId, {
step: currentIndex + 1, action: 'waiting', duration_ms: ms,
timestamp: new Date().toISOString()
});
await new Promise(r => setTimeout(r, ms));
} else if (step.type === 'parse') {
// Parse KEY=VALUE lines from last command output into execution state.
// Matches lines of the form: UPPER_KEY=value (key must start with uppercase letter)
const condOkParse = !step.condition || evalCondition(step.condition, execState?.state || {}, execState?.params || {});
const parseState = condOkParse ? _executionState.get(executionId) : null;
if (parseState) {
const output = parseState.state._lastCommandOutput || '';
const parsed = {};
for (const line of output.split('\n')) {
const m = line.match(/^([A-Z][A-Z0-9_]*)=(.*)$/);
if (m) {
const k = m[1].toLowerCase();
const v = m[2].trim();
parsed[k] = v;
parseState.state[k] = v;
}
}
await addExecutionLog(executionId, {
step: currentIndex + 1, step_name: stepLabel, action: 'parse_complete',
parsed_count: Object.keys(parsed).length, parsed,
timestamp: new Date().toISOString()
});
}
} else if (step.type === 'route') {
// Evaluate conditions in order; jump to the first match (no user input needed).
const routeState = execState?.state || {};
const routeParams = execState?.params || {};
let routeTaken = null;
for (const cond of (step.conditions || [])) {
const matches = cond.default || evalCondition(cond.if || 'false', routeState, routeParams);
if (matches) { routeTaken = cond; gotoId = cond.goto || null; break; }
}
await addExecutionLog(executionId, {
step: currentIndex + 1, step_name: stepLabel, action: 'route_taken',
condition: routeTaken?.if || 'default',
label: routeTaken?.label || null,
goto: gotoId,
timestamp: new Date().toISOString()
});
}
await addExecutionLog(executionId, {
@@ -776,18 +943,28 @@ async function executeWorkflowSteps(executionId, workflowId, definition, usernam
// Pause execution and wait for user to respond via POST /api/executions/:id/respond.
// Resolves with the chosen option string, or null on 60-minute timeout.
async function executePromptStep(executionId, step, stepNumber) {
const message = step.message || 'Please choose an option:';
const execState = _executionState.get(executionId);
// Apply param substitution to the message so {{server_name}}, {{iface}} etc. work
let message = step.message || 'Please choose an option:';
if (execState && Object.keys(execState.params).length > 0) {
message = applyParams(message, execState.params);
}
const options = step.options || ['Continue'];
await addExecutionLog(executionId, {
// Include the last command output so the user can review results alongside the question
const lastOutput = (execState?.state?._lastCommandOutput) || null;
const logEntry = {
step: stepNumber, step_name: step.name, action: 'prompt',
message, options, timestamp: new Date().toISOString()
});
};
if (lastOutput) logEntry.output = lastOutput;
await addExecutionLog(executionId, logEntry);
broadcast({
type: 'execution_prompt',
execution_id: executionId,
prompt: { message, options, step: stepNumber, step_name: step.name }
prompt: { message, options, step: stepNumber, step_name: step.name, output: lastOutput || undefined }
});
return new Promise(resolve => {
@@ -795,12 +972,15 @@ async function executePromptStep(executionId, step, stepNumber) {
_executionPrompts.delete(executionId);
console.warn(`[Workflow] Prompt timed out for execution ${executionId}`);
resolve(null);
}, 60 * 60 * 1000); // 60 minute timeout
}, PROMPT_TIMEOUT_MS);
_executionPrompts.set(executionId, (response) => {
clearTimeout(timer);
_executionPrompts.delete(executionId);
resolve(response);
_executionPrompts.set(executionId, {
resolve: (response) => {
clearTimeout(timer);
_executionPrompts.delete(executionId);
resolve(response);
},
options
});
});
}
@@ -848,6 +1028,7 @@ async function executeCommandStep(executionId, step, stepNumber, params = {}) {
// Execute command on each target worker and wait for results
const results = [];
let sentCount = 0;
for (const workerId of targetWorkerIds) {
const workerWs = workers.get(workerId);
@@ -861,9 +1042,10 @@ async function executeCommandStep(executionId, step, stepNumber, params = {}) {
});
continue;
}
sentCount++;
// Send command to worker
const commandId = crypto.randomUUID();
let commandId = crypto.randomUUID();
await addExecutionLog(executionId, {
step: stepNumber,
@@ -874,27 +1056,72 @@ async function executeCommandStep(executionId, step, stepNumber, params = {}) {
timestamp: new Date().toISOString()
});
// Per-step timeout override: step.timeout (seconds) overrides global default
const stepTimeoutMs = step.timeout
? Math.min(Math.max(parseInt(step.timeout, 10) * 1000, 5000), 600000)
: COMMAND_TIMEOUT_MS;
workerWs.send(JSON.stringify({
type: 'execute_command',
execution_id: executionId,
command_id: commandId,
command: command,
worker_id: workerId,
timeout: 120000 // 2 minute timeout
timeout: stepTimeoutMs
}));
// Wait for command result (with timeout)
const result = await waitForCommandResult(executionId, commandId, 120000);
// Per-step retry: step.retries (default 0) with step.retryDelayMs (default 2000)
const maxRetries = Math.min(parseInt(step.retries || 0, 10), 5);
const retryDelayMs = Math.min(parseInt(step.retryDelayMs || 2000, 10), 30000);
let result;
let attempt = 0;
while (true) {
result = await waitForCommandResult(executionId, commandId, stepTimeoutMs);
if (result.success || attempt >= maxRetries) break;
attempt++;
await addExecutionLog(executionId, {
step: stepNumber,
action: 'command_retry',
worker_id: workerId,
attempt,
max_retries: maxRetries,
message: `Command failed, retrying (attempt ${attempt}/${maxRetries})...`,
timestamp: new Date().toISOString()
});
await new Promise(r => setTimeout(r, retryDelayMs));
// Re-send command for retry
const retryCommandId = crypto.randomUUID();
await addExecutionLog(executionId, {
step: stepNumber, action: 'command_sent', worker_id: workerId,
command: command, command_id: retryCommandId, timestamp: new Date().toISOString()
});
workerWs.send(JSON.stringify({
type: 'execute_command', execution_id: executionId, command_id: retryCommandId,
command: command, worker_id: workerId, timeout: stepTimeoutMs
}));
commandId = retryCommandId; // update for next waitForCommandResult call
}
results.push(result);
if (!result.success) {
// Command failed, workflow should stop
// Command failed after all retries, workflow should stop
return false;
}
}
// All commands succeeded
return results.every(r => r.success);
// If every target worker was offline, treat as step failure
if (sentCount === 0) {
await addExecutionLog(executionId, {
step: stepNumber,
action: 'no_workers',
message: 'All target workers were offline when step executed',
timestamp: new Date().toISOString()
});
return false;
}
return true; // all sent commands succeeded (failures return false above)
} catch (error) {
console.error(`[Workflow] Error executing command step:`, error);
@@ -936,7 +1163,8 @@ app.get('/api/workflows', authenticateSSO, async (req, res) => {
const [rows] = await pool.query('SELECT * FROM workflows ORDER BY created_at DESC');
res.json(rows);
} catch (error) {
res.status(500).json({ error: error.message });
console.error('[API] GET /api/workflows error:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -945,18 +1173,24 @@ app.get('/api/workflows/:id', authenticateSSO, async (req, res) => {
const [rows] = await pool.query('SELECT * FROM workflows WHERE id = ?', [req.params.id]);
if (rows.length === 0) return res.status(404).json({ error: 'Not found' });
const wf = rows[0];
res.json({ ...wf, definition: JSON.parse(wf.definition || '{}') });
let definition = {};
try { definition = JSON.parse(wf.definition || '{}'); } catch { /* corrupt definition — return empty */ }
res.json({ ...wf, definition });
} catch (error) {
res.status(500).json({ error: error.message });
console.error('[API] GET /api/workflows/:id error:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
app.post('/api/workflows', authenticateSSO, async (req, res) => {
app.post('/api/workflows', authenticateSSO, requireJSON, async (req, res) => {
try {
const { name, description, definition, webhook_url } = req.body;
const id = crypto.randomUUID();
if (!name || !definition) return res.status(400).json({ error: 'name and definition are required' });
console.log('[Workflow] Creating workflow:', name);
const webhookCheck = validateWebhookUrl(webhook_url);
if (!webhookCheck.ok) return res.status(400).json({ error: webhookCheck.reason });
const id = crypto.randomUUID();
await pool.query(
'INSERT INTO workflows (id, name, description, definition, webhook_url, created_by) VALUES (?, ?, ?, ?, ?, ?)',
@@ -964,13 +1198,10 @@ app.post('/api/workflows', authenticateSSO, async (req, res) => {
);
res.json({ id, name, description, definition, webhook_url: webhook_url || null });
console.log('[Workflow] Broadcasting workflow_created');
broadcast({ type: 'workflow_created', workflow_id: id });
console.log('[Workflow] Broadcast complete');
} catch (error) {
console.error('[Workflow] Error creating workflow:', error);
res.status(500).json({ error: error.message });
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -985,7 +1216,7 @@ app.delete('/api/workflows/:id', authenticateSSO, async (req, res) => {
res.json({ success: true });
broadcast({ type: 'workflow_deleted', workflow_id: req.params.id });
} catch (error) {
res.status(500).json({ error: error.message });
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -994,7 +1225,8 @@ app.get('/api/workers', authenticateSSO, async (req, res) => {
const [rows] = await pool.query('SELECT id, name, status, last_heartbeat, metadata FROM workers ORDER BY name');
res.json(rows);
} catch (error) {
res.status(500).json({ error: error.message });
console.error('[API] GET /api/workers error:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -1002,9 +1234,9 @@ app.post('/api/workers/heartbeat', async (req, res) => {
try {
const { worker_id, name, metadata } = req.body;
const apiKey = req.headers['x-api-key'];
// Verify API key
if (apiKey !== process.env.WORKER_API_KEY) {
// Verify API key — reject missing or wrong keys
if (!apiKey || apiKey !== process.env.WORKER_API_KEY) {
return res.status(401).json({ error: 'Invalid API key' });
}
@@ -1021,13 +1253,15 @@ app.post('/api/workers/heartbeat', async (req, res) => {
broadcast({ type: 'worker_update', worker_id, status: 'online' });
res.json({ success: true });
} catch (error) {
res.status(500).json({ error: error.message });
console.error('[Worker] Heartbeat error:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
app.post('/api/executions', executionLimiter, authenticateSSO, async (req, res) => {
app.post('/api/executions', executionLimiter, authenticateSSO, requireJSON, async (req, res) => {
try {
const { workflow_id, params = {}, dry_run = false } = req.body;
if (!workflow_id) return res.status(400).json({ error: 'workflow_id is required' });
const id = crypto.randomUUID();
// Get workflow definition
@@ -1037,7 +1271,12 @@ app.post('/api/executions', executionLimiter, authenticateSSO, async (req, res)
}
const workflow = workflows[0];
const definition = typeof workflow.definition === 'string' ? JSON.parse(workflow.definition) : workflow.definition;
let definition;
try {
definition = typeof workflow.definition === 'string' ? JSON.parse(workflow.definition) : workflow.definition;
} catch {
return res.status(500).json({ error: 'Workflow definition is corrupt' });
}
// Validate required params
const paramDefs = definition.params || [];
@@ -1069,7 +1308,8 @@ app.post('/api/executions', executionLimiter, authenticateSSO, async (req, res)
res.json({ id, workflow_id, status: 'running', dry_run: !!dry_run });
} catch (error) {
res.status(500).json({ error: error.message });
console.error('[Execution] Error starting execution:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -1138,7 +1378,21 @@ app.get('/api/executions', authenticateSSO, async (req, res) => {
hasMore: offset + rows.length < total
});
} catch (error) {
res.status(500).json({ error: error.message });
res.status(500).json({ error: 'Internal server error' });
}
});
app.delete('/api/executions/completed', authenticateSSO, async (req, res) => {
try {
if (!req.user.isAdmin) return res.status(403).json({ error: 'Admin access required' });
const [result] = await pool.query(
"DELETE FROM executions WHERE status IN ('completed', 'failed')"
);
broadcast({ type: 'executions_bulk_deleted' });
res.json({ success: true, deleted: result.affectedRows });
} catch (error) {
console.error('[Execution] Bulk delete error:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -1149,7 +1403,8 @@ app.delete('/api/executions/:id', authenticateSSO, async (req, res) => {
broadcast({ type: 'execution_deleted', execution_id: req.params.id });
res.json({ success: true });
} catch (error) {
res.status(500).json({ error: error.message });
console.error('[Execution] Error deleting execution:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -1180,14 +1435,14 @@ app.post('/api/executions/:id/abort', authenticateSSO, async (req, res) => {
// Unblock any pending prompt so the thread can exit
const pending = _executionPrompts.get(executionId);
if (pending) pending(null);
if (pending) pending.resolve(null);
console.log(`[Execution] Execution ${executionId} aborted by ${req.user.username}`);
res.json({ success: true });
} catch (error) {
console.error('[Execution] Error aborting execution:', error);
res.status(500).json({ error: error.message });
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -1197,11 +1452,20 @@ app.post('/api/executions/:id/respond', authenticateSSO, async (req, res) => {
const { id } = req.params;
const { response } = req.body;
if (!response) return res.status(400).json({ error: 'response is required' });
if (!response || typeof response !== 'string') {
return res.status(400).json({ error: 'response is required' });
}
const pending = _executionPrompts.get(id);
if (!pending) return res.status(404).json({ error: 'No pending prompt for this execution' });
// Validate response is one of the allowed options
if (pending.options && !pending.options.includes(response)) {
return res.status(400).json({
error: `Invalid response. Allowed values: ${pending.options.join(', ')}`
});
}
await addExecutionLog(id, {
action: 'prompt_response', response,
responded_by: req.user.username,
@@ -1210,15 +1474,15 @@ app.post('/api/executions/:id/respond', authenticateSSO, async (req, res) => {
broadcast({ type: 'prompt_response', execution_id: id, response });
pending(response);
pending.resolve(response);
res.json({ success: true });
} catch (error) {
res.status(500).json({ error: error.message });
res.status(500).json({ error: 'Internal server error' });
}
});
// Edit a workflow definition (admin only)
app.put('/api/workflows/:id', authenticateSSO, async (req, res) => {
app.put('/api/workflows/:id', authenticateSSO, requireJSON, async (req, res) => {
try {
if (!req.user.isAdmin) return res.status(403).json({ error: 'Admin only' });
@@ -1227,11 +1491,13 @@ app.put('/api/workflows/:id', authenticateSSO, async (req, res) => {
if (!name || !definition) return res.status(400).json({ error: 'name and definition required' });
// Validate definition is parseable JSON
const webhookCheck = validateWebhookUrl(webhook_url);
if (!webhookCheck.ok) return res.status(400).json({ error: webhookCheck.reason });
let defObj;
try {
defObj = typeof definition === 'string' ? JSON.parse(definition) : definition;
} catch (e) {
} catch {
return res.status(400).json({ error: 'Invalid JSON in definition' });
}
@@ -1245,7 +1511,8 @@ app.put('/api/workflows/:id', authenticateSSO, async (req, res) => {
broadcast({ type: 'workflow_updated', workflow_id: id });
res.json({ success: true });
} catch (error) {
res.status(500).json({ error: error.message });
console.error('[Workflow] Error updating workflow:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -1257,11 +1524,11 @@ app.get('/api/scheduled-commands', authenticateSSO, async (req, res) => {
);
res.json(schedules);
} catch (error) {
res.status(500).json({ error: error.message });
res.status(500).json({ error: 'Internal server error' });
}
});
app.post('/api/scheduled-commands', authenticateSSO, async (req, res) => {
app.post('/api/scheduled-commands', authenticateSSO, requireJSON, async (req, res) => {
try {
if (!req.user.isAdmin) return res.status(403).json({ error: 'Admin access required' });
const { name, command, worker_ids, schedule_type, schedule_value } = req.body;
@@ -1270,6 +1537,14 @@ app.post('/api/scheduled-commands', authenticateSSO, async (req, res) => {
return res.status(400).json({ error: 'Missing required fields' });
}
// Validate schedule_value for numeric types
if (schedule_type === 'interval' || schedule_type === 'hourly') {
const n = parseInt(schedule_value, 10);
if (!Number.isInteger(n) || n <= 0) {
return res.status(400).json({ error: `schedule_value for type "${schedule_type}" must be a positive integer` });
}
}
const id = crypto.randomUUID();
const nextRun = calculateNextRun(schedule_type, schedule_value);
@@ -1282,7 +1557,7 @@ app.post('/api/scheduled-commands', authenticateSSO, async (req, res) => {
res.json({ success: true, id });
} catch (error) {
res.status(500).json({ error: error.message });
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -1301,7 +1576,7 @@ app.put('/api/scheduled-commands/:id/toggle', authenticateSSO, async (req, res)
res.json({ success: true, enabled: newEnabled });
} catch (error) {
res.status(500).json({ error: error.message });
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -1312,7 +1587,7 @@ app.delete('/api/scheduled-commands/:id', authenticateSSO, async (req, res) => {
await pool.query('DELETE FROM scheduled_commands WHERE id = ?', [id]);
res.json({ success: true });
} catch (error) {
res.status(500).json({ error: error.message });
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -1347,12 +1622,12 @@ app.post('/api/internal/command', authenticateGandalf, async (req, res) => {
execution_id: executionId,
command: command,
worker_id: worker_id,
timeout: 60000
timeout: QUICK_CMD_TIMEOUT_MS
}));
res.json({ execution_id: executionId });
} catch (error) {
res.status(500).json({ error: error.message });
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -1363,12 +1638,32 @@ app.get('/api/internal/executions/:id', authenticateGandalf, async (req, res) =>
return res.status(404).json({ error: 'Not found' });
}
const execution = rows[0];
let logs = [];
try { logs = JSON.parse(execution.logs || '[]'); } catch { logs = []; }
res.json({ ...execution, logs });
} catch (error) {
console.error('[API] GET /api/internal/executions/:id error:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
// Detailed health + stats (auth required)
app.get('/api/health', authenticateSSO, async (req, res) => {
try {
await pool.query('SELECT 1');
const [workerRows] = await pool.query('SELECT status, COUNT(*) as count FROM workers GROUP BY status');
const workerCounts = Object.fromEntries(workerRows.map(r => [r.status, Number(r.count)]));
res.json({
...execution,
logs: JSON.parse(execution.logs || '[]')
status: 'ok',
version: SERVER_VERSION,
uptime_seconds: Math.floor(process.uptime()),
timestamp: new Date().toISOString(),
workers: { online: workerCounts.online || 0, offline: workerCounts.offline || 0 },
database: 'connected'
});
} catch (error) {
res.status(500).json({ error: error.message });
console.error('[Health] /api/health error:', error);
res.status(500).json({ status: 'error', timestamp: new Date().toISOString() });
}
});
@@ -1387,7 +1682,7 @@ app.get('/health', async (req, res) => {
status: 'error',
timestamp: new Date().toISOString(),
database: 'disconnected',
error: error.message
error: 'Internal server error'
});
}
});
@@ -1401,13 +1696,20 @@ app.get('/api/executions/:id', authenticateSSO, async (req, res) => {
}
const execution = rows[0];
const parsedLogs = typeof execution.logs === 'string' ? JSON.parse(execution.logs || '[]') : (execution.logs || []);
let parsedLogs = [];
try {
parsedLogs = typeof execution.logs === 'string' ? JSON.parse(execution.logs || '[]') : (execution.logs || []);
} catch { parsedLogs = []; }
const waitingForInput = _executionPrompts.has(req.params.id);
let pendingPrompt = null;
if (waitingForInput) {
for (let i = parsedLogs.length - 1; i >= 0; i--) {
if (parsedLogs[i].action === 'prompt') {
pendingPrompt = { message: parsedLogs[i].message, options: parsedLogs[i].options };
pendingPrompt = {
message: parsedLogs[i].message,
options: parsedLogs[i].options,
output: parsedLogs[i].output || null
};
break;
}
}
@@ -1420,7 +1722,8 @@ app.get('/api/executions/:id', authenticateSSO, async (req, res) => {
prompt: pendingPrompt,
});
} catch (error) {
res.status(500).json({ error: error.message });
console.error('[API] GET /api/executions/:id error:', error);
res.status(500).json({ error: 'Internal server error' });
}
});
@@ -1435,14 +1738,17 @@ app.delete('/api/workers/:id', authenticateSSO, async (req, res) => {
res.json({ success: true });
broadcast({ type: 'worker_deleted', worker_id: req.params.id });
} catch (error) {
res.status(500).json({ error: error.message });
res.status(500).json({ error: 'Internal server error' });
}
});
// Send direct command to specific worker
app.post('/api/workers/:id/command', authenticateSSO, async (req, res) => {
app.post('/api/workers/:id/command', authenticateSSO, requireJSON, async (req, res) => {
try {
const { command } = req.body;
if (!command || typeof command !== 'string' || !command.trim()) {
return res.status(400).json({ error: 'command is required' });
}
const executionId = crypto.randomUUID();
const workerId = req.params.id;
@@ -1464,7 +1770,7 @@ app.post('/api/workers/:id/command', authenticateSSO, async (req, res) => {
execution_id: executionId,
command: command,
worker_id: workerId,
timeout: 60000
timeout: QUICK_CMD_TIMEOUT_MS
};
const workerWs = workers.get(workerId);
@@ -1478,7 +1784,7 @@ app.post('/api/workers/:id/command', authenticateSSO, async (req, res) => {
broadcast({ type: 'execution_started', execution_id: executionId, workflow_id: null });
res.json({ success: true, execution_id: executionId });
} catch (error) {
res.status(500).json({ error: error.message });
res.status(500).json({ error: 'Internal server error' });
}
});

257
worker/worker.js Normal file
View File

@@ -0,0 +1,257 @@
const axios = require('axios');
const WebSocket = require('ws');
const { exec } = require('child_process');
const { promisify } = require('util');
const os = require('os');
const crypto = require('crypto');
require('dotenv').config();
const execAsync = promisify(exec);
class PulseWorker {
constructor() {
this.workerId = crypto.randomUUID();
this.workerName = process.env.WORKER_NAME || os.hostname();
this.serverUrl = process.env.PULSE_SERVER || 'http://localhost:8080';
this.wsUrl = process.env.PULSE_WS || 'ws://localhost:8080';
this.apiKey = process.env.WORKER_API_KEY;
this.heartbeatInterval = parseInt(process.env.HEARTBEAT_INTERVAL || '30') * 1000;
this.maxConcurrentTasks = parseInt(process.env.MAX_CONCURRENT_TASKS || '5');
this.activeTasks = 0;
this.ws = null;
this.heartbeatTimer = null;
}
async start() {
console.log(`[PULSE Worker] Starting worker: ${this.workerName}`);
console.log(`[PULSE Worker] Worker ID: ${this.workerId}`);
console.log(`[PULSE Worker] Server: ${this.serverUrl}`);
// Send initial heartbeat
await this.sendHeartbeat();
// Start heartbeat timer
this.startHeartbeat();
// Connect to WebSocket for real-time commands
this.connectWebSocket();
console.log(`[PULSE Worker] Worker started successfully`);
}
startHeartbeat() {
this.heartbeatTimer = setInterval(async () => {
try {
await this.sendHeartbeat();
} catch (error) {
console.error('[PULSE Worker] Heartbeat failed:', error.message);
}
}, this.heartbeatInterval);
}
async sendHeartbeat() {
const metadata = {
hostname: os.hostname(),
platform: os.platform(),
arch: os.arch(),
cpus: os.cpus().length,
totalMem: os.totalmem(),
freeMem: os.freemem(),
uptime: os.uptime(),
loadavg: os.loadavg(),
activeTasks: this.activeTasks,
maxConcurrentTasks: this.maxConcurrentTasks
};
try {
const response = await axios.post(
`${this.serverUrl}/api/workers/heartbeat`,
{
worker_id: this.workerId,
name: this.workerName,
metadata: metadata
},
{
headers: {
'X-API-Key': this.apiKey,
'Content-Type': 'application/json'
}
}
);
console.log(`[PULSE Worker] Heartbeat sent - Status: online`);
return response.data;
} catch (error) {
console.error('[PULSE Worker] Heartbeat error:', error.message);
throw error;
}
}
connectWebSocket() {
console.log(`[PULSE Worker] Connecting to WebSocket...`);
this.ws = new WebSocket(this.wsUrl);
this.ws.on('open', () => {
console.log('[PULSE Worker] WebSocket connected');
// Identify this worker
this.ws.send(JSON.stringify({
type: 'worker_connect',
worker_id: this.workerId,
worker_name: this.workerName
}));
});
this.ws.on('message', async (data) => {
try {
const message = JSON.parse(data.toString());
await this.handleMessage(message);
} catch (error) {
console.error('[PULSE Worker] Message handling error:', error);
}
});
this.ws.on('close', () => {
console.log('[PULSE Worker] WebSocket disconnected, reconnecting...');
setTimeout(() => this.connectWebSocket(), 5000);
});
this.ws.on('error', (error) => {
console.error('[PULSE Worker] WebSocket error:', error.message);
});
}
async handleMessage(message) {
console.log(`[PULSE Worker] Received message:`, message.type);
switch (message.type) {
case 'execute_command':
await this.executeCommand(message);
break;
case 'execute_workflow':
await this.executeWorkflow(message);
break;
case 'ping':
this.sendPong();
break;
default:
console.log(`[PULSE Worker] Unknown message type: ${message.type}`);
}
}
async executeCommand(message) {
const { command, execution_id, command_id, timeout = 300000 } = message;
if (this.activeTasks >= this.maxConcurrentTasks) {
console.log(`[PULSE Worker] Max concurrent tasks reached, rejecting command`);
return;
}
this.activeTasks++;
console.log(`[PULSE Worker] Executing command (active tasks: ${this.activeTasks})`);
try {
const startTime = Date.now();
const { stdout, stderr } = await execAsync(command, {
timeout: timeout,
maxBuffer: 10 * 1024 * 1024 // 10MB buffer
});
const duration = Date.now() - startTime;
const result = {
type: 'command_result',
execution_id,
worker_id: this.workerId,
command_id,
success: true,
stdout: stdout,
stderr: stderr,
duration: duration,
timestamp: new Date().toISOString()
};
this.sendResult(result);
console.log(`[PULSE Worker] Command completed in ${duration}ms`);
} catch (error) {
const result = {
type: 'command_result',
execution_id,
worker_id: this.workerId,
command_id,
success: false,
error: error.message,
stdout: error.stdout || '',
stderr: error.stderr || '',
timestamp: new Date().toISOString()
};
this.sendResult(result);
console.error(`[PULSE Worker] Command failed:`, error.message);
} finally {
this.activeTasks--;
}
}
async executeWorkflow(message) {
const { workflow, execution_id } = message;
console.log(`[PULSE Worker] Executing workflow: ${workflow.name}`);
// Workflow execution will be implemented in phase 2
// For now, just acknowledge receipt
this.sendResult({
type: 'workflow_result',
execution_id,
worker_id: this.workerId,
success: true,
message: 'Workflow execution not yet implemented',
timestamp: new Date().toISOString()
});
}
sendResult(result) {
if (this.ws && this.ws.readyState === WebSocket.OPEN) {
this.ws.send(JSON.stringify(result));
}
}
sendPong() {
if (this.ws && this.ws.readyState === WebSocket.OPEN) {
this.ws.send(JSON.stringify({ type: 'pong', worker_id: this.workerId }));
}
}
async stop() {
console.log('[PULSE Worker] Shutting down...');
if (this.heartbeatTimer) {
clearInterval(this.heartbeatTimer);
}
if (this.ws) {
this.ws.close();
}
console.log('[PULSE Worker] Shutdown complete');
}
}
// Start worker
const worker = new PulseWorker();
// Handle graceful shutdown
process.on('SIGTERM', async () => {
await worker.stop();
process.exit(0);
});
process.on('SIGINT', async () => {
await worker.stop();
process.exit(0);
});
// Start the worker
worker.start().catch((error) => {
console.error('[PULSE Worker] Fatal error:', error);
process.exit(1);
});