Overview
The complete node triggers when another node in the flow has
finished processing a message. It listens for completion events from one or more target nodes,
allowing you to chain operations, run cleanup tasks, or confirm that an action was successful.
This is especially useful for nodes that perform asynchronous work like file writes, HTTP requests,
or database operations.
Properties
| Property | Type | Default | Description |
|---|---|---|---|
| scope | string | "target" | Listen to all nodes or specific target nodes |
| nodeIds | array | [] | Array of node IDs to monitor for completion |
How It Works
Target Mode
Listen for completion events from specific nodes selected in the configuration panel.
// Triggers only when "File Write"
// or "DB Insert" nodes complete
scope: "target"
nodeIds: ["file-write-id", "db-insert-id"] All Nodes Mode
Listen for completion events from every node in the flow.
// Triggers when ANY node in the
// current flow finishes processing
scope: "all"
nodeIds: [] // ignored Example: Chain Operations After File Write
Trigger a notification and log entry after a file write node completes successfully.
// Complete node configuration
{
"scope": "target",
"nodeIds": ["file-write-node-id"]
}
// The complete node receives the original message
// that was processed by the file write node.
// msg object available in downstream nodes:
{
"payload": "data that was written",
"filename": "/var/log/sensor-data.csv",
"_msgid": "abc123"
}
// Connect complete node output to:
// 1. Debug node - log confirmation
// 2. Email node - send notification
// 3. Change node - update flow status Example: Cleanup After Data Processing
Run cleanup logic once a data processing pipeline node finishes its work.
// Complete node watches the "Data Processor" node
{
"scope": "target",
"nodeIds": ["data-processor-id"]
}
// Downstream function node performs cleanup:
// Clear temporary context variables
flow.set("processingActive", false);
flow.set("tempBuffer", null);
// Update processing statistics
var stats = flow.get("stats") || {};
stats.lastCompleted = Date.now();
stats.totalProcessed = (stats.totalProcessed || 0) + 1;
flow.set("stats", stats);
msg.payload = {
"status": "cleanup_complete",
"processedAt": new Date().toISOString(),
"totalRuns": stats.totalProcessed
};
return msg; Common Use Cases
Sequential Pipelines
Ensure step A finishes before starting step B, even when step A is asynchronous.
Confirmation Logging
Log or notify that a critical operation (file write, API call, DB insert) completed.
Resource Cleanup
Release locks, clear temporary variables, or close connections after processing.
Flow Metrics
Track processing counts, durations, and throughput by monitoring node completions.