Overview
The link-in node acts as a virtual wire receiver. It receives
messages sent by link-out nodes without requiring a physical
wire connection on the canvas. This enables clean flow organization by connecting distant parts of
a flow or linking across entirely separate flows. Link In nodes can receive from multiple Link Out
nodes simultaneously.
Properties
| Property | Type | Default | Description |
|---|---|---|---|
| linkId | string | auto | Unique identifier for this link endpoint |
| scope | string | "flow" | Visibility scope: "flow" (same flow only) or "global" (all flows) |
| flowId | string | auto | ID of the flow this node belongs to |
Scope Modes
Flow Scope
Only visible to Link Out nodes on the same flow tab. Use for internal flow wiring.
scope: "flow"
// Only Link Out nodes on the
// same flow tab can send here Global Scope
Visible to Link Out nodes across all flows. Use for shared services and utilities.
scope: "global"
// Any Link Out node in any flow
// can send messages to this node Example: Reusable Error Handler
Create a centralized error handler that receives errors from multiple flows via virtual wires.
// Link In node: "Error Handler" (global scope)
{
"linkId": "error-handler-link",
"scope": "global",
"name": "Error Handler Receiver"
}
// This Link In receives errors from multiple flows:
// - Flow 1: API Gateway -> Link Out -> [Error Handler]
// - Flow 2: MQTT Handler -> Link Out -> [Error Handler]
// - Flow 3: File Watcher -> Link Out -> [Error Handler]
// Downstream processing after Link In:
// 1. Switch node routes by error severity
// 2. Critical errors -> Email notification
// 3. All errors -> Database log
// 4. Warning errors -> Dashboard update
// Message format received:
{
"payload": {
"error": "Connection timeout",
"severity": "critical",
"source": "mqtt-broker-1",
"timestamp": "2025-01-15T10:30:00Z"
}
} Example: Shared Data Processing Pipeline
Build a reusable data normalization pipeline that multiple input flows feed into.
// Link In node: "Data Normalizer" (global scope)
{
"linkId": "data-normalizer-input",
"scope": "global",
"name": "Normalize Pipeline Entry"
}
// Multiple data sources feed into this pipeline:
// - REST API endpoint sends raw JSON
// - CSV file reader sends parsed rows
// - MQTT subscriber sends sensor payloads
// The pipeline after this Link In:
// [Link In] -> [Validate Schema] -> [Transform] -> [Enrich]
// -> [Store in DB] -> [Link Out: "processed"]
// Each source uses a Link Out node pointed at this Link In,
// so the normalization logic is defined only once.
// Any changes to the pipeline apply to all data sources. Common Use Cases
Shared Error Handling
Centralize error processing logic that multiple flows can send to.
Reusable Pipelines
Define data processing once and let many sources feed in.
Clean Canvas Layout
Eliminate long wires that cross the entire canvas by using virtual connections.
Shared Logging
Aggregate log messages from all flows into a single logging pipeline.