30 Commits

Author SHA1 Message Date
znetsixe
4a9521154b fix: safety overfill keeps pumps running + minHeightBasedOn=inlet
Some checks failed
CI / lint-and-test (push) Has been cancelled
pumpingStation 5e2ebe4: overfill safety no longer shuts down machine
groups or blocks level control. Pumps keep running during overfill
(sewer can't stop receiving). Only upstream equipment is shut down.

Demo config: minHeightBasedOn=inlet (not outlet). The minimum height
reference for the basin is the inlet pipe elevation — sewage flows
in by gravity and the basin level can't go below the inlet without
the sewer backing up.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 14:11:08 +02:00
znetsixe
732b5a3380 fix: realistic sinus + continuous pump control + dead zone elimination
Some checks failed
CI / lint-and-test (push) Has been cancelled
Sinus inflow: 54-270 m³/h (base 0.015 + amplitude 0.06 m³/s), 4 min
period. Peak needs 1-2 pumps, never all 3 = realistic headroom.

PS control: continuous proportional demand when level > stopLevel, not
just when > startLevel && filling. Pumps now ramp down smoothly as
basin drains toward stopLevel instead of staying stuck at last setpoint.

pumpingStation e8dd657: dead zone elimination
build_flow.py: sinus tuned for gradual pump scaling visibility

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 13:42:55 +02:00
znetsixe
c8f149e204 feat(dashboard): split basin charts by unit + add y-axis labels to all charts
Some checks failed
CI / lint-and-test (push) Has been cancelled
Flow: m³/h, Power: kW, Basin Level: m, Basin Fill: % (0-100 fixed).
Level and fill in separate chart groups with their own gauges.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 13:19:32 +02:00
znetsixe
b693e0b90c fix: graduated pump control + mass balance corrections
Some checks failed
CI / lint-and-test (push) Has been cancelled
Three fixes:

1. PS outflow triple-counted (pumpingStation c62d8bc): MGC registered
   twice + individual pumps registered alongside MGC + dual event
   subscription per child. Now: one registration per aggregation level,
   one event per child. Volume integration tracks correctly.

2. All 3 pumps always on: minFlowLevel was 1.0 m but startLevel was
   2.0 m, so at the moment pumps started the percControl was already
   40% → MGC mapped to 356 m³/h → all 3 pumps. Fixed: minFlowLevel
   = startLevel (2.0 m) so percControl starts at 0% and ramps
   linearly. Now pumps graduate: 1-2 pumps at low level, 3 at high.

3. Generalizable registration rule added as code comments: when a group
   aggregator exists (MGC), subscribe to it, not its children. Pick
   one event name per measurement type per child.

E2E verified: 2/3 pumps active at 56% fill, volume draining correctly,
pump C at 5.2% ctrl delivering 99 m³/h while pump A stays off.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 13:10:32 +02:00
znetsixe
2b0c4e89b1 fix: abort recovery bounce loop broke MGC → pump control
Some checks failed
CI / lint-and-test (push) Has been cancelled
generalFunctions 086e5fe -> 693517c:
  abortCurrentMovement now takes options.returnToOperational (default
  false). Routine MGC demand-update aborts leave pumps in their current
  state. Only shutdown/emergency-stop paths pass returnToOperational:true.

rotatingMachine 510a423 -> 11d196f:
  executeSequence passes returnToOperational:true for shutdown/estop.

Verified E2E: PS fills to startLevel → MGC distributes demand → all 3
pumps at 1.31% ctrl delivering 121 m³/h each → basin draining at
-234 m³/h net. Full fill/drain cycle operational.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 12:11:36 +02:00
znetsixe
faaeb2efd3 fix: realistic basin sizing + boosted sinus inflow for visible fill cycle
Some checks failed
CI / lint-and-test (push) Has been cancelled
Basin was 30 m³ with 72 m³/h average sinus inflow → took 10+ minutes
to reach startLevel, looking static on the dashboard. Boosted sinus to
base=0.02 + amplitude=0.10 m³/s (avg ~252 m³/h, peak ~432 m³/h). Basin
fills from outlet to startLevel in ~3 minutes now.

Also removed initBasinProperties trace from previous debug session.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 11:17:17 +02:00
znetsixe
53b55d81c3 fix: fully configure PS basin + add node-completeness rule
Some checks failed
CI / lint-and-test (push) Has been cancelled
Basin undersized (10m³) for sinus peak (126 m³/h) → overflow → 122%.
Now 30 m³ with 4m height, all PS fields set. New rule: always configure
every field of every node.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 11:00:27 +02:00
znetsixe
eb97670179 fix(dashboard): use correct basin capacity for fill % + clamp to 0-100
Some checks failed
CI / lint-and-test (push) Has been cancelled
maxVol was hardcoded to 9.33 (overflow volume at 2.8 m height) instead
of 10.0 (basin capacity = basinVolume config). Volumes above 9.33 m³
produced fill > 100% (e.g. 122% at vol=11.4). Fixed to use 10.0 and
clamp to [0, 100].

Patched via nodes-only deploy — basin not reset.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 10:53:41 +02:00
znetsixe
cc4ee670ea fix(dashboard): move basin gauges to trend pages next to basin chart
Some checks failed
CI / lint-and-test (push) Has been cancelled
The tank gauge (basin level) and 270° arc gauge (fill %) now live on
the trend pages alongside the basin metrics chart — not on the control
page. Each trend page (10 min / 1 hour) gets its own pair of gauges.

Layout per trend page Basin group:
  - Chart (width 8): Basin fill % + Level + Net flow series
  - Tank gauge (width 2): 0–3 m with color zones at stop/start levels
  - Arc gauge (width 2): 0–100% fill with red/orange/green zones

Deployed via partial (nodes-only) deploy so the basin wasn't reset.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 10:46:38 +02:00
znetsixe
a51bc46e26 feat(dashboard): add tank gauge for basin level + 270° arc for fill %
Some checks failed
CI / lint-and-test (push) Has been cancelled
Basin Status group on the Control page now has two visual gauges:

1. gauge-tank (vertical tank with fill gradient) for basin level 0–3 m.
   Color zones: red < 0.6 m (below stopLevel) → orange → blue 1.2–2.5 m
   (normal operating range) → orange → red > 2.8 m (overflow zone).

2. gauge-34 (270° arc) for fill percentage 0–100%.
   Color zones: red < 10% → orange → green 30–80% → orange → red > 95%.

Both gauges are fed from the PS dispatcher's numeric outputs (fillPctNum
and levelNum) which also feed the basin trend charts — same data, two
visual forms.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 10:40:43 +02:00
znetsixe
b18c47c07e feat(dashboard): add basin fill gauge, countdown, and basin trend charts
Some checks failed
CI / lint-and-test (push) Has been cancelled
PS control page now shows 7 fields instead of 5:
  - Direction (filling/draining/steady)
  - Basin level (m)
  - Basin volume (m³)
  - Fill level (%)
  - Net flow (m³/h, signed)
  - Time to full/empty (countdown in min or s)
  - Inflow (m³/h)

Two new trend pages per time window (short 10 min / long 1 hour):
  - Basin chart: 3 series (Basin fill %, Basin level m, Net flow m³/h)
    on both Trends 10 min and Trends 1 hour pages.

PS formatter now extracts direction, netFlow, seconds from the delta-
compressed port 0 cache and computes fillPct from vol/maxVol. Dispatcher
sends 10 outputs (7 text + 3 trend numerics to both short+long basin
charts).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 10:35:44 +02:00
znetsixe
60c8d0ff66 fix: root-cause bogus machineCurve default poisoning spline predictions
Some checks failed
CI / lint-and-test (push) Has been cancelled
generalFunctions 29b78a3 -> 086e5fe:
  Schema default machineCurve.nq had a dummy pressure slice at key "1"
  with fake data. Deep merge injected it alongside real curve data,
  pulling the pressure-dimension spline negative at low pressures.
  Fix: default to empty {nq: {}, np: {}}.

rotatingMachine 26e253d -> 510a423:
  Tests updated for corrected fValues.min (70000 vs old 1).
  Trace instrumentation removed. 91/91 green.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 10:28:24 +02:00
znetsixe
658915c53e chore: bump rotatingMachine — clamp negative flow/power at ctrl≤0
Some checks failed
CI / lint-and-test (push) Has been cancelled
rotatingMachine: c464b66 -> 26e253d

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 10:07:10 +02:00
znetsixe
0cbd6a4077 wip: sinus-driven pumping station demo + PS levelbased control to MGC
Some checks failed
CI / lint-and-test (push) Has been cancelled
Architecture change: demo is now driven by a sinusoidal inflow into the
pumping station basin, rather than a random demand generator. The basin
fills from the sinus, and PS's levelbased control should start/stop
pumps via MGC when level crosses start/stop thresholds.

Changes:
- Demo Drivers tab: sinus generator (period 120s, base 0.005 + amp 0.03
  m³/s) replaces the random demand. Sends q_in to PS via link channel.
- PS config: levelbased mode, 10 m³ basin, startLevel 1.2 m / stopLevel
  0.6 m. Volume-based safeties on, time-based off.
- MGC scaling = normalized (was absolute) so PS's percent-based level
  control maps correctly.
- Dashboard mode toggle now drives PS mode (levelbased ↔ manual) instead
  of per-pump setMode. Slider sends Qd to PS (only effective in manual).
- PS code (committed separately): _controlLevelBased now calls
  _applyMachineGroupLevelControl + new Qd topic + forwardDemandToChildren.

KNOWN ISSUE: Basin fills correctly (visible on dashboard), but pumps
don't start when level exceeds startLevel. Likely cause: _pickVariant
for 'level' in _controlLevelBased may not be resolving the predicted
level correctly, or the safetyController is interfering despite
time-threshold being 0. Needs source-level tracing of the PS tick →
_safetyController → _controlLogic → _controlLevelBased path with
logging enabled. To be debugged in the next session.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 08:42:22 +02:00
znetsixe
bc8138c3dc fix(charts): add all required FlowFuse ui-chart properties + document in rule set
Some checks failed
CI / lint-and-test (push) Has been cancelled
Charts rendered blank because the helper was missing 15+ required
FlowFuse properties. The critical three:
  - interpolation: "linear" (no line drawn without it)
  - yAxisProperty: "payload" + yAxisPropertyType: "msg" (chart didn't
    know which msg field to plot)
  - xAxisPropertyType: "timestamp" (chart didn't know the x source)

Also: width/height must be numbers not strings, colors/textColor/
gridColor arrays must be present, and stackSeries/bins/xAxisFormat/
xAxisFormatType all need explicit values.

Fixed the ui_chart helper to include every property from the working
rotatingMachine/examples/03-Dashboard.json charts. Added the full
required-property template + gotcha list to the flow-layout rule set
(Section 4) so this class of bug is caught by reference on the next
chart build.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 08:04:43 +02:00
znetsixe
06d81169e8 fix(trends): add msg.timestamp to chart data points
Some checks failed
CI / lint-and-test (push) Has been cancelled
FlowFuse ui-chart with xAxisType=time may need an explicit timestamp
on each msg for the time axis to render. Added Date.now() as
msg.timestamp on the per-pump dispatcher flow/power outputs.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 07:59:04 +02:00
znetsixe
82db2953e9 fix(dashboard): resolve [object Object] in ui-text widgets + use dispatcher pattern
Some checks failed
CI / lint-and-test (push) Has been cancelled
FlowFuse ui-text only supports {{msg.payload}} — not nested paths
like {{msg.payload.state}}. Every ui-text was showing [object Object]
because the formatter sent a fat object as msg.payload and the format
template tried to access sub-fields.

Fix: per-pump (and per-MGC, per-PS) "dispatcher" function on the
Dashboard UI tab. The dispatcher receives the fat object via one
link-in, then returns 7-9 plain-string outputs — one per ui-text
widget — each with msg.payload set to the formatted string value.
Outputs 8+9 carry numeric values (flowNum/powerNum) tagged with
msg.topic for the trend charts, wired directly to both short-term
and long-term chart nodes.

Pattern documented as the recommended approach in the rule set:
"FlowFuse ui-text receives plain strings only — use a dispatcher
function to split a fat object into per-widget outputs."

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 07:54:02 +02:00
znetsixe
d439b048f2 docs: add CLAUDE.md to all 11 node submodules — S88 classification + rule reference
Some checks failed
CI / lint-and-test (push) Has been cancelled
Each node repo now has a CLAUDE.md that declares its S88 hierarchy
level (Control Module / Equipment Module / Unit / Process Cell), the
associated S88 colour, and the placement lane per the superproject's
flow-layout rule set (.claude/rules/node-red-flow-layout.md).

The rule set lives in the superproject only (single source of truth).
Per-node repos reference it. When Claude Code opens a node repo, it
reads the local CLAUDE.md and knows which lane / colour / group to
use when building a multi-node demo or production flow.

Submodule pointer bumps for all 11 nodes.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 07:48:37 +02:00
znetsixe
e280d87e6a fix(dashboard): split trends into 3 pages + fix chart dimensions
Some checks failed
CI / lint-and-test (push) Has been cancelled
Dashboard was a single page — 30+ widgets + tiny charts competing for
space. Trends were invisible or very small (width/height both "0"
meant "inherit from group" which gave near-zero chart area).

Split into 3 dashboard pages:
  1. Control — Process Demand, Station Controls, MGC/Basin status,
     per-pump panels (unchanged, just moved off trend groups)
  2. Trends — 10 min — rolling 10-minute flow + power charts with
     width=12 (full group), height=8 (tall charts), 300 max points
  3. Trends — 1 hour — same layout with 60-minute window, 1800 points

All 3 pages auto-nav via the FlowFuse sidebar. Same data feed: the
per-pump trend_split function now wires to 4 charts (2 outputs × 2
pages) instead of 2.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-14 07:46:51 +02:00
znetsixe
64944aa9d8 docs(rules): add S88-hierarchical placement rules for Node-RED flows
Some checks failed
CI / lint-and-test (push) Has been cancelled
Sections 10-16 extend the existing flow-layout rule with a deterministic
lane-and-group convention anchored in the S88 hierarchy:

- 8 logical lanes: L0 inputs -> L1 adapters -> L2 CM -> L3 EM -> L4 UN
  -> L5 PC -> L6 formatters -> L7 outputs. 240 px between lanes.
- Lane assignment is by S88 level, not by node name. New nodes inherit
  a lane via a NODE_LEVEL registry, no rule change needed.
- Every parent + its direct children is wrapped in a Node-RED group box
  coloured by the parent's S88 level (Pump A = EM blue, MGC = Unit blue,
  PS = Process Cell blue, ...). Search the parent's name -> group
  highlights.
- Utility clusters (mode broadcast, station-wide commands, demand
  fan-out) use neutral-grey group boxes.
- Dashboard / setup / demo-driver tabs each get a variant of the rule.
- Spacing constants, place() and wrap_in_group() helpers, an 8-step
  verification checklist.

Off-spec colours (settler orange, monster teal, diffuser and
dashboardAPI missing) are flagged in Section 16 as a follow-up cleanup.
The NODE_LEVEL registry already maps those nodes to their semantic S88
level regardless of what the node's own colour currently says.

Rule lives in the superproject only; per-node repos will reference it
from their own CLAUDE.md files (separate commits per submodule).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-13 17:31:57 +02:00
znetsixe
0d7af6bfff refactor(examples): split pumpingstation demo across 4 concern-based tabs + add layout rule set
Some checks failed
CI / lint-and-test (push) Has been cancelled
The demo was a single 96-node tab with everything wired directly. Now
4 tabs wired only through named link-out / link-in pairs, and a
permanent rule set for future Claude sessions to follow.

Tabs (by concern, not by data flow):

  🏭 Process Plant   only EVOLV nodes (3 pumps + MGC + PS + 6 measurements)
                     + per-node output formatters
  📊 Dashboard UI   only ui-* widgets, button/setpoint wrappers, trend
                     splitters
  🎛️ Demo Drivers   random demand generator + state holder. Removable
                     in production
  ⚙️ Setup & Init   one-shot deploy-time injects (mode, scaling,
                     auto-startup, random-on)

Cross-tab wiring uses a fixed named-channel contract (cmd:demand,
cmd:mode, cmd:setpoint-A, evt:pump-A, etc.) — multiple emitters can
target a single link-in for fan-in, e.g. both the slider and the random
generator feed cmd:demand.

Bug fixes folded in:

1. Trend chart was empty / scrambled. Root cause: the trend-feeder
   function had ONE output that wired to BOTH flow and power charts,
   so each chart received both flow and power msgs and the legend
   garbled. Now: 2 outputs (flow → flow chart, power → power chart),
   one msg per output.

2. Every ui-text and ui-chart fell on the (0, 0) corner of the editor
   canvas. Root cause: the helper functions accepted x/y parameters
   but never assigned them on the returned node dict — Node-RED
   defaulted every widget to (0, 0) and they piled on top of each
   other. The dashboard render was unaffected (it lays out by group/
   order), but the editor was unreadable. Fixed both helpers and added
   a verification step ("no node should be at (0, 0)") to the rule set.

Spacing convention (now codified):
- 6 lanes per tab at x = [120, 380, 640, 900, 1160, 1420]
- 80 px standard row pitch, 30-40 px for tight ui-text stacks
- 200 px gap between sections, with a comment header per section

New rule set: .claude/rules/node-red-flow-layout.md
- Tab boundaries by concern
- Link-channel naming convention (cmd:/evt:/setup: prefixes)
- Spacing constants
- Trend-split chart pattern
- Inject node payload typing pitfall (per-prop v/vt)
- Dashboard widget rules (every ui-* needs x/y!)
- Do/don't checklist
- Link-out/link-in JSON cheat sheet
- 5-step layout verification before declaring a flow done

CLAUDE.md updated to point at the new rule set.

Verified end-to-end on Dockerized Node-RED 2026-04-13: 168 nodes across
4 tabs, all wired via 22 link-out / 19 link-in pairs, no nodes at
(0, 0), pumps reach operational ~5 s after deploy, MGC distributes
random demand, trends populate per pump.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-13 16:13:27 +02:00
znetsixe
7aacee6482 feat(examples): pumpingstation-3pumps-dashboard end-to-end demo + bump generalFunctions
Some checks failed
CI / lint-and-test (push) Has been cancelled
New top-level examples/ folder for end-to-end demos that show how multiple
EVOLV nodes work together (complementing the per-node example flows under
nodes/<name>/examples/). Future end-to-end demos will live as siblings.

First demo: pumpingstation-3pumps-dashboard
- 1 pumpingStation (basin model, manual mode for the demo so it observes
  rather than auto-shutting pumps; safety guards disabled — see README)
- 1 machineGroupControl (optimalcontrol mode, absolute scaling)
- 3 rotatingMachine pumps (hidrostal-H05K-S03R curve)
- 6 measurement nodes (per pump: upstream + downstream pressure mbar,
  simulator mode for continuous activity)
- Process demand input via dashboard slider (0-300 m3/h) AND auto random
  generator (3s tick, [40, 240] m3/h) — both feed PS q_in + MGC Qd
- Auto/Manual mode toggle (broadcasts setMode to all 3 pumps)
- Station-wide Start / Stop / Emergency-Stop buttons
- Per-pump setpoint slider, individual buttons, full status text
- Two trend charts (flow per pump, power per pump)
- FlowFuse dashboard at /dashboard/pumping-station-demo

build_flow.py is the source of truth — it generates flow.json
deterministically and is the right place to extend the demo.

Bumps:
  nodes/generalFunctions  43f6906 -> 29b78a3
    Fix: childRegistrationUtils now aliases the production
    softwareType values (rotatingmachine, machinegroupcontrol) to the
    dispatch keys parent nodes check for (machine, machinegroup). Without
    this, MGC <-> rotatingMachine and pumpingStation <-> MGC wiring
    silently never matched in production even though tests passed.
    Demo confirms: MGC reports '3 machine(s) connected'.

Verified end-to-end on Dockerized Node-RED 2026-04-13: pumps reach
operational ~5s after deploy, MGC distributes random demand across them,
basin tracks net flow direction, all dashboard widgets update each second.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-13 15:53:47 +02:00
znetsixe
d7d106773e chore: bump generalFunctions submodule — fix asset menu supplier->type->model cascade
Some checks failed
CI / lint-and-test (push) Has been cancelled
generalFunctions: e50be2e -> 43f6906

Fixes the bug where picking a supplier and then a type left the model
dropdown stuck on "Awaiting Type Selection". Affects every node that
uses the shared assetMenu (measurement, rotatingMachine, pumpingStation,
monster, …). The chained dropdowns now use an explicit downward
cascade with no synthetic change-event dispatch, so the parent handler
can no longer wipe a child after the child was populated.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-13 14:51:02 +02:00
znetsixe
89f3b5ddc4 chore: bump measurement submodule — fix asset menu render (TDZ ReferenceError)
Some checks failed
CI / lint-and-test (push) Has been cancelled
measurement: d6f8af4 -> <new>

Fixes a regression in the previous measurement editor commit where a
const Temporal Dead Zone error in oneditprepare aborted the function
before the asset / logger / position menu init ran. Menus are now
kicked off first, mode logic is guarded with try/catch and null-checks.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-13 14:15:11 +02:00
znetsixe
d0fe4d0583 chore: bump measurement submodule — editor UX fix (mode as top-level switch)
Some checks failed
CI / lint-and-test (push) Has been cancelled
measurement: 495b4cf -> d6f8af4

Makes Input Mode the top-level hierarchy in the editor: analog-only and
digital-only field blocks toggle visibility live based on the dropdown,
legacy nodes default to 'analog', channels JSON gets live validation,
and runtime logs an actionable warning when the payload shape doesn't
match the selected mode.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-13 14:00:44 +02:00
znetsixe
0300a76ae8 docs: measurement trial-ready — digital mode + dispatcher fix + 71 tests
Some checks failed
CI / lint-and-test (push) Has been cancelled
Bumps:
- nodes/generalFunctions  75d16c6 -> e50be2e  (permissive unit check + measurement schema additions)
- nodes/measurement       f7c3dc2 -> 495b4cf  (digital mode + dispatcher fix + 59 new tests + rewritten README + UI)

Wiki:
- wiki/manuals/nodes/measurement.md — new user manual covering analog and
  digital modes, topic reference, smoothing/outlier methods, unit policy,
  and the pre-fix dispatcher bug advisory.
- wiki/sessions/2026-04-13-measurement-digital-mode.md — session note with
  findings, fix scope, test additions, and dual-mode E2E results.
- wiki/index.md — links both pages and adds the missing 2026-04-13
  rotatingMachine session entry that was omitted from the earlier commit.

Status: measurement is now trial-ready in both analog and digital modes.
71/71 unit tests green (was 12), dual-mode E2E on live Dockerized
Node-RED verifies analog regression and a three-channel MQTT-style
payload (temperature/humidity/pressure) dispatching independently with
per-channel smoothing.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-13 13:46:00 +02:00
znetsixe
a1aa44f6ca docs: rotatingMachine trial-ready — submodule bumps, wiki manual, session note
Some checks failed
CI / lint-and-test (push) Has been cancelled
Bumps:
- nodes/generalFunctions  024db55 -> 75d16c6  (FSM abort recovery + schema sync)
- nodes/rotatingMachine   07af7ce -> 17b8887  (interruptible sequences, dual-curve tests, rewritten README)

Wiki:
- wiki/manuals/nodes/rotatingMachine.md — new user manual covering inputs,
  outputs, state machine, supported curves, and troubleshooting.
- wiki/sessions/2026-04-13-rotatingMachine-trial-ready.md — session note
  with findings, fixes, test additions, and dual-curve E2E results.
- wiki/index.md — link both and bump updated date.

Status: rotatingMachine is now trial-ready. 91/91 unit tests green, live
Docker E2E verifies shutdown/emergency-stop during ramps and prediction
behaviour across both shipped pump curves (hidrostal-H05K-S03R,
hidrostal-C5-D03R-SHN1).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-13 13:22:10 +02:00
znetsixe
6cf1821161 chore: remove redundant Makefile and .npmignore, fix .dockerignore
Some checks failed
CI / lint-and-test (push) Has been cancelled
- Makefile: all useful targets duplicate package.json scripts, and
  referenced deleted e2e files. Use npm run instead.
- .npmignore: contained only node_modules/ which npm ignores by default.
- .dockerignore: remove stale paths (manuals/, third_party/, AGENTS.md,
  FUNCTIONAL_ISSUES_BACKLOG.md), add wiki/.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 20:53:26 +02:00
znetsixe
48f790d123 chore: clean up superproject structure
Some checks failed
CI / lint-and-test (push) Has been cancelled
Move content to correct locations:
- AGENTS.md → .agents/AGENTS.md (with orchestrator reference update)
- third_party/docs/ (8 reference docs) → wiki/concepts/
- manuals/ (12 Node-RED docs) → wiki/manuals/

Delete 23 unreferenced one-off scripts from scripts/ (keeping 5 active).
Delete stale Dockerfile.e2e, docker-compose.e2e.yml, test/e2e/.
Remove empty third_party/ directory.

Root is now: README, CLAUDE.md, LICENSE, package.json, Makefile,
Dockerfile, docker-compose.yml, docker/, scripts/ (5), nodes/, wiki/,
plus dotfiles (.agents, .claude, .gitea).

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 18:01:04 +02:00
znetsixe
bac6c620b1 docs: rewrite README with actual project content
Some checks failed
CI / lint-and-test (push) Has been cancelled
Replace generic Dutch template (with placeholder text) with a proper
README showing: node inventory table, architecture summary, install
instructions, test commands, documentation links, and license.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
2026-04-07 17:25:03 +02:00
81 changed files with 7012 additions and 5890 deletions

View File

@@ -2,7 +2,7 @@
## Context
- Task/request: Adapt EVOLV agents/skills using Harness Engineering patterns and set owner-controlled operating defaults.
- Impacted files/contracts: `AGENTS.md`, `.agents/skills/*/SKILL.md`, `.agents/skills/*/agents/openai.yaml`, decision-log policy.
- Impacted files/contracts: `.agents/AGENTS.md`, `.agents/skills/*/SKILL.md`, `.agents/skills/*/agents/openai.yaml`, decision-log policy.
- Why a decision is required now: New harness workflow needs explicit defaults for compatibility, safety bias, and governance discipline.
## Options
@@ -30,9 +30,9 @@
- Data/operations impact: Decision traceability improves cross-turn consistency and auditability.
## Implementation Notes
- Required code/doc updates: Set defaults in `AGENTS.md` and orchestrator skill instructions; keep decision-log template active.
- Required code/doc updates: Set defaults in `.agents/AGENTS.md` and orchestrator skill instructions; keep decision-log template active.
- Validation evidence required: Presence of defaults in policy docs and this decision artifact under `.agents/decisions/`.
## Rollback / Migration
- Rollback strategy: Update defaults in `AGENTS.md` and orchestrator SKILL; create a superseding decision log entry.
- Rollback strategy: Update defaults in `.agents/AGENTS.md` and orchestrator SKILL; create a superseding decision log entry.
- Migration/deprecation plan: For any future hard-break preference, require explicit migration plan and effective date in a new decision entry.

View File

@@ -42,7 +42,7 @@ You are the EVOLV orchestrator agent. You decompose complex tasks, route to spec
## Reference Files
- `.agents/skills/evolv-orchestrator/SKILL.md` — Full orchestration protocol
- `AGENTS.md` — Agent invocation policy, routing table, decision governance
- `.agents/AGENTS.md` — Agent invocation policy, routing table, decision governance
- `.agents/decisions/` — Decision log directory
- `.agents/improvements/IMPROVEMENTS_BACKLOG.md` — Deferred improvements
@@ -52,4 +52,4 @@ You are the EVOLV orchestrator agent. You decompose complex tasks, route to spec
- Owner-approved defaults: compatibility=controlled, safety=availability-first
## Reasoning Difficulty: Medium-High
This agent handles multi-domain task decomposition, cross-cutting impact analysis, and decision governance enforcement. The primary challenge is correctly mapping changes across node boundaries — a single modification can cascade through parent-child relationships, shared contracts, and InfluxDB semantics. When uncertain about cross-domain impact, consult `.agents/skills/evolv-orchestrator/SKILL.md` and `AGENTS.md` before routing to specialist agents.
This agent handles multi-domain task decomposition, cross-cutting impact analysis, and decision governance enforcement. The primary challenge is correctly mapping changes across node boundaries — a single modification can cascade through parent-child relationships, shared contracts, and InfluxDB semantics. When uncertain about cross-domain impact, consult `.agents/skills/evolv-orchestrator/SKILL.md` and `.agents/AGENTS.md` before routing to specialist agents.

View File

@@ -0,0 +1,501 @@
# Node-RED Flow Layout Rules
How to lay out a multi-tab Node-RED demo or production flow so it is readable, debuggable, and trivially extendable. These rules apply to anything you build with `examples/` flows, dashboards, or production deployments.
## 1. Tab boundaries — by CONCERN, not by data
Every node lives on the tab matching its **concern**, never where it happens to be wired:
| Tab | Lives here | Never here |
|---|---|---|
| **🏭 Process Plant** | EVOLV nodes (rotatingMachine, MGC, pumpingStation, measurement, reactor, settler, …) + small per-node output formatters | UI widgets, demo drivers, one-shot setup injects |
| **📊 Dashboard UI** | All `ui-*` widgets, the wrapper functions that turn a button click into a typed `msg`, the trend-feeder split functions | Anything that produces data autonomously, anything that talks to EVOLV nodes directly |
| **🎛️ Demo Drivers** | Random generators, scripted scenarios, schedule injectors, anything that exists only to drive the demo | Real production data sources (those go on Process Plant or are wired in externally) |
| **⚙️ Setup & Init** | One-shot `once: true` injects (setMode, setScaling, auto-startup) | Anything that fires more than once |
**Why these four:** each tab can be disabled or deleted independently. Disable Demo Drivers → demo becomes inert until a real data source is wired. Disable Setup → fresh deploys don't auto-configure (good for debugging). Disable Dashboard UI → headless mode for tests. Process Plant always stays.
If you find yourself wanting a node "between" two tabs, you've named your concerns wrong — re-split.
## 2. Cross-tab wiring — link nodes only, named channels
Never wire a node on tab A directly to a node on tab B. Use **named link-out / link-in pairs**:
```text
[ui-slider] ──► [link out cmd:demand] ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐
[random gen] ─► [link out cmd:demand] ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─► [link in cmd:demand] ──► [router] ──► [MGC]
many link-outs may target one link-in
```
### Naming convention
Channels follow `<direction>:<topic>` lowercase, kebab-case after the colon:
- `cmd:` — UI / drivers → process. Carries commands.
- `evt:` — process → UI / external. Carries state events.
- `setup:` — setup tab → wherever. Carries one-shot init.
Examples used in the pumping-station demo:
- `cmd:demand`, `cmd:randomToggle`, `cmd:mode`
- `cmd:station-startup`, `cmd:station-shutdown`, `cmd:station-estop`
- `cmd:setpoint-A`, `cmd:setpoint-B`, `cmd:setpoint-C`
- `cmd:pump-A-seq` (start/stop for pump A specifically)
- `evt:pump-A`, `evt:pump-B`, `evt:pump-C`, `evt:mgc`, `evt:ps`
- `setup:to-mgc`
### Channels are the contract
The list of channel names IS the inter-tab API. Document it in the demo's README. Renaming a channel is a breaking change.
### When to use one channel vs many
- One channel, many emitters: same kind of message from multiple sources (e.g. `cmd:demand` is fired by both the slider and the random generator).
- Different channels: messages with different *meaning* even if they go to the same node (e.g. don't fold `cmd:setpoint-A` into a generic `cmd:pump-A` — keep setpoint and start/stop separate).
- Avoid one mega-channel: a "process commands" channel that the receiver routes-by-topic is harder to read than separate channels per concern.
### Don't use link-call for fan-out
`link call` is for synchronous request/response (waits for a paired `link out` in `return` mode). For fan-out, use plain `link out` (mode=`link`) with multiple targets, or a single link out → single link in → function-node fan-out (whichever is clearer for your case).
## 3. Spacing and visual layout
Nodes need air to be readable. Apply these constants in any flow generator:
```python
LANE_X = [120, 380, 640, 900, 1160, 1420] # 6 vertical lanes per tab
ROW = 80 # standard row pitch
SECTION_GAP = 200 # extra y-shift between sections
```
### Lane assignment (process plant tab as example)
| Lane | Contents |
|---|---|
| 0 (x=120) | Inputs from outside the tab — link-in nodes, injects |
| 1 (x=380) | First-level transformers — wrappers, fan-outs, routers |
| 2 (x=640) | Mid-level — section comments live here too |
| 3 (x=900) | Target nodes — the EVOLV node itself (pump, MGC, PS) |
| 4 (x=1160) | Output formatters — function nodes that build dashboard-friendly payloads |
| 5 (x=1420) | Outputs to outside the tab — link-out nodes, debug taps |
Inputs flow left → right. Don't loop wires backwards across the tab.
### Section comments
Every logical group within a tab gets a comment header at lane 2 with a `── Section name ──` style label. Use them liberally — every 3-5 nodes deserves a header. The `info` field on the comment carries the multi-line description.
### Section spacing
`SECTION_GAP = 200` between sections, on top of the standard row pitch. Don't pack sections together — when you have 6 measurements on a tab, give each pump 4 rows + a 200 px gap to the next pump. Yes, it makes tabs scroll. Scroll is cheap; visual confusion is expensive.
## 4. Charts — the trend-split rule
ui-chart with `category: "topic"` + `categoryType: "msg"` plots one series per unique `msg.topic`. So:
- One chart per **metric type** (one chart for flow, one for power).
- Each chart receives msgs whose `topic` is the **series label** (e.g. `Pump A`, `Pump B`, `Pump C`).
### Required chart properties (FlowFuse ui-chart renders blank without ALL of these)
Derived from working charts in rotatingMachine/examples/03-Dashboard. Every property listed below is mandatory — omit any one and the chart renders blank with no error message.
```json
{
"type": "ui-chart",
"chartType": "line",
"interpolation": "linear",
"category": "topic",
"categoryType": "msg",
"xAxisType": "time",
"xAxisProperty": "",
"xAxisPropertyType": "timestamp",
"xAxisFormat": "",
"xAxisFormatType": "auto",
"yAxisProperty": "payload",
"yAxisPropertyType": "msg",
"action": "append",
"stackSeries": false,
"pointShape": "circle",
"pointRadius": 4,
"showLegend": true,
"bins": 10,
"width": 12,
"height": 6,
"removeOlder": "15",
"removeOlderUnit": "60",
"removeOlderPoints": "",
"colors": ["#0095FF","#FF0000","#FF7F0E","#2CA02C","#A347E1","#D62728","#FF9896","#9467BD","#C5B0D5"],
"textColor": ["#666666"],
"textColorDefault": true,
"gridColor": ["#e5e5e5"],
"gridColorDefault": true
}
```
**Key gotchas:**
- `interpolation` MUST be set (`"linear"`, `"step"`, `"bezier"`, `"cubic"`, `"cubic-mono"`). Without it: no line drawn.
- `yAxisProperty: "payload"` + `yAxisPropertyType: "msg"` tells the chart WHERE in the msg to find the y-value. Without these: chart has no data to plot.
- `xAxisPropertyType: "timestamp"` tells the chart to use `msg.timestamp` (or auto-generated) for the x-axis.
- `width` and `height` are **numbers, not strings**. `width: 12` (correct) vs `width: "12"` (may break).
- `removeOlderPoints: ""` (empty string) → retention is controlled by removeOlder + removeOlderUnit only. Set to a number string to additionally cap points per series.
- `colors` array defines the palette for auto-assigned series colours. Provide at least 3.
### The trend-split function pattern
A common bug: feeding both flow and power msgs to a single function output that wires to both charts. Both charts then plot all metrics, garbling the legend.
**Fix:** the trend-feeder function MUST have one output per chart, and split:
```js
// outputs: 2
// wires: [["chart_flow"], ["chart_power"]]
const flowMsg = p.flowNum != null ? { topic: 'Pump A', payload: p.flowNum } : null;
const powerMsg = p.powerNum != null ? { topic: 'Pump A', payload: p.powerNum } : null;
return [flowMsg, powerMsg];
```
A null msg on a given output sends nothing on that output — exactly what we want.
### Chart axis settings to actually configure
- `removeOlder` + `removeOlderUnit`: how much history to keep (e.g. 10 minutes).
- `removeOlderPoints`: cap on points per series (200 is sensible for a demo).
- `ymin` / `ymax`: leave blank for autoscale, or set numeric strings if you want a fixed range.
## 5. Inject node — payload typing
Multi-prop inject must populate `v` and `vt` **per prop**, not just the legacy top-level `payload` + `payloadType`:
```json
{
"props": [
{"p": "topic", "vt": "str"},
{"p": "payload", "v": "{\"action\":\"startup\"}", "vt": "json"}
],
"topic": "execSequence",
"payload": "{\"action\":\"startup\"}",
"payloadType": "json"
}
```
If you only fill the top-level fields, `payload_type=json` is silently treated as `str`.
## 6. Dashboard widget rules
- **Widget = display only.** No business logic in `ui-text` formats or `ui-template` HTML.
- **Buttons emit a typed string payload** (`"fired"` or similar). Convert to the real msg shape with a tiny wrapper function on the same tab, before the link-out.
- **Sliders use `passthru: true`** so they re-emit on input messages (useful for syncing initial state from the process side later).
- **One ui-page per demo.** Multiple groups under one page is the natural split.
- **Group widths should sum to a multiple of 12.** The page grid is 12 columns. A row of `4 + 4 + 4` or `6 + 6` works; mixing arbitrary widths leaves gaps.
- **EVERY ui-* node needs `x` and `y` keys.** Without them Node-RED dumps the node at (0,0) — every text widget and chart piles up in the top-left of the editor canvas. The dashboard itself still renders correctly (it lays out by group/order, not editor x/y), but the editor view is unreadable. If you write a flow generator helper, set `x` and `y` on the dict EVERY time. Test with `jq '[.[] | select(.x==0 and .y==0 and (.type|tostring|startswith("ui-")))]'` after generating.
## 7. Do / don't checklist
✅ Do:
- Generate flows from a Python builder (`build_flow.py`) — it's the source of truth.
- Use deterministic IDs (`pump_a`, `meas_pump_a_u`, `lin_demand_to_mgc`) — reproducible diffs across regenerations.
- Tag every channel name with `cmd:` / `evt:` / `setup:`.
- Comment every section, even short ones.
- Verify trends with a `ui-chart` of synthetic data first, before plumbing real data through.
❌ Don't:
- Don't use `replace_all` on a Python identifier that appears in a node's own wires definition — you'll create self-loops (>250k msg/s discovered the hard way).
- Don't wire across tabs directly. The wire IS allowed but it makes the editor unreadable.
- Don't put dashboard widgets next to EVOLV nodes — different concerns.
- Don't pack nodes within 40 px of each other — labels overlap, wires snap to wrong handles.
- Don't ship `enableLog: "debug"` in a demo — fills the container log within seconds and obscures real errors.
## 8. The link-out / link-in JSON shape (cheat sheet)
```json
{
"id": "lout_demand_dash",
"type": "link out",
"z": "tab_ui",
"name": "cmd:demand",
"mode": "link",
"links": ["lin_demand_to_mgc"],
"x": 380, "y": 140,
"wires": []
}
```
```json
{
"id": "lin_demand_to_mgc",
"type": "link in",
"z": "tab_process",
"name": "cmd:demand",
"links": ["lout_demand_dash", "lout_demand_drivers"],
"x": 120, "y": 1500,
"wires": [["demand_fanout_mgc_ps"]]
}
```
Both ends store the paired ids in `links`. The `name` is cosmetic (label only) — Node-RED routes by id. Multiple emitters can target one receiver; one emitter can target multiple receivers.
## 9. Node configuration completeness — ALWAYS set every field
When placing an EVOLV node in a flow (demo or production), configure **every config field** the node's schema defines — don't rely on schema defaults for operational parameters. Schema defaults exist to make the validator happy, not to represent a realistic plant.
**Why this matters:** A pumpingStation with `basinVolume: 10` but default `heightOverflow: 2.5` and default `heightOutlet: 0.2` creates an internally inconsistent basin where the fill % exceeds 100%, safety guards fire at wrong thresholds, and the demo looks broken. Every field interacts with every other field.
**The rule:**
1. Read the node's config schema (`generalFunctions/src/configs/<nodeName>.json`) before writing the flow.
2. For each section (basin, hydraulics, control, safety, scaling, smoothing, …), set EVERY field explicitly in the flow JSON — even if you'd pick the same value as the default.
3. Add a comment in the flow generator per section explaining WHY you chose each value (e.g. "basin sized so sinus peak takes 6 min to fill from startLevel to overflow").
4. Cross-check computed values: `surfaceArea = volume / height`, `maxVolOverflow = heightOverflow × surfaceArea`, gauge `max` = basin `height`, fill % denominator = `volume` (not overflow volume).
5. If a gauge or chart references a config value (basin height, maxVol), derive it from the same source — never hardcode a number that was computed elsewhere.
## 10. Verifying the layout
Before declaring a flow done:
1. **Open the tab in the editor — every wire should run left → right.** No backward loops.
2. **Open each section by section comment — visible in 1 screen height.** If not, raise `SECTION_GAP`.
3. **Hit the dashboard URL — every widget has data.** `n/a` everywhere is a contract failure.
4. **For charts, watch a series populate over 30 s.** A blank chart after 30 s = bug.
5. **Disable each tab one at a time and re-deploy.** Process Plant alone should still load (just inert). Dashboard UI alone should serve a page (just empty). If disabling a tab errors out, the tab boundaries are wrong.
## 10. Hierarchical placement — by S88 level, not by node name
The lane assignment maps to the **S88 hierarchy**, not to specific node names. Any node that lives at a given S88 level goes in the same lane regardless of what kind of equipment it is. New node types added to the platform inherit a lane by their S88 category — no rule change needed.
### 10.1 Lane convention (x-axis = S88 level)
| Lane | x | Purpose | S88 level | Colour | Current EVOLV nodes |
|---:|---:|---|---|---|---|
| **L0** | 120 | Tab inputs | — | (none) | `link in`, `inject` |
| **L1** | 360 | Adapters | — | (none) | `function` (msg-shape wrappers) |
| **L2** | 600 | Control Module | CM | `#a9daee` | `measurement` |
| **L3** | 840 | Equipment Module | EM | `#86bbdd` | `rotatingMachine`, `valve`, `diffuser` |
| **L4** | 1080 | Unit | UN | `#50a8d9` | `machineGroupControl`, `valveGroupControl`, `reactor`, `settler`, `monster` |
| **L5** | 1320 | Process Cell | PC | `#0c99d9` | `pumpingStation` |
| **L6** | 1560 | Output formatters | — | (none) | `function` (build dashboard payload from port 0) |
| **L7** | 1800 | Tab outputs | — | (none) | `link out`, `debug` |
Spacing: **240 px** between lanes. Tab width ≤ 1920 px (fits standard monitors without horizontal scroll in the editor).
**Area level** (`#0f52a5`) is reserved for plant-wide coordination and currently unused — when added, allocate a new lane and shift formatter/output one lane right (i.e. expand to 9 lanes if and when needed).
### 10.2 The group rule (Node-RED `group` boxes anchor each parent + its children)
Use Node-RED's native `group` node (the visual box around a set of nodes — not to be confused with `ui-group`) to anchor every "parent + direct children" cluster. The box makes ownership unambiguous and lets you collapse the cluster in the editor.
**Group rules:**
- **One Node-RED group per parent + its direct children.**
Example: `Pump A + meas-A-up + meas-A-dn` is one group, named `Pump A`.
- **Group colour = parent's S88 colour.**
So a Pump-A group is `#86bbdd` (Equipment Module). A reactor group is `#50a8d9` (Unit).
- **Group `style.label = true`** so the box shows the parent's name.
- **Group must contain all the children's adapters / wrappers / formatters** too if those exclusively belong to the parent. The box is the visual anchor for "this is everything that owns / serves Pump A".
- **Utility groups for cross-cutting logic** (mode broadcast, station-wide commands, demand fan-out) use a neutral colour (`#dddddd`).
JSON shape:
```json
{
"id": "grp_pump_a",
"type": "group",
"z": "tab_process",
"name": "Pump A",
"style": { "label": true, "stroke": "#000000", "fill": "#86bbdd", "fill-opacity": "0.10" },
"nodes": ["meas_pump_a_u", "meas_pump_a_d", "pump_a", "format_pump_a", "lin_setpoint_pump_a", "build_setpoint_pump_a", "lin_seq_pump_a", "lout_evt_pump_a"],
"x": 80, "y": 100, "w": 1800, "h": 200
}
```
`x/y/w/h` is the bounding box of contained nodes + padding — compute it from the children's positions.
### 10.3 The hierarchy rule, restated
> Nodes at the **same S88 level** (siblings sharing one parent) **stack vertically in the same lane**.
>
> Nodes at **different S88 levels** (parent ↔ child) sit **next to each other on different lanes**.
### 10.4 Worked example — pumping station demo
```
L0 L1 L2 L3 L4 L5 L6 L7
(input) (adapter) (CM) (EM) (Unit) (PC) (formatter) (output)
┌── group: Pump A (#86bbdd) ─────────────────────────────────────────────────────────────────────────────────────────┐
│ [lin-set-A] [build-A] │
│ [lin-seq-A] │
│ [meas-A-up] │
│ [meas-A-dn] → [Pump A] → │
│ [format-A] →[lout-evt-A]
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
┌── group: Pump B (#86bbdd) ─────────────────────────────────────────────────────────────────────────────────────────┐
│ ... same shape ... │
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
┌── group: Pump C (#86bbdd) ─────────────────────────────────────────────────────────────────────────────────────────┐
│ ... same shape ... │
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
┌── group: MGC — Pump Group (#50a8d9) ──────────────────────────────────────────────────────────────────────────────┐
│ [lin-demand] [demand→MGC+PS] [MGC] [format-MGC]→[lout-evt-MGC]
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
┌── group: Pumping Station (#0c99d9) ───────────────────────────────────────────────────────────────────────────────┐
│ [PS] [format-PS]→[lout-evt-PS]
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
┌── group: Mode broadcast (#dddddd, neutral) ───────────────────────────────────────────────────────────────────────┐
│ [lin-mode] [fan-mode] ─────────────► to all 3 pumps in the Pump A/B/C groups │
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
┌── group: Station-wide commands (#dddddd) ─────────────────────────────────────────────────────────────────────────┐
│ [lin-start] [fan-start] ─► to pumps │
│ [lin-stop] [fan-stop] │
│ [lin-estop] [fan-estop] │
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
```
What that buys:
- Search "Pump A" highlights the whole group box (parent + sensors + adapters + formatter).
- S88 colour of the group box tells you the level at a glance.
- Wires are horizontal within a group; cross-group wires (Pump A port 2 → MGC) cross only one band.
- Collapse a group in the editor and it becomes a single tile — clutter disappears during reviews.
### 10.5 Multi-input fan-in rule
Stack link-ins tightly at L0, centred on the destination's y. Merge node one lane right at the same y.
### 10.6 Multi-output fan-out rule
Source at the y-centre of its destinations; destinations stack vertically in the next lane. Wires fork cleanly without jogging.
### 10.7 Link-in placement (within a tab)
- All link-ins on **L0**.
- Order them top-to-bottom by the y of their **first downstream target**.
- Link-ins that feed the same destination share the same y-band as that destination.
### 10.8 Link-out placement (within a tab)
- All link-outs on **L7** (the rightmost lane).
- Each link-out's y matches its **upstream source's** y, so the wire is horizontal.
### 10.9 Cross-tab wire rule
Cross-tab wires use `link out` / `link in` pairs (see Section 2). Direct cross-tab wires are forbidden.
### 10.10 The "no jog" verification
- A wire whose source y == destination y is fine (perfectly horizontal).
- A wire that jogs vertically by ≤ 80 px is fine (one row of slop).
- A wire that jogs by > 80 px means **the destination is in the wrong group y-band**. Move the destination, not the source — the source's position was determined by its own group.
## 11. Dashboard tab variant
Dashboard widgets are stamped to the real grid by the FlowFuse renderer; editor x/y is for the editor's readability.
- Use only **L0, L2, L4, L7**:
- L0 = `link in` (events from process)
- L2 = `ui-*` inputs (sliders, switches, buttons)
- L4 = wrapper / format / trend-split functions
- L7 = `link out` (commands going back)
- **One Node-RED group per `ui-group`.** Editor group's name matches the `ui-group` name. Colour follows the S88 level of the represented equipment (MGC group = `#50a8d9`, Pump A group = `#86bbdd`, …) so the editor view mirrors the dashboard structure.
- Within the group, widgets stack vertically by their visual order in the dashboard.
## 12. Setup tab variant
Single-column ladder L0 → L7, ordered top-to-bottom by `onceDelay`. Wrap in a single neutral-grey Node-RED group named `Deploy-time setup`.
## 13. Demo Drivers tab variant
Same as Process Plant but typically only L0, L2, L4, L7 are used. Wrap each driver (random gen, scripted scenario, …) in its own neutral Node-RED group.
## 14. Spacing constants (final)
```python
LANE_X = [120, 360, 600, 840, 1080, 1320, 1560, 1800]
SIBLING_PITCH = 40
GROUP_GAP = 200
TAB_TOP_MARGIN = 80
GROUP_PADDING = 20 # extra px around child bounding box for the Node-RED group box
S88_COLORS = {
"AR": "#0f52a5", # Area (currently unused)
"PC": "#0c99d9", # Process Cell
"UN": "#50a8d9", # Unit
"EM": "#86bbdd", # Equipment Module
"CM": "#a9daee", # Control Module
"neutral": "#dddddd",
}
# Registry: drop a new node type here to place it automatically.
NODE_LEVEL = {
"measurement": "CM",
"rotatingMachine": "EM",
"valve": "EM",
"diffuser": "EM",
"machineGroupControl": "UN",
"valveGroupControl": "UN",
"reactor": "UN",
"settler": "UN",
"monster": "UN",
"pumpingStation": "PC",
"dashboardAPI": "neutral",
}
```
Helpers for the build script:
```python
def place(lane, group_index, position_in_group, group_size):
"""Compute (x, y) for a node in a process group."""
x = LANE_X[lane]
band_centre = TAB_TOP_MARGIN + group_index * (group_size * SIBLING_PITCH + GROUP_GAP) \
+ (group_size - 1) * SIBLING_PITCH / 2
y = band_centre + (position_in_group - (group_size - 1) / 2) * SIBLING_PITCH
return int(x), int(y)
def wrap_in_group(child_ids, name, s88_color, nodes_by_id, padding=GROUP_PADDING):
"""Compute the Node-RED group box around a set of children."""
xs = [nodes_by_id[c]["x"] for c in child_ids]
ys = [nodes_by_id[c]["y"] for c in child_ids]
return {
"type": "group", "name": name,
"style": {"label": True, "stroke": "#000000", "fill": s88_color, "fill-opacity": "0.10"},
"nodes": list(child_ids),
"x": min(xs) - padding, "y": min(ys) - padding,
"w": max(xs) - min(xs) + 160 + 2 * padding,
"h": max(ys) - min(ys) + 40 + 2 * padding,
}
```
## 15. Verification checklist (extends Section 9)
After building a tab:
1. **No wire jogs > 80 px vertically within a group.**
2. **Each lane contains nodes of one purpose only** (never an `ui-text` on L3; never a `rotatingMachine` on L2).
3. **Peers share a lane; parents and children sit on adjacent lanes.**
4. **Every parent + direct children sit inside one Node-RED group box, coloured by the parent's S88 level.**
5. **Utility groups** (mode broadcast, station commands, demand fan-out) wrapped in neutral-grey Node-RED groups.
6. **Section comments at the top of each group band.**
7. **Editor scrollable in y but NOT in x** on a normal monitor.
8. **Search test:** typing the parent's name in the editor highlights the whole group box.
## 16. S88 colour cleanup (separate follow-up task)
These nodes don't currently follow the S88 palette. They should be brought in line in a separate session before the placement rule is fully consistent across the editor:
- `settler` (`#e4a363` orange) → should be `#50a8d9` (Unit)
- `monster` (`#4f8582` teal) → should be `#50a8d9` (Unit)
- `diffuser` (no colour set) → should be `#86bbdd` (Equipment Module)
- `dashboardAPI` (no colour set) → utility, no S88 colour needed
Until cleaned up, the placement rule still works — `NODE_LEVEL` (Section 14) already maps these to their semantic S88 level regardless of the node's own colour.

View File

@@ -11,7 +11,9 @@ node_modules/
# Agent/Claude metadata (not needed at runtime)
.agents/
.claude/
manuals/
# Documentation (not needed at runtime)
wiki/
# IDE
.vscode/
@@ -23,10 +25,3 @@ manuals/
# OS
.DS_Store
Thumbs.db
# Documentation (not needed at runtime)
third_party/
FUNCTIONAL_ISSUES_BACKLOG.md
AGENTS.md
README.md
LICENSE

View File

@@ -1,2 +0,0 @@
# Ignore test files
node_modules/

View File

@@ -24,6 +24,7 @@ Each node follows a three-layer pattern:
- Config JSON files in `generalFunctions/src/configs/` define defaults, types, enums per node
- Tick loop runs at 1000ms intervals for time-based updates
- Three outputs per node: [process, dbase, parent]
- **Multi-tab demo flows**: see `.claude/rules/node-red-flow-layout.md` for the tab/link-channel/spacing rule set used by `examples/`
## Development Notes
- No build step required - pure Node.js

View File

@@ -1,29 +0,0 @@
FROM nodered/node-red:latest
# Switch to root for setup
USER root
# Copy EVOLV directly into where Node-RED looks for custom nodes
COPY package.json /data/node_modules/EVOLV/package.json
COPY nodes/ /data/node_modules/EVOLV/nodes/
# Rewrite generalFunctions dependency to local file path (no-op if already local)
RUN sed -i 's|"generalFunctions": "git+https://[^"]*"|"generalFunctions": "file:./nodes/generalFunctions"|' \
/data/node_modules/EVOLV/package.json
# Fix ownership for node-red user
RUN chown -R node-red:root /data
USER node-red
# Install EVOLV's own dependencies inside the EVOLV package directory
WORKDIR /data/node_modules/EVOLV
RUN npm install --ignore-scripts --production
# Copy test flows into Node-RED data directory
COPY --chown=node-red:root test/e2e/flows.json /data/flows.json
# Reset workdir to Node-RED default
WORKDIR /usr/src/node-red
EXPOSE 1880

View File

@@ -1,50 +0,0 @@
.PHONY: install lint lint-fix test test-jest test-node test-legacy ci docker-ci docker-test docker-lint e2e e2e-up e2e-down
install:
@sed -i 's|"generalFunctions": "git+https://[^"]*"|"generalFunctions": "file:./nodes/generalFunctions"|' package.json
npm install
@git checkout -- package.json 2>/dev/null || true
lint:
npx eslint nodes/
lint-fix:
npx eslint nodes/ --fix
test-jest:
npx jest --forceExit
test-node:
node --test \
nodes/valve/test/basic/*.test.js \
nodes/valve/test/edge/*.test.js \
nodes/valve/test/integration/*.test.js \
nodes/valveGroupControl/test/basic/*.test.js \
nodes/valveGroupControl/test/edge/*.test.js \
nodes/valveGroupControl/test/integration/*.test.js
test-legacy:
node nodes/machineGroupControl/src/groupcontrol.test.js
node nodes/generalFunctions/src/nrmse/errorMetric.test.js
test: test-jest test-node test-legacy
ci: lint test
docker-ci:
docker compose run --rm ci
docker-test:
docker compose run --rm test
docker-lint:
docker compose run --rm lint
e2e:
bash test/e2e/run-e2e.sh
e2e-up:
docker compose -f docker-compose.e2e.yml up -d --build
e2e-down:
docker compose -f docker-compose.e2e.yml down

164
README.md
View File

@@ -1,147 +1,77 @@
# R&D Bouwblok: EVOLV (Edge-Layer Evolution for Optimized Virtualization)
# EVOLV Edge-Layer Evolution for Optimized Virtualization
## Over
Node-RED custom nodes package voor de automatisering van afvalwaterzuiveringsinstallaties. Ontwikkeld door het R&D-team van Waterschap Brabantse Delta. Volgt de ISA-88 (S88) batch control standaard.
Dit bouwblok is ontwikkeld door het R&D-team van Waterschap Brabantse Delta voor gebruik in Node-RED.
## Nodes
| Node | Functie | S88-niveau |
|------|---------|------------|
| **rotatingMachine** | Individuele pomp/compressor/blower aansturing | Equipment |
| **machineGroupControl** | Multi-pomp optimalisatie (BEP-Gravitation) | Unit |
| **pumpingStation** | Pompgemaal met hydraulische context | Unit |
| **valve** | Individuele klep modellering | Equipment |
| **valveGroupControl** | Klep groep coordinatie | Unit |
| **reactor** | Biologische reactor (ASM kinetiek) | Unit |
| **settler** | Nabezinker / slibscheiding | Unit |
| **monster** | Multi-parameter biologische monitoring | Equipment |
| **measurement** | Sensor signaalconditionering | Control Module |
| **diffuser** | Beluchting aansturing | Equipment |
| **dashboardAPI** | InfluxDB telemetrie + FlowFuse dashboards | — |
| **generalFunctions** | Gedeelde bibliotheek (predict, PID, convert, etc.) | — |
> *[Voeg hier een korte toelichting toe over de specifieke functionele werking van dit bouwblok]*
## Architectuur
---
Elke node volgt een drie-lagen patroon:
1. **Entry file** (`<naam>.js`) — registratie bij Node-RED, admin endpoints
2. **nodeClass** (`src/nodeClass.js`) — Node-RED adapter (tick loop, routing, status)
3. **specificClass** (`src/specificClass.js`) — pure domeinlogica (fysica, toestandsmachines)
## Licentie
Drie output-poorten per node: **Port 0** = procesdata, **Port 1** = InfluxDB telemetrie, **Port 2** = registratie/besturing.
Deze software valt onder de **Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0)**-licentie.
- Gebruik, aanpassing en verspreiding is toegestaan voor **niet-commerciële doeleinden**, mits duidelijke naamsvermelding naar Waterschap Brabantse Delta.
- Voor **commercieel gebruik** is voorafgaande toestemming vereist.
📧 Contact: [rdlab@brabantsedelta.nl](mailto:rdlab@brabantsedelta.nl)
🔗 Licentie: [https://creativecommons.org/licenses/by-nc/4.0/](https://creativecommons.org/licenses/by-nc/4.0/)
---
## Generieke opbouw van bouwblokken
- Reageren automatisch op inkomende data (bijv. de positie van een object bepaalt de berekening).
- Ondersteunen koppeling van complexe dataketens tussen processen.
- Gestandaardiseerde input/output:
- Output = procesdata
- Opslaginformatie + relatieve positionering t.o.v. andere objecten
- Ontworpen voor combinatie met andere bouwblokken (ook van derden).
- Open source en vrij beschikbaar voor iedereen.
---
## Installatie Alle bouwblokken (via EVOLV)
Alle bouwblokken van het R&D-team zijn gebundeld in de **EVOLV-repository**, waarin gebruik wordt gemaakt van Git submodules.
### Eerste keer klonen:
## Installatie
```bash
git clone --recurse-submodules https://gitea.wbd-rd.nl/RnD/EVOLV.git
cd EVOLV
npm install
```
Of, als je zonder submodules hebt gekloond:
```bash
git submodule init
git submodule update
```
### Submodules updaten:
Om alle submodules te updaten naar de laatste versie van hun eigen repository:
Submodules updaten:
```bash
git submodule update --remote --merge
```
Individuele submodule updaten:
Enkel bouwblok installeren in Node-RED:
```bash
cd nodes/<bouwblok-naam>
git checkout main
git pull origin main
cd ../..
git add nodes/<bouwblok-naam>
git commit -m "Update submodule <bouwblok-naam>"
mkdir -p ~/.node-red/nodes
cp -r nodes/<bouwblok-naam> ~/.node-red/nodes/
```
---
## Testen
## Installatie Enkel bouwblok
```bash
# Alle nodes
bash scripts/test-all.sh
1. Clone de gewenste repository:
# Specifieke node
node --test nodes/<nodeName>/test/basic/*.test.js
node --test nodes/<nodeName>/test/integration/*.test.js
node --test nodes/<nodeName>/test/edge/*.test.js
```
```bash
git clone https://gitea.wbd-rd.nl/<repo-naam>.git
```
## Documentatie
2. Kopieer het bouwblok naar je Node-RED map:
- **`wiki/`** — Projectwiki met architectuur, bevindingen en metrics ([index](wiki/index.md))
- **`CLAUDE.md`** — Claude Code projectgids
- **`manuals/node-red/`** — FlowFuse en Node-RED referentiedocumentatie
- **`.agents/`** — Agent skills, beslissingen en function-anchors
```bash
mkdir -p ~/.node-red/nodes
cp -r <pad-naar-geclonede-map> ~/.node-red/nodes/
```
## Licentie
3. Controleer of `settings.js` het volgende bevat:
**Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0)**
```js
nodesDir: './nodes',
```
4. Herstart Node-RED:
```bash
node-red-stop
node-red-start
```
---
## Bijdragen (Fork & Pull Request)
Wil je bijdragen aan de R&D bouwblokken? Volg dan dit stappenplan:
1. Fork maken
- Maak een fork van de gewenste R&D repository in Gitea.
- Je krijgt hiermee een eigen kopie van de repository in je account.
2. Wijzigingen aanbrengen
- Clone je fork lokaal en maak een nieuwe branch (bijv. feature/mijn-wijziging).
- Breng je wijzigingen aan, commit en push de branch terug naar je fork.
3. Pull Request indienen
- Ga in Gitea naar je fork en open de branch.
- Klik op New Pull Request.
- Stel de R&D repository in bij samenvoegen met.
- Stel jouw fork/branch in bij trekken van.
4. Beschrijving toevoegen
- Geef een duidelijke titel en beschrijving.
- Verwijs indien van toepassing naar een issue met de notatie #<nummer> (bijv. #42).
5. Code review en merge
- De beheerders van de R&D repository beoordelen je wijziging.
- Na goedkeuring wordt de wijziging opgenomen in de R&D repository.
----
Gebruik, aanpassing en verspreiding is toegestaan voor niet-commerciele doeleinden, mits naamsvermelding naar Waterschap Brabantse Delta. Voor commercieel gebruik is voorafgaande toestemming vereist.
## Contact
📧 rdlab@brabantsedelta.nl
rdlab@brabantsedelta.nl

View File

@@ -1,49 +0,0 @@
services:
influxdb:
image: influxdb:2.7
environment:
- DOCKER_INFLUXDB_INIT_MODE=setup
- DOCKER_INFLUXDB_INIT_USERNAME=admin
- DOCKER_INFLUXDB_INIT_PASSWORD=adminpassword
- DOCKER_INFLUXDB_INIT_ORG=evolv
- DOCKER_INFLUXDB_INIT_BUCKET=evolv
- DOCKER_INFLUXDB_INIT_ADMIN_TOKEN=evolv-e2e-token
ports:
- "8086:8086"
healthcheck:
test: ["CMD", "influx", "ping"]
interval: 5s
timeout: 5s
retries: 5
nodered:
build:
context: .
dockerfile: Dockerfile.e2e
ports:
- "1880:1880"
depends_on:
influxdb:
condition: service_healthy
environment:
- INFLUXDB_URL=http://influxdb:8086
- INFLUXDB_TOKEN=evolv-e2e-token
- INFLUXDB_ORG=evolv
- INFLUXDB_BUCKET=evolv
volumes:
- ./test/e2e/flows.json:/data/flows.json
healthcheck:
test: ["CMD", "curl", "-f", "http://localhost:1880/"]
interval: 5s
timeout: 5s
retries: 10
grafana:
image: grafana/grafana:latest
ports:
- "3000:3000"
environment:
- GF_SECURITY_ADMIN_PASSWORD=admin
- GF_AUTH_ANONYMOUS_ENABLED=true
depends_on:
- influxdb

42
examples/README.md Normal file
View File

@@ -0,0 +1,42 @@
# EVOLV — End-to-End Example Flows
Demo flows that show how multiple EVOLV nodes work together in a realistic wastewater-automation scenario. Each example is self-contained: its folder has a `flow.json` you can import directly into Node-RED plus a `README.md` that walks through the topology, control modes, and dashboard layout.
These flows complement the per-node example flows under `nodes/<name>/examples/` (which exercise a single node in isolation). Use the per-node flows for smoke tests during development; use the flows here when you want to see how a real plant section behaves end-to-end.
## Catalogue
| Folder | What it shows |
|---|---|
| [`pumpingstation-3pumps-dashboard/`](pumpingstation-3pumps-dashboard/) | Wet-well basin + machineGroupControl orchestrating 3 pumps (each with up/downstream pressure measurements), individual + auto control, process-demand input via dashboard slider or random generator, full FlowFuse dashboard. |
## How to import
1. Bring up the EVOLV stack: `docker compose up -d` from the superproject root.
2. Open Node-RED at `http://localhost:1880`.
3. Menu → **Import** → drop in the example's `flow.json` (or paste the contents).
4. Open the FlowFuse dashboard at `http://localhost:1880/dashboard`.
Each example uses a unique dashboard `path` so they can coexist in the same Node-RED runtime.
## Adding new examples
When you create a new end-to-end example:
1. Make a subfolder under `examples/` named `<scenario>-<focus>`.
2. Include `flow.json` (Node-RED export) and `README.md` (topology, control modes, dashboard map, things to try).
3. Test it on a fresh Dockerized Node-RED — clean import, no errors, dashboard loads.
4. Add a row to the catalogue table above.
## Wishlist for future examples
These are scenarios worth building when there's a session for it:
- **Pump failure + MGC re-routing** — kill pump 2 mid-run, watch MGC redistribute to pumps 1 and 3.
- **Energy-optimal vs equal-flow control** — same demand profile run through `optimalcontrol` and `prioritycontrol` modes side-by-side, energy comparison chart.
- **Schedule-driven demand** — diurnal flow pattern (low at night, peak at 7 am), MGC auto-tuning over 24 simulated hours.
- **Reactor + clarifier loop** — `reactor` upstream feeding `settler`, return sludge controlled by a small `pumpingStation`.
- **Diffuser + DO control** — aeration grid driven by a PID controller from a dissolved-oxygen sensor.
- **Digital sensor bundle** — MQTT-style sensor (BME280, ATAS, etc.) feeding a `measurement` node in digital mode + parent equipment node.
- **Maintenance window** — entermaintenance / exitmaintenance cycle with operator handover dashboard.
- **Calibration walk-through** — measurement node calibrate cycle with stable / unstable input demonstrations.

View File

@@ -0,0 +1,140 @@
# Pumping Station — 3 Pumps with Dashboard
A complete end-to-end EVOLV stack: a wet-well basin model, a `machineGroupControl` orchestrating three `rotatingMachine` pumps (each with upstream/downstream pressure measurements), process-demand input from either a dashboard slider or an auto random generator, individual + auto control modes, and a FlowFuse dashboard with status, gauges, and trend charts.
This is the canonical "make sure everything works together" demo for the platform. Use it after any cross-node refactor to confirm the architecture still hangs together end-to-end.
## Quick start
```bash
cd /mnt/d/gitea/EVOLV
docker compose up -d
# Wait for http://localhost:1880/nodes to return 200, then:
curl -s -X POST http://localhost:1880/flows \
-H "Content-Type: application/json" \
-H "Node-RED-Deployment-Type: full" \
--data-binary @examples/pumpingstation-3pumps-dashboard/flow.json
```
Or open Node-RED at <http://localhost:1880>, **Import → drop the `flow.json`**, click **Deploy**.
Then open the dashboard:
- <http://localhost:1880/dashboard/pumping-station-demo>
## Tabs
The flow is split across four tabs by **concern**:
| Tab | Lives here | Why |
|---|---|---|
| 🏭 **Process Plant** | EVOLV nodes (3 pumps + MGC + PS + 6 measurements) and per-node output formatters | The "real plant" layer. Lift this tab into production unchanged. |
| 📊 **Dashboard UI** | All `ui-*` widgets, button/setpoint wrappers, trend-split functions | Display + operator inputs only. No business logic. |
| 🎛️ **Demo Drivers** | Random demand generator, random-toggle state | Demo-only stimulus. In production, delete this tab and feed `cmd:demand` from your real demand source. |
| ⚙️ **Setup & Init** | One-shot `once: true` injects (MGC scaling/mode, pumps mode, auto-startup, random-on) | Runs at deploy time only. Disable for production runtimes. |
Cross-tab wiring uses **named link-out / link-in pairs**, never direct cross-tab wires. The channel names form the contract:
| Channel | Direction | What it carries |
|---|---|---|
| `cmd:demand` | UI / drivers → process | numeric demand in m³/h |
| `cmd:randomToggle` | UI → drivers | `'on'` / `'off'` |
| `cmd:mode` | UI / setup → process | `'auto'` / `'virtualControl'` setMode broadcast |
| `cmd:station-startup` / `cmd:station-shutdown` / `cmd:station-estop` | UI / setup → process | station-wide command, fanned to all 3 pumps |
| `cmd:setpoint-A` / `-B` / `-C` | UI → process | per-pump setpoint slider value |
| `cmd:pump-A-seq` / `-B-seq` / `-C-seq` | UI → process | per-pump start/stop |
| `evt:pump-A` / `-B` / `-C` | process → UI | formatted per-pump status |
| `evt:mgc` | process → UI | MGC totals (flow / power / efficiency) |
| `evt:ps` | process → UI | basin state + level + volume + flows |
| `setup:to-mgc` | setup → process | MGC scaling/mode init |
See `.claude/rules/node-red-flow-layout.md` for the full layout rule set this demo follows.
## What the flow contains
| Layer | Node(s) | Role |
|---|---|---|
| Top | `pumpingStation` "Pumping Station" | Wet-well basin model. Tracks inflow (`q_in`), outflow (from machine-group child predictions), basin level/volume. PS is in `manual` control mode for the demo so it observes without taking control. |
| Mid | `machineGroupControl` "MGC — Pump Group" | Distributes Qd flow demand across the 3 pumps via `optimalcontrol` (BEP-driven). Scaling: `absolute` (Qd is in m³/h directly). |
| Low | `rotatingMachine` × 3 — Pump A / B / C | Hidrostal H05K-S03R curve. `auto` mode by default so MGC's `parent` commands are accepted. Manual setpoint slider overrides per-pump when each is in `virtualControl`. |
| Sensors | `measurement` × 6 | Per pump: upstream + downstream pressure (mbar). Simulator mode — each ticks a random-walk value continuously. Registered as children of their pump. |
| Demand | inject `demand_rand_tick` + function `demand_rand_fn` + `ui-slider` | Random generator (3 s tick, [40, 240] m³/h) AND a manual slider. Both feed a router that fans out to PS (`q_in` in m³/s) and MGC (`Qd` in m³/h). |
| Glue | `setMode` fanouts + station-wide buttons | Mode toggle broadcasts `setMode` to all 3 pumps. Station-wide Start / Stop / Emergency-Stop buttons fan out to all 3. |
| Dashboard | FlowFuse `ui-page` + 6 groups | Process Demand · Pumping Station · Pump A · Pump B · Pump C · Trends. |
## Dashboard map
The page (`/dashboard/pumping-station-demo`) is laid out top-to-bottom:
1. **Process Demand**
- Slider 0300 m³/h (`manualDemand` topic)
- Random demand toggle (auto cycles every 3 s)
- Live "current demand" text
2. **Pumping Station**
- Auto/Manual mode toggle (drives all pumps' `setMode` simultaneously)
- Station-wide buttons: Start all · Stop all · Emergency stop
- Basin state, level (m), volume (m³), inflow / pumped-out flow (m³/h)
3. **Pump A / B / C** (one group each)
- Setpoint slider 0100 % (only effective when that pump is in `virtualControl`)
- Per-pump Startup + Shutdown buttons
- Live state, mode, controller %, flow, power, upstream/downstream pressure
4. **Trends**
- Flow per pump chart (m³/h)
- Power per pump chart (kW)
## Control model
- **AUTO** — the default. `setMode auto` → MGC's `optimalcontrol` decides which pumps run and at what flow. Operator drives only the **Process Demand** slider (or leaves the random generator on); the per-pump setpoint sliders are ignored.
- **MANUAL** — flip the Auto/Manual switch. All 3 pumps go to `virtualControl`. MGC commands are now ignored. Per-pump setpoint sliders / Start / Stop are the only inputs that affect the pumps.
The Emergency Stop button always works regardless of mode and uses the new interruptible-movement path so it stops a pump mid-ramp.
## Notable design choices
- **PS is in `manual` control mode** (`controlMode: "manual"`). The default `levelbased` mode would auto-shut all pumps as soon as basin level dips below `stopLevel` (1 m default), which masks the demo. Manual = observation only.
- **PS safety guards (dry-run / overfill) disabled.** With no real inflow the basin will frequently look "empty" — that's expected for a demo, not a fault. In production you'd configure a real `q_in` source and leave safeties on.
- **MGC scaling = `absolute`, mode = `optimalcontrol`.** Set via inject at deploy. Demand in m³/h, BEP-driven distribution.
- **demand_router gates Qd ≤ 0.** A demand of 0 would shut every running pump (via MGC.turnOffAllMachines). Use the explicit Stop All button to actually take pumps down.
- **Auto-startup on deploy.** All three pumps fire `execSequence startup` 4 s after deploy so the dashboard shows activity immediately.
- **Auto-enable random demand** 5 s after deploy so the trends fill in without operator action.
- **Verbose logging is OFF.** All EVOLV nodes are at `warn`. Crank the per-node `logLevel` to `info` or `debug` if you're diagnosing a flow.
## Things to try
- Drag the **Process Demand slider** with random off — watch MGC distribute that target across pumps and the basin start filling/draining accordingly.
- Flip to **Manual** mode and use the per-pump setpoint sliders — note that MGC stops driving them.
- Hit **Emergency Stop** while a pump is ramping — confirms the interruptible-movement fix shipped in `rotatingMachine` v1.0.3.
- Watch the **Trends** chart over a few minutes — flow distribution shifts as MGC re-balances around the BEP.
## Verification (last green run, 2026-04-13)
Deployed via `POST /flows` to a Dockerized Node-RED, observed for ~15 s after auto-startup:
- All 3 measurement nodes per pump tick (6 total): pressure values stream every second.
- Each pump reaches `operational` ~5 s after the auto-startup inject (3 s starting + 1 s warmup + 1 s for setpoint=0 settle).
- MGC reports `3 machine(s) connected` with mode `optimalcontrol`.
- Pumping Station shows non-zero basin volume + tracks net flow direction (⬆ / ⬇ / ⏸).
- Random demand cycles between ~40 and ~240 m³/h every 3 s.
- Per-pump status text + trend chart update on every tick.
## Regenerating `flow.json`
`flow.json` is generated from `build_flow.py`. Edit the Python (cleaner diff) and regenerate:
```bash
cd examples/pumpingstation-3pumps-dashboard
python3 build_flow.py > flow.json
```
The `build_flow.py` is the source of truth — keep it in sync if you tweak the demo.
## Wishlist (not in this demo, build separately)
- **Pump failure + MGC re-routing** — kill pump 2 mid-run, watch MGC redistribute. Would demonstrate fault-tolerance.
- **Energy-optimal vs equal-flow control** — same demand profile run through `optimalcontrol` and `prioritycontrol` modes side-by-side, energy comparison chart.
- **Schedule-driven demand** — diurnal flow pattern (low at night, peak at 7 am), MGC auto-tuning over 24 simulated hours.
- **PS with real `q_in` source + safeties on** — show the basin auto-shut behaviour as a feature, not a bug.
- **Real flow sensor per pump** (vs. relying on rotatingMachine's predicted flow) — would let the demo also show measurement-vs-prediction drift indicators.
- **Reactor or settler downstream** — close the loop on a real wastewater scenario.
See the parent `examples/README.md` for the full follow-up catalogue.

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@@ -1,114 +0,0 @@
#!/usr/bin/env node
/**
* Add monitoring/debug nodes to the demo flow for process visibility.
* Adds a function node per PS that logs volume, level, flow rate every 10 ticks.
* Also adds a status debug node for the overall system.
*/
const fs = require('fs');
const path = require('path');
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
// Remove existing monitoring nodes
const monitorIds = flow.filter(n => n.id && n.id.startsWith('demo_mon_')).map(n => n.id);
if (monitorIds.length > 0) {
console.log('Removing existing monitoring nodes:', monitorIds);
for (const id of monitorIds) {
const idx = flow.findIndex(n => n.id === id);
if (idx !== -1) flow.splice(idx, 1);
}
// Also remove from wires
flow.forEach(n => {
if (n.wires) {
n.wires = n.wires.map(portWires =>
Array.isArray(portWires) ? portWires.filter(w => !monitorIds.includes(w)) : portWires
);
}
});
}
// Add monitoring function nodes for each PS
const monitors = [
{
id: 'demo_mon_west',
name: 'Monitor PS West',
ps: 'demo_ps_west',
x: 800, y: 50,
},
{
id: 'demo_mon_north',
name: 'Monitor PS North',
ps: 'demo_ps_north',
x: 800, y: 100,
},
{
id: 'demo_mon_south',
name: 'Monitor PS South',
ps: 'demo_ps_south',
x: 800, y: 150,
},
];
// Each PS sends process data on port 0. Wire monitoring nodes to PS port 0.
monitors.forEach(mon => {
// Function node that extracts key metrics and logs them periodically
const fnNode = {
id: mon.id,
type: 'function',
z: 'demo_tab_wwtp',
name: mon.name,
func: `// Extract key metrics from PS process output
const p = msg.payload || {};
// Keys have .default suffix in PS output format
const vol = p["volume.predicted.atequipment.default"];
const level = p["level.predicted.atequipment.default"];
const netFlow = p["netFlowRate.predicted.atequipment.default"];
const volPct = p["volumePercent.predicted.atequipment.default"];
// Only log when we have volume data
if (vol !== null && vol !== undefined) {
const ctx = context.get("tickCount") || 0;
context.set("tickCount", ctx + 1);
// Log every 10 ticks
if (ctx % 10 === 0) {
const fmt = (v, dec) => typeof v === "number" ? v.toFixed(dec) : String(v);
const parts = ["vol=" + fmt(vol, 1) + "m3"];
if (level !== null && level !== undefined) parts.push("lvl=" + fmt(level, 3) + "m");
if (volPct !== null && volPct !== undefined) parts.push("fill=" + fmt(volPct, 1) + "%");
if (netFlow !== null && netFlow !== undefined) parts.push("net=" + fmt(netFlow, 1) + "m3/h");
node.warn(parts.join(" | "));
}
}
return msg;`,
outputs: 1,
timeout: '',
noerr: 0,
initialize: '',
finalize: '',
libs: [],
x: mon.x,
y: mon.y,
wires: [[]],
};
flow.push(fnNode);
// Wire PS port 0 to this monitor (append to existing wires)
const psNode = flow.find(n => n.id === mon.ps);
if (psNode && psNode.wires && psNode.wires[0]) {
if (!psNode.wires[0].includes(mon.id)) {
psNode.wires[0].push(mon.id);
}
}
console.log(`Added ${mon.id}: ${mon.name} → wired to ${mon.ps} port 0`);
});
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n');
console.log(`\nDone. ${monitors.length} monitoring nodes added.`);

View File

@@ -1,138 +0,0 @@
#!/usr/bin/env node
/**
* Comprehensive runtime analysis of the WWTP demo flow.
* Captures process debug output, pumping station state, measurements,
* and analyzes filling/draining behavior over time.
*/
const http = require('http');
const NR_URL = 'http://localhost:1880';
function fetchJSON(url) {
return new Promise((resolve, reject) => {
http.get(url, res => {
const chunks = [];
res.on('data', c => chunks.push(c));
res.on('end', () => {
try { resolve(JSON.parse(Buffer.concat(chunks))); }
catch (e) { reject(new Error('Parse error from ' + url + ': ' + e.message)); }
});
}).on('error', reject);
});
}
// Inject a debug-capture subflow to intercept process messages
async function injectDebugCapture() {
const flows = await fetchJSON(NR_URL + '/flows');
// Find all nodes on WWTP tab
const wwtp = flows.filter(n => n.z === 'demo_tab_wwtp');
console.log('=== WWTP Node Inventory ===');
const byType = {};
wwtp.forEach(n => {
if (!byType[n.type]) byType[n.type] = [];
byType[n.type].push(n);
});
Object.entries(byType).sort().forEach(([type, nodes]) => {
console.log(type + ' (' + nodes.length + '):');
nodes.forEach(n => {
const extra = [];
if (n.simulator) extra.push('sim=ON');
if (n.model) extra.push('model=' + n.model);
if (n.basinVolume) extra.push('basin=' + n.basinVolume + 'm3');
if (n.basinHeight) extra.push('h=' + n.basinHeight + 'm');
if (n.positionVsParent) extra.push('pos=' + n.positionVsParent);
if (n.control) extra.push('ctrl=' + JSON.stringify(n.control));
console.log(' ' + n.id + ' "' + (n.name || '') + '" ' + (extra.length ? '[' + extra.join(', ') + ']' : ''));
});
});
// Analyze pumping station configurations
console.log('\n=== Pumping Station Configs ===');
const pss = wwtp.filter(n => n.type === 'pumpingStation');
pss.forEach(ps => {
console.log('\n' + ps.id + ' "' + ps.name + '"');
console.log(' Basin: vol=' + ps.basinVolume + 'm3, h=' + ps.basinHeight + 'm');
console.log(' Inlet: h=' + ps.heightInlet + 'm, Outlet: h=' + ps.heightOutlet + 'm');
console.log(' Simulator: ' + ps.simulator);
console.log(' Control mode: ' + (ps.controlMode || 'not set'));
// Check q_in inject wiring
const qinInject = wwtp.find(n => n.id === 'demo_inj_' + ps.id.replace('demo_ps_', '') + '_flow');
if (qinInject) {
console.log(' q_in inject: repeat=' + qinInject.repeat + 's, wired to ' + JSON.stringify(qinInject.wires));
}
// Check what's wired to this PS (port 2 = parent registration)
const children = wwtp.filter(n => {
if (!n.wires) return false;
return n.wires.some(portWires => portWires && portWires.includes(ps.id));
});
console.log(' Children wired to it: ' + children.map(c => c.id + '(' + c.type + ')').join(', '));
});
// Analyze inject timers
console.log('\n=== Active Inject Timers ===');
const injects = wwtp.filter(n => n.type === 'inject');
injects.forEach(inj => {
const targets = (inj.wires || []).flat();
console.log(inj.id + ' "' + (inj.name || '') + '"');
console.log(' topic=' + inj.topic + ' payload=' + inj.payload);
console.log(' once=' + inj.once + ' repeat=' + (inj.repeat || 'none'));
console.log(' → ' + targets.join(', '));
});
// Analyze q_in function nodes
console.log('\n=== q_in Flow Simulation Functions ===');
const fnNodes = wwtp.filter(n => n.type === 'function' && n.name && n.name.includes('Flow'));
fnNodes.forEach(fn => {
console.log(fn.id + ' "' + fn.name + '"');
console.log(' func: ' + (fn.func || '').substring(0, 200));
const targets = (fn.wires || []).flat();
console.log(' → ' + targets.join(', '));
});
// Analyze measurement nodes
console.log('\n=== Measurement Nodes ===');
const meas = wwtp.filter(n => n.type === 'measurement');
meas.forEach(m => {
console.log(m.id + ' "' + (m.name || '') + '"');
console.log(' type=' + m.assetType + ' sim=' + m.simulator + ' range=[' + m.o_min + ',' + m.o_max + '] unit=' + m.unit);
console.log(' pos=' + (m.positionVsParent || 'none'));
// Check port 2 wiring (parent registration)
const port2 = m.wires && m.wires[2] ? m.wires[2] : [];
console.log(' port2→ ' + (port2.length ? port2.join(', ') : 'none'));
});
// Analyze rotating machines
console.log('\n=== Rotating Machine Nodes ===');
const machines = wwtp.filter(n => n.type === 'rotatingMachine');
machines.forEach(m => {
console.log(m.id + ' "' + (m.name || '') + '"');
console.log(' model=' + m.model + ' mode=' + m.movementMode);
console.log(' pos=' + m.positionVsParent + ' supplier=' + m.supplier);
console.log(' speed=' + m.speed + ' startup=' + m.startup + ' shutdown=' + m.shutdown);
const port2 = m.wires && m.wires[2] ? m.wires[2] : [];
console.log(' port2→ ' + (port2.length ? port2.join(', ') : 'none'));
});
// Check wiring integrity
console.log('\n=== Wiring Analysis ===');
pss.forEach(ps => {
const psPort0 = ps.wires && ps.wires[0] ? ps.wires[0] : [];
const psPort1 = ps.wires && ps.wires[1] ? ps.wires[1] : [];
const psPort2 = ps.wires && ps.wires[2] ? ps.wires[2] : [];
console.log(ps.id + ' wiring:');
console.log(' port0 (process): ' + psPort0.join(', '));
console.log(' port1 (influx): ' + psPort1.join(', '));
console.log(' port2 (parent): ' + psPort2.join(', '));
});
}
injectDebugCapture().catch(err => {
console.error('Analysis failed:', err);
process.exit(1);
});

View File

@@ -1,145 +0,0 @@
#!/usr/bin/env node
/**
* Capture live process data from Node-RED WebSocket debug sidebar.
* Collects samples over a time window and analyzes trends.
*/
const http = require('http');
const NR_URL = 'http://localhost:1880';
const CAPTURE_SECONDS = 30;
// Alternative: poll the Node-RED comms endpoint
// But let's use a simpler approach - inject a temporary catch-all debug and read context
async function fetchJSON(url) {
return new Promise((resolve, reject) => {
http.get(url, res => {
const chunks = [];
res.on('data', c => chunks.push(c));
res.on('end', () => {
try { resolve(JSON.parse(Buffer.concat(chunks))); }
catch (e) { reject(new Error('Parse: ' + e.message)); }
});
}).on('error', reject);
});
}
async function postJSON(url, data) {
return new Promise((resolve, reject) => {
const body = JSON.stringify(data);
const parsed = new URL(url);
const req = http.request({
hostname: parsed.hostname,
port: parsed.port,
path: parsed.pathname,
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Content-Length': Buffer.byteLength(body),
},
}, res => {
const chunks = [];
res.on('data', c => chunks.push(c));
res.on('end', () => {
const text = Buffer.concat(chunks).toString();
try { resolve(JSON.parse(text)); } catch { resolve(text); }
});
});
req.on('error', reject);
req.write(body);
req.end();
});
}
(async () => {
console.log('=== Capturing Process Data (' + CAPTURE_SECONDS + 's) ===\n');
// Use Node-RED inject API to trigger debug output
// Instead, let's read node context which stores the current state
// Get flows to find node IDs
const flows = await fetchJSON(NR_URL + '/flows');
const wwtp = flows.filter(n => n.z === 'demo_tab_wwtp');
// Pumping stations store state in node context
const pss = wwtp.filter(n => n.type === 'pumpingStation');
const pumps = wwtp.filter(n => n.type === 'rotatingMachine');
const samples = [];
const startTime = Date.now();
console.log('Sampling every 3 seconds for ' + CAPTURE_SECONDS + 's...\n');
for (let i = 0; i < Math.ceil(CAPTURE_SECONDS / 3); i++) {
const t = Date.now();
const elapsed = ((t - startTime) / 1000).toFixed(1);
// Read PS context data via Node-RED context API
const sample = { t: elapsed, stations: {} };
for (const ps of pss) {
try {
const ctx = await fetchJSON(NR_URL + '/context/node/' + ps.id + '?store=default');
sample.stations[ps.id] = ctx;
} catch (e) {
sample.stations[ps.id] = { error: e.message };
}
}
for (const pump of pumps) {
try {
const ctx = await fetchJSON(NR_URL + '/context/node/' + pump.id + '?store=default');
sample.stations[pump.id] = ctx;
} catch (e) {
sample.stations[pump.id] = { error: e.message };
}
}
samples.push(sample);
// Print summary for this sample
console.log('--- Sample at t=' + elapsed + 's ---');
for (const ps of pss) {
const ctx = sample.stations[ps.id];
if (ctx && ctx.data) {
console.log(ps.name + ':');
// Print all context keys
Object.entries(ctx.data).forEach(([key, val]) => {
if (typeof val === 'object') {
console.log(' ' + key + ': ' + JSON.stringify(val).substring(0, 200));
} else {
console.log(' ' + key + ': ' + val);
}
});
} else {
console.log(ps.name + ': ' + JSON.stringify(ctx).substring(0, 200));
}
}
for (const pump of pumps) {
const ctx = sample.stations[pump.id];
if (ctx && ctx.data && Object.keys(ctx.data).length > 0) {
console.log(pump.name + ':');
Object.entries(ctx.data).forEach(([key, val]) => {
if (typeof val === 'object') {
console.log(' ' + key + ': ' + JSON.stringify(val).substring(0, 200));
} else {
console.log(' ' + key + ': ' + val);
}
});
}
}
console.log('');
if (i < Math.ceil(CAPTURE_SECONDS / 3) - 1) {
await new Promise(r => setTimeout(r, 3000));
}
}
console.log('\n=== Summary ===');
console.log('Collected ' + samples.length + ' samples over ' + CAPTURE_SECONDS + 's');
})().catch(err => {
console.error('Capture failed:', err);
process.exit(1);
});

View File

@@ -1,109 +0,0 @@
#!/usr/bin/env node
/**
* Verify asset selection fields are correct in deployed flow.
* Checks that supplier/assetType/model/unit values match asset data IDs
* so the editor dropdowns will pre-select correctly.
*/
const http = require('http');
const NR_URL = 'http://localhost:1880';
async function fetchJSON(url) {
return new Promise((resolve, reject) => {
http.get(url, res => {
const chunks = [];
res.on('data', c => chunks.push(c));
res.on('end', () => {
try { resolve(JSON.parse(Buffer.concat(chunks))); }
catch (e) { reject(new Error(`Parse error: ${e.message}`)); }
});
}).on('error', reject);
});
}
(async () => {
const flows = await fetchJSON(`${NR_URL}/flows`);
const errors = [];
console.log('=== Pump Asset Selection Checks ===');
const pumps = flows.filter(n => n.type === 'rotatingMachine' && n.z === 'demo_tab_wwtp');
pumps.forEach(p => {
const checks = [
{ field: 'supplier', expected: 'hidrostal', actual: p.supplier },
{ field: 'assetType', expected: 'pump-centrifugal', actual: p.assetType },
{ field: 'category', expected: 'machine', actual: p.category },
];
checks.forEach(c => {
if (c.actual === c.expected) {
console.log(` PASS: ${p.id} ${c.field} = "${c.actual}"`);
} else {
console.log(` FAIL: ${p.id} ${c.field} = "${c.actual}" (expected "${c.expected}")`);
errors.push(`${p.id}.${c.field}`);
}
});
// Model should be one of the known models
const validModels = ['hidrostal-H05K-S03R', 'hidrostal-C5-D03R-SHN1'];
if (validModels.includes(p.model)) {
console.log(` PASS: ${p.id} model = "${p.model}"`);
} else {
console.log(` FAIL: ${p.id} model = "${p.model}" (expected one of ${validModels})`);
errors.push(`${p.id}.model`);
}
});
console.log('\n=== Measurement Asset Selection Checks ===');
const measurements = flows.filter(n => n.type === 'measurement' && n.z === 'demo_tab_wwtp');
// Valid supplier→type→model combinations from measurement.json
const validSuppliers = {
'Endress+Hauser': {
types: ['flow', 'pressure', 'level'],
models: { flow: ['Promag-W400', 'Promag-W300'], pressure: ['Cerabar-PMC51', 'Cerabar-PMC71'], level: ['Levelflex-FMP50'] }
},
'Hach': {
types: ['dissolved-oxygen', 'ammonium', 'nitrate', 'tss'],
models: { 'dissolved-oxygen': ['LDO2'], ammonium: ['Amtax-sc'], nitrate: ['Nitratax-sc'], tss: ['Solitax-sc'] }
},
'vega': {
types: ['temperature', 'pressure', 'flow', 'level', 'oxygen'],
models: {} // not checking Vega models for now
}
};
measurements.forEach(m => {
const supplierData = validSuppliers[m.supplier];
if (!supplierData) {
console.log(` FAIL: ${m.id} supplier "${m.supplier}" not in asset data`);
errors.push(`${m.id}.supplier`);
return;
}
console.log(` PASS: ${m.id} supplier = "${m.supplier}"`);
if (!supplierData.types.includes(m.assetType)) {
console.log(` FAIL: ${m.id} assetType "${m.assetType}" not in ${m.supplier} types`);
errors.push(`${m.id}.assetType`);
} else {
console.log(` PASS: ${m.id} assetType = "${m.assetType}"`);
}
const validModels = supplierData.models[m.assetType] || [];
if (validModels.length > 0 && !validModels.includes(m.model)) {
console.log(` FAIL: ${m.id} model "${m.model}" not in ${m.supplier}/${m.assetType} models`);
errors.push(`${m.id}.model`);
} else if (m.model) {
console.log(` PASS: ${m.id} model = "${m.model}"`);
}
});
console.log('\n=== RESULT ===');
if (errors.length === 0) {
console.log('ALL ASSET SELECTION CHECKS PASSED');
} else {
console.log(`${errors.length} FAILURE(S):`, errors.join(', '));
process.exit(1);
}
})().catch(err => {
console.error('Check failed:', err.message);
process.exit(1);
});

View File

@@ -1,142 +0,0 @@
#!/usr/bin/env node
/**
* Check the deployed Node-RED flow for correctness after changes.
*/
const http = require('http');
function fetch(url) {
return new Promise((resolve, reject) => {
http.get(url, res => {
const chunks = [];
res.on('data', c => chunks.push(c));
res.on('end', () => resolve(JSON.parse(Buffer.concat(chunks))));
}).on('error', reject);
});
}
(async () => {
let errors = 0;
// 1. Check deployed flow structure
console.log('=== Checking deployed flow structure ===');
const flow = await fetch('http://localhost:1880/flows');
console.log('Total deployed nodes:', flow.length);
// Check MGC exists
const mgc = flow.find(n => n.id === 'demo_mgc_west');
if (mgc) {
console.log('PASS: MGC West exists, position:', mgc.positionVsParent);
} else {
console.log('FAIL: MGC West missing from deployed flow');
errors++;
}
// Check reactor speedUpFactor
const reactor = flow.find(n => n.id === 'demo_reactor');
if (reactor && reactor.speedUpFactor === 1) {
console.log('PASS: Reactor speedUpFactor = 1');
} else {
console.log('FAIL: Reactor speedUpFactor =', reactor?.speedUpFactor);
errors++;
}
// Check sim mode on measurements
const simMeasIds = [
'demo_meas_flow', 'demo_meas_do', 'demo_meas_nh4',
'demo_meas_ft_n1', 'demo_meas_eff_flow', 'demo_meas_eff_do',
'demo_meas_eff_nh4', 'demo_meas_eff_no3', 'demo_meas_eff_tss'
];
let simOk = 0;
simMeasIds.forEach(id => {
const n = flow.find(x => x.id === id);
if (n && n.simulator === true) simOk++;
else { console.log('FAIL: simulator not true on', id); errors++; }
});
console.log(`PASS: ${simOk}/9 measurement nodes have simulator=true`);
// Check pressure nodes exist
const ptIds = ['demo_meas_pt_w_up','demo_meas_pt_w_down','demo_meas_pt_n_up','demo_meas_pt_n_down','demo_meas_pt_s_up','demo_meas_pt_s_down'];
let ptOk = 0;
ptIds.forEach(id => {
const n = flow.find(x => x.id === id);
if (n && n.type === 'measurement') ptOk++;
else { console.log('FAIL: pressure node missing:', id); errors++; }
});
console.log(`PASS: ${ptOk}/6 pressure measurement nodes present`);
// Check removed nodes are gone
const removedIds = [
'demo_inj_meas_flow', 'demo_fn_sim_flow', 'demo_inj_meas_do', 'demo_fn_sim_do',
'demo_inj_meas_nh4', 'demo_fn_sim_nh4', 'demo_inj_ft_n1', 'demo_fn_sim_ft_n1',
'demo_inj_eff_flow', 'demo_fn_sim_eff_flow', 'demo_inj_eff_do', 'demo_fn_sim_eff_do',
'demo_inj_eff_nh4', 'demo_fn_sim_eff_nh4', 'demo_inj_eff_no3', 'demo_fn_sim_eff_no3',
'demo_inj_eff_tss', 'demo_fn_sim_eff_tss',
'demo_inj_w1_startup', 'demo_inj_w1_setpoint', 'demo_inj_w2_startup', 'demo_inj_w2_setpoint',
'demo_inj_n1_startup', 'demo_inj_s1_startup'
];
const stillPresent = removedIds.filter(id => flow.find(x => x.id === id));
if (stillPresent.length === 0) {
console.log('PASS: All 24 removed nodes are gone');
} else {
console.log('FAIL: These removed nodes are still present:', stillPresent);
errors++;
}
// Check kept nodes still exist
const keptIds = [
'demo_inj_west_flow', 'demo_fn_west_flow_sim',
'demo_inj_north_flow', 'demo_fn_north_flow_sim',
'demo_inj_south_flow', 'demo_fn_south_flow_sim',
'demo_inj_w1_mode', 'demo_inj_w2_mode', 'demo_inj_n1_mode', 'demo_inj_s1_mode',
'demo_inj_west_mode', 'demo_inj_north_mode', 'demo_inj_south_mode'
];
const keptMissing = keptIds.filter(id => !flow.find(x => x.id === id));
if (keptMissing.length === 0) {
console.log('PASS: All kept nodes still present');
} else {
console.log('FAIL: These nodes should exist but are missing:', keptMissing);
errors++;
}
// Check wiring: W1/W2 register to MGC, MGC registers to PS West
const w1 = flow.find(n => n.id === 'demo_pump_w1');
const w2 = flow.find(n => n.id === 'demo_pump_w2');
if (w1 && w1.wires[2] && w1.wires[2].includes('demo_mgc_west')) {
console.log('PASS: W1 port 2 wired to MGC');
} else {
console.log('FAIL: W1 port 2 not wired to MGC, got:', w1?.wires?.[2]);
errors++;
}
if (w2 && w2.wires[2] && w2.wires[2].includes('demo_mgc_west')) {
console.log('PASS: W2 port 2 wired to MGC');
} else {
console.log('FAIL: W2 port 2 not wired to MGC, got:', w2?.wires?.[2]);
errors++;
}
if (mgc && mgc.wires[2] && mgc.wires[2].includes('demo_ps_west')) {
console.log('PASS: MGC port 2 wired to PS West');
} else {
console.log('FAIL: MGC port 2 not wired to PS West');
errors++;
}
// Check PS outputs wire to level-to-pressure functions
const psWest = flow.find(n => n.id === 'demo_ps_west');
if (psWest && psWest.wires[0] && psWest.wires[0].includes('demo_fn_level_to_pressure_w')) {
console.log('PASS: PS West port 0 wired to level-to-pressure function');
} else {
console.log('FAIL: PS West port 0 missing level-to-pressure wire');
errors++;
}
console.log('\n=== RESULT ===');
if (errors === 0) {
console.log('ALL CHECKS PASSED');
} else {
console.log(`${errors} FAILURE(S)`);
process.exit(1);
}
})().catch(err => {
console.error('Failed to connect to Node-RED:', err.message);
process.exit(1);
});

View File

@@ -1,78 +0,0 @@
#!/usr/bin/env node
/**
* Runtime smoke test: connect to Node-RED WebSocket debug and verify
* that key nodes are producing output within a timeout period.
*/
const http = require('http');
const TIMEOUT_MS = 15000;
const NR_URL = 'http://localhost:1880';
async function fetchJSON(url) {
return new Promise((resolve, reject) => {
http.get(url, res => {
const chunks = [];
res.on('data', c => chunks.push(c));
res.on('end', () => {
try { resolve(JSON.parse(Buffer.concat(chunks))); }
catch (e) { reject(new Error(`Parse error from ${url}: ${e.message}`)); }
});
}).on('error', reject);
});
}
(async () => {
const errors = [];
// REST-based checks: verify Node-RED is healthy
console.log('=== Runtime Health Checks ===');
try {
const settings = await fetchJSON(`${NR_URL}/settings`);
console.log('PASS: Node-RED is responding, version:', settings.editorTheme ? 'custom' : 'default');
} catch (e) {
console.log('FAIL: Node-RED not responding:', e.message);
errors.push('Node-RED not responding');
}
// Check that flows are loaded
try {
const flows = await fetchJSON(`${NR_URL}/flows`);
const wwtp = flows.filter(n => n.z === 'demo_tab_wwtp');
if (wwtp.length > 50) {
console.log(`PASS: ${wwtp.length} nodes loaded on WWTP tab`);
} else {
console.log(`FAIL: Only ${wwtp.length} nodes on WWTP tab (expected >50)`);
errors.push('Too few nodes');
}
} catch (e) {
console.log('FAIL: Cannot read flows:', e.message);
errors.push('Cannot read flows');
}
// Check inject nodes are running (they have repeat timers)
try {
const flows = await fetchJSON(`${NR_URL}/flows`);
const injects = flows.filter(n => n.type === 'inject' && n.repeat && n.z === 'demo_tab_wwtp');
console.log(`PASS: ${injects.length} inject nodes with timers on WWTP tab`);
// Verify the q_in inject nodes are still there
const qinInjects = injects.filter(n => n.id.includes('_flow') || n.id.includes('_tick'));
console.log(`PASS: ${qinInjects.length} q_in/tick inject timers active`);
} catch (e) {
console.log('FAIL: Cannot check inject nodes:', e.message);
errors.push('Cannot check inject nodes');
}
console.log('\n=== RESULT ===');
if (errors.length === 0) {
console.log('ALL RUNTIME CHECKS PASSED');
} else {
console.log(`${errors.length} FAILURE(S):`, errors.join(', '));
process.exit(1);
}
})().catch(err => {
console.error('Runtime check failed:', err.message);
process.exit(1);
});

View File

@@ -1,294 +0,0 @@
#!/usr/bin/env node
/**
* Comprehensive WWTP Demo Test Suite
*
* Tests:
* 1. Deploy succeeds
* 2. All nodes healthy (no errors)
* 3. PS volumes above safety threshold after calibration
* 4. q_in flowing to all PSs (volume rising)
* 5. Measurement simulators producing values
* 6. MGC pressure handling working
* 7. No persistent safety triggers
* 8. Level-based control (PS West) stays idle at low level
* 9. Flow-based control (PS North) responds to flow
* 10. PS output format correct
*/
const http = require('http');
const fs = require('fs');
const path = require('path');
const { execSync } = require('child_process');
const NR_URL = 'http://localhost:1880';
const FLOW_FILE = path.join(__dirname, '..', 'docker', 'demo-flow.json');
let passed = 0;
let failed = 0;
let warnings = 0;
function test(name, condition, detail) {
if (condition) {
console.log(` ✅ PASS: ${name}${detail ? ' — ' + detail : ''}`);
passed++;
} else {
console.log(` ❌ FAIL: ${name}${detail ? ' — ' + detail : ''}`);
failed++;
}
}
function warn(name, detail) {
console.log(` ⚠️ WARN: ${name}${detail ? ' — ' + detail : ''}`);
warnings++;
}
function httpReq(method, urlPath, body) {
return new Promise((resolve, reject) => {
const parsed = new URL(NR_URL + urlPath);
const opts = {
hostname: parsed.hostname,
port: parsed.port,
path: parsed.pathname,
method,
headers: { 'Content-Type': 'application/json', 'Node-RED-Deployment-Type': 'full' },
};
if (body) opts.headers['Content-Length'] = Buffer.byteLength(JSON.stringify(body));
const req = http.request(opts, (res) => {
const chunks = [];
res.on('data', (c) => chunks.push(c));
res.on('end', () => resolve({ status: res.statusCode, body: Buffer.concat(chunks).toString() }));
});
req.on('error', reject);
if (body) req.write(JSON.stringify(body));
req.end();
});
}
function getLogs(since) {
try {
return execSync(`docker logs evolv-nodered --since ${since} 2>&1`, {
encoding: 'utf8', timeout: 5000,
});
} catch (e) { return ''; }
}
function fetchJSON(url) {
return new Promise((resolve, reject) => {
http.get(url, (res) => {
const chunks = [];
res.on('data', (c) => chunks.push(c));
res.on('end', () => {
try { resolve(JSON.parse(Buffer.concat(chunks))); }
catch (e) { reject(e); }
});
}).on('error', reject);
});
}
(async () => {
console.log('═══════════════════════════════════════');
console.log(' WWTP Demo Flow — Comprehensive Test');
console.log('═══════════════════════════════════════\n');
// ==========================================================
console.log('1. DEPLOYMENT');
console.log('─────────────');
const flow = JSON.parse(fs.readFileSync(FLOW_FILE, 'utf8'));
test('Flow file loads', flow.length > 0, `${flow.length} nodes`);
const deployTime = new Date().toISOString();
const res = await httpReq('POST', '/flows', flow);
test('Deploy succeeds', res.status === 204 || res.status === 200, `HTTP ${res.status}`);
// Wait for init + calibration
console.log(' Waiting 5s for initialization...');
await new Promise((r) => setTimeout(r, 5000));
// Check for errors in logs
const initLogs = getLogs(deployTime);
const initErrors = initLogs.split('\n').filter((l) => l.includes('[ERROR]') || l.includes('Error'));
test('No initialization errors', initErrors.length === 0,
initErrors.length > 0 ? initErrors.slice(0, 3).join('; ') : 'clean');
// ==========================================================
console.log('\n2. NODE INVENTORY');
console.log('─────────────────');
const flows = await fetchJSON(NR_URL + '/flows');
const processTabs = ['demo_tab_wwtp', 'demo_tab_ps_west', 'demo_tab_ps_north', 'demo_tab_ps_south', 'demo_tab_treatment'];
const wwtp = flows.filter((n) => processTabs.includes(n.z));
const byType = {};
wwtp.forEach((n) => {
if (!n.type || n.type === 'tab' || n.type === 'comment') return;
byType[n.type] = (byType[n.type] || 0) + 1;
});
test('Has pumping stations', (byType['pumpingStation'] || 0) === 3, `${byType['pumpingStation'] || 0} PS nodes`);
test('Has rotating machines', (byType['rotatingMachine'] || 0) === 5, `${byType['rotatingMachine'] || 0} pumps`);
test('Has measurements', (byType['measurement'] || 0) >= 15, `${byType['measurement'] || 0} measurement nodes`);
test('Has reactor', (byType['reactor'] || 0) === 1, `${byType['reactor'] || 0} reactor`);
test('Has machineGroupControl', (byType['machineGroupControl'] || 0) >= 1, `${byType['machineGroupControl'] || 0} MGC`);
test('Has inject nodes', (byType['inject'] || 0) >= 10, `${byType['inject'] || 0} injects`);
console.log(` Node types: ${JSON.stringify(byType)}`);
// ==========================================================
console.log('\n3. PS CONFIGURATION');
console.log('───────────────────');
const pss = flows.filter((n) => n.type === 'pumpingStation');
pss.forEach((ps) => {
const vol = Number(ps.basinVolume);
const h = Number(ps.basinHeight);
const hOut = Number(ps.heightOutlet);
const sa = vol / h;
const minVol = hOut * sa;
test(`${ps.name} basin config valid`, vol > 0 && h > 0 && hOut >= 0, `vol=${vol} h=${h} hOut=${hOut}`);
test(`${ps.name} has safety enabled`, ps.enableDryRunProtection === true || ps.enableDryRunProtection === 'true');
});
// Check calibration nodes exist
const calibNodes = flows.filter((n) => n.id && n.id.startsWith('demo_inj_calib_'));
test('Calibration inject nodes exist', calibNodes.length === 3, `${calibNodes.length} calibration nodes`);
// ==========================================================
console.log('\n4. MEASUREMENT SIMULATORS');
console.log('─────────────────────────');
const measurements = flows.filter((n) => n.type === 'measurement' && processTabs.includes(n.z));
const simEnabled = measurements.filter((n) => n.simulator === true || n.simulator === 'true');
test('Measurement simulators enabled', simEnabled.length >= 10, `${simEnabled.length} of ${measurements.length} have sim=true`);
// List measurement nodes
measurements.forEach((m) => {
const sim = m.simulator === true || m.simulator === 'true';
const range = `[${m.o_min}-${m.o_max}] ${m.unit}`;
if (!sim && !m.id.includes('level') && !m.id.includes('pt_')) {
warn(`${m.name || m.id} sim=${sim}`, `range ${range}`);
}
});
// ==========================================================
console.log('\n5. PUMP CONFIGURATION');
console.log('─────────────────────');
const pumps = flows.filter((n) => n.type === 'rotatingMachine' && processTabs.includes(n.z));
pumps.forEach((p) => {
test(`${p.name} has model`, !!p.model, p.model);
test(`${p.name} supplier lowercase`, p.supplier === 'hidrostal', `supplier="${p.supplier}"`);
});
// ==========================================================
console.log('\n6. PRESSURE MEASUREMENTS');
console.log('────────────────────────');
const pts = flows.filter((n) => n.type === 'measurement' && n.id && n.id.includes('_pt_'));
test('6 pressure transmitters', pts.length === 6, `found ${pts.length}`);
pts.forEach((pt) => {
const range = `${pt.o_min}-${pt.o_max} ${pt.unit}`;
const sim = pt.simulator === true || pt.simulator === 'true';
const pos = pt.positionVsParent;
test(`${pt.name} valid`, pt.assetType === 'pressure', `pos=${pos} sim=${sim} range=${range}`);
// Check reasonable pressure ranges (not 0-5000)
if (pos === 'downstream' || pos === 'Downstream') {
test(`${pt.name} realistic range`, Number(pt.o_max) <= 2000, `o_max=${pt.o_max} (should be <=2000)`);
}
});
// ==========================================================
console.log('\n7. RUNTIME BEHAVIOR (30s observation)');
console.log('─────────────────────────────────────');
const obsStart = new Date().toISOString();
// Wait 30 seconds and observe
console.log(' Observing for 30 seconds...');
await new Promise((r) => setTimeout(r, 30000));
const obsLogs = getLogs(obsStart);
const obsLines = obsLogs.split('\n');
// Count message types
const safetyLines = obsLines.filter((l) => l.includes('Safe guard'));
const errorLines = obsLines.filter((l) => l.includes('[ERROR]'));
const monitorLines = obsLines.filter((l) => l.includes('[function:Monitor'));
test('No safety triggers in 30s', safetyLines.length === 0, `${safetyLines.length} triggers`);
test('No errors in 30s', errorLines.length === 0,
errorLines.length > 0 ? errorLines[0].substring(0, 100) : 'clean');
test('Monitor nodes producing data', monitorLines.length > 0, `${monitorLines.length} monitor lines`);
// Parse monitoring data
if (monitorLines.length > 0) {
console.log('\n Monitor data:');
monitorLines.forEach((l) => {
const clean = l.replace(/^\[WARN\] -> /, ' ');
console.log(' ' + clean.trim().substring(0, 150));
});
// Check volume per PS
const psVolumes = {};
monitorLines.forEach((l) => {
const psMatch = l.match(/Monitor (PS \w+)/);
const volMatch = l.match(/vol=([\d.]+)m3/);
if (psMatch && volMatch) {
const ps = psMatch[1];
if (!psVolumes[ps]) psVolumes[ps] = [];
psVolumes[ps].push(parseFloat(volMatch[1]));
}
});
Object.entries(psVolumes).forEach(([ps, vols]) => {
const first = vols[0];
const last = vols[vols.length - 1];
test(`${ps} volume above 0`, first > 0, `vol=${first.toFixed(1)} m3`);
test(`${ps} volume reasonable`, first < 1000, `vol=${first.toFixed(1)} m3`);
if (vols.length >= 2) {
const trend = last - first;
test(`${ps} volume stable/rising`, trend >= -0.5, `${first.toFixed(1)}${last.toFixed(1)} m3 (${trend >= 0 ? '+' : ''}${trend.toFixed(2)})`);
}
});
} else {
warn('No monitor data', 'monitoring function nodes may not have fired yet');
}
// ==========================================================
console.log('\n8. WIRING INTEGRITY');
console.log('───────────────────');
// Check all PS have q_in inject
pss.forEach((ps) => {
const qinFn = flows.find((n) => n.wires && n.wires.flat && n.wires.flat().includes(ps.id) && n.type === 'function');
test(`${ps.name} has q_in source`, !!qinFn, qinFn ? qinFn.name : 'none');
});
// Check all pumps have pressure measurements (RAS pump has flow sensor instead)
pumps.forEach((p) => {
const childSensors = flows.filter((n) => n.type === 'measurement' && n.wires && n.wires[2] && n.wires[2].includes(p.id));
const isRAS = p.id === 'demo_pump_ras';
const minSensors = isRAS ? 1 : 2;
test(`${p.name} has ${isRAS ? 'sensors' : 'pressure PTs'}`, childSensors.length >= minSensors,
`${childSensors.length} ${isRAS ? 'sensors' : 'PTs'} (${childSensors.map((pt) => pt.positionVsParent).join(', ')})`);
});
// ==========================================================
console.log('\n═══════════════════════════════════════');
console.log(` Results: ${passed} passed, ${failed} failed, ${warnings} warnings`);
console.log('═══════════════════════════════════════');
if (failed > 0) {
console.log('\n ❌ SOME TESTS FAILED');
process.exit(1);
} else if (warnings > 0) {
console.log('\n ⚠️ ALL TESTS PASSED (with warnings)');
} else {
console.log('\n ✅ ALL TESTS PASSED');
}
})().catch((err) => {
console.error('Test suite failed:', err);
process.exit(1);
});

View File

@@ -1,217 +0,0 @@
#!/usr/bin/env node
/**
* Deploy the demo flow fresh and trace the first 60 seconds of behavior.
* Captures: container logs, PS volume evolution, flow events.
*/
const http = require('http');
const fs = require('fs');
const path = require('path');
const { execSync } = require('child_process');
const NR_URL = 'http://localhost:1880';
const FLOW_FILE = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const TRACE_SECONDS = 45;
function httpReq(method, urlPath, body) {
return new Promise((resolve, reject) => {
const parsed = new URL(NR_URL + urlPath);
const opts = {
hostname: parsed.hostname,
port: parsed.port,
path: parsed.pathname,
method,
headers: {
'Content-Type': 'application/json',
'Node-RED-Deployment-Type': 'full',
},
};
if (body) {
const buf = Buffer.from(JSON.stringify(body));
opts.headers['Content-Length'] = buf.length;
}
const req = http.request(opts, (res) => {
const chunks = [];
res.on('data', (c) => chunks.push(c));
res.on('end', () => {
const text = Buffer.concat(chunks).toString();
resolve({ status: res.statusCode, body: text });
});
});
req.on('error', reject);
if (body) req.write(JSON.stringify(body));
req.end();
});
}
function getLogs(since) {
try {
// Get ALL logs since our timestamp
const cmd = `docker logs evolv-nodered --since ${since} 2>&1`;
return execSync(cmd, { encoding: 'utf8', timeout: 5000 });
} catch (e) {
return 'Log error: ' + e.message;
}
}
(async () => {
console.log('=== Deploy & Trace ===');
console.log('Loading flow from', FLOW_FILE);
const flow = JSON.parse(fs.readFileSync(FLOW_FILE, 'utf8'));
console.log(`Flow has ${flow.length} nodes`);
// Deploy
const deployTime = new Date().toISOString();
console.log(`\nDeploying at ${deployTime}...`);
const res = await httpReq('POST', '/flows', flow);
console.log(`Deploy response: ${res.status}`);
if (res.status !== 204 && res.status !== 200) {
console.error('Deploy failed:', res.body);
process.exit(1);
}
// Wait 3 seconds for initial setup
console.log('Waiting 3s for init...\n');
await new Promise((r) => setTimeout(r, 3000));
// Trace loop
const traceStart = Date.now();
const volumeHistory = [];
let lastLogPos = 0;
for (let i = 0; i < Math.ceil(TRACE_SECONDS / 3); i++) {
const elapsed = ((Date.now() - traceStart) / 1000).toFixed(1);
// Get new logs since deploy
const logs = getLogs(deployTime);
const newLines = logs.split('\n').slice(lastLogPos);
lastLogPos = logs.split('\n').length;
// Parse interesting log lines
const safeGuards = [];
const pressureChanges = [];
const modeChanges = [];
const stateChanges = [];
const other = [];
newLines.forEach((line) => {
if (!line.trim()) return;
const volMatch = line.match(/vol=([-\d.]+) m3.*remainingTime=([\w.]+)/);
if (volMatch) {
safeGuards.push({ vol: parseFloat(volMatch[1]), remaining: volMatch[2] });
return;
}
if (line.includes('Pressure change detected')) {
pressureChanges.push(1);
return;
}
if (line.includes('Mode changed') || line.includes('setMode') || line.includes('Control mode')) {
modeChanges.push(line.trim().substring(0, 200));
return;
}
if (line.includes('machine state') || line.includes('State:') || line.includes('startup') || line.includes('shutdown')) {
stateChanges.push(line.trim().substring(0, 200));
return;
}
if (line.includes('q_in') || line.includes('netflow') || line.includes('Volume') ||
line.includes('Height') || line.includes('Level') || line.includes('Controllevel')) {
other.push(line.trim().substring(0, 200));
return;
}
});
console.log(`--- t=${elapsed}s ---`);
if (safeGuards.length > 0) {
const latest = safeGuards[safeGuards.length - 1];
const first = safeGuards[0];
console.log(` SAFETY: ${safeGuards.length} triggers, vol: ${first.vol}${latest.vol} m3, remaining: ${latest.remaining}s`);
volumeHistory.push({ t: parseFloat(elapsed), vol: latest.vol });
} else {
console.log(' SAFETY: none (good)');
}
if (pressureChanges.length > 0) {
console.log(` PRESSURE: ${pressureChanges.length} changes`);
}
if (modeChanges.length > 0) {
modeChanges.forEach((m) => console.log(` MODE: ${m}`));
}
if (stateChanges.length > 0) {
stateChanges.slice(-5).forEach((s) => console.log(` STATE: ${s}`));
}
if (other.length > 0) {
other.slice(-5).forEach((o) => console.log(` INFO: ${o}`));
}
console.log('');
await new Promise((r) => setTimeout(r, 3000));
}
// Final analysis
console.log('\n=== Volume Trajectory ===');
volumeHistory.forEach((v) => {
const bar = '#'.repeat(Math.max(0, Math.round(v.vol / 2)));
console.log(` t=${String(v.t).padStart(5)}s: ${String(v.vol.toFixed(2)).padStart(8)} m3 ${bar}`);
});
if (volumeHistory.length >= 2) {
const first = volumeHistory[0];
const last = volumeHistory[volumeHistory.length - 1];
const dt = last.t - first.t;
const dv = last.vol - first.vol;
const rate = dt > 0 ? (dv / dt * 3600).toFixed(1) : 'N/A';
console.log(`\n Rate: ${rate} m3/h (${dv > 0 ? 'FILLING' : 'DRAINING'})`);
}
// Get ALL logs for comprehensive analysis
console.log('\n=== Full Log Analysis ===');
const allLogs = getLogs(deployTime);
const allLines = allLogs.split('\n');
// Count different message types
const counts = { safety: 0, pressure: 0, mode: 0, state: 0, error: 0, warn: 0, flow: 0 };
allLines.forEach((l) => {
if (l.includes('Safe guard')) counts.safety++;
if (l.includes('Pressure change')) counts.pressure++;
if (l.includes('Mode') || l.includes('mode')) counts.mode++;
if (l.includes('startup') || l.includes('shutdown') || l.includes('machine state')) counts.state++;
if (l.includes('[ERROR]') || l.includes('Error')) counts.error++;
if (l.includes('[WARN]')) counts.warn++;
if (l.includes('netflow') || l.includes('q_in') || l.includes('flow')) counts.flow++;
});
console.log('Message counts:', JSON.stringify(counts, null, 2));
// Print errors
const errors = allLines.filter((l) => l.includes('[ERROR]') || l.includes('Error'));
if (errors.length > 0) {
console.log('\nErrors:');
errors.slice(0, 20).forEach((e) => console.log(' ' + e.trim().substring(0, 200)));
}
// Print first few non-pressure, non-safety lines
console.log('\nKey events (first 30):');
let keyCount = 0;
allLines.forEach((l) => {
if (keyCount >= 30) return;
if (l.includes('Pressure change detected')) return;
if (l.includes('Safe guard triggered')) return;
if (!l.trim()) return;
console.log(' ' + l.trim().substring(0, 200));
keyCount++;
});
})().catch((err) => {
console.error('Failed:', err);
process.exit(1);
});

View File

@@ -1,36 +0,0 @@
#!/usr/bin/env node
/**
* Fix asset selection in demo-flow.json so editor dropdowns correctly
* pre-select the configured supplier/type/model when a node is opened.
*
* Issues fixed:
* 1. Pump nodes: supplier "Hidrostal" → "hidrostal" (matches machine.json id)
* 2. demo_meas_flow: assetType "flow-electromagnetic" → "flow" (matches measurement.json type id)
*/
const fs = require('fs');
const path = require('path');
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
let changes = 0;
flow.forEach(node => {
// Fix 1: Pump supplier id mismatch
if (node.type === 'rotatingMachine' && node.supplier === 'Hidrostal') {
node.supplier = 'hidrostal';
changes++;
console.log(`Fixed pump ${node.id}: supplier "Hidrostal" → "hidrostal"`);
}
// Fix 2: Standardize flow measurement assetType
if (node.type === 'measurement' && node.assetType === 'flow-electromagnetic') {
node.assetType = 'flow';
changes++;
console.log(`Fixed ${node.id}: assetType "flow-electromagnetic" → "flow"`);
}
});
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n');
console.log(`\nDone. ${changes} node(s) updated.`);

View File

@@ -1,243 +0,0 @@
#!/usr/bin/env node
/**
* Fix display issues:
* 1. Set positionIcon on all nodes based on positionVsParent
* 2. Switch reactor from CSTR to PFR with proper length/resolution
* 3. Add missing default fields to all dashboard widgets (gauges, sliders, button-groups)
*/
const fs = require('fs');
const path = require('path');
const FLOW_PATH = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const flow = JSON.parse(fs.readFileSync(FLOW_PATH, 'utf8'));
const byId = (id) => flow.find(n => n.id === id);
// =============================================
// FIX 1: positionIcon on all process nodes
// =============================================
// Icon mapping from physicalPosition.js
const positionIconMap = {
'upstream': '→',
'atEquipment': '⊥',
'downstream': '←',
};
let iconFixed = 0;
for (const node of flow) {
if (node.positionVsParent !== undefined && node.positionVsParent !== '') {
const icon = positionIconMap[node.positionVsParent];
if (icon && node.positionIcon !== icon) {
node.positionIcon = icon;
iconFixed++;
}
}
// Also ensure positionIcon has a fallback if positionVsParent is set
if (node.positionVsParent && !node.positionIcon) {
node.positionIcon = positionIconMap[node.positionVsParent] || '⊥';
iconFixed++;
}
}
console.log(`Fixed positionIcon on ${iconFixed} nodes`);
// =============================================
// FIX 2: Switch reactor from CSTR to PFR
// =============================================
const reactor = byId('demo_reactor');
if (reactor) {
reactor.reactor_type = 'PFR';
reactor.length = 50; // 50m plug flow reactor
reactor.resolution_L = 10; // 10 slices for spatial resolution
reactor.alpha = 0; // Danckwerts BC (dispersive flow, more realistic)
console.log(`Switched reactor to PFR: length=${reactor.length}m, resolution=${reactor.resolution_L} slices`);
// Update influent measurements with positions along the reactor
// FT-001 at inlet (position 0), DO-001 at 1/3, NH4-001 at 2/3
const measFlow = byId('demo_meas_flow');
if (measFlow) {
measFlow.hasDistance = true;
measFlow.distance = 0; // at inlet
measFlow.distanceUnit = 'm';
measFlow.distanceDescription = 'reactor inlet';
measFlow.positionVsParent = 'upstream';
measFlow.positionIcon = '→';
console.log(' FT-001 positioned at reactor inlet (0m)');
}
const measDo = byId('demo_meas_do');
if (measDo) {
measDo.hasDistance = true;
measDo.distance = 15; // 15m along the reactor (30% of length)
measDo.distanceUnit = 'm';
measDo.distanceDescription = 'aeration zone';
measDo.positionVsParent = 'atEquipment';
measDo.positionIcon = '⊥';
console.log(' DO-001 positioned at 15m (aeration zone)');
}
const measNh4 = byId('demo_meas_nh4');
if (measNh4) {
measNh4.hasDistance = true;
measNh4.distance = 35; // 35m along the reactor (70% of length)
measNh4.distanceUnit = 'm';
measNh4.distanceDescription = 'post-aeration zone';
measNh4.positionVsParent = 'atEquipment';
measNh4.positionIcon = '⊥';
console.log(' NH4-001 positioned at 35m (post-aeration zone)');
}
}
// =============================================
// FIX 3: Add missing defaults to dashboard widgets
// =============================================
// --- ui-gauge: add missing fields ---
const gaugeDefaults = {
value: 'payload',
valueType: 'msg',
sizeThickness: 16,
sizeGap: 4,
sizeKeyThickness: 8,
styleRounded: true,
styleGlow: false,
alwaysShowTitle: false,
floatingTitlePosition: 'top-left',
icon: '',
};
let gaugeFixed = 0;
for (const node of flow) {
if (node.type !== 'ui-gauge') continue;
let changed = false;
for (const [key, defaultVal] of Object.entries(gaugeDefaults)) {
if (node[key] === undefined) {
node[key] = defaultVal;
changed = true;
}
}
// Ensure className exists
if (node.className === undefined) node.className = '';
// Ensure outputs (gauges have 1 output in newer versions)
if (changed) gaugeFixed++;
}
console.log(`Fixed ${gaugeFixed} ui-gauge nodes with missing defaults`);
// --- ui-button-group: add missing fields ---
const buttonGroupDefaults = {
rounded: true,
useThemeColors: true,
topic: 'topic',
topicType: 'msg',
className: '',
};
let bgFixed = 0;
for (const node of flow) {
if (node.type !== 'ui-button-group') continue;
let changed = false;
for (const [key, defaultVal] of Object.entries(buttonGroupDefaults)) {
if (node[key] === undefined) {
node[key] = defaultVal;
changed = true;
}
}
// Ensure options have valueType
if (node.options && Array.isArray(node.options)) {
for (const opt of node.options) {
if (!opt.valueType) opt.valueType = 'str';
}
}
if (changed) bgFixed++;
}
console.log(`Fixed ${bgFixed} ui-button-group nodes with missing defaults`);
// --- ui-slider: add missing fields ---
const sliderDefaults = {
topic: 'topic',
topicType: 'msg',
thumbLabel: true,
showTicks: 'always',
className: '',
iconPrepend: '',
iconAppend: '',
color: '',
colorTrack: '',
colorThumb: '',
showTextField: false,
};
let sliderFixed = 0;
for (const node of flow) {
if (node.type !== 'ui-slider') continue;
let changed = false;
for (const [key, defaultVal] of Object.entries(sliderDefaults)) {
if (node[key] === undefined) {
node[key] = defaultVal;
changed = true;
}
}
if (changed) sliderFixed++;
}
console.log(`Fixed ${sliderFixed} ui-slider nodes with missing defaults`);
// --- ui-chart: add missing fields ---
const chartDefaults = {
className: '',
};
let chartFixed = 0;
for (const node of flow) {
if (node.type !== 'ui-chart') continue;
let changed = false;
for (const [key, defaultVal] of Object.entries(chartDefaults)) {
if (node[key] === undefined) {
node[key] = defaultVal;
changed = true;
}
}
if (changed) chartFixed++;
}
console.log(`Fixed ${chartFixed} ui-chart nodes with missing defaults`);
// --- ui-template: add missing fields ---
for (const node of flow) {
if (node.type !== 'ui-template') continue;
if (node.templateScope === undefined) node.templateScope = 'local';
if (node.className === undefined) node.className = '';
}
// --- ui-text: add missing fields ---
for (const node of flow) {
if (node.type !== 'ui-text') continue;
if (node.className === undefined) node.className = '';
}
// =============================================
// Validate
// =============================================
const allIds = new Set(flow.map(n => n.id));
let issues = 0;
for (const n of flow) {
if (!n.wires) continue;
for (const port of n.wires) {
for (const target of port) {
if (!allIds.has(target)) {
console.warn(`BROKEN WIRE: ${n.id}${target}`);
issues++;
}
}
}
}
if (issues === 0) console.log('All wire references valid ✓');
// List all nodes with positionIcon to verify
console.log('\nNodes with positionIcon:');
for (const n of flow) {
if (n.positionIcon) {
console.log(` ${n.positionIcon} ${n.name || n.id} (${n.positionVsParent})`);
}
}
// Write
fs.writeFileSync(FLOW_PATH, JSON.stringify(flow, null, 2) + '\n');
console.log(`\nWrote ${FLOW_PATH} (${flow.length} nodes)`);

View File

@@ -1,154 +0,0 @@
#!/usr/bin/env node
/**
* Fix layout of demo-flow.json so nodes are nicely grouped and don't overlap.
*
* Layout structure (on demo_tab_wwtp):
*
* Row 1 (y=40-300): PS West section (comment, mode injects, pumps, MGC, PS, q_in sim)
* Row 2 (y=340-500): PS North section
* Row 3 (y=520-680): PS South section
* Row 4 (y=720-920): Biological Treatment (measurements, reactor, settler, monster)
* Row 5 (y=960-1120): Pressure Measurements section
* Row 6 (y=1140-1440): Effluent measurements
* Row 7 (y=1460+): Telemetry & Dashboard API
*
* Column layout:
* x=140: Inject nodes (left)
* x=370: Function nodes
* x=580: Intermediate nodes (measurements feeding other nodes)
* x=700: Main equipment nodes (PS, pumps, measurement nodes)
* x=935: Link out nodes
* x=1050+: Right side (reactor, settler, telemetry)
*/
const fs = require('fs');
const path = require('path');
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
function setPos(id, x, y) {
const node = flow.find(n => n.id === id);
if (node) {
node.x = x;
node.y = y;
} else {
console.warn('Layout: node not found:', id);
}
}
// === PS West section (y: 40-300) ===
setPos('demo_comment_ps', 340, 40);
// Mode + q_in injects (left column)
setPos('demo_inj_w1_mode', 140, 80);
setPos('demo_inj_w2_mode', 140, 260);
setPos('demo_inj_west_mode', 140, 160);
setPos('demo_inj_west_flow', 140, 200);
// q_in function node
setPos('demo_fn_west_flow_sim', 370, 200);
// MGC sits between PS and pumps
setPos('demo_pump_w1', 700, 100);
setPos('demo_mgc_west', 700, 180);
setPos('demo_pump_w2', 700, 260);
setPos('demo_ps_west', 940, 180);
// === PS North section (y: 340-500) ===
setPos('demo_comment_ps_north', 330, 340);
setPos('demo_inj_n1_mode', 140, 380);
setPos('demo_inj_north_mode', 140, 420);
setPos('demo_inj_north_flow', 140, 460);
setPos('demo_fn_north_flow_sim', 370, 460);
// North outflow measurement
setPos('demo_comment_north_outflow', 200, 500);
setPos('demo_meas_ft_n1', 580, 500);
setPos('demo_pump_n1', 700, 400);
setPos('demo_ps_north', 940, 440);
// === PS South section (y: 540-680) ===
setPos('demo_comment_ps_south', 320, 540);
setPos('demo_inj_s1_mode', 140, 580);
setPos('demo_inj_south_mode', 140, 620);
setPos('demo_inj_south_flow', 140, 660);
setPos('demo_fn_south_flow_sim', 370, 660);
setPos('demo_pump_s1', 700, 580);
setPos('demo_ps_south', 940, 620);
// === Biological Treatment (y: 720-920) ===
setPos('demo_comment_treatment', 200, 720);
setPos('demo_meas_flow', 700, 760);
setPos('demo_meas_do', 700, 820);
setPos('demo_meas_nh4', 700, 880);
setPos('demo_reactor', 1100, 820);
setPos('demo_inj_reactor_tick', 900, 760);
setPos('demo_settler', 1100, 920);
setPos('demo_monster', 1100, 1000);
setPos('demo_inj_monster_flow', 850, 1000);
setPos('demo_fn_monster_flow', 930, 1040);
// === Pressure Measurements (y: 960-1120) — new section ===
setPos('demo_comment_pressure', 320, 960);
// West pressure (grouped together)
setPos('demo_fn_level_to_pressure_w', 370, 1000);
setPos('demo_meas_pt_w_up', 580, 1000);
setPos('demo_meas_pt_w_down', 580, 1040);
// North pressure
setPos('demo_fn_level_to_pressure_n', 370, 1080);
setPos('demo_meas_pt_n_up', 580, 1080);
setPos('demo_meas_pt_n_down', 580, 1120);
// South pressure
setPos('demo_fn_level_to_pressure_s', 370, 1160);
setPos('demo_meas_pt_s_up', 580, 1160);
setPos('demo_meas_pt_s_down', 580, 1200);
// === Effluent Measurements (y: 1240-1520) ===
setPos('demo_comment_effluent_meas', 300, 1240);
setPos('demo_meas_eff_flow', 700, 1280);
setPos('demo_meas_eff_do', 700, 1340);
setPos('demo_meas_eff_nh4', 700, 1400);
setPos('demo_meas_eff_no3', 700, 1460);
setPos('demo_meas_eff_tss', 700, 1520);
// === Telemetry section (right side, y: 40-240) ===
setPos('demo_comment_telemetry', 1300, 40);
setPos('demo_link_influx_out', 1135, 500);
setPos('demo_link_influx_in', 1175, 100);
setPos('demo_fn_influx_convert', 1350, 100);
setPos('demo_http_influx', 1560, 100);
setPos('demo_fn_influx_count', 1740, 100);
// Process debug
setPos('demo_comment_process_out', 1300, 160);
setPos('demo_link_process_out', 1135, 540);
setPos('demo_link_process_in', 1175, 200);
setPos('demo_dbg_process', 1360, 200);
setPos('demo_dbg_registration', 1370, 240);
// Dashboard link outs
setPos('demo_link_ps_west_dash', 1135, 160);
setPos('demo_link_ps_north_dash', 1135, 420);
setPos('demo_link_ps_south_dash', 1135, 600);
setPos('demo_link_reactor_dash', 1300, 820);
setPos('demo_link_meas_dash', 1135, 860);
setPos('demo_link_eff_meas_dash', 1135, 1300);
// Dashboard API
setPos('demo_dashapi', 1100, 1100);
setPos('demo_inj_dashapi', 850, 1100);
setPos('demo_http_grafana', 1300, 1100);
setPos('demo_dbg_grafana', 1500, 1100);
// InfluxDB status link
setPos('demo_link_influx_status_out', 1940, 100);
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n');
console.log('Layout fixed. Deploying...');

View File

@@ -1,103 +0,0 @@
#!/usr/bin/env node
/**
* Add initial volume calibration inject nodes to the demo flow.
*
* Problem: All 3 pumping stations start with initial volume = minVol,
* which is below the dryRun safety threshold. This causes the safety
* guard to trigger immediately on every tick, preventing normal control.
*
* Fix: Add inject nodes that fire once at deploy, sending
* calibratePredictedVolume to each PS with a reasonable starting volume.
*
* PS West: 500m3 basin, startLevel=2.5m → start at 200m3 (level 1.6m)
* Below startLevel, pumps stay off. q_in fills basin naturally.
* PS North: 200m3 basin, flowbased → start at 100m3 (50% fill)
* PS South: 100m3 basin, manual → start at 50m3 (50% fill)
*/
const fs = require('fs');
const path = require('path');
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
// Check if calibration nodes already exist
const existingCalib = flow.filter(n => n.id && n.id.startsWith('demo_inj_calib_'));
if (existingCalib.length > 0) {
console.log('Calibration nodes already exist:', existingCalib.map(n => n.id));
console.log('Removing existing calibration nodes first...');
for (const node of existingCalib) {
const idx = flow.findIndex(n => n.id === node.id);
if (idx !== -1) flow.splice(idx, 1);
}
}
// Find the WWTP tab for positioning
const wwtpTab = flow.find(n => n.id === 'demo_tab_wwtp');
if (!wwtpTab) {
console.error('WWTP tab not found!');
process.exit(1);
}
// Calibration configs: { ps_id, name, volume, x, y }
const calibrations = [
{
id: 'demo_inj_calib_west',
name: 'Cal: PS West → 200m3',
target: 'demo_ps_west',
volume: 200,
x: 100, y: 50,
},
{
id: 'demo_inj_calib_north',
name: 'Cal: PS North → 100m3',
target: 'demo_ps_north',
volume: 100,
x: 100, y: 100,
},
{
id: 'demo_inj_calib_south',
name: 'Cal: PS South → 50m3',
target: 'demo_ps_south',
volume: 50,
x: 100, y: 150,
},
];
let added = 0;
calibrations.forEach(cal => {
const injectNode = {
id: cal.id,
type: 'inject',
z: 'demo_tab_wwtp',
name: cal.name,
props: [
{
p: 'payload',
vt: 'num',
},
{
p: 'topic',
vt: 'str',
},
],
repeat: '',
crontab: '',
once: true,
onceDelay: '0.5',
topic: 'calibratePredictedVolume',
payload: String(cal.volume),
payloadType: 'num',
x: cal.x,
y: cal.y,
wires: [[cal.target]],
};
flow.push(injectNode);
added++;
console.log(`Added ${cal.id}: ${cal.name}${cal.target} (${cal.volume} m3)`);
});
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n');
console.log(`\nDone. ${added} calibration node(s) added.`);

View File

@@ -1,25 +0,0 @@
const fs = require("fs");
const flowPath = "docker/demo-flow.json";
const flow = JSON.parse(fs.readFileSync(flowPath, "utf8"));
let newFlow = flow.filter(n => n.id !== "demo_dbg_reactor_inspect");
const reactor = newFlow.find(n => n.id === "demo_reactor");
reactor.wires[0] = reactor.wires[0].filter(id => id !== "demo_dbg_reactor_inspect");
reactor.kla = 70;
newFlow.push({
id: "demo_dbg_reactor_inspect",
type: "function",
z: "demo_tab_treatment",
name: "Reactor State Inspector",
func: 'if (msg.topic !== "GridProfile") return null;\nconst p = msg.payload;\nif (!p || !p.grid) return null;\nconst now = Date.now();\nif (global.get("lastInspect") && now - global.get("lastInspect") < 5000) return null;\nglobal.set("lastInspect", now);\nconst profile = p.grid.map((row, i) => "cell" + i + "(" + (i*p.d_x).toFixed(0) + "m): NH4=" + row[3].toFixed(2) + " DO=" + row[0].toFixed(2));\nnode.warn("GRID: " + profile.join(" | "));\nreturn null;',
outputs: 1,
x: 840,
y: 320,
wires: [[]]
});
reactor.wires[0].push("demo_dbg_reactor_inspect");
fs.writeFileSync(flowPath, JSON.stringify(newFlow, null, 2) + "\n");
console.log("kla:", reactor.kla, "X_A_init:", reactor.X_A_init);

View File

@@ -1,72 +0,0 @@
#!/usr/bin/env node
/**
* Fix downstream pressure simulator ranges and add a monitoring debug node.
*
* Problems found:
* 1. Downstream pressure simulator range 0-5000 mbar is unrealistic.
* Real WWTP system backpressure: 800-1500 mbar (0.8-1.5 bar).
* The pump curve operates in 700-3900 mbar. With upstream ~300 mbar
* (hydrostatic from 3m basin) and downstream at 5000 mbar, the
* pressure differential pushes the curve to extreme predictions.
*
* 2. No way to see runtime state visually. We'll leave visual monitoring
* to the Grafana/dashboard layer, but fix the root cause here.
*
* Fix: Set downstream pressure simulators to realistic ranges:
* - West: o_min=800, o_max=1500, i_min=800, i_max=1500
* - North: o_min=600, o_max=1200, i_min=600, i_max=1200
* - South: o_min=500, o_max=1000, i_min=500, i_max=1000
*
* This keeps pressure differential in ~500-1200 mbar range,
* well within the pump curve (700-3900 mbar).
*/
const fs = require('fs');
const path = require('path');
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
let changes = 0;
// Fix downstream pressure simulator ranges
const pressureFixes = {
'demo_meas_pt_w_down': { i_min: 800, i_max: 1500, o_min: 800, o_max: 1500 },
'demo_meas_pt_n_down': { i_min: 600, i_max: 1200, o_min: 600, o_max: 1200 },
'demo_meas_pt_s_down': { i_min: 500, i_max: 1000, o_min: 500, o_max: 1000 },
};
flow.forEach(node => {
const fix = pressureFixes[node.id];
if (fix) {
const old = { i_min: node.i_min, i_max: node.i_max, o_min: node.o_min, o_max: node.o_max };
Object.assign(node, fix);
console.log(`Fixed ${node.id} "${node.name}":`);
console.log(` Was: i=[${old.i_min},${old.i_max}] o=[${old.o_min},${old.o_max}]`);
console.log(` Now: i=[${fix.i_min},${fix.i_max}] o=[${fix.o_min},${fix.o_max}]`);
changes++;
}
});
// Also fix upstream pressure ranges to match realistic hydrostatic range
// Basin level 0-4m → hydrostatic 0-392 mbar → use 0-500 mbar range
const upstreamFixes = {
'demo_meas_pt_w_up': { i_min: 0, i_max: 500, o_min: 0, o_max: 500 },
'demo_meas_pt_n_up': { i_min: 0, i_max: 400, o_min: 0, o_max: 400 },
'demo_meas_pt_s_up': { i_min: 0, i_max: 300, o_min: 0, o_max: 300 },
};
flow.forEach(node => {
const fix = upstreamFixes[node.id];
if (fix) {
const old = { i_min: node.i_min, i_max: node.i_max, o_min: node.o_min, o_max: node.o_max };
Object.assign(node, fix);
console.log(`Fixed ${node.id} "${node.name}":`);
console.log(` Was: i=[${old.i_min},${old.i_max}] o=[${old.o_min},${old.o_max}]`);
console.log(` Now: i=[${fix.i_min},${fix.i_max}] o=[${fix.o_min},${fix.o_max}]`);
changes++;
}
});
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n');
console.log(`\nDone. ${changes} node(s) updated.`);

View File

@@ -1,142 +0,0 @@
#!/usr/bin/env node
/**
* Monitor WWTP system health and process state.
* Captures PS volume, flow rates, pump states, and control actions.
*/
const http = require('http');
const { execSync } = require('child_process');
const NR_URL = 'http://localhost:1880';
const SAMPLE_INTERVAL = 5000;
const NUM_SAMPLES = 20; // 100 seconds
function getLogs(lines = 50) {
try {
return execSync('docker logs evolv-nodered --tail ' + lines + ' 2>&1', {
encoding: 'utf8', timeout: 5000,
});
} catch (e) { return ''; }
}
function parseLogs(logs) {
const result = { safety: [], pressure: 0, control: [], state: [], errors: [], flow: [] };
logs.split('\n').forEach(line => {
if (!line.trim()) return;
const volMatch = line.match(/vol=([-\d.]+) m3.*remainingTime=([\w.]+)/);
if (volMatch) {
result.safety.push({ vol: parseFloat(volMatch[1]), remaining: volMatch[2] });
return;
}
if (line.includes('Pressure change detected')) { result.pressure++; return; }
if (line.includes('Controllevel') || line.includes('flowbased') || line.includes('control applying')) {
result.control.push(line.trim().substring(0, 200));
return;
}
if (line.includes('startup') || line.includes('shutdown') || line.includes('machine state') ||
line.includes('Handling input') || line.includes('execSequence') || line.includes('execsequence')) {
result.state.push(line.trim().substring(0, 200));
return;
}
if (line.includes('[ERROR]') || line.includes('Error')) {
result.errors.push(line.trim().substring(0, 200));
return;
}
if (line.includes('netflow') || line.includes('Height') || line.includes('flow')) {
result.flow.push(line.trim().substring(0, 200));
}
});
return result;
}
(async () => {
console.log('=== WWTP Health Monitor ===');
console.log(`Sampling every ${SAMPLE_INTERVAL/1000}s for ${NUM_SAMPLES * SAMPLE_INTERVAL / 1000}s\n`);
const history = [];
for (let i = 0; i < NUM_SAMPLES; i++) {
const elapsed = (i * SAMPLE_INTERVAL / 1000).toFixed(0);
const logs = getLogs(40);
const parsed = parseLogs(logs);
console.log(`--- Sample ${i+1}/${NUM_SAMPLES} (t=${elapsed}s) ---`);
// Safety status
if (parsed.safety.length > 0) {
const latest = parsed.safety[parsed.safety.length - 1];
console.log(` ⚠️ SAFETY: ${parsed.safety.length} triggers, vol=${latest.vol} m3`);
} else {
console.log(' ✅ SAFETY: OK');
}
// Pressure changes
if (parsed.pressure > 0) {
console.log(` 📊 PRESSURE: ${parsed.pressure} changes (sim active)`);
}
// Control actions
if (parsed.control.length > 0) {
parsed.control.slice(-3).forEach(c => console.log(` 🎛️ CONTROL: ${c}`));
}
// State changes
if (parsed.state.length > 0) {
parsed.state.slice(-3).forEach(s => console.log(` 🔄 STATE: ${s}`));
}
// Flow info
if (parsed.flow.length > 0) {
parsed.flow.slice(-2).forEach(f => console.log(` 💧 FLOW: ${f}`));
}
// Errors
if (parsed.errors.length > 0) {
parsed.errors.forEach(e => console.log(` ❌ ERROR: ${e}`));
}
history.push({
t: parseInt(elapsed),
safety: parsed.safety.length,
pressure: parsed.pressure,
control: parsed.control.length,
state: parsed.state.length,
errors: parsed.errors.length,
});
console.log('');
if (i < NUM_SAMPLES - 1) {
await new Promise(r => setTimeout(r, SAMPLE_INTERVAL));
}
}
// Summary
console.log('\n=== Health Summary ===');
const totalSafety = history.reduce((a, h) => a + h.safety, 0);
const totalErrors = history.reduce((a, h) => a + h.errors, 0);
const totalControl = history.reduce((a, h) => a + h.control, 0);
const totalState = history.reduce((a, h) => a + h.state, 0);
console.log(`Safety triggers: ${totalSafety} ${totalSafety === 0 ? '✅' : '⚠️'}`);
console.log(`Errors: ${totalErrors} ${totalErrors === 0 ? '✅' : '❌'}`);
console.log(`Control actions: ${totalControl}`);
console.log(`State changes: ${totalState}`);
if (totalSafety === 0 && totalErrors === 0) {
console.log('\n🟢 SYSTEM HEALTHY');
} else if (totalErrors > 0) {
console.log('\n🔴 ERRORS DETECTED');
} else {
console.log('\n🟡 SAFETY ACTIVE (may be normal during startup)');
}
})().catch(err => {
console.error('Monitor failed:', err);
process.exit(1);
});

View File

@@ -1,158 +0,0 @@
#!/usr/bin/env node
/**
* Monitor WWTP runtime via Node-RED debug WebSocket and container logs.
* Captures process data every few seconds and displays trends.
*/
const http = require('http');
const { execSync } = require('child_process');
const NR_URL = 'http://localhost:1880';
const SAMPLE_INTERVAL = 5000; // ms
const NUM_SAMPLES = 12; // 60 seconds total
function fetchJSON(url) {
return new Promise((resolve, reject) => {
http.get(url, res => {
const chunks = [];
res.on('data', c => chunks.push(c));
res.on('end', () => {
try { resolve(JSON.parse(Buffer.concat(chunks))); }
catch (e) { reject(new Error('Parse: ' + e.message)); }
});
}).on('error', reject);
});
}
function getRecentLogs(lines = 50) {
try {
return execSync('docker logs evolv-nodered --tail ' + lines + ' 2>&1', {
encoding: 'utf8',
timeout: 5000,
});
} catch (e) {
return 'Failed to get logs: ' + e.message;
}
}
function parseSafeGuardLogs(logs) {
const lines = logs.split('\n');
const safeGuards = [];
const pressures = [];
const others = [];
lines.forEach(line => {
const volMatch = line.match(/Safe guard triggered: vol=([-\d.]+) m3/);
if (volMatch) {
safeGuards.push(parseFloat(volMatch[1]));
}
const pressMatch = line.match(/New f =([\d.]+) is constrained/);
if (pressMatch) {
pressures.push(parseFloat(pressMatch[1]));
}
if (line.includes('_controlLevelBased') || line.includes('Mode changed') ||
line.includes('execSequence') || line.includes('startup') ||
line.includes('shutdown') || line.includes('setMode')) {
others.push(line.trim().substring(0, 200));
}
});
return { safeGuards, pressures, others };
}
(async () => {
console.log('=== WWTP Runtime Monitor ===');
console.log('Capturing ' + NUM_SAMPLES + ' samples at ' + (SAMPLE_INTERVAL/1000) + 's intervals\n');
// Wait for nodes to initialize after deploy
console.log('Waiting 10s for nodes to initialize...\n');
await new Promise(r => setTimeout(r, 10000));
for (let i = 0; i < NUM_SAMPLES; i++) {
const elapsed = (i * SAMPLE_INTERVAL / 1000 + 10).toFixed(0);
console.log('--- Sample ' + (i+1) + '/' + NUM_SAMPLES + ' (t=' + elapsed + 's after deploy) ---');
// Capture container logs (last 30 lines since last sample)
const logs = getRecentLogs(30);
const parsed = parseSafeGuardLogs(logs);
if (parsed.safeGuards.length > 0) {
const latest = parsed.safeGuards[parsed.safeGuards.length - 1];
const trend = parsed.safeGuards.length > 1
? (parsed.safeGuards[parsed.safeGuards.length-1] - parsed.safeGuards[0] > 0 ? 'RISING' : 'FALLING')
: 'STABLE';
console.log(' SAFETY: vol=' + latest.toFixed(2) + ' m3 (' + parsed.safeGuards.length + ' triggers, ' + trend + ')');
} else {
console.log(' SAFETY: No safe guard triggers (GOOD)');
}
if (parsed.pressures.length > 0) {
const avg = parsed.pressures.reduce((a,b) => a+b, 0) / parsed.pressures.length;
console.log(' PRESSURE CLAMP: avg f=' + avg.toFixed(0) + ' (' + parsed.pressures.length + ' warnings)');
} else {
console.log(' PRESSURE: No interpolation warnings (GOOD)');
}
if (parsed.others.length > 0) {
console.log(' CONTROL: ' + parsed.others.slice(-3).join('\n '));
}
// Check if there are state change or mode messages
const logLines = logs.split('\n');
const stateChanges = logLines.filter(l =>
l.includes('machine state') || l.includes('State:') ||
l.includes('draining') || l.includes('filling') ||
l.includes('q_in') || l.includes('netFlow')
);
if (stateChanges.length > 0) {
console.log(' STATE: ' + stateChanges.slice(-3).map(s => s.trim().substring(0, 150)).join('\n '));
}
console.log('');
if (i < NUM_SAMPLES - 1) {
await new Promise(r => setTimeout(r, SAMPLE_INTERVAL));
}
}
// Final log dump
console.log('\n=== Final Log Analysis (last 200 lines) ===');
const finalLogs = getRecentLogs(200);
const finalParsed = parseSafeGuardLogs(finalLogs);
console.log('Safe guard triggers: ' + finalParsed.safeGuards.length);
if (finalParsed.safeGuards.length > 0) {
console.log(' First vol: ' + finalParsed.safeGuards[0].toFixed(2) + ' m3');
console.log(' Last vol: ' + finalParsed.safeGuards[finalParsed.safeGuards.length-1].toFixed(2) + ' m3');
const delta = finalParsed.safeGuards[finalParsed.safeGuards.length-1] - finalParsed.safeGuards[0];
console.log(' Delta: ' + (delta > 0 ? '+' : '') + delta.toFixed(2) + ' m3 (' + (delta > 0 ? 'RECOVERING' : 'STILL DRAINING') + ')');
}
console.log('Pressure clamp warnings: ' + finalParsed.pressures.length);
if (finalParsed.pressures.length > 0) {
const min = Math.min(...finalParsed.pressures);
const max = Math.max(...finalParsed.pressures);
console.log(' Range: ' + min.toFixed(0) + ' - ' + max.toFixed(0));
}
console.log('\nControl events: ' + finalParsed.others.length);
finalParsed.others.slice(-10).forEach(l => console.log(' ' + l));
// Overall assessment
console.log('\n=== ASSESSMENT ===');
if (finalParsed.safeGuards.length === 0 && finalParsed.pressures.length === 0) {
console.log('HEALTHY: No safety triggers, no pressure warnings');
} else if (finalParsed.safeGuards.length > 0) {
const trend = finalParsed.safeGuards[finalParsed.safeGuards.length-1] - finalParsed.safeGuards[0];
if (trend > 0) {
console.log('RECOVERING: Volume rising but still negative');
} else {
console.log('CRITICAL: Volume still dropping - control issue persists');
}
} else if (finalParsed.pressures.length > 0) {
console.log('WARNING: Pressure values exceeding curve bounds');
}
})().catch(err => {
console.error('Monitor failed:', err);
process.exit(1);
});

View File

@@ -1,184 +0,0 @@
#!/usr/bin/env node
/**
* Patch demo-flow.json:
* 1. Fix NH4 chart — remove demo_link_meas_dash from new NH4 nodes
* 2. Update parse function — use "NH4 @ Xm" label format
* 3. Reorganize entire treatment tab — logical left-to-right layout
*/
const fs = require('fs');
const path = require('path');
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
const find = (id) => flow.find(n => n.id === id);
// ============================================================
// 1. FIX NH4 CHART WIRING
// Remove demo_link_meas_dash from the 4 new NH4 nodes.
// They should only go to process link + NH4 profile link.
// ============================================================
const newNh4Ids = ['demo_meas_nh4_in', 'demo_meas_nh4_a', 'demo_meas_nh4_b', 'demo_meas_nh4_c'];
for (const id of newNh4Ids) {
const n = find(id);
if (n) {
n.wires[0] = n.wires[0].filter(w => w !== 'demo_link_meas_dash');
console.log(` ${id} Port 0 wires: ${JSON.stringify(n.wires[0])}`);
}
}
console.log('1. Fixed: removed demo_link_meas_dash from new NH4 nodes');
// ============================================================
// 2. UPDATE PARSE FUNCTION — "NH4 @ Xm" format
// Also make it generic: read distance from payload metadata
// if available, fall back to topic matching.
// ============================================================
const parseFn = find('demo_fn_nh4_profile_parse');
if (parseFn) {
parseFn.func = `const p = msg.payload || {};
const topic = msg.topic || '';
const now = Date.now();
const val = Number(p.mAbs);
if (!Number.isFinite(val)) return null;
// Build label from distance metadata if available, else match by tag
const dist = p.distance;
const tag = p.assetTagNumber || topic;
let label;
if (dist !== undefined && dist !== null) {
label = 'NH4 @ ' + dist + 'm';
} else if (tag.includes('NH4-IN')) label = 'NH4 @ 0m';
else if (tag.includes('NH4-A')) label = 'NH4 @ 10m';
else if (tag.includes('NH4-B')) label = 'NH4 @ 25m';
else if (tag.includes('NH4-001')) label = 'NH4 @ 35m';
else if (tag.includes('NH4-C')) label = 'NH4 @ 45m';
else label = 'NH4 @ ?m';
return { topic: label, payload: Math.round(val * 100) / 100, timestamp: now };`;
console.log('2. Updated NH4 profile parse function to "NH4 @ Xm" format');
}
// ============================================================
// 3. REORGANIZE TREATMENT TAB LAYOUT
//
// Logical left-to-right process flow:
//
// Col 1 (x=80): Comments / section headers
// Col 2 (x=200): Injects (reactor tick, monster flow)
// Col 3 (x=420): Inlet measurements (flow, DO, NH4 profile)
// Col 4 (x=640): Link outs (meas dash, NH4 profile dash)
// Col 5 (x=820): Reactor
// Col 6 (x=1060): Settler
// Col 7 (x=1280): Effluent measurements
// Col 8 (x=1500): Effluent link outs
//
// Row zones (y):
// Row A (y=40): Section comment
// Row B (y=100-440): Main process: reactor measurements → reactor → settler
// Row C (y=500-700): Effluent measurements (downstream of settler)
// Row D (y=760-900): RAS recycle loop (below main flow)
// Row E (y=960-1120): Merge collection / influent composition
//
// ============================================================
const layout = {
// ── SECTION COMMENT ──
'demo_comment_treatment': { x: 80, y: 40 },
// ── INJECTS ──
'demo_inj_reactor_tick': { x: 200, y: 120 },
'demo_inj_monster_flow': { x: 200, y: 560 },
// ── INLET MEASUREMENTS (column, spaced 60px) ──
'demo_meas_flow': { x: 420, y: 100 }, // FT-001 flow
'demo_meas_do': { x: 420, y: 160 }, // DO-001
'demo_meas_nh4_in': { x: 420, y: 220 }, // NH4-IN 0m
'demo_meas_nh4_a': { x: 420, y: 280 }, // NH4-A 10m
'demo_meas_nh4': { x: 420, y: 340 }, // NH4-001 35m (existing, keep between A & B for distance order — wait, 25m < 35m)
'demo_meas_nh4_b': { x: 420, y: 400 }, // NH4-B 25m
'demo_meas_nh4_c': { x: 420, y: 460 }, // NH4-C 45m
// ── LINK OUTS (from measurements) ──
'demo_link_meas_dash': { x: 640, y: 130 },
'demo_link_nh4_profile_dash': { x: 640, y: 340 },
// ── REACTOR ──
'demo_reactor': { x: 820, y: 220 },
// ── REACTOR LINK OUTS ──
'demo_link_reactor_dash': { x: 1020, y: 180 },
'demo_link_overview_reactor_out': { x: 1020, y: 220 },
// ── SETTLER ──
'demo_settler': { x: 1060, y: 320 },
// ── SHARED LINK OUTS (process + influx) ──
'demo_link_influx_out_treatment': { x: 1020, y: 260 },
'demo_link_process_out_treatment': { x: 1020, y: 300 },
// ── EFFLUENT SECTION ──
'demo_comment_effluent_meas': { x: 80, y: 520 },
'demo_meas_eff_flow': { x: 1280, y: 320 },
'demo_meas_eff_do': { x: 1280, y: 380 },
'demo_meas_eff_nh4': { x: 1280, y: 440 },
'demo_meas_eff_no3': { x: 1280, y: 500 },
'demo_meas_eff_tss': { x: 1280, y: 560 },
'demo_link_eff_meas_dash': { x: 1500, y: 440 },
'demo_link_overview_eff_out': { x: 1500, y: 500 },
// ── MONSTER (downstream of settler, parallel to effluent meas) ──
'demo_monster': { x: 1060, y: 440 },
'demo_fn_monster_flow': { x: 400, y: 560 },
// ── RAS RECYCLE LOOP (below main process) ──
'demo_fn_ras_filter': { x: 1060, y: 760 },
'demo_pump_ras': { x: 1280, y: 760 },
'demo_meas_ft_ras': { x: 1500, y: 760 },
'demo_inj_ras_mode': { x: 1280, y: 820 },
'demo_inj_ras_speed': { x: 1280, y: 880 },
'demo_comment_pressure': { x: 80, y: 740 },
// ── MERGE COLLECTION (bottom section) ──
'demo_comment_merge': { x: 80, y: 960 },
'demo_link_merge_west_in': { x: 100, y: 1000 },
'demo_link_merge_north_in': { x: 100, y: 1060 },
'demo_link_merge_south_in': { x: 100, y: 1120 },
'demo_fn_tag_west': { x: 300, y: 1000 },
'demo_fn_tag_north': { x: 300, y: 1060 },
'demo_fn_tag_south': { x: 300, y: 1120 },
'demo_fn_merge_collect': { x: 520, y: 1060 },
'demo_link_merge_dash': { x: 720, y: 1020 },
'demo_fn_influent_compose': { x: 720, y: 1100 },
};
// Sort NH4 measurements by distance for visual order
// NH4-IN=0m, NH4-A=10m, NH4-B=25m, NH4-001=35m, NH4-C=45m
// Adjust y to be in distance order:
layout['demo_meas_nh4_in'] = { x: 420, y: 220 }; // 0m
layout['demo_meas_nh4_a'] = { x: 420, y: 280 }; // 10m
layout['demo_meas_nh4_b'] = { x: 420, y: 340 }; // 25m
layout['demo_meas_nh4'] = { x: 420, y: 400 }; // 35m
layout['demo_meas_nh4_c'] = { x: 420, y: 460 }; // 45m
let moved = 0;
for (const [id, pos] of Object.entries(layout)) {
const n = find(id);
if (n) {
n.x = pos.x;
n.y = pos.y;
moved++;
} else {
console.warn(` WARN: node ${id} not found`);
}
}
console.log(`3. Repositioned ${moved} nodes on treatment tab`);
// ============================================================
// WRITE OUTPUT
// ============================================================
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n', 'utf8');
console.log(`\nDone. Wrote ${flow.length} nodes to ${flowPath}`);

View File

@@ -1,455 +0,0 @@
#!/usr/bin/env node
/**
* Patch demo-flow.json:
* Phase A: Add 4 NH4 measurement nodes + ui-group + ui-chart
* Phase B: Add influent composer function node + wire merge collector
* Phase C: Fix biomass init on reactor
* Phase D: Add RAS pump, flow sensor, 2 injects, filter function + wiring
*/
const fs = require('fs');
const path = require('path');
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
// Helper: find node by id
const findNode = (id) => flow.find(n => n.id === id);
// ============================================================
// PHASE A: Add 4 NH4 measurement nodes + ui-group + ui-chart
// ============================================================
const nh4Measurements = [
{
id: 'demo_meas_nh4_in',
name: 'NH4-IN (Ammonium Inlet)',
uuid: 'nh4-in-001',
assetTagNumber: 'NH4-IN',
distance: 0,
distanceDescription: 'reactor inlet',
y: 280
},
{
id: 'demo_meas_nh4_a',
name: 'NH4-A (Early Aeration)',
uuid: 'nh4-a-001',
assetTagNumber: 'NH4-A',
distance: 10,
distanceDescription: 'early aeration zone',
y: 320
},
{
id: 'demo_meas_nh4_b',
name: 'NH4-B (Mid-Reactor)',
uuid: 'nh4-b-001',
assetTagNumber: 'NH4-B',
distance: 25,
distanceDescription: 'mid-reactor',
y: 360
},
{
id: 'demo_meas_nh4_c',
name: 'NH4-C (Near Outlet)',
uuid: 'nh4-c-001',
assetTagNumber: 'NH4-C',
distance: 45,
distanceDescription: 'near outlet',
y: 400
}
];
for (const m of nh4Measurements) {
flow.push({
id: m.id,
type: 'measurement',
z: 'demo_tab_treatment',
name: m.name,
scaling: true,
i_min: 0,
i_max: 50,
i_offset: 0,
o_min: 0,
o_max: 50,
smooth_method: 'mean',
count: 3,
simulator: true,
uuid: m.uuid,
supplier: 'Hach',
category: 'sensor',
assetType: 'ammonium',
model: 'Amtax-sc',
unit: 'mg/L',
assetTagNumber: m.assetTagNumber,
enableLog: false,
logLevel: 'error',
positionVsParent: 'atEquipment',
x: 400,
y: m.y,
wires: [
['demo_link_meas_dash', 'demo_link_process_out_treatment'],
['demo_link_influx_out_treatment'],
['demo_reactor']
],
positionIcon: '⊥',
hasDistance: true,
distance: m.distance,
distanceUnit: 'm',
distanceDescription: m.distanceDescription
});
}
// NH4 profile ui-group
flow.push({
id: 'demo_ui_grp_nh4_profile',
type: 'ui-group',
name: 'NH4 Profile Along Reactor',
page: 'demo_ui_page_treatment',
width: '6',
height: '1',
order: 6,
showTitle: true,
className: ''
});
// NH4 profile chart
flow.push({
id: 'demo_chart_nh4_profile',
type: 'ui-chart',
z: 'demo_tab_dashboard',
group: 'demo_ui_grp_nh4_profile',
name: 'NH4 Profile',
label: 'NH4 Along Reactor (mg/L)',
order: 1,
width: '6',
height: '5',
chartType: 'line',
category: 'topic',
categoryType: 'msg',
xAxisType: 'time',
yAxisLabel: 'mg/L',
removeOlder: '10',
removeOlderUnit: '60',
action: 'append',
pointShape: 'false',
pointRadius: 0,
interpolation: 'linear',
x: 510,
y: 1060,
wires: [],
showLegend: true,
xAxisProperty: '',
xAxisPropertyType: 'timestamp',
yAxisProperty: 'payload',
yAxisPropertyType: 'msg',
colors: [
'#0094ce',
'#FF7F0E',
'#2CA02C',
'#D62728',
'#A347E1',
'#D62728',
'#FF9896',
'#9467BD',
'#C5B0D5'
],
textColor: ['#aaaaaa'],
textColorDefault: false,
gridColor: ['#333333'],
gridColorDefault: false,
className: ''
});
// Link out + link in for NH4 profile chart
flow.push({
id: 'demo_link_nh4_profile_dash',
type: 'link out',
z: 'demo_tab_treatment',
name: '→ NH4 Profile Dashboard',
mode: 'link',
links: ['demo_link_nh4_profile_dash_in'],
x: 620,
y: 340
});
flow.push({
id: 'demo_link_nh4_profile_dash_in',
type: 'link in',
z: 'demo_tab_dashboard',
name: '← NH4 Profile',
links: ['demo_link_nh4_profile_dash'],
x: 75,
y: 1060,
wires: [['demo_fn_nh4_profile_parse']]
});
// Parse function for NH4 profile chart
flow.push({
id: 'demo_fn_nh4_profile_parse',
type: 'function',
z: 'demo_tab_dashboard',
name: 'Parse NH4 Profile',
func: `const p = msg.payload || {};
const topic = msg.topic || '';
const now = Date.now();
const val = Number(p.mAbs);
if (!Number.isFinite(val)) return null;
let label = topic;
if (topic.includes('NH4-IN')) label = 'NH4-IN (0m)';
else if (topic.includes('NH4-A')) label = 'NH4-A (10m)';
else if (topic.includes('NH4-B')) label = 'NH4-B (25m)';
else if (topic.includes('NH4-001')) label = 'NH4-001 (35m)';
else if (topic.includes('NH4-C')) label = 'NH4-C (45m)';
return { topic: label, payload: Math.round(val * 100) / 100, timestamp: now };`,
outputs: 1,
x: 280,
y: 1060,
wires: [['demo_chart_nh4_profile']]
});
// Wire existing NH4-001 and new NH4 measurements to the profile link out
const existingNh4 = findNode('demo_meas_nh4');
if (existingNh4) {
if (!existingNh4.wires[0].includes('demo_link_nh4_profile_dash')) {
existingNh4.wires[0].push('demo_link_nh4_profile_dash');
}
}
for (const m of nh4Measurements) {
const node = findNode(m.id);
if (node && !node.wires[0].includes('demo_link_nh4_profile_dash')) {
node.wires[0].push('demo_link_nh4_profile_dash');
}
}
console.log('Phase A: Added 4 NH4 measurements + ui-group + chart + wiring');
// ============================================================
// PHASE B: Add influent composer + wire merge collector
// ============================================================
flow.push({
id: 'demo_fn_influent_compose',
type: 'function',
z: 'demo_tab_treatment',
name: 'Influent Composer',
func: `// Convert merge collector output to Fluent messages for reactor
// ASM3: [S_O, S_I, S_S, S_NH, S_N2, S_NO, S_HCO, X_I, X_S, X_H, X_STO, X_A, X_TS]
const p = msg.payload || {};
const MUNICIPAL = [0.5, 30, 200, 40, 0, 0, 5, 25, 150, 30, 0, 0, 200];
const INDUSTRIAL = [0.5, 40, 300, 25, 0, 0, 4, 30, 100, 20, 0, 0, 150];
const RESIDENTIAL = [0.5, 25, 180, 45, 0, 0, 5, 20, 130, 25, 0, 0, 175];
const Fw = (p.west?.netFlow || 0) * 24; // m3/h -> m3/d
const Fn = (p.north?.netFlow || 0) * 24;
const Fs = (p.south?.netFlow || 0) * 24;
const msgs = [];
if (Fw > 0) msgs.push({ topic: 'Fluent', payload: { inlet: 0, F: Fw, C: MUNICIPAL }});
if (Fn > 0) msgs.push({ topic: 'Fluent', payload: { inlet: 1, F: Fn, C: INDUSTRIAL }});
if (Fs > 0) msgs.push({ topic: 'Fluent', payload: { inlet: 2, F: Fs, C: RESIDENTIAL }});
return [msgs];`,
outputs: 1,
x: 480,
y: 1040,
wires: [['demo_reactor']]
});
// Wire merge collector → influent composer (add to existing wires)
const mergeCollect = findNode('demo_fn_merge_collect');
if (mergeCollect) {
if (!mergeCollect.wires[0].includes('demo_fn_influent_compose')) {
mergeCollect.wires[0].push('demo_fn_influent_compose');
}
console.log('Phase B: Wired merge collector → influent composer → reactor');
} else {
console.error('Phase B: ERROR — demo_fn_merge_collect not found!');
}
// ============================================================
// PHASE C: Fix biomass initialization
// ============================================================
const reactor = findNode('demo_reactor');
if (reactor) {
reactor.X_A_init = 300;
reactor.X_H_init = 1500;
reactor.X_TS_init = 2500;
reactor.S_HCO_init = 8;
console.log('Phase C: Updated reactor biomass init values');
} else {
console.error('Phase C: ERROR — demo_reactor not found!');
}
// ============================================================
// PHASE D: Return Activated Sludge
// ============================================================
// D1: RAS pump
flow.push({
id: 'demo_pump_ras',
type: 'rotatingMachine',
z: 'demo_tab_treatment',
name: 'RAS Pump',
speed: '1',
startup: '5',
warmup: '3',
shutdown: '4',
cooldown: '2',
movementMode: 'dynspeed',
machineCurve: '',
uuid: 'pump-ras-001',
supplier: 'hidrostal',
category: 'machine',
assetType: 'pump-centrifugal',
model: 'hidrostal-RAS',
unit: 'm3/h',
enableLog: true,
logLevel: 'info',
positionVsParent: 'downstream',
positionIcon: '←',
hasDistance: false,
distance: 0,
distanceUnit: 'm',
distanceDescription: '',
x: 1000,
y: 380,
wires: [
['demo_link_process_out_treatment'],
['demo_link_influx_out_treatment'],
['demo_settler']
],
curveFlowUnit: 'l/s',
curvePressureUnit: 'mbar',
curvePowerUnit: 'kW'
});
// D2: RAS flow sensor
flow.push({
id: 'demo_meas_ft_ras',
type: 'measurement',
z: 'demo_tab_treatment',
name: 'FT-RAS (RAS Flow)',
scaling: true,
i_min: 20,
i_max: 80,
i_offset: 0,
o_min: 20,
o_max: 80,
smooth_method: 'mean',
count: 3,
simulator: true,
uuid: 'ft-ras-001',
supplier: 'Endress+Hauser',
category: 'sensor',
assetType: 'flow',
model: 'Promag-W400',
unit: 'm3/h',
assetTagNumber: 'FT-RAS',
enableLog: false,
logLevel: 'error',
positionVsParent: 'atEquipment',
positionIcon: '⊥',
hasDistance: false,
distance: 0,
distanceUnit: 'm',
distanceDescription: '',
x: 1200,
y: 380,
wires: [
['demo_link_process_out_treatment'],
['demo_link_influx_out_treatment'],
['demo_pump_ras']
]
});
// D3: Inject to set pump mode
flow.push({
id: 'demo_inj_ras_mode',
type: 'inject',
z: 'demo_tab_treatment',
name: 'RAS → virtualControl',
props: [
{ p: 'topic', vt: 'str' },
{ p: 'payload', vt: 'str' }
],
topic: 'setMode',
payload: 'virtualControl',
payloadType: 'str',
once: true,
onceDelay: '3',
x: 1000,
y: 440,
wires: [['demo_pump_ras']],
repeatType: 'none',
crontab: '',
repeat: ''
});
// D3: Inject to set pump speed
flow.push({
id: 'demo_inj_ras_speed',
type: 'inject',
z: 'demo_tab_treatment',
name: 'RAS speed → 50%',
props: [
{ p: 'topic', vt: 'str' },
{ p: 'payload', vt: 'json' }
],
topic: 'execMovement',
payload: '{"source":"auto","action":"setpoint","setpoint":50}',
payloadType: 'json',
once: true,
onceDelay: '4',
x: 1000,
y: 480,
wires: [['demo_pump_ras']],
repeatType: 'none',
crontab: '',
repeat: ''
});
// D4: RAS filter function
flow.push({
id: 'demo_fn_ras_filter',
type: 'function',
z: 'demo_tab_treatment',
name: 'RAS Filter',
func: `// Only pass RAS (inlet 2) from settler to reactor as inlet 3
if (msg.topic === 'Fluent' && msg.payload && msg.payload.inlet === 2) {
msg.payload.inlet = 3; // reactor inlet 3 = RAS
return msg;
}
return null;`,
outputs: 1,
x: 1000,
y: 320,
wires: [['demo_reactor']]
});
// D5: Wire settler Port 0 → RAS filter
const settler = findNode('demo_settler');
if (settler) {
if (!settler.wires[0].includes('demo_fn_ras_filter')) {
settler.wires[0].push('demo_fn_ras_filter');
}
console.log('Phase D: Wired settler → RAS filter → reactor');
} else {
console.error('Phase D: ERROR — demo_settler not found!');
}
// D5: Update reactor n_inlets: 3 → 4
if (reactor) {
reactor.n_inlets = 4;
console.log('Phase D: Updated reactor n_inlets to 4');
}
console.log('Phase D: Added RAS pump, flow sensor, 2 injects, filter function');
// ============================================================
// WRITE OUTPUT
// ============================================================
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n', 'utf8');
console.log(`\nDone. Wrote ${flow.length} nodes to ${flowPath}`);

View File

@@ -1,380 +0,0 @@
#!/usr/bin/env node
/**
* Step 1: Tab Restructure + Per-tab link-outs
* - Creates 4 new tabs (PS West, PS North, PS South, Treatment)
* - Renames WWTP tab to "Telemetry / InfluxDB"
* - Moves nodes to their new tabs
* - Creates per-tab link-out nodes for influx + process
* - Rewires nodes to use local link-outs
* - Recalculates coordinates for clean layout
*/
const fs = require('fs');
const path = require('path');
const FLOW_PATH = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const flow = JSON.parse(fs.readFileSync(FLOW_PATH, 'utf8'));
const byId = (id) => flow.find(n => n.id === id);
// =============================================
// 1a. Create 4 new tabs
// =============================================
flow.push(
{ id: "demo_tab_ps_west", type: "tab", label: "PS West", disabled: false, info: "Pumping Station West (Urban Catchment - 2 pumps, Level-based)" },
{ id: "demo_tab_ps_north", type: "tab", label: "PS North", disabled: false, info: "Pumping Station North (Industrial - 1 pump, Flow-based)" },
{ id: "demo_tab_ps_south", type: "tab", label: "PS South", disabled: false, info: "Pumping Station South (Residential - 1 pump, Manual)" },
{ id: "demo_tab_treatment", type: "tab", label: "Biological Treatment", disabled: false, info: "Merge point, Reactor, Settler, Effluent Measurements" }
);
// =============================================
// 1b. Rename existing WWTP tab
// =============================================
const wwtpTab = byId("demo_tab_wwtp");
wwtpTab.label = "Telemetry / InfluxDB";
wwtpTab.info = "InfluxDB write chain, process debug, Grafana dashboard API, shared infrastructure";
// =============================================
// 1c. Move nodes to new tabs
// =============================================
const moveMap = {
// PS West tab
"demo_comment_ps": "demo_tab_ps_west",
"demo_ps_west": "demo_tab_ps_west",
"demo_pump_w1": "demo_tab_ps_west",
"demo_pump_w2": "demo_tab_ps_west",
"demo_mgc_west": "demo_tab_ps_west",
"demo_inj_west_mode": "demo_tab_ps_west",
"demo_inj_west_flow": "demo_tab_ps_west",
"demo_fn_west_flow_sim": "demo_tab_ps_west",
"demo_inj_w1_mode": "demo_tab_ps_west",
"demo_inj_w2_mode": "demo_tab_ps_west",
"demo_inj_calib_west": "demo_tab_ps_west",
"demo_fn_level_to_pressure_w": "demo_tab_ps_west",
"demo_meas_pt_w_up": "demo_tab_ps_west",
"demo_meas_pt_w_down": "demo_tab_ps_west",
"demo_mon_west": "demo_tab_ps_west",
"demo_link_ps_west_dash": "demo_tab_ps_west",
// PS North tab
"demo_comment_ps_north": "demo_tab_ps_north",
"demo_ps_north": "demo_tab_ps_north",
"demo_pump_n1": "demo_tab_ps_north",
"demo_inj_north_mode": "demo_tab_ps_north",
"demo_inj_north_flow": "demo_tab_ps_north",
"demo_fn_north_flow_sim": "demo_tab_ps_north",
"demo_inj_n1_mode": "demo_tab_ps_north",
"demo_inj_calib_north": "demo_tab_ps_north",
"demo_comment_north_outflow": "demo_tab_ps_north",
"demo_meas_ft_n1": "demo_tab_ps_north",
"demo_fn_level_to_pressure_n": "demo_tab_ps_north",
"demo_meas_pt_n_up": "demo_tab_ps_north",
"demo_meas_pt_n_down": "demo_tab_ps_north",
"demo_mon_north": "demo_tab_ps_north",
"demo_link_ps_north_dash": "demo_tab_ps_north",
// PS South tab
"demo_comment_ps_south": "demo_tab_ps_south",
"demo_ps_south": "demo_tab_ps_south",
"demo_pump_s1": "demo_tab_ps_south",
"demo_inj_south_mode": "demo_tab_ps_south",
"demo_inj_south_flow": "demo_tab_ps_south",
"demo_fn_south_flow_sim": "demo_tab_ps_south",
"demo_inj_s1_mode": "demo_tab_ps_south",
"demo_inj_calib_south": "demo_tab_ps_south",
"demo_fn_level_to_pressure_s": "demo_tab_ps_south",
"demo_meas_pt_s_up": "demo_tab_ps_south",
"demo_meas_pt_s_down": "demo_tab_ps_south",
"demo_mon_south": "demo_tab_ps_south",
"demo_link_ps_south_dash": "demo_tab_ps_south",
// Treatment tab
"demo_comment_treatment": "demo_tab_treatment",
"demo_meas_flow": "demo_tab_treatment",
"demo_meas_do": "demo_tab_treatment",
"demo_meas_nh4": "demo_tab_treatment",
"demo_reactor": "demo_tab_treatment",
"demo_inj_reactor_tick": "demo_tab_treatment",
"demo_settler": "demo_tab_treatment",
"demo_monster": "demo_tab_treatment",
"demo_inj_monster_flow": "demo_tab_treatment",
"demo_fn_monster_flow": "demo_tab_treatment",
"demo_comment_effluent_meas": "demo_tab_treatment",
"demo_meas_eff_flow": "demo_tab_treatment",
"demo_meas_eff_do": "demo_tab_treatment",
"demo_meas_eff_nh4": "demo_tab_treatment",
"demo_meas_eff_no3": "demo_tab_treatment",
"demo_meas_eff_tss": "demo_tab_treatment",
"demo_comment_pressure": "demo_tab_treatment",
"demo_link_reactor_dash": "demo_tab_treatment",
"demo_link_meas_dash": "demo_tab_treatment",
"demo_link_eff_meas_dash": "demo_tab_treatment"
};
for (const [nodeId, tabId] of Object.entries(moveMap)) {
const node = byId(nodeId);
if (node) {
node.z = tabId;
} else {
console.warn(`WARNING: Node ${nodeId} not found for move`);
}
}
// =============================================
// 1c-coords. Recalculate coordinates per tab
// =============================================
// PS West layout (2 pumps + MGC)
const psWestCoords = {
"demo_comment_ps": { x: 340, y: 40 },
"demo_inj_calib_west": { x: 120, y: 80 },
"demo_inj_w1_mode": { x: 120, y: 120 },
"demo_inj_west_mode": { x: 120, y: 200 },
"demo_inj_west_flow": { x: 120, y: 240 },
"demo_inj_w2_mode": { x: 120, y: 320 },
"demo_fn_west_flow_sim": { x: 360, y: 240 },
"demo_pump_w1": { x: 600, y: 120 },
"demo_pump_w2": { x: 600, y: 320 },
"demo_mgc_west": { x: 600, y: 220 },
"demo_ps_west": { x: 860, y: 220 },
"demo_fn_level_to_pressure_w": { x: 360, y: 420 },
"demo_meas_pt_w_up": { x: 560, y: 420 },
"demo_meas_pt_w_down": { x: 560, y: 480 },
"demo_mon_west": { x: 1080, y: 160 },
"demo_link_ps_west_dash": { x: 1080, y: 220 },
};
// PS North layout (1 pump, no MGC)
const psNorthCoords = {
"demo_comment_ps_north": { x: 340, y: 40 },
"demo_inj_calib_north": { x: 120, y: 80 },
"demo_inj_n1_mode": { x: 120, y: 120 },
"demo_inj_north_mode": { x: 120, y: 200 },
"demo_inj_north_flow": { x: 120, y: 240 },
"demo_fn_north_flow_sim": { x: 360, y: 240 },
"demo_pump_n1": { x: 600, y: 120 },
"demo_ps_north": { x: 860, y: 200 },
"demo_comment_north_outflow":{ x: 200, y: 320 },
"demo_meas_ft_n1": { x: 560, y: 340 },
"demo_fn_level_to_pressure_n":{ x: 360, y: 420 },
"demo_meas_pt_n_up": { x: 560, y: 420 },
"demo_meas_pt_n_down": { x: 560, y: 480 },
"demo_mon_north": { x: 1080, y: 140 },
"demo_link_ps_north_dash": { x: 1080, y: 200 },
};
// PS South layout (1 pump, no MGC)
const psSouthCoords = {
"demo_comment_ps_south": { x: 340, y: 40 },
"demo_inj_calib_south": { x: 120, y: 80 },
"demo_inj_s1_mode": { x: 120, y: 120 },
"demo_inj_south_mode": { x: 120, y: 200 },
"demo_inj_south_flow": { x: 120, y: 240 },
"demo_fn_south_flow_sim": { x: 360, y: 240 },
"demo_pump_s1": { x: 600, y: 120 },
"demo_ps_south": { x: 860, y: 200 },
"demo_fn_level_to_pressure_s":{ x: 360, y: 380 },
"demo_meas_pt_s_up": { x: 560, y: 380 },
"demo_meas_pt_s_down": { x: 560, y: 440 },
"demo_mon_south": { x: 1080, y: 140 },
"demo_link_ps_south_dash": { x: 1080, y: 200 },
};
// Treatment layout
const treatmentCoords = {
"demo_comment_treatment": { x: 200, y: 40 },
"demo_meas_flow": { x: 400, y: 120 },
"demo_meas_do": { x: 400, y: 180 },
"demo_meas_nh4": { x: 400, y: 240 },
"demo_inj_reactor_tick": { x: 600, y: 80 },
"demo_reactor": { x: 800, y: 180 },
"demo_settler": { x: 800, y: 320 },
"demo_monster": { x: 800, y: 420 },
"demo_inj_monster_flow": { x: 560, y: 420 },
"demo_fn_monster_flow": { x: 660, y: 460 },
"demo_comment_effluent_meas":{ x: 200, y: 520 },
"demo_meas_eff_flow": { x: 400, y: 560 },
"demo_meas_eff_do": { x: 400, y: 620 },
"demo_meas_eff_nh4": { x: 400, y: 680 },
"demo_meas_eff_no3": { x: 400, y: 740 },
"demo_meas_eff_tss": { x: 400, y: 800 },
"demo_comment_pressure": { x: 200, y: 860 },
"demo_link_reactor_dash": { x: 1020, y: 180 },
"demo_link_meas_dash": { x: 620, y: 180 },
"demo_link_eff_meas_dash": { x: 620, y: 620 },
};
// Apply coordinates
for (const [nodeId, coords] of Object.entries({...psWestCoords, ...psNorthCoords, ...psSouthCoords, ...treatmentCoords})) {
const node = byId(nodeId);
if (node) {
node.x = coords.x;
node.y = coords.y;
}
}
// =============================================
// 1d. Create per-tab link-out nodes
// =============================================
// Determine which tab each moved node belongs to
const tabForNode = {};
for (const n of flow) {
if (n.z) tabForNode[n.id] = n.z;
}
// Map from tab → influx link-out ID
const influxLinkOutMap = {
"demo_tab_ps_west": "demo_link_influx_out_west",
"demo_tab_ps_north": "demo_link_influx_out_north",
"demo_tab_ps_south": "demo_link_influx_out_south",
"demo_tab_treatment": "demo_link_influx_out_treatment",
};
// Map from tab → process link-out ID
const processLinkOutMap = {
"demo_tab_ps_west": "demo_link_process_out_west",
"demo_tab_ps_north": "demo_link_process_out_north",
"demo_tab_ps_south": "demo_link_process_out_south",
"demo_tab_treatment": "demo_link_process_out_treatment",
};
// Link-out node positions per tab
const linkOutPositions = {
"demo_tab_ps_west": { influx: { x: 1080, y: 280 }, process: { x: 1080, y: 320 } },
"demo_tab_ps_north": { influx: { x: 1080, y: 260 }, process: { x: 1080, y: 300 } },
"demo_tab_ps_south": { influx: { x: 1080, y: 260 }, process: { x: 1080, y: 300 } },
"demo_tab_treatment": { influx: { x: 1020, y: 280 }, process: { x: 1020, y: 320 } },
};
// Create influx link-out nodes
for (const [tabId, nodeId] of Object.entries(influxLinkOutMap)) {
const pos = linkOutPositions[tabId].influx;
flow.push({
id: nodeId,
type: "link out",
z: tabId,
name: "→ InfluxDB",
mode: "link",
links: ["demo_link_influx_in"],
x: pos.x,
y: pos.y
});
}
// Create process link-out nodes
for (const [tabId, nodeId] of Object.entries(processLinkOutMap)) {
const pos = linkOutPositions[tabId].process;
flow.push({
id: nodeId,
type: "link out",
z: tabId,
name: "→ Process debug",
mode: "link",
links: ["demo_link_process_in"],
x: pos.x,
y: pos.y
});
}
// =============================================
// 1d-rewire. Rewire nodes to use local link-outs
// =============================================
// For every node that references "demo_link_influx_out" or "demo_link_process_out"
// in its wires, replace with the per-tab version
for (const node of flow) {
if (!node.wires || !node.z) continue;
const tab = node.z;
const localInflux = influxLinkOutMap[tab];
const localProcess = processLinkOutMap[tab];
for (let portIdx = 0; portIdx < node.wires.length; portIdx++) {
for (let wireIdx = 0; wireIdx < node.wires[portIdx].length; wireIdx++) {
if (node.wires[portIdx][wireIdx] === "demo_link_influx_out" && localInflux) {
node.wires[portIdx][wireIdx] = localInflux;
}
if (node.wires[portIdx][wireIdx] === "demo_link_process_out" && localProcess) {
node.wires[portIdx][wireIdx] = localProcess;
}
}
}
}
// Update the link-in nodes to reference all new link-out IDs
const influxIn = byId("demo_link_influx_in");
influxIn.links = Object.values(influxLinkOutMap);
// Also keep the old one if any nodes on the telemetry tab still reference it
// (the dashapi, telemetry nodes that stayed on demo_tab_wwtp)
influxIn.links.push("demo_link_influx_out");
const processIn = byId("demo_link_process_in");
processIn.links = Object.values(processLinkOutMap);
processIn.links.push("demo_link_process_out");
// Keep old link-out nodes on telemetry tab (they may still be needed
// by nodes that remain there, like dashapi)
// Update their links arrays too
const oldInfluxOut = byId("demo_link_influx_out");
if (oldInfluxOut) {
oldInfluxOut.links = ["demo_link_influx_in"];
// Move to bottom of telemetry tab
oldInfluxOut.x = 1135;
oldInfluxOut.y = 500;
}
const oldProcessOut = byId("demo_link_process_out");
if (oldProcessOut) {
oldProcessOut.links = ["demo_link_process_in"];
oldProcessOut.x = 1135;
oldProcessOut.y = 540;
}
// =============================================
// Validate
// =============================================
const tabCounts = {};
for (const n of flow) {
if (n.z) {
tabCounts[n.z] = (tabCounts[n.z] || 0) + 1;
}
}
console.log('Nodes per tab:', JSON.stringify(tabCounts, null, 2));
console.log('Total nodes:', flow.length);
// Check for broken wire references
const allIds = new Set(flow.map(n => n.id));
let brokenWires = 0;
for (const n of flow) {
if (!n.wires) continue;
for (const port of n.wires) {
for (const target of port) {
if (!allIds.has(target)) {
console.warn(`BROKEN WIRE: ${n.id}${target}`);
brokenWires++;
}
}
}
}
if (brokenWires === 0) console.log('All wire references valid ✓');
// Check link-in/link-out pairing
for (const n of flow) {
if (n.type === 'link out' && n.links) {
for (const linkTarget of n.links) {
if (!allIds.has(linkTarget)) {
console.warn(`BROKEN LINK: ${n.id} links to missing ${linkTarget}`);
}
}
}
if (n.type === 'link in' && n.links) {
for (const linkSource of n.links) {
if (!allIds.has(linkSource)) {
console.warn(`BROKEN LINK: ${n.id} expects link from missing ${linkSource}`);
}
}
}
}
// Write
fs.writeFileSync(FLOW_PATH, JSON.stringify(flow, null, 2) + '\n');
console.log(`\nWrote ${FLOW_PATH} (${flow.length} nodes)`);

View File

@@ -1,219 +0,0 @@
#!/usr/bin/env node
/**
* Step 2: Merge Collection Point
* - Adds link-out from each PS tab to merge on treatment tab
* - Creates link-in, tag, collect, and dashboard link-out nodes on treatment
* - Wires PS outputs through merge to feed reactor
*/
const fs = require('fs');
const path = require('path');
const FLOW_PATH = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const flow = JSON.parse(fs.readFileSync(FLOW_PATH, 'utf8'));
const byId = (id) => flow.find(n => n.id === id);
// =============================================
// 2a. Link-out nodes on each PS tab
// =============================================
flow.push(
{
id: "demo_link_merge_west_out",
type: "link out",
z: "demo_tab_ps_west",
name: "→ Merge (West)",
mode: "link",
links: ["demo_link_merge_west_in"],
x: 1080, y: 360
},
{
id: "demo_link_merge_north_out",
type: "link out",
z: "demo_tab_ps_north",
name: "→ Merge (North)",
mode: "link",
links: ["demo_link_merge_north_in"],
x: 1080, y: 340
},
{
id: "demo_link_merge_south_out",
type: "link out",
z: "demo_tab_ps_south",
name: "→ Merge (South)",
mode: "link",
links: ["demo_link_merge_south_in"],
x: 1080, y: 340
}
);
// Add merge link-outs to each PS node's wires[0]
const psWest = byId("demo_ps_west");
psWest.wires[0].push("demo_link_merge_west_out");
const psNorth = byId("demo_ps_north");
psNorth.wires[0].push("demo_link_merge_north_out");
const psSouth = byId("demo_ps_south");
psSouth.wires[0].push("demo_link_merge_south_out");
// =============================================
// 2b. Merge nodes on Treatment tab
// =============================================
// Link-in nodes
flow.push(
{
id: "demo_link_merge_west_in",
type: "link in",
z: "demo_tab_treatment",
name: "← PS West",
links: ["demo_link_merge_west_out"],
x: 100, y: 920,
wires: [["demo_fn_tag_west"]]
},
{
id: "demo_link_merge_north_in",
type: "link in",
z: "demo_tab_treatment",
name: "← PS North",
links: ["demo_link_merge_north_out"],
x: 100, y: 980,
wires: [["demo_fn_tag_north"]]
},
{
id: "demo_link_merge_south_in",
type: "link in",
z: "demo_tab_treatment",
name: "← PS South",
links: ["demo_link_merge_south_out"],
x: 100, y: 1040,
wires: [["demo_fn_tag_south"]]
}
);
// Tag functions
flow.push(
{
id: "demo_fn_tag_west",
type: "function",
z: "demo_tab_treatment",
name: "Tag: west",
func: "msg._psSource = 'west';\nreturn msg;",
outputs: 1,
x: 280, y: 920,
wires: [["demo_fn_merge_collect"]]
},
{
id: "demo_fn_tag_north",
type: "function",
z: "demo_tab_treatment",
name: "Tag: north",
func: "msg._psSource = 'north';\nreturn msg;",
outputs: 1,
x: 280, y: 980,
wires: [["demo_fn_merge_collect"]]
},
{
id: "demo_fn_tag_south",
type: "function",
z: "demo_tab_treatment",
name: "Tag: south",
func: "msg._psSource = 'south';\nreturn msg;",
outputs: 1,
x: 280, y: 1040,
wires: [["demo_fn_merge_collect"]]
}
);
// Merge collect function
flow.push({
id: "demo_fn_merge_collect",
type: "function",
z: "demo_tab_treatment",
name: "Merge Collector",
func: `// Cache each PS output by _psSource tag, compute totals
const p = msg.payload || {};
const ps = msg._psSource;
const cache = flow.get('merge_cache') || { west: {}, north: {}, south: {} };
const keys = Object.keys(p);
const pick = (prefix) => { const k = keys.find(k => k.startsWith(prefix)); return k ? Number(p[k]) : null; };
if (ps && cache[ps]) {
const nf = pick('netFlowRate.predicted'); if (nf !== null) cache[ps].netFlow = nf;
const fp = pick('volumePercent.predicted'); if (fp !== null) cache[ps].fillPct = fp;
cache[ps].direction = p.direction || cache[ps].direction;
cache[ps].ts = Date.now();
}
flow.set('merge_cache', cache);
const totalFlow = (cache.west.netFlow||0) + (cache.north.netFlow||0) + (cache.south.netFlow||0);
const avgFill = ((cache.west.fillPct||0) + (cache.north.fillPct||0) + (cache.south.fillPct||0)) / 3;
return {
topic: 'merge_combined_influent',
payload: { totalInfluentFlow: +totalFlow.toFixed(1), avgFillPercent: +avgFill.toFixed(1),
west: cache.west, north: cache.north, south: cache.south }
};`,
outputs: 1,
x: 480, y: 980,
wires: [["demo_link_merge_dash"]]
});
// Dashboard link-out for merge data
flow.push({
id: "demo_link_merge_dash",
type: "link out",
z: "demo_tab_treatment",
name: "→ Merge Dashboard",
mode: "link",
links: ["demo_link_merge_dash_in"],
x: 680, y: 980
});
// Create a comment for the merge section
flow.push({
id: "demo_comment_merge",
type: "comment",
z: "demo_tab_treatment",
name: "=== MERGE COLLECTION POINT ===",
info: "Combines output from all 3 pumping stations",
x: 200, y: 880
});
// =============================================
// Validate
// =============================================
const allIds = new Set(flow.map(n => n.id));
let brokenWires = 0;
for (const n of flow) {
if (!n.wires) continue;
for (const port of n.wires) {
for (const target of port) {
if (!allIds.has(target)) {
console.warn(`BROKEN WIRE: ${n.id}${target}`);
brokenWires++;
}
}
}
}
for (const n of flow) {
if (n.type === 'link out' && n.links) {
for (const lt of n.links) {
if (!allIds.has(lt)) console.warn(`BROKEN LINK: ${n.id} links to missing ${lt}`);
}
}
if (n.type === 'link in' && n.links) {
for (const ls of n.links) {
if (!allIds.has(ls)) console.warn(`BROKEN LINK: ${n.id} expects link from missing ${ls}`);
}
}
}
if (brokenWires === 0) console.log('All wire references valid ✓');
console.log('Total nodes:', flow.length);
// Write
fs.writeFileSync(FLOW_PATH, JSON.stringify(flow, null, 2) + '\n');
console.log(`Wrote ${FLOW_PATH}`);

View File

@@ -1,583 +0,0 @@
#!/usr/bin/env node
/**
* Step 3: Overview Dashboard Page + KPI Gauges
* - Creates overview page with chain visualization
* - Adds KPI gauges (Total Flow, DO, TSS, NH4)
* - Link-in nodes to feed overview from merge + reactor + effluent data
* - Reorders all page navigation
*/
const fs = require('fs');
const path = require('path');
const FLOW_PATH = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const flow = JSON.parse(fs.readFileSync(FLOW_PATH, 'utf8'));
const byId = (id) => flow.find(n => n.id === id);
// =============================================
// 3a. New config nodes
// =============================================
// Overview page
flow.push({
id: "demo_ui_page_overview",
type: "ui-page",
name: "Plant Overview",
ui: "demo_ui_base",
path: "/overview",
icon: "dashboard",
layout: "grid",
theme: "demo_ui_theme",
breakpoints: [{ name: "Default", px: "0", cols: "12" }],
order: 0,
className: ""
});
// Overview groups
flow.push(
{
id: "demo_ui_grp_overview_chain",
type: "ui-group",
name: "Process Chain",
page: "demo_ui_page_overview",
width: "12",
height: "1",
order: 1,
showTitle: true,
className: ""
},
{
id: "demo_ui_grp_overview_kpi",
type: "ui-group",
name: "Key Indicators",
page: "demo_ui_page_overview",
width: "12",
height: "1",
order: 2,
showTitle: true,
className: ""
}
);
// =============================================
// 3b. Chain visualization - link-in nodes on dashboard tab
// =============================================
// Link-in for merge data (this is what step 2's demo_link_merge_dash links to)
flow.push({
id: "demo_link_merge_dash_in",
type: "link in",
z: "demo_tab_dashboard",
name: "← Merge Data",
links: ["demo_link_merge_dash"],
x: 75, y: 960,
wires: [["demo_fn_overview_parse"]]
});
// We also need reactor and effluent data for the overview.
// Create link-out nodes on treatment tab for overview data
flow.push(
{
id: "demo_link_overview_reactor_out",
type: "link out",
z: "demo_tab_treatment",
name: "→ Overview (Reactor)",
mode: "link",
links: ["demo_link_overview_reactor_in"],
x: 1020, y: 220
},
{
id: "demo_link_overview_reactor_in",
type: "link in",
z: "demo_tab_dashboard",
name: "← Reactor (Overview)",
links: ["demo_link_overview_reactor_out"],
x: 75, y: 1020,
wires: [["demo_fn_overview_reactor_parse"]]
}
);
// Add overview reactor link-out to reactor's wires[0]
const reactor = byId("demo_reactor");
reactor.wires[0].push("demo_link_overview_reactor_out");
// Effluent measurements link for overview KPIs
flow.push(
{
id: "demo_link_overview_eff_out",
type: "link out",
z: "demo_tab_treatment",
name: "→ Overview (Effluent)",
mode: "link",
links: ["demo_link_overview_eff_in"],
x: 620, y: 660
},
{
id: "demo_link_overview_eff_in",
type: "link in",
z: "demo_tab_dashboard",
name: "← Effluent (Overview)",
links: ["demo_link_overview_eff_out"],
x: 75, y: 1080,
wires: [["demo_fn_overview_eff_parse"]]
}
);
// Add overview eff link-out to effluent measurement nodes wires[0]
// TSS and NH4 are the key effluent quality indicators
const effTss = byId("demo_meas_eff_tss");
effTss.wires[0].push("demo_link_overview_eff_out");
const effNh4 = byId("demo_meas_eff_nh4");
effNh4.wires[0].push("demo_link_overview_eff_out");
// =============================================
// 3b. Parse functions for overview
// =============================================
// Parse merge data for chain visualization + total flow gauge
flow.push({
id: "demo_fn_overview_parse",
type: "function",
z: "demo_tab_dashboard",
name: "Parse Overview (Merge)",
func: `const p = msg.payload || {};
const now = Date.now();
// Store in flow context for the template
flow.set('overview_merge', p);
// Output 1: chain vis data, Output 2: total flow gauge
return [
{ topic: 'overview_chain', payload: p },
p.totalInfluentFlow !== undefined ? { topic: 'Total Influent Flow', payload: p.totalInfluentFlow } : null
];`,
outputs: 2,
x: 280, y: 960,
wires: [
["demo_overview_template"],
["demo_gauge_overview_flow"]
]
});
// Parse reactor data for overview
flow.push({
id: "demo_fn_overview_reactor_parse",
type: "function",
z: "demo_tab_dashboard",
name: "Parse Overview (Reactor)",
func: `const p = msg.payload || {};
if (!p.C || !Array.isArray(p.C)) return null;
flow.set('overview_reactor', p);
// Output: DO gauge value
return { topic: 'Reactor DO', payload: Math.round(p.C[0]*100)/100 };`,
outputs: 1,
x: 280, y: 1020,
wires: [["demo_gauge_overview_do"]]
});
// Parse effluent data for overview KPIs
flow.push({
id: "demo_fn_overview_eff_parse",
type: "function",
z: "demo_tab_dashboard",
name: "Parse Overview (Effluent)",
func: `const p = msg.payload || {};
const topic = msg.topic || '';
const val = Number(p.mAbs);
if (!Number.isFinite(val)) return null;
// Route to appropriate gauge based on measurement type
if (topic.includes('TSS') || topic.includes('tss')) {
return [{ topic: 'Effluent TSS', payload: Math.round(val*100)/100 }, null];
}
if (topic.includes('NH4') || topic.includes('ammonium')) {
return [null, { topic: 'Effluent NH4', payload: Math.round(val*100)/100 }];
}
return [null, null];`,
outputs: 2,
x: 280, y: 1080,
wires: [
["demo_gauge_overview_tss"],
["demo_gauge_overview_nh4"]
]
});
// =============================================
// 3b. Chain visualization template
// =============================================
flow.push({
id: "demo_overview_template",
type: "ui-template",
z: "demo_tab_dashboard",
group: "demo_ui_grp_overview_chain",
name: "Process Chain Diagram",
order: 1,
width: "12",
height: "6",
head: "",
format: `<template>
<div class="chain-container">
<svg viewBox="0 0 900 280" class="chain-svg">
<!-- PS West -->
<g @click="navigateTo('/ps-west')" class="chain-block clickable">
<rect x="20" y="20" width="160" height="80" rx="8" :fill="blockColor(merge?.west)"/>
<text x="100" y="50" class="block-title">PS West</text>
<text x="100" y="70" class="block-value">{{ formatPct(merge?.west?.fillPct) }}</text>
<text x="100" y="86" class="block-sub">{{ formatDir(merge?.west?.direction) }}</text>
</g>
<!-- PS North -->
<g @click="navigateTo('/ps-north')" class="chain-block clickable">
<rect x="20" y="120" width="160" height="80" rx="8" :fill="blockColor(merge?.north)"/>
<text x="100" y="150" class="block-title">PS North</text>
<text x="100" y="170" class="block-value">{{ formatPct(merge?.north?.fillPct) }}</text>
<text x="100" y="186" class="block-sub">{{ formatDir(merge?.north?.direction) }}</text>
</g>
<!-- PS South -->
<g @click="navigateTo('/ps-south')" class="chain-block clickable">
<rect x="20" y="220" width="160" height="80" rx="8" :fill="blockColor(merge?.south)"/>
<text x="100" y="250" class="block-title">PS South</text>
<text x="100" y="270" class="block-value">{{ formatPct(merge?.south?.fillPct) }}</text>
<text x="100" y="286" class="block-sub">{{ formatDir(merge?.south?.direction) }}</text>
</g>
<!-- Merge arrows -->
<line x1="180" y1="60" x2="260" y2="160" class="chain-arrow"/>
<line x1="180" y1="160" x2="260" y2="160" class="chain-arrow"/>
<line x1="180" y1="260" x2="260" y2="160" class="chain-arrow"/>
<!-- Merge point -->
<g class="chain-block">
<rect x="260" y="120" width="120" height="80" rx="8" fill="#0f3460"/>
<text x="320" y="150" class="block-title">Merge</text>
<text x="320" y="170" class="block-value">{{ formatFlow(merge?.totalInfluentFlow) }}</text>
<text x="320" y="186" class="block-sub">m\\u00b3/h total</text>
</g>
<!-- Arrow merge → reactor -->
<line x1="380" y1="160" x2="420" y2="160" class="chain-arrow"/>
<!-- Reactor -->
<g @click="navigateTo('/treatment')" class="chain-block clickable">
<rect x="420" y="120" width="140" height="80" rx="8" :fill="reactorColor"/>
<text x="490" y="150" class="block-title">Reactor</text>
<text x="490" y="170" class="block-value">DO: {{ reactorDO }}</text>
<text x="490" y="186" class="block-sub">mg/L</text>
</g>
<!-- Arrow reactor → settler -->
<line x1="560" y1="160" x2="600" y2="160" class="chain-arrow"/>
<!-- Settler -->
<g @click="navigateTo('/treatment')" class="chain-block clickable">
<rect x="600" y="120" width="120" height="80" rx="8" fill="#0f3460"/>
<text x="660" y="150" class="block-title">Settler</text>
<text x="660" y="170" class="block-value">TSS: {{ effTSS }}</text>
<text x="660" y="186" class="block-sub">mg/L</text>
</g>
<!-- Arrow settler → effluent -->
<line x1="720" y1="160" x2="760" y2="160" class="chain-arrow"/>
<!-- Effluent -->
<g class="chain-block">
<rect x="760" y="120" width="120" height="80" rx="8" :fill="effluentColor"/>
<text x="820" y="150" class="block-title">Effluent</text>
<text x="820" y="170" class="block-value">NH4: {{ effNH4 }}</text>
<text x="820" y="186" class="block-sub">mg/L</text>
</g>
</svg>
</div>
</template>
<script>
export default {
data() {
return {
merge: null,
reactorDO: '--',
effTSS: '--',
effNH4: '--'
}
},
computed: {
reactorColor() {
const d = parseFloat(this.reactorDO);
if (isNaN(d)) return '#0f3460';
if (d < 1) return '#f44336';
if (d < 2) return '#ff9800';
return '#1b5e20';
},
effluentColor() {
const n = parseFloat(this.effNH4);
if (isNaN(n)) return '#0f3460';
if (n > 10) return '#f44336';
if (n > 5) return '#ff9800';
return '#1b5e20';
}
},
watch: {
msg(val) {
if (!val) return;
const t = val.topic || '';
if (t === 'overview_chain') {
this.merge = val.payload;
} else if (t === 'Reactor DO') {
this.reactorDO = val.payload?.toFixed(1) || '--';
} else if (t === 'Effluent TSS') {
this.effTSS = val.payload?.toFixed(1) || '--';
} else if (t === 'Effluent NH4') {
this.effNH4 = val.payload?.toFixed(1) || '--';
}
}
},
methods: {
navigateTo(path) {
this.$router.push('/dashboard' + path);
},
blockColor(ps) {
if (!ps || ps.fillPct === undefined) return '#0f3460';
if (ps.fillPct > 90) return '#f44336';
if (ps.fillPct > 75) return '#ff9800';
if (ps.fillPct < 10) return '#f44336';
return '#0f3460';
},
formatPct(v) { return v !== undefined && v !== null ? v.toFixed(0) + '%' : '--'; },
formatFlow(v) { return v !== undefined && v !== null ? v.toFixed(0) : '--'; },
formatDir(d) { return d === 'filling' ? '\\u2191 filling' : d === 'emptying' ? '\\u2193 emptying' : '--'; }
}
}
</script>
<style>
.chain-container { width: 100%; overflow-x: auto; }
.chain-svg { width: 100%; height: auto; min-height: 200px; }
.chain-block text { text-anchor: middle; fill: #e0e0e0; }
.block-title { font-size: 14px; font-weight: bold; }
.block-value { font-size: 13px; fill: #4fc3f7; }
.block-sub { font-size: 10px; fill: #90a4ae; }
.chain-arrow { stroke: #4fc3f7; stroke-width: 2; marker-end: url(#arrowhead); }
.clickable { cursor: pointer; }
.clickable:hover rect { opacity: 0.8; }
</style>`,
templateScope: "local",
className: "",
x: 510, y: 960,
wires: [[]]
});
// =============================================
// 3c. KPI gauges on overview
// =============================================
// Total Influent Flow gauge
flow.push({
id: "demo_gauge_overview_flow",
type: "ui-gauge",
z: "demo_tab_dashboard",
group: "demo_ui_grp_overview_kpi",
name: "Total Influent Flow",
gtype: "gauge-34",
gstyle: "Rounded",
title: "Influent Flow",
units: "m\u00b3/h",
prefix: "",
suffix: "m\u00b3/h",
min: 0,
max: 500,
segments: [
{ color: "#2196f3", from: 0 },
{ color: "#4caf50", from: 50 },
{ color: "#ff9800", from: 350 },
{ color: "#f44336", from: 450 }
],
width: 3,
height: 4,
order: 1,
className: "",
x: 510, y: 1020,
wires: []
});
// Reactor DO gauge
flow.push({
id: "demo_gauge_overview_do",
type: "ui-gauge",
z: "demo_tab_dashboard",
group: "demo_ui_grp_overview_kpi",
name: "Reactor DO",
gtype: "gauge-34",
gstyle: "Rounded",
title: "Reactor DO",
units: "mg/L",
prefix: "",
suffix: "mg/L",
min: 0,
max: 10,
segments: [
{ color: "#f44336", from: 0 },
{ color: "#ff9800", from: 1 },
{ color: "#4caf50", from: 2 },
{ color: "#ff9800", from: 6 },
{ color: "#f44336", from: 8 }
],
width: 3,
height: 4,
order: 2,
className: "",
x: 510, y: 1060,
wires: []
});
// Effluent TSS gauge
flow.push({
id: "demo_gauge_overview_tss",
type: "ui-gauge",
z: "demo_tab_dashboard",
group: "demo_ui_grp_overview_kpi",
name: "Effluent TSS",
gtype: "gauge-34",
gstyle: "Rounded",
title: "Effluent TSS",
units: "mg/L",
prefix: "",
suffix: "mg/L",
min: 0,
max: 50,
segments: [
{ color: "#4caf50", from: 0 },
{ color: "#ff9800", from: 25 },
{ color: "#f44336", from: 40 }
],
width: 3,
height: 4,
order: 3,
className: "",
x: 510, y: 1100,
wires: []
});
// Effluent NH4 gauge
flow.push({
id: "demo_gauge_overview_nh4",
type: "ui-gauge",
z: "demo_tab_dashboard",
group: "demo_ui_grp_overview_kpi",
name: "Effluent NH4",
gtype: "gauge-34",
gstyle: "Rounded",
title: "Effluent NH4",
units: "mg/L",
prefix: "",
suffix: "mg/L",
min: 0,
max: 20,
segments: [
{ color: "#4caf50", from: 0 },
{ color: "#ff9800", from: 5 },
{ color: "#f44336", from: 10 }
],
width: 3,
height: 4,
order: 4,
className: "",
x: 510, y: 1140,
wires: []
});
// =============================================
// 3d. Reorder all page navigation
// =============================================
const pageOrders = {
"demo_ui_page_overview": 0,
"demo_ui_page_influent": 1,
"demo_ui_page_treatment": 5,
"demo_ui_page_telemetry": 6,
};
for (const [pageId, order] of Object.entries(pageOrders)) {
const page = byId(pageId);
if (page) page.order = order;
}
// =============================================
// Feed chain vis and KPIs from merge + reactor + effluent
// We need to also wire the overview_template to receive reactor/eff data
// The parse functions already wire to the template and gauges separately
// But the template needs ALL data sources - let's connect reactor and eff parsers to it too
// =============================================
// Actually, the template needs multiple inputs. Let's connect reactor and eff parse outputs too.
// Modify overview reactor parse to also send to template
const reactorParse = byId("demo_fn_overview_reactor_parse");
// Currently wires to demo_gauge_overview_do. Add template as well.
reactorParse.func = `const p = msg.payload || {};
if (!p.C || !Array.isArray(p.C)) return null;
flow.set('overview_reactor', p);
// Output 1: DO gauge, Output 2: to chain template
const doVal = Math.round(p.C[0]*100)/100;
return [
{ topic: 'Reactor DO', payload: doVal },
{ topic: 'Reactor DO', payload: doVal }
];`;
reactorParse.outputs = 2;
reactorParse.wires = [["demo_gauge_overview_do"], ["demo_overview_template"]];
// Same for effluent parse - add template output
const effParse = byId("demo_fn_overview_eff_parse");
effParse.func = `const p = msg.payload || {};
const topic = msg.topic || '';
const val = Number(p.mAbs);
if (!Number.isFinite(val)) return null;
const rounded = Math.round(val*100)/100;
// Route to appropriate gauge + template based on measurement type
if (topic.includes('TSS') || topic.includes('tss')) {
return [{ topic: 'Effluent TSS', payload: rounded }, null, { topic: 'Effluent TSS', payload: rounded }];
}
if (topic.includes('NH4') || topic.includes('ammonium')) {
return [null, { topic: 'Effluent NH4', payload: rounded }, { topic: 'Effluent NH4', payload: rounded }];
}
return [null, null, null];`;
effParse.outputs = 3;
effParse.wires = [["demo_gauge_overview_tss"], ["demo_gauge_overview_nh4"], ["demo_overview_template"]];
// =============================================
// Validate
// =============================================
const allIds = new Set(flow.map(n => n.id));
let issues = 0;
for (const n of flow) {
if (!n.wires) continue;
for (const port of n.wires) {
for (const target of port) {
if (!allIds.has(target)) {
console.warn(`BROKEN WIRE: ${n.id}${target}`);
issues++;
}
}
}
if (n.type === 'link out' && n.links) {
for (const lt of n.links) {
if (!allIds.has(lt)) { console.warn(`BROKEN LINK OUT: ${n.id}${lt}`); issues++; }
}
}
if (n.type === 'link in' && n.links) {
for (const ls of n.links) {
if (!allIds.has(ls)) { console.warn(`BROKEN LINK IN: ${n.id}${ls}`); issues++; }
}
}
}
if (issues === 0) console.log('All references valid ✓');
console.log('Total nodes:', flow.length);
// Write
fs.writeFileSync(FLOW_PATH, JSON.stringify(flow, null, 2) + '\n');
console.log(`Wrote ${FLOW_PATH}`);

View File

@@ -1,613 +0,0 @@
#!/usr/bin/env node
/**
* Step 4: Manual Controls per PS Detail Page
* - Creates 3 PS detail pages (/ps-west, /ps-north, /ps-south) with control groups
* - Adds control widgets: mode switches, pump speed sliders
* - Format functions to convert dashboard inputs to process node messages
* - Link-in/out routing between dashboard tab and PS tabs
* - Per-PS monitoring charts on detail pages
*/
const fs = require('fs');
const path = require('path');
const FLOW_PATH = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const flow = JSON.parse(fs.readFileSync(FLOW_PATH, 'utf8'));
const byId = (id) => flow.find(n => n.id === id);
// =============================================
// Helper to create a standard set of controls for a PS
// =============================================
function createPSDetailPage(config) {
const {
psKey, // 'west', 'north', 'south'
psLabel, // 'PS West', 'PS North', 'PS South'
pagePath, // '/ps-west'
pageOrder, // 2, 3, 4
psNodeId, // 'demo_ps_west'
pumps, // [{id: 'demo_pump_w1', label: 'W1'}, ...]
controlModes, // ['levelbased','flowbased','manual']
defaultMode, // 'levelbased'
maxFlow, // 300
basinHeight, // 4
tabId, // 'demo_tab_ps_west'
} = config;
const prefix = `demo_ctrl_${psKey}`;
const nodes = [];
// === Page ===
nodes.push({
id: `demo_ui_page_ps_${psKey}_detail`,
type: "ui-page",
name: `${psLabel} Detail`,
ui: "demo_ui_base",
path: pagePath,
icon: "water_drop",
layout: "grid",
theme: "demo_ui_theme",
breakpoints: [{ name: "Default", px: "0", cols: "12" }],
order: pageOrder,
className: ""
});
// === Groups ===
nodes.push(
{
id: `${prefix}_grp_controls`,
type: "ui-group",
name: `${psLabel} Controls`,
page: `demo_ui_page_ps_${psKey}_detail`,
width: "6",
height: "1",
order: 1,
showTitle: true,
className: ""
},
{
id: `${prefix}_grp_monitoring`,
type: "ui-group",
name: `${psLabel} Monitoring`,
page: `demo_ui_page_ps_${psKey}_detail`,
width: "6",
height: "1",
order: 2,
showTitle: true,
className: ""
},
{
id: `${prefix}_grp_charts`,
type: "ui-group",
name: `${psLabel} Trends`,
page: `demo_ui_page_ps_${psKey}_detail`,
width: "12",
height: "1",
order: 3,
showTitle: true,
className: ""
}
);
// === PS Mode button group ===
const modeOptions = controlModes.map(m => ({
label: m === 'levelbased' ? 'Level' : m === 'flowbased' ? 'Flow' : m.charAt(0).toUpperCase() + m.slice(1),
value: m,
valueType: "str"
}));
nodes.push({
id: `${prefix}_mode`,
type: "ui-button-group",
z: "demo_tab_dashboard",
group: `${prefix}_grp_controls`,
name: `${psLabel} Mode`,
label: "Station Mode",
tooltip: "",
order: 1,
width: "6",
height: "1",
passthru: false,
options: modeOptions,
x: 120, y: 100 + pageOrder * 300,
wires: [[`${prefix}_fn_mode`]]
});
// Format: PS mode → setMode message
nodes.push({
id: `${prefix}_fn_mode`,
type: "function",
z: "demo_tab_dashboard",
name: `Fmt ${psLabel} Mode`,
func: `msg.topic = 'setMode';\nmsg.payload = msg.payload;\nreturn msg;`,
outputs: 1,
x: 320, y: 100 + pageOrder * 300,
wires: [[`${prefix}_link_cmd_out`]]
});
// === Manual Flow slider ===
nodes.push({
id: `${prefix}_flow`,
type: "ui-slider",
z: "demo_tab_dashboard",
group: `${prefix}_grp_controls`,
name: `${psLabel} Flow`,
label: "Manual Flow (m\u00b3/h)",
tooltip: "",
order: 2,
width: "6",
height: "1",
passthru: false,
outs: "end",
min: 0,
max: maxFlow,
step: 1,
x: 120, y: 140 + pageOrder * 300,
wires: [[`${prefix}_fn_flow`]]
});
// Format: flow slider → q_in message
nodes.push({
id: `${prefix}_fn_flow`,
type: "function",
z: "demo_tab_dashboard",
name: `Fmt ${psLabel} Flow`,
func: `msg.topic = 'q_in';\nmsg.payload = { value: Number(msg.payload), unit: 'm3/h' };\nreturn msg;`,
outputs: 1,
x: 320, y: 140 + pageOrder * 300,
wires: [[`${prefix}_link_cmd_out`]]
});
// === Pump controls ===
pumps.forEach((pump, pIdx) => {
const yOff = 180 + pageOrder * 300 + pIdx * 80;
// Pump mode button group
nodes.push({
id: `${prefix}_pump_${pump.label.toLowerCase()}_mode`,
type: "ui-button-group",
z: "demo_tab_dashboard",
group: `${prefix}_grp_controls`,
name: `${pump.label} Mode`,
label: `${pump.label} Mode`,
tooltip: "",
order: 3 + pIdx * 2,
width: "3",
height: "1",
passthru: false,
options: [
{ label: "Auto", value: "auto", valueType: "str" },
{ label: "Virtual", value: "virtualControl", valueType: "str" },
{ label: "Physical", value: "fysicalControl", valueType: "str" }
],
x: 120, y: yOff,
wires: [[`${prefix}_fn_pump_${pump.label.toLowerCase()}_mode`]]
});
// Format: pump mode
nodes.push({
id: `${prefix}_fn_pump_${pump.label.toLowerCase()}_mode`,
type: "function",
z: "demo_tab_dashboard",
name: `Fmt ${pump.label} Mode`,
func: `msg.topic = 'setMode';\nmsg.payload = msg.payload;\nmsg._targetNode = '${pump.id}';\nreturn msg;`,
outputs: 1,
x: 320, y: yOff,
wires: [[`${prefix}_link_pump_${pump.label.toLowerCase()}_out`]]
});
// Pump speed slider
nodes.push({
id: `${prefix}_pump_${pump.label.toLowerCase()}_speed`,
type: "ui-slider",
z: "demo_tab_dashboard",
group: `${prefix}_grp_controls`,
name: `${pump.label} Speed`,
label: `${pump.label} Speed (%)`,
tooltip: "",
order: 4 + pIdx * 2,
width: "3",
height: "1",
passthru: false,
outs: "end",
min: 0,
max: 100,
step: 1,
x: 120, y: yOff + 40,
wires: [[`${prefix}_fn_pump_${pump.label.toLowerCase()}_speed`]]
});
// Format: pump speed → execMovement
nodes.push({
id: `${prefix}_fn_pump_${pump.label.toLowerCase()}_speed`,
type: "function",
z: "demo_tab_dashboard",
name: `Fmt ${pump.label} Speed`,
func: `msg.topic = 'execMovement';\nmsg.payload = { source: 'dashboard', action: 'setpoint', setpoint: Number(msg.payload) };\nmsg._targetNode = '${pump.id}';\nreturn msg;`,
outputs: 1,
x: 320, y: yOff + 40,
wires: [[`${prefix}_link_pump_${pump.label.toLowerCase()}_out`]]
});
// Link-out for pump commands (dashboard → PS tab)
nodes.push({
id: `${prefix}_link_pump_${pump.label.toLowerCase()}_out`,
type: "link out",
z: "demo_tab_dashboard",
name: `${pump.label} Cmd`,
mode: "link",
links: [`${prefix}_link_pump_${pump.label.toLowerCase()}_in`],
x: 520, y: yOff + 20
});
// Link-in on PS tab
nodes.push({
id: `${prefix}_link_pump_${pump.label.toLowerCase()}_in`,
type: "link in",
z: tabId,
name: `${pump.label} Cmd`,
links: [`${prefix}_link_pump_${pump.label.toLowerCase()}_out`],
x: 120, y: 540 + pIdx * 60,
wires: [[pump.id]]
});
});
// === PS command link-out (dashboard → PS tab) ===
nodes.push({
id: `${prefix}_link_cmd_out`,
type: "link out",
z: "demo_tab_dashboard",
name: `${psLabel} Cmd`,
mode: "link",
links: [`${prefix}_link_cmd_in`],
x: 520, y: 120 + pageOrder * 300
});
// Link-in on PS tab for PS-level commands
nodes.push({
id: `${prefix}_link_cmd_in`,
type: "link in",
z: tabId,
name: `${psLabel} Cmd`,
links: [`${prefix}_link_cmd_out`],
x: 120, y: 480,
wires: [[psNodeId]]
});
// === Monitoring widgets on detail page ===
// Re-use existing data from the PS parse functions on dashboard tab
// Create a link-in to receive PS data and parse for detail page
nodes.push({
id: `${prefix}_link_detail_data_out`,
type: "link out",
z: tabId,
name: `${psLabel} Detail`,
mode: "link",
links: [`${prefix}_link_detail_data_in`],
x: 1080, y: 400
});
// Add to PS node wires[0]
const psNode = byId(psNodeId);
if (psNode && psNode.wires && psNode.wires[0]) {
psNode.wires[0].push(`${prefix}_link_detail_data_out`);
}
nodes.push({
id: `${prefix}_link_detail_data_in`,
type: "link in",
z: "demo_tab_dashboard",
name: `${psLabel} Detail`,
links: [`${prefix}_link_detail_data_out`],
x: 75, y: 50 + pageOrder * 300,
wires: [[`${prefix}_fn_detail_parse`]]
});
// Parse function for detail monitoring
nodes.push({
id: `${prefix}_fn_detail_parse`,
type: "function",
z: "demo_tab_dashboard",
name: `Parse ${psLabel} Detail`,
func: `const p = msg.payload || {};
const cache = context.get('c') || {};
const keys = Object.keys(p);
const pick = (prefixes) => { for (const pfx of prefixes) { const k = keys.find(k => k.startsWith(pfx)); if (k) { const v = Number(p[k]); if (Number.isFinite(v)) return v; } } return null; };
const level = pick(['level.predicted.atequipment','level.measured.atequipment']);
const volume = pick(['volume.predicted.atequipment']);
const netFlow = pick(['netFlowRate.predicted.atequipment']);
const fillPct = pick(['volumePercent.predicted.atequipment']);
const direction = p.direction || cache.direction || '?';
if (level !== null) cache.level = level;
if (volume !== null) cache.volume = volume;
if (netFlow !== null) cache.netFlow = netFlow;
if (fillPct !== null) cache.fillPct = fillPct;
cache.direction = direction;
context.set('c', cache);
const now = Date.now();
const dirArrow = cache.direction === 'filling' ? '\\u2191' : cache.direction === 'emptying' ? '\\u2193' : '\\u2014';
const status = [
dirArrow + ' ' + (cache.direction || ''),
cache.netFlow !== undefined ? Math.abs(cache.netFlow).toFixed(0) + ' m\\u00b3/h' : '',
].filter(s => s.trim()).join(' | ');
return [
cache.level !== undefined ? {topic:'${psLabel} Level', payload: cache.level, timestamp: now} : null,
cache.netFlow !== undefined ? {topic:'${psLabel} Flow', payload: cache.netFlow, timestamp: now} : null,
{topic:'${psLabel} Status', payload: status},
cache.fillPct !== undefined ? {payload: Number(cache.fillPct.toFixed(1))} : null,
cache.level !== undefined ? {payload: Number(cache.level.toFixed(2))} : null
];`,
outputs: 5,
x: 280, y: 50 + pageOrder * 300,
wires: [
[`${prefix}_chart_level`],
[`${prefix}_chart_flow`],
[`${prefix}_text_status`],
[`${prefix}_gauge_fill`],
[`${prefix}_gauge_tank`]
]
});
// Level chart
nodes.push({
id: `${prefix}_chart_level`,
type: "ui-chart",
z: "demo_tab_dashboard",
group: `${prefix}_grp_charts`,
name: `${psLabel} Level`,
label: "Basin Level (m)",
order: 1,
width: "6",
height: "5",
chartType: "line",
category: "topic",
categoryType: "msg",
xAxisType: "time",
yAxisLabel: "m",
removeOlder: "10",
removeOlderUnit: "60",
action: "append",
pointShape: "false",
pointRadius: 0,
interpolation: "linear",
showLegend: true,
xAxisProperty: "",
xAxisPropertyType: "timestamp",
yAxisProperty: "payload",
yAxisPropertyType: "msg",
colors: ["#0094ce", "#FF7F0E", "#2CA02C"],
textColor: ["#aaaaaa"],
textColorDefault: false,
gridColor: ["#333333"],
gridColorDefault: false,
x: 510, y: 30 + pageOrder * 300,
wires: []
});
// Flow chart
nodes.push({
id: `${prefix}_chart_flow`,
type: "ui-chart",
z: "demo_tab_dashboard",
group: `${prefix}_grp_charts`,
name: `${psLabel} Flow`,
label: "Net Flow (m\u00b3/h)",
order: 2,
width: "6",
height: "5",
chartType: "line",
category: "topic",
categoryType: "msg",
xAxisType: "time",
yAxisLabel: "m\u00b3/h",
removeOlder: "10",
removeOlderUnit: "60",
action: "append",
pointShape: "false",
pointRadius: 0,
interpolation: "linear",
showLegend: true,
xAxisProperty: "",
xAxisPropertyType: "timestamp",
yAxisProperty: "payload",
yAxisPropertyType: "msg",
colors: ["#4fc3f7", "#FF7F0E", "#2CA02C"],
textColor: ["#aaaaaa"],
textColorDefault: false,
gridColor: ["#333333"],
gridColorDefault: false,
x: 510, y: 60 + pageOrder * 300,
wires: []
});
// Status text
nodes.push({
id: `${prefix}_text_status`,
type: "ui-text",
z: "demo_tab_dashboard",
group: `${prefix}_grp_monitoring`,
name: `${psLabel} Status`,
label: "Status",
order: 1,
width: "6",
height: "1",
format: "{{msg.payload}}",
layout: "row-spread",
x: 510, y: 80 + pageOrder * 300,
wires: []
});
// Fill % gauge
nodes.push({
id: `${prefix}_gauge_fill`,
type: "ui-gauge",
z: "demo_tab_dashboard",
group: `${prefix}_grp_monitoring`,
name: `${psLabel} Fill`,
gtype: "gauge-34",
gstyle: "Rounded",
title: "Fill",
units: "%",
prefix: "",
suffix: "%",
min: 0,
max: 100,
segments: [
{ color: "#f44336", from: 0 },
{ color: "#ff9800", from: 10 },
{ color: "#4caf50", from: 25 },
{ color: "#ff9800", from: 75 },
{ color: "#f44336", from: 90 }
],
width: 3,
height: 3,
order: 2,
className: "",
x: 700, y: 80 + pageOrder * 300,
wires: []
});
// Tank gauge
nodes.push({
id: `${prefix}_gauge_tank`,
type: "ui-gauge",
z: "demo_tab_dashboard",
group: `${prefix}_grp_monitoring`,
name: `${psLabel} Tank`,
gtype: "gauge-tank",
gstyle: "Rounded",
title: "Level",
units: "m",
prefix: "",
suffix: "m",
min: 0,
max: basinHeight,
segments: [
{ color: "#f44336", from: 0 },
{ color: "#ff9800", from: basinHeight * 0.08 },
{ color: "#2196f3", from: basinHeight * 0.25 },
{ color: "#ff9800", from: basinHeight * 0.62 },
{ color: "#f44336", from: basinHeight * 0.8 }
],
width: 3,
height: 4,
order: 3,
className: "",
x: 700, y: 40 + pageOrder * 300,
wires: []
});
return nodes;
}
// =============================================
// Create detail pages for each PS
// =============================================
const westNodes = createPSDetailPage({
psKey: 'west',
psLabel: 'PS West',
pagePath: '/ps-west',
pageOrder: 2,
psNodeId: 'demo_ps_west',
pumps: [
{ id: 'demo_pump_w1', label: 'W1' },
{ id: 'demo_pump_w2', label: 'W2' }
],
controlModes: ['levelbased', 'flowbased', 'manual'],
defaultMode: 'levelbased',
maxFlow: 300,
basinHeight: 4,
tabId: 'demo_tab_ps_west',
});
const northNodes = createPSDetailPage({
psKey: 'north',
psLabel: 'PS North',
pagePath: '/ps-north',
pageOrder: 3,
psNodeId: 'demo_ps_north',
pumps: [
{ id: 'demo_pump_n1', label: 'N1' }
],
controlModes: ['levelbased', 'flowbased', 'manual'],
defaultMode: 'flowbased',
maxFlow: 200,
basinHeight: 3,
tabId: 'demo_tab_ps_north',
});
const southNodes = createPSDetailPage({
psKey: 'south',
psLabel: 'PS South',
pagePath: '/ps-south',
pageOrder: 4,
psNodeId: 'demo_ps_south',
pumps: [
{ id: 'demo_pump_s1', label: 'S1' }
],
controlModes: ['levelbased', 'flowbased', 'manual'],
defaultMode: 'manual',
maxFlow: 100,
basinHeight: 2.5,
tabId: 'demo_tab_ps_south',
});
flow.push(...westNodes, ...northNodes, ...southNodes);
// =============================================
// Validate
// =============================================
const allIds = new Set(flow.map(n => n.id));
let issues = 0;
// Check for duplicate IDs
const idCounts = {};
flow.forEach(n => { idCounts[n.id] = (idCounts[n.id] || 0) + 1; });
for (const [id, count] of Object.entries(idCounts)) {
if (count > 1) { console.warn(`DUPLICATE ID: ${id} (${count} instances)`); issues++; }
}
for (const n of flow) {
if (!n.wires) continue;
for (const port of n.wires) {
for (const target of port) {
if (!allIds.has(target)) {
console.warn(`BROKEN WIRE: ${n.id}${target}`);
issues++;
}
}
}
if (n.type === 'link out' && n.links) {
for (const lt of n.links) {
if (!allIds.has(lt)) { console.warn(`BROKEN LINK OUT: ${n.id}${lt}`); issues++; }
}
}
if (n.type === 'link in' && n.links) {
for (const ls of n.links) {
if (!allIds.has(ls)) { console.warn(`BROKEN LINK IN: ${n.id}${ls}`); issues++; }
}
}
}
if (issues === 0) console.log('All references valid ✓');
else console.log(`Found ${issues} issues`);
// Count nodes per tab
const tabCounts = {};
for (const n of flow) {
if (n.z) tabCounts[n.z] = (tabCounts[n.z] || 0) + 1;
}
console.log('Nodes per tab:', JSON.stringify(tabCounts, null, 2));
console.log('Total nodes:', flow.length);
// Count new nodes added
const newNodeCount = westNodes.length + northNodes.length + southNodes.length;
console.log(`Added ${newNodeCount} new nodes (${westNodes.length} west + ${northNodes.length} north + ${southNodes.length} south)`);
// Write
fs.writeFileSync(FLOW_PATH, JSON.stringify(flow, null, 2) + '\n');
console.log(`Wrote ${FLOW_PATH}`);

View File

@@ -1,279 +0,0 @@
#!/usr/bin/env node
/**
* Script to update docker/demo-flow.json with Fixes 2-5 from the plan.
* Run from project root: node scripts/update-demo-flow.js
*/
const fs = require('fs');
const path = require('path');
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
// === Fix 2: Enable simulator on 9 measurement nodes ===
const simMeasIds = [
'demo_meas_flow', 'demo_meas_do', 'demo_meas_nh4',
'demo_meas_ft_n1', 'demo_meas_eff_flow', 'demo_meas_eff_do',
'demo_meas_eff_nh4', 'demo_meas_eff_no3', 'demo_meas_eff_tss'
];
simMeasIds.forEach(id => {
const node = flow.find(n => n.id === id);
if (node) {
node.simulator = true;
console.log('Enabled simulator on', id);
} else {
console.error('NOT FOUND:', id);
}
});
// === Fix 2: Remove 18 inject+function sim pairs ===
const removeSimIds = [
'demo_inj_meas_flow', 'demo_fn_sim_flow',
'demo_inj_meas_do', 'demo_fn_sim_do',
'demo_inj_meas_nh4', 'demo_fn_sim_nh4',
'demo_inj_ft_n1', 'demo_fn_sim_ft_n1',
'demo_inj_eff_flow', 'demo_fn_sim_eff_flow',
'demo_inj_eff_do', 'demo_fn_sim_eff_do',
'demo_inj_eff_nh4', 'demo_fn_sim_eff_nh4',
'demo_inj_eff_no3', 'demo_fn_sim_eff_no3',
'demo_inj_eff_tss', 'demo_fn_sim_eff_tss'
];
// === Fix 5: Remove manual pump startup/setpoint injectors ===
const removeManualIds = [
'demo_inj_w1_startup', 'demo_inj_w1_setpoint',
'demo_inj_w2_startup', 'demo_inj_w2_setpoint',
'demo_inj_n1_startup',
'demo_inj_s1_startup'
];
const allRemoveIds = new Set([...removeSimIds, ...removeManualIds]);
const before = flow.length;
const filtered = flow.filter(n => !allRemoveIds.has(n.id));
console.log(`Removed ${before - filtered.length} nodes (expected 24)`);
// Remove wires to removed nodes from remaining nodes
filtered.forEach(n => {
if (n.wires && Array.isArray(n.wires)) {
n.wires = n.wires.map(wireGroup => {
if (Array.isArray(wireGroup)) {
return wireGroup.filter(w => !allRemoveIds.has(w));
}
return wireGroup;
});
}
});
// === Fix 3 (demo part): Add speedUpFactor to reactor ===
const reactor = filtered.find(n => n.id === 'demo_reactor');
if (reactor) {
reactor.speedUpFactor = 1;
console.log('Added speedUpFactor=1 to reactor');
}
// === Fix 4: Add pressure measurement nodes ===
const maxY = Math.max(...filtered.filter(n => n.z === 'demo_tab_wwtp').map(n => n.y || 0));
const ptBaseConfig = {
scaling: true,
i_offset: 0,
smooth_method: 'mean',
count: 3,
category: 'sensor',
assetType: 'pressure',
enableLog: false,
logLevel: 'error',
positionIcon: '',
hasDistance: false
};
// Function to extract level from PS output and convert to hydrostatic pressure
const levelExtractFunc = [
'// Extract basin level from PS output and convert to hydrostatic pressure (mbar)',
'// P = rho * g * h, rho=1000 kg/m3, g=9.81 m/s2',
'const p = msg.payload || {};',
'const keys = Object.keys(p);',
'const levelKey = keys.find(k => k.startsWith("level.predicted.atequipment") || k.startsWith("level.measured.atequipment"));',
'if (!levelKey) return null;',
'const h = Number(p[levelKey]);',
'if (!Number.isFinite(h)) return null;',
'msg.topic = "measurement";',
'msg.payload = Math.round(h * 98.1 * 10) / 10; // mbar',
'return msg;'
].join('\n');
const newNodes = [
// Comment
{
id: 'demo_comment_pressure',
type: 'comment',
z: 'demo_tab_wwtp',
name: '=== PRESSURE MEASUREMENTS (per pumping station) ===',
info: '',
x: 320,
y: maxY + 40
},
// --- PS West upstream PT ---
{
id: 'demo_fn_level_to_pressure_w',
type: 'function',
z: 'demo_tab_wwtp',
name: 'Level\u2192Pressure (West)',
func: levelExtractFunc,
outputs: 1,
x: 370,
y: maxY + 80,
wires: [['demo_meas_pt_w_up']]
},
{
id: 'demo_meas_pt_w_up',
type: 'measurement',
z: 'demo_tab_wwtp',
name: 'PT-W-UP (West Upstream)',
...ptBaseConfig,
i_min: 0, i_max: 5000, o_min: 0, o_max: 5000,
simulator: false,
uuid: 'pt-w-up-001',
supplier: 'Endress+Hauser',
model: 'Cerabar-PMC51',
unit: 'mbar',
assetTagNumber: 'PT-W-UP',
positionVsParent: 'upstream',
x: 580,
y: maxY + 80,
wires: [['demo_link_process_out'], ['demo_link_influx_out'], ['demo_pump_w1', 'demo_pump_w2']]
},
// PS West downstream PT (simulated)
{
id: 'demo_meas_pt_w_down',
type: 'measurement',
z: 'demo_tab_wwtp',
name: 'PT-W-DN (West Downstream)',
...ptBaseConfig,
i_min: 0, i_max: 5000, o_min: 0, o_max: 5000,
simulator: true,
uuid: 'pt-w-dn-001',
supplier: 'Endress+Hauser',
model: 'Cerabar-PMC51',
unit: 'mbar',
assetTagNumber: 'PT-W-DN',
positionVsParent: 'downstream',
x: 580,
y: maxY + 140,
wires: [['demo_link_process_out'], ['demo_link_influx_out'], ['demo_pump_w1', 'demo_pump_w2']]
},
// --- PS North upstream PT ---
{
id: 'demo_fn_level_to_pressure_n',
type: 'function',
z: 'demo_tab_wwtp',
name: 'Level\u2192Pressure (North)',
func: levelExtractFunc,
outputs: 1,
x: 370,
y: maxY + 220,
wires: [['demo_meas_pt_n_up']]
},
{
id: 'demo_meas_pt_n_up',
type: 'measurement',
z: 'demo_tab_wwtp',
name: 'PT-N-UP (North Upstream)',
...ptBaseConfig,
i_min: 0, i_max: 5000, o_min: 0, o_max: 5000,
simulator: false,
uuid: 'pt-n-up-001',
supplier: 'Endress+Hauser',
model: 'Cerabar-PMC51',
unit: 'mbar',
assetTagNumber: 'PT-N-UP',
positionVsParent: 'upstream',
x: 580,
y: maxY + 220,
wires: [['demo_link_process_out'], ['demo_link_influx_out'], ['demo_pump_n1']]
},
{
id: 'demo_meas_pt_n_down',
type: 'measurement',
z: 'demo_tab_wwtp',
name: 'PT-N-DN (North Downstream)',
...ptBaseConfig,
i_min: 0, i_max: 5000, o_min: 0, o_max: 5000,
simulator: true,
uuid: 'pt-n-dn-001',
supplier: 'Endress+Hauser',
model: 'Cerabar-PMC51',
unit: 'mbar',
assetTagNumber: 'PT-N-DN',
positionVsParent: 'downstream',
x: 580,
y: maxY + 280,
wires: [['demo_link_process_out'], ['demo_link_influx_out'], ['demo_pump_n1']]
},
// --- PS South upstream PT ---
{
id: 'demo_fn_level_to_pressure_s',
type: 'function',
z: 'demo_tab_wwtp',
name: 'Level\u2192Pressure (South)',
func: levelExtractFunc,
outputs: 1,
x: 370,
y: maxY + 360,
wires: [['demo_meas_pt_s_up']]
},
{
id: 'demo_meas_pt_s_up',
type: 'measurement',
z: 'demo_tab_wwtp',
name: 'PT-S-UP (South Upstream)',
...ptBaseConfig,
i_min: 0, i_max: 5000, o_min: 0, o_max: 5000,
simulator: false,
uuid: 'pt-s-up-001',
supplier: 'Endress+Hauser',
model: 'Cerabar-PMC51',
unit: 'mbar',
assetTagNumber: 'PT-S-UP',
positionVsParent: 'upstream',
x: 580,
y: maxY + 360,
wires: [['demo_link_process_out'], ['demo_link_influx_out'], ['demo_pump_s1']]
},
{
id: 'demo_meas_pt_s_down',
type: 'measurement',
z: 'demo_tab_wwtp',
name: 'PT-S-DN (South Downstream)',
...ptBaseConfig,
i_min: 0, i_max: 5000, o_min: 0, o_max: 5000,
simulator: true,
uuid: 'pt-s-dn-001',
supplier: 'Endress+Hauser',
model: 'Cerabar-PMC51',
unit: 'mbar',
assetTagNumber: 'PT-S-DN',
positionVsParent: 'downstream',
x: 580,
y: maxY + 420,
wires: [['demo_link_process_out'], ['demo_link_influx_out'], ['demo_pump_s1']]
}
];
// Wire PS output port 0 to the level-to-pressure function nodes
const psWest = filtered.find(n => n.id === 'demo_ps_west');
const psNorth = filtered.find(n => n.id === 'demo_ps_north');
const psSouth = filtered.find(n => n.id === 'demo_ps_south');
if (psWest && psWest.wires[0]) psWest.wires[0].push('demo_fn_level_to_pressure_w');
if (psNorth && psNorth.wires[0]) psNorth.wires[0].push('demo_fn_level_to_pressure_n');
if (psSouth && psSouth.wires[0]) psSouth.wires[0].push('demo_fn_level_to_pressure_s');
// Combine and write
const result = [...filtered, ...newNodes];
console.log(`Final flow has ${result.length} nodes`);
fs.writeFileSync(flowPath, JSON.stringify(result, null, 2) + '\n');
console.log('Done! Written to docker/demo-flow.json');

View File

@@ -1,440 +0,0 @@
[
{
"id": "e2e-flow-tab",
"type": "tab",
"label": "E2E Test Flow",
"disabled": false,
"info": "End-to-end test flow that verifies EVOLV nodes load, accept input, and produce output."
},
{
"id": "inject-trigger",
"type": "inject",
"z": "e2e-flow-tab",
"name": "Trigger once on start",
"props": [
{ "p": "payload" },
{ "p": "topic", "vt": "str" }
],
"repeat": "",
"crontab": "",
"once": true,
"onceDelay": "3",
"topic": "e2e-test",
"payload": "",
"payloadType": "date",
"x": 160,
"y": 80,
"wires": [["build-measurement-msg"]]
},
{
"id": "build-measurement-msg",
"type": "function",
"z": "e2e-flow-tab",
"name": "Build measurement input",
"func": "// Simulate an analog sensor reading sent to the measurement node.\n// The measurement node expects a numeric payload on topic 'analogInput'.\nmsg.payload = 4.2 + Math.random() * 15.8; // 4-20 mA range\nmsg.topic = 'analogInput';\nnode.status({ fill: 'green', shape: 'dot', text: 'sent ' + msg.payload.toFixed(2) });\nreturn msg;",
"outputs": 1,
"timeout": "",
"noerr": 0,
"initialize": "",
"finalize": "",
"libs": [],
"x": 380,
"y": 80,
"wires": [["measurement-e2e-node"]]
},
{
"id": "measurement-e2e-node",
"type": "measurement",
"z": "e2e-flow-tab",
"name": "E2E-Level-Sensor",
"scaling": true,
"i_min": 4,
"i_max": 20,
"i_offset": 0,
"o_min": 0,
"o_max": 5,
"simulator": false,
"smooth_method": "",
"count": "10",
"uuid": "",
"supplier": "e2e-test",
"category": "level",
"assetType": "sensor",
"model": "e2e-virtual",
"unit": "m",
"enableLog": false,
"logLevel": "error",
"positionVsParent": "upstream",
"positionIcon": "",
"hasDistance": false,
"distance": 0,
"distanceUnit": "m",
"distanceDescription": "",
"x": 600,
"y": 80,
"wires": [
["debug-process"],
["debug-dbase"],
["debug-parent"]
]
},
{
"id": "debug-process",
"type": "debug",
"z": "e2e-flow-tab",
"name": "Process Output",
"active": true,
"tosidebar": true,
"console": true,
"tostatus": true,
"complete": "true",
"targetType": "full",
"statusVal": "",
"statusType": "auto",
"x": 830,
"y": 40,
"wires": []
},
{
"id": "debug-dbase",
"type": "debug",
"z": "e2e-flow-tab",
"name": "Database Output",
"active": true,
"tosidebar": true,
"console": true,
"tostatus": true,
"complete": "true",
"targetType": "full",
"statusVal": "",
"statusType": "auto",
"x": 840,
"y": 80,
"wires": []
},
{
"id": "debug-parent",
"type": "debug",
"z": "e2e-flow-tab",
"name": "Parent Output",
"active": true,
"tosidebar": true,
"console": true,
"tostatus": true,
"complete": "true",
"targetType": "full",
"statusVal": "",
"statusType": "auto",
"x": 830,
"y": 120,
"wires": []
},
{
"id": "inject-periodic",
"type": "inject",
"z": "e2e-flow-tab",
"name": "Periodic (5s)",
"props": [
{ "p": "payload" },
{ "p": "topic", "vt": "str" }
],
"repeat": "5",
"crontab": "",
"once": true,
"onceDelay": "6",
"topic": "e2e-heartbeat",
"payload": "",
"payloadType": "date",
"x": 160,
"y": 200,
"wires": [["heartbeat-func"]]
},
{
"id": "heartbeat-func",
"type": "function",
"z": "e2e-flow-tab",
"name": "Heartbeat check",
"func": "// Verify the EVOLV measurement node is running by querying its presence\nmsg.payload = {\n check: 'heartbeat',\n timestamp: Date.now(),\n nodeCount: global.get('_e2e_msg_count') || 0\n};\n// Increment message counter\nlet count = global.get('_e2e_msg_count') || 0;\nglobal.set('_e2e_msg_count', count + 1);\nnode.status({ fill: 'blue', shape: 'ring', text: 'beat #' + (count+1) });\nreturn msg;",
"outputs": 1,
"timeout": "",
"noerr": 0,
"initialize": "",
"finalize": "",
"libs": [],
"x": 380,
"y": 200,
"wires": [["debug-heartbeat"]]
},
{
"id": "debug-heartbeat",
"type": "debug",
"z": "e2e-flow-tab",
"name": "Heartbeat Debug",
"active": true,
"tosidebar": true,
"console": true,
"tostatus": false,
"complete": "payload",
"targetType": "msg",
"statusVal": "",
"statusType": "auto",
"x": 600,
"y": 200,
"wires": []
},
{
"id": "inject-monster-prediction",
"type": "inject",
"z": "e2e-flow-tab",
"name": "Monster prediction",
"props": [
{ "p": "payload" },
{ "p": "topic", "vt": "str" }
],
"repeat": "",
"crontab": "",
"once": true,
"onceDelay": "4",
"topic": "model_prediction",
"payload": "120",
"payloadType": "num",
"x": 150,
"y": 320,
"wires": [["evolv-monster"]]
},
{
"id": "inject-monster-flow",
"type": "inject",
"z": "e2e-flow-tab",
"name": "Monster flow",
"props": [
{ "p": "payload" },
{ "p": "topic", "vt": "str" }
],
"repeat": "3",
"crontab": "",
"once": true,
"onceDelay": "5",
"topic": "i_flow",
"payload": "3600",
"payloadType": "num",
"x": 140,
"y": 360,
"wires": [["evolv-monster"]]
},
{
"id": "inject-monster-start",
"type": "inject",
"z": "e2e-flow-tab",
"name": "Monster start",
"props": [
{ "p": "payload" },
{ "p": "topic", "vt": "str" }
],
"repeat": "",
"crontab": "",
"once": true,
"onceDelay": "6",
"topic": "start",
"payload": "",
"payloadType": "date",
"x": 140,
"y": 400,
"wires": [["evolv-monster"]]
},
{
"id": "evolv-monster",
"type": "monster",
"z": "e2e-flow-tab",
"name": "E2E-Monster",
"samplingtime": 1,
"minvolume": 5,
"maxweight": 23,
"emptyWeightBucket": 3,
"aquon_sample_name": "112100",
"supplier": "e2e-test",
"subType": "samplingCabinet",
"model": "e2e-virtual",
"unit": "m3/h",
"enableLog": false,
"logLevel": "error",
"x": 390,
"y": 360,
"wires": [
["debug-monster-process"],
["debug-monster-dbase"],
[],
[]
]
},
{
"id": "debug-monster-process",
"type": "debug",
"z": "e2e-flow-tab",
"name": "Monster Process Output",
"active": true,
"tosidebar": true,
"console": true,
"tostatus": true,
"complete": "true",
"targetType": "full",
"statusVal": "",
"statusType": "auto",
"x": 660,
"y": 340,
"wires": []
},
{
"id": "debug-monster-dbase",
"type": "debug",
"z": "e2e-flow-tab",
"name": "Monster Database Output",
"active": true,
"tosidebar": true,
"console": true,
"tostatus": true,
"complete": "true",
"targetType": "full",
"statusVal": "",
"statusType": "auto",
"x": 670,
"y": 380,
"wires": []
},
{
"id": "inject-dashboardapi-register",
"type": "inject",
"z": "e2e-flow-tab",
"name": "DashboardAPI register child",
"props": [
{ "p": "payload" },
{ "p": "topic", "vt": "str" }
],
"repeat": "",
"crontab": "",
"once": true,
"onceDelay": "12",
"payload": "",
"payloadType": "date",
"x": 160,
"y": 500,
"wires": [["build-dashboardapi-msg"]]
},
{
"id": "build-dashboardapi-msg",
"type": "function",
"z": "e2e-flow-tab",
"name": "Build dashboardapi input",
"func": "msg.topic = 'registerChild';\nmsg.payload = {\n config: {\n general: {\n name: 'E2E-Level-Sensor'\n },\n functionality: {\n softwareType: 'measurement'\n }\n }\n};\nreturn msg;",
"outputs": 1,
"timeout": "",
"noerr": 0,
"initialize": "",
"finalize": "",
"libs": [],
"x": 400,
"y": 500,
"wires": [["dashboardapi-e2e"]]
},
{
"id": "dashboardapi-e2e",
"type": "dashboardapi",
"z": "e2e-flow-tab",
"name": "E2E-DashboardAPI",
"logLevel": "error",
"enableLog": false,
"host": "grafana",
"port": "3000",
"bearerToken": "",
"x": 660,
"y": 500,
"wires": [["debug-dashboardapi-output"]]
},
{
"id": "debug-dashboardapi-output",
"type": "debug",
"z": "e2e-flow-tab",
"name": "DashboardAPI Output",
"active": true,
"tosidebar": true,
"console": true,
"tostatus": true,
"complete": "true",
"targetType": "full",
"statusVal": "",
"statusType": "auto",
"x": 920,
"y": 500,
"wires": []
},
{
"id": "inject-diffuser-flow",
"type": "inject",
"z": "e2e-flow-tab",
"name": "Diffuser airflow",
"props": [
{ "p": "payload" },
{ "p": "topic", "vt": "str" }
],
"repeat": "",
"crontab": "",
"once": true,
"onceDelay": "9",
"topic": "air_flow",
"payload": "24",
"payloadType": "num",
"x": 150,
"y": 620,
"wires": [["diffuser-e2e"]]
},
{
"id": "diffuser-e2e",
"type": "diffuser",
"z": "e2e-flow-tab",
"name": "E2E-Diffuser",
"number": 1,
"i_elements": 4,
"i_diff_density": 2.4,
"i_m_water": 4.5,
"alfaf": 0.7,
"enableLog": false,
"logLevel": "error",
"x": 390,
"y": 620,
"wires": [["debug-diffuser-process"], ["debug-diffuser-dbase"], []]
},
{
"id": "debug-diffuser-process",
"type": "debug",
"z": "e2e-flow-tab",
"name": "Diffuser Process Output",
"active": true,
"tosidebar": true,
"console": true,
"tostatus": true,
"complete": "true",
"targetType": "full",
"statusVal": "",
"statusType": "auto",
"x": 670,
"y": 600,
"wires": []
},
{
"id": "debug-diffuser-dbase",
"type": "debug",
"z": "e2e-flow-tab",
"name": "Diffuser Database Output",
"active": true,
"tosidebar": true,
"console": true,
"tostatus": true,
"complete": "true",
"targetType": "full",
"statusVal": "",
"statusType": "auto",
"x": 680,
"y": 640,
"wires": []
}
]

View File

@@ -1,213 +0,0 @@
#!/usr/bin/env bash
#
# End-to-end test runner for EVOLV Node-RED stack.
# Starts Node-RED + InfluxDB + Grafana via Docker Compose,
# verifies that EVOLV nodes are registered in the palette,
# and tears down the stack on exit.
#
set -euo pipefail
SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
PROJECT_ROOT="$(cd "$SCRIPT_DIR/../.." && pwd)"
COMPOSE_FILE="$PROJECT_ROOT/docker-compose.e2e.yml"
NODERED_URL="http://localhost:1880"
MAX_WAIT=120 # seconds to wait for Node-RED to become healthy
GRAFANA_URL="http://localhost:3000/api/health"
MAX_GRAFANA_WAIT=60
LOG_WAIT=20
# EVOLV node types that must appear in the palette (from package.json node-red.nodes)
EXPECTED_NODES=(
"dashboardapi"
"diffuser"
"machineGroupControl"
"measurement"
"monster"
"pumpingstation"
"reactor"
"rotatingMachine"
"settler"
"valve"
"valveGroupControl"
)
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m' # No Color
log_info() { echo -e "${GREEN}[INFO]${NC} $*"; }
log_warn() { echo -e "${YELLOW}[WARN]${NC} $*"; }
log_error() { echo -e "${RED}[ERROR]${NC} $*"; }
wait_for_log_pattern() {
local pattern="$1"
local description="$2"
local required="${3:-false}"
local elapsed=0
local logs=""
while [ $elapsed -lt $LOG_WAIT ]; do
logs=$(run_compose logs nodered 2>&1)
if echo "$logs" | grep -q "$pattern"; then
log_info " [PASS] $description"
return 0
fi
sleep 2
elapsed=$((elapsed + 2))
done
if [ "$required" = true ]; then
log_error " [FAIL] $description not detected in logs"
FAILURES=$((FAILURES + 1))
else
log_warn " [WARN] $description not detected in logs"
fi
return 1
}
# Determine docker compose command (handle permission via sg docker if needed)
USE_SG_DOCKER=false
if ! docker info >/dev/null 2>&1; then
if sg docker -c "docker info" >/dev/null 2>&1; then
USE_SG_DOCKER=true
log_info "Using sg docker for Docker access"
else
log_error "Docker is not accessible. Please ensure Docker is running and you have permissions."
exit 1
fi
fi
run_compose() {
if [ "$USE_SG_DOCKER" = true ]; then
local cmd="docker compose -f $(printf '%q' "$COMPOSE_FILE")"
local arg
for arg in "$@"; do
cmd+=" $(printf '%q' "$arg")"
done
sg docker -c "$cmd"
else
docker compose -f "$COMPOSE_FILE" "$@"
fi
}
cleanup() {
log_info "Tearing down E2E stack..."
run_compose down --volumes --remove-orphans 2>/dev/null || true
}
# Always clean up on exit
trap cleanup EXIT
# --- Step 1: Build and start the stack ---
log_info "Building and starting E2E stack..."
run_compose up -d --build
# --- Step 2: Wait for Node-RED to be healthy ---
log_info "Waiting for Node-RED to become healthy (max ${MAX_WAIT}s)..."
elapsed=0
while [ $elapsed -lt $MAX_WAIT ]; do
if curl -sf "$NODERED_URL/" >/dev/null 2>&1; then
log_info "Node-RED is up after ${elapsed}s"
break
fi
sleep 2
elapsed=$((elapsed + 2))
done
if [ $elapsed -ge $MAX_WAIT ]; then
log_error "Node-RED did not become healthy within ${MAX_WAIT}s"
log_error "Container logs:"
run_compose logs nodered
exit 1
fi
# Give Node-RED a few extra seconds to finish loading all nodes and editor metadata
sleep 8
# --- Step 3: Verify EVOLV nodes are registered in the palette ---
log_info "Querying Node-RED for registered nodes..."
NODES_RESPONSE=$(curl -sf "$NODERED_URL/nodes" 2>&1) || {
log_error "Failed to query Node-RED /nodes endpoint"
exit 1
}
FAILURES=0
PALETTE_MISSES=0
for node_type in "${EXPECTED_NODES[@]}"; do
if echo "$NODES_RESPONSE" | grep -qi "$node_type"; then
log_info " [PASS] Node type '$node_type' found in palette"
else
log_warn " [WARN] Node type '$node_type' not found in /nodes response"
PALETTE_MISSES=$((PALETTE_MISSES + 1))
fi
done
# --- Step 4: Verify flows are deployed ---
log_info "Checking deployed flows..."
FLOWS_RESPONSE=$(curl -sf "$NODERED_URL/flows" 2>&1) || {
log_error "Failed to query Node-RED /flows endpoint"
exit 1
}
if echo "$FLOWS_RESPONSE" | grep -q "e2e-flow-tab"; then
log_info " [PASS] E2E test flow is deployed"
else
log_warn " [WARN] E2E test flow not found in deployed flows (may need manual deploy)"
fi
# --- Step 5: Verify InfluxDB is reachable ---
log_info "Checking InfluxDB health..."
INFLUX_HEALTH=$(curl -sf "http://localhost:8086/health" 2>&1) || {
log_error "Failed to reach InfluxDB health endpoint"
FAILURES=$((FAILURES + 1))
}
if echo "$INFLUX_HEALTH" | grep -q '"status":"pass"'; then
log_info " [PASS] InfluxDB is healthy"
else
log_error " [FAIL] InfluxDB health check failed"
FAILURES=$((FAILURES + 1))
fi
# --- Step 5b: Verify Grafana is reachable ---
log_info "Checking Grafana health..."
GRAFANA_HEALTH=""
elapsed=0
while [ $elapsed -lt $MAX_GRAFANA_WAIT ]; do
GRAFANA_HEALTH=$(curl -sf "$GRAFANA_URL" 2>&1) && break
sleep 2
elapsed=$((elapsed + 2))
done
if echo "$GRAFANA_HEALTH" | grep -Eq '"database"[[:space:]]*:[[:space:]]*"ok"'; then
log_info " [PASS] Grafana is healthy"
else
log_error " [FAIL] Grafana health check failed"
FAILURES=$((FAILURES + 1))
fi
# --- Step 5c: Verify EVOLV measurement node produced output ---
log_info "Checking EVOLV measurement node output in container logs..."
wait_for_log_pattern "Database Output" "EVOLV measurement node produced database output" true || true
wait_for_log_pattern "Process Output" "EVOLV measurement node produced process output" true || true
wait_for_log_pattern "Monster Process Output" "EVOLV monster node produced process output" true || true
wait_for_log_pattern "Monster Database Output" "EVOLV monster node produced database output" true || true
wait_for_log_pattern "Diffuser Process Output" "EVOLV diffuser node produced process output" true || true
wait_for_log_pattern "Diffuser Database Output" "EVOLV diffuser node produced database output" true || true
wait_for_log_pattern "DashboardAPI Output" "EVOLV dashboardapi node produced create output" true || true
# --- Step 6: Summary ---
echo ""
if [ $FAILURES -eq 0 ]; then
log_info "========================================="
log_info " E2E tests PASSED - all checks green"
log_info "========================================="
exit 0
else
log_error "========================================="
log_error " E2E tests FAILED - $FAILURES check(s) failed"
log_error "========================================="
exit 1
fi

View File

@@ -1,40 +0,0 @@
# EVOLV Scientific & Technical Reference Library
## Purpose
This directory contains curated reference documents for EVOLV's domain-specialist agents. These summaries distill authoritative sources into actionable knowledge that agents should consult **before making scientific or engineering claims**.
## How Agents Should Use This
1. **Before making domain claims**: Read the relevant reference doc to verify your reasoning
2. **Cite sources**: When referencing scientific facts, point to the specific reference doc and its cited sources
3. **Acknowledge uncertainty**: If the reference docs don't cover a topic, say so rather than guessing
4. **Cross-reference with skills**: Combine these references with `.agents/skills/` SKILL.md files for implementation context
## Index
| File | Domain | Used By Agents |
|------|--------|---------------|
| [`asm-models.md`](asm-models.md) | Activated Sludge Models (ASM1-ASM3) | biological-process-engineer |
| [`settling-models.md`](settling-models.md) | Sludge Settling & Clarifier Models | biological-process-engineer |
| [`pump-affinity-laws.md`](pump-affinity-laws.md) | Pump Affinity Laws & Curve Theory | mechanical-process-engineer |
| [`pid-control-theory.md`](pid-control-theory.md) | PID Control for Process Applications | mechanical-process-engineer, node-red-runtime |
| [`signal-processing-sensors.md`](signal-processing-sensors.md) | Sensor Signal Conditioning | instrumentation-measurement |
| [`wastewater-compliance-nl.md`](wastewater-compliance-nl.md) | Dutch Wastewater Regulations | commissioning-compliance, biological-process-engineer |
| [`influxdb-schema-design.md`](influxdb-schema-design.md) | InfluxDB Time-Series Best Practices | telemetry-database |
| [`ot-security-iec62443.md`](ot-security-iec62443.md) | OT Security Standards | ot-security-integration |
## Sources Directory
The `sources/` subdirectory is for placing actual PDFs of scientific papers, standards, and technical manuals. Agents should prefer these curated summaries but can reference originals when available.
## Validation Status
All reference documents have been validated against authoritative sources including:
- IWA Scientific and Technical Reports (ASM models)
- Peer-reviewed publications (Takacs 1991, Vesilind, Burger-Diehl)
- Engineering Toolbox (pump affinity laws)
- ISA publications (Astrom & Hagglund PID control)
- IEC standards (61298, 62443)
- EU Directive 91/271/EEC (wastewater compliance)
- InfluxDB official documentation (schema design)

View File

View File

@@ -1,6 +1,6 @@
---
title: Wiki Index
updated: 2026-04-07
updated: 2026-04-13
---
# EVOLV Project Wiki Index
@@ -20,6 +20,14 @@ updated: 2026-04-07
## Core Concepts
- [generalFunctions API](concepts/generalfunctions-api.md) — logger, MeasurementContainer, configManager, etc.
- [Pump Affinity Laws](concepts/pump-affinity-laws.md) — Q ∝ N, H ∝ N², P ∝ N³
- [ASM Models](concepts/asm-models.md) — activated sludge model kinetics
- [PID Control Theory](concepts/pid-control-theory.md) — proportional-integral-derivative control
- [Settling Models](concepts/settling-models.md) — secondary clarifier sludge settling
- [Signal Processing for Sensors](concepts/signal-processing-sensors.md) — sensor conditioning
- [InfluxDB Schema Design](concepts/influxdb-schema-design.md) — telemetry data model
- [OT Security (IEC 62443)](concepts/ot-security-iec62443.md) — industrial security standard
- [Wastewater Compliance NL](concepts/wastewater-compliance-nl.md) — Dutch regulatory requirements
## Findings
- [BEP-Gravitation Proof](findings/bep-gravitation-proof.md) — within 0.1% of brute-force optimum (proven)
@@ -28,21 +36,22 @@ updated: 2026-04-07
- [Pump Switching Stability](findings/pump-switching-stability.md) — 1-2 transitions, no hysteresis (proven)
- [Open Issues (2026-03)](findings/open-issues-2026-03.md) — diffuser, monster refactor, ML relocation, etc.
## Manuals
- [rotatingMachine User Manual](manuals/nodes/rotatingMachine.md) — inputs, outputs, state machine, examples
- [measurement User Manual](manuals/nodes/measurement.md) — analog + digital modes, smoothing, outlier filtering
- [FlowFuse Dashboard Layout](manuals/node-red/flowfuse-dashboard-layout-manual.md)
- [FlowFuse Widget Catalog](manuals/node-red/flowfuse-widgets-catalog.md)
- [Node-RED Function Patterns](manuals/node-red/function-node-patterns.md)
- [Node-RED Runtime](manuals/node-red/runtime-node-js.md)
- [Messages and Editor Structure](manuals/node-red/messages-and-editor-structure.md)
## Sessions
- [2026-04-07: Production Hardening](sessions/2026-04-07-production-hardening.md) — rotatingMachine + machineGroupControl
- [2026-04-13: rotatingMachine Trial-Ready](sessions/2026-04-13-rotatingMachine-trial-ready.md) — FSM interruptibility, config schema sync, UX polish, dual-curve tests
- [2026-04-13: measurement Digital Mode](sessions/2026-04-13-measurement-digital-mode.md) — silent dispatcher bug fix, 59 new tests, MQTT-style multi-channel input mode
## Other Documentation (outside wiki)
- `CLAUDE.md` — Claude Code project guide (root)
- `AGENTS.md` — agent routing table, orchestrator policy (root, used by `.claude/agents/`)
- `.agents/AGENTS.md` — agent routing table, orchestrator policy
- `.agents/` — skills, decisions, function-anchors, improvements
- `.claude/` — Claude Code agents and rules
- `manuals/node-red/` — FlowFuse dashboard and Node-RED reference docs
## Not Yet Documented
- Parent-child registration protocol (Port 2 handshake)
- Prediction health scoring algorithm (confidence 0-1)
- MeasurementContainer internals (chainable API, delta compression)
- PID controller implementation
- reactor / settler / monster / measurement / valve nodes
- pumpingStation node (uses rotatingMachine children)
- InfluxDB telemetry format (Port 1)

View File

@@ -0,0 +1,203 @@
---
title: measurement — User Manual
node: measurement
updated: 2026-04-13
status: trial-ready
---
# measurement — User Manual
The `measurement` node is the sensor-side of every EVOLV flow. It takes raw signal data, applies offset / scaling / smoothing / outlier rejection, and publishes a conditioned value into the shared `MeasurementContainer`. A parent equipment node (rotatingMachine, pumpingStation, reactor, ...) subscribes automatically via the child-registration handshake on port 2.
## At a glance
| Item | Value |
|---|---|
| Node category | EVOLV |
| Inputs | 1 (message-driven) |
| Outputs | 3 — `process` / `dbase` / `parent` |
| Tick period | 1 s |
| Input modes | `analog` (default) — one scalar per msg. `digital` — object payload with many keys. |
| Smoothing methods | 12 (`none`, `mean`, `min`, `max`, `sd`, `lowPass`, `highPass`, `weightedMovingAverage`, `bandPass`, `median`, `kalman`, `savitzkyGolay`) |
| Outlier methods | 3 (`zScore`, `iqr`, `modifiedZScore`) |
## Choosing a mode
### Analog — one scalar per message (PLC / 4-20 mA)
The classic pattern — what the node did before v1.1. `msg.payload` is a single number. The node runs one offset → scaling → smoothing → outlier pipeline and emits exactly one MeasurementContainer slot keyed by the asset's type + position.
```json
{ "topic": "measurement", "payload": 12.34 }
```
Use when one Node-RED `measurement` node represents one physical sensor.
### Digital — object payload, many channels (MQTT / IoT / JSON)
Use when one Node-RED `measurement` node represents one physical **device** that publishes multiple readings. Common shapes:
```json
{ "topic": "measurement",
"payload": { "temperature": 22.5, "humidity": 45, "pressure": 1013 } }
```
```json
{ "topic": "measurement",
"payload": { "co2": 618, "voc": 122, "pm25": 8 } }
```
Each top-level key maps to a **channel** with its own `type`, `position`, `unit`, and pipeline parameters. Unknown keys are ignored (logged at debug).
## Configuration
### Common (both modes)
- **Asset** (menu): supplier, category, asset type (`assetType`), model, unit.
- **Logger** (menu): log level + enable flag.
- **Position** (menu): `upstream` / `atEquipment` / `downstream`, optional distance offset.
### Analog fields
| Field | Meaning |
|---|---|
| **Scaling** | enables linear interpolation from source range to process range |
| **Source Min / Max** | raw input bounds (e.g. `4` / `20` for mA) |
| **Input Offset** | additive bias applied before scaling |
| **Process Min / Max** | mapped output bounds (e.g. `0` / `3000` for mbar) |
| **Simulator** | internal random-walk source for testing |
| **Smoothing** | method (dropdown) |
| **Window** | smoothing window size |
### Digital fields
- **Input Mode**: set to `digital` in the dropdown.
- **Channels (JSON)**: array of channel definitions.
Each channel:
```json
{
"key": "temperature",
"type": "temperature",
"position": "atEquipment",
"unit": "C",
"scaling": { "enabled": false, "inputMin": 0, "inputMax": 1, "absMin": -50, "absMax": 150, "offset": 0 },
"smoothing": { "smoothWindow": 5, "smoothMethod": "mean" },
"outlierDetection": { "enabled": true, "method": "zScore", "threshold": 3 }
}
```
`scaling` / `smoothing` / `outlierDetection` are optional — missing sections inherit the top-level analog-mode fields. `key` is the JSON field name inside `msg.payload`; `type` is the MeasurementContainer axis — any string works, not just the physical-unit-backed defaults.
## Input topics
| Topic | Payload | Effect |
|---|---|---|
| `measurement` | number (analog) / object (digital) | drives the pipeline |
| `simulator` | — | toggle the internal random-walk simulator |
| `outlierDetection` | — | toggle outlier rejection |
| `calibrate` | — | set the offset so the current output matches `Source Min` (scaling on) or `Process Min` (scaling off). Requires a stable window — aborts if the signal is fluctuating. |
## Output ports
### Port 0 — process
Delta-compressed payload.
**Analog** shape:
```json
{ "mAbs": 4.2, "mPercent": 42, "totalMinValue": 0, "totalMaxValue": 100,
"totalMinSmooth": 0, "totalMaxSmooth": 4.2 }
```
**Digital** shape:
```json
{ "channels": {
"temperature": { "key": "temperature", "type": "temperature", "position": "atEquipment",
"unit": "C", "mAbs": 24, "mPercent": 37,
"totalMinValue": 22.5, "totalMaxValue": 25.5,
"totalMinSmooth": 22.5, "totalMaxSmooth": 24 },
"humidity": { ... },
"pressure": { ... }
} }
```
### Port 1 — dbase
InfluxDB line-protocol telemetry. Tags = asset metadata; fields = measurements. See [InfluxDB Schema Design](../../concepts/influxdb-schema-design.md).
### Port 2 — parent
`{ topic: "registerChild", payload: <nodeId>, positionVsParent, distance }` — emitted once ~200 ms after deploy so the parent equipment node registers this sensor.
## Pipeline per value
1. **Outlier check** (if enabled) — rejects via zScore / IQR / modifiedZScore. Rejected values never advance, don't update min/max, don't emit.
2. **Offset**`value + scaling.offset`.
3. **Scaling** (if enabled) — linear interpolation from `[inputMin, inputMax]` to `[absMin, absMax]` with boundary clamping.
4. **Smoothing** — current value pushed into the rolling window; the configured method produces the smoothed output.
5. **Min/Max tracking** — both raw (pre-smoothing) and smoothed min/max tracked for display.
6. **Constrain** — smoothed value clamped to `[absMin, absMax]`.
7. **Emit**`MeasurementContainer.type(...).variant('measured').position(...).distance(...).value(out, ts, unit)` triggers the event `<type>.measured.<position>` (lowercase) that the parent equipment subscribes to.
In digital mode, each channel runs this pipeline independently.
## Smoothing methods — quick reference
| Method | Use case |
|---|---|
| `none` | pass raw value through — useful for testing |
| `mean` | simple arithmetic average over window |
| `min` / `max` | worst-case / peak reporting |
| `sd` | outputs standard deviation (noise indicator) |
| `median` | outlier-resistant central tendency |
| `weightedMovingAverage` | later samples weighted higher |
| `lowPass` | EMA-style attenuation of high-frequency noise |
| `highPass` | emphasises rapid changes (step detection) |
| `bandPass` | `lowPass + highPass - raw` — band-of-interest filtering |
| `kalman` | recursive noise filter, converges to steady value |
| `savitzkyGolay` | polynomial smoothing over 5-point window |
## Outlier methods — quick reference
| Method | Best when |
|---|---|
| `zScore` | signal is approximately normal; threshold = # of SDs |
| `iqr` | signal is non-normal; robust to skewed distributions |
| `modifiedZScore` | small samples; uses median / MAD instead of mean / SD |
> **Historical bug fixed 2026-04-13:** The dispatcher compared against camelCase keys (`lowPass`, `zScore`, ...) but the validator lowercases enum values. Result: 4 smoothing methods and 2 outlier methods were silently no-ops when chosen from the editor — they fell through to the "unknown" branch and emitted the raw last value. Review any flow deployed before 2026-04-13 that relied on these methods.
## Unit policy
Unknown measurement types (anything not in the container's built-in `measureMap`: `pressure`, `flow`, `power`, `temperature`, `volume`, `length`, `mass`, `energy`) are accepted without unit compatibility checks. This lets digital channels use `humidity` (`%`), `co2` (`ppm`), arbitrary IoT units. Known types still validate strictly.
## Example flow (digital)
```json
[
{ "id": "dig", "type": "measurement",
"mode": "digital",
"channels": "[{\"key\":\"temperature\",\"type\":\"temperature\",\"position\":\"atEquipment\",\"unit\":\"C\",\"scaling\":{\"enabled\":false,\"absMin\":-50,\"absMax\":150},\"smoothing\":{\"smoothWindow\":5,\"smoothMethod\":\"mean\"}},{\"key\":\"humidity\",\"type\":\"humidity\",\"position\":\"atEquipment\",\"unit\":\"%\",\"scaling\":{\"enabled\":false,\"absMin\":0,\"absMax\":100},\"smoothing\":{\"smoothWindow\":5,\"smoothMethod\":\"mean\"}}]",
...
}
]
```
## Testing
```bash
cd nodes/measurement
npm test
```
71 tests — coverage includes every smoothing method, every outlier strategy, scaling, interpolation, constrain, calibration, stability, simulation, per-channel pipelines, digital-mode dispatch, malformed-channel handling, event emits.
End-to-end benchmark scripts live in the superproject at `/tmp/m_e2e_baseline.py` (analog) and `/tmp/m_digital_e2e.py` (digital). Run against a Dockerized Node-RED stack (`docker compose up -d nodered`).
## Production status
Trial-ready as of 2026-04-13 after the session that fixed the silent dispatcher bug and added digital mode. See [session 2026-04-13](../../sessions/2026-04-13-measurement-digital-mode.md) and the memory file `node_measurement.md`.

View File

@@ -0,0 +1,247 @@
---
title: rotatingMachine — User Manual
node: rotatingMachine
updated: 2026-04-13
status: trial-ready
---
# rotatingMachine — User Manual
The `rotatingMachine` node models a single pump, compressor, or blower. It runs an S88-style state machine, predicts flow and power from a supplier curve, and publishes process and telemetry data every second. It is the atomic control module beneath `machineGroupControl` and `pumpingStation`.
This manual is the operator-facing reference. For architecture and the 3-tier code layout see [Node Architecture](../../architecture/node-architecture.md); for curve theory see [3D Pump Curves](../../architecture/3d-pump-curves.md).
## At a glance
| Item | Value |
|---|---|
| Node category | EVOLV |
| Inputs | 1 (message-driven) |
| Outputs | 3 — `process` / `dbase` / `parent` |
| Tick period | 1 s |
| State machine | 10 states (S88) |
| Predictions | curve-backed (nq flow, np power, reversed nq for ctrl) |
| Canonical units | Pa, m³/s, W, K |
## Editor configuration
| Field | Default | Meaning |
|---|---|---|
| **Reaction Speed** | `1` | Ramp rate in controller-position units per second. `1` = 1 %/s. |
| **Startup Time** | `0` | Seconds in the `starting` state. |
| **Warmup Time** | `0` | Seconds in the protected `warmingup` state. |
| **Shutdown Time** | `0` | Seconds in the `stopping` state. |
| **Cooldown Time** | `0` | Seconds in the protected `coolingdown` state. |
| **Movement Mode** | `staticspeed` | `staticspeed` = linear ramp; `dynspeed` = ease-in/out. |
| **Process Output** | `process` | Port 0 payload format: `process` (delta-compressed) / `json` / `csv`. |
| **Database Output** | `influxdb` | Port 1 payload format: `influxdb` line protocol / `json` / `csv`. |
| **Asset** (menu) | — | Supplier, category, model (must match a curve file in `generalFunctions/datasets`), output flow unit, curve units. |
| **Logger** (menu) | `info`, enabled | Log level and toggle. |
| **Position** (menu) | `atEquipment` | `upstream` / `atEquipment` / `downstream` relative to parent. Icon and optional distance offset. |
> **Tip.** With `Reaction Speed = 1` and `Set 60%` from idle, the controller takes ~60 s to reach 60 %. Scale `Reaction Speed` up to emulate a faster actuator (e.g. `20` gives 1 second per 20 % = 3 s to reach 60 %).
## Input topics
Every command enters on the single input port. `msg.topic` selects the handler; `msg.payload` carries the arguments.
### `setMode`
```json
{ "topic": "setMode", "payload": "virtualControl" }
```
Valid values: `auto`, `virtualControl`, `fysicalControl`. The current mode gates *which source* may issue *which action* (mode/action/source policy lives in `generalFunctions/src/configs/rotatingMachine.json`).
### `execSequence`
```json
{ "topic": "execSequence",
"payload": { "source": "GUI", "action": "execSequence", "parameter": "startup" } }
```
`parameter` values: `startup`, `shutdown`, `entermaintenance`, `exitmaintenance`. Case is normalized.
If a `shutdown` is issued while the machine is mid-ramp (`accelerating` / `decelerating`), the active movement is aborted and the shutdown proceeds as soon as the FSM has returned to `operational`.
### `execMovement`
```json
{ "topic": "execMovement",
"payload": { "source": "GUI", "action": "execMovement", "setpoint": 60 } }
```
`setpoint` is expressed in controller units (0100 %).
### `flowMovement`
```json
{ "topic": "flowMovement",
"payload": { "source": "parent", "action": "flowMovement", "setpoint": 150 } }
```
`setpoint` is expressed in the configured **output flow unit** (e.g. m³/h). The node converts flow → controller-% via the reversed nq curve and then drives `execMovement`.
### `emergencystop`
```json
{ "topic": "emergencystop",
"payload": { "source": "GUI", "action": "emergencystop" } }
```
Aborts any active movement, runs the `emergencystop``off` transition. Allowed from every active state. Case-insensitive.
### `simulateMeasurement`
Inject a dashboard-side measurement without wiring a sensor child. Useful for validation, smoke tests, demo flows.
```json
{ "topic": "simulateMeasurement",
"payload": { "type": "pressure", "position": "upstream", "value": 200, "unit": "mbar" } }
```
`type`: `pressure` / `flow` / `temperature` / `power`. `unit` is required and must be convertible to the canonical unit for the type.
### Diagnostics
- `showWorkingCurves` — snapshot of current curve slices + computed metrics; reply on port 0.
- `CoG` — current centre-of-gravity (peak efficiency point) indicators; reply on port 0.
### `registerChild`
Internal. Sensor children (typically `measurement` nodes) send this to bind themselves to the machine. The machine also emits one on port 2 shortly after deploy so a parent group/station can register it.
## Output ports
### Port 0 — process data
Delta-compressed payload. Only *changed* fields are emitted each tick. Keys use a **4-segment** format:
```
<type>.<variant>.<position>.<childId>
```
Examples:
| Key | Meaning |
|---|---|
| `flow.predicted.downstream.default` | predicted flow at discharge |
| `flow.predicted.atequipment.default` | predicted flow at equipment |
| `power.predicted.atequipment.default` | predicted electrical power draw |
| `pressure.measured.downstream.dashboard-sim-downstream` | simulated discharge pressure |
| `pressure.measured.upstream.<childId>` | real upstream sensor reading |
| `state` | current FSM state |
| `mode` | current mode |
| `ctrl` | current controller position (0100 %) |
| `NCog` / `cog` | normalized / absolute centre-of-gravity |
| `runtime` | cumulative operational hours |
Consumers must cache and merge deltas. The example flow `01 - Basic Manual Control.json` includes a function node that does exactly this — reuse its logic in your own flows.
### Port 1 — dbase (InfluxDB)
InfluxDB line-protocol payload formatted for the `telemetry` bucket. Tags are low-cardinality fields (node name, machine type); measurements are numeric values. See the [InfluxDB Schema Design](../../concepts/influxdb-schema-design.md) page for the full tag/field contract.
### Port 2 — parent
`{ topic: "registerChild", payload: <this-node-id>, positionVsParent }` — emitted once ~180 ms after deploy so a downstream parent group can discover this machine. Subsequent commands and data flow through the parent's input port.
## State machine
```
┌────────────────────────────┐
│ operational │◄────┐
└────┬──────────┬────────┬────┘ │
│ │ │ │
execMovement │ │ │ │
execMovement │ │ │ │
▼ ▼ ▼ ▼ │
accelerating decelerating │ emergencystop ─► off
│ │ │
└─── (abort)─┘ │
│ │
┌────▼──────────▼────┐
│ stopping │
└────────┬─────────────┘
coolingdown
idle
starting
warmingup
(operational)
```
Protected states (cannot be aborted by a new command): `warmingup`, `coolingdown`.
Interruptible states: `accelerating`, `decelerating`. A `shutdown` or `emergencystop` issued during a ramp aborts the ramp and drives the FSM correctly to `idle` / `off`.
Active states (contribute to `runtime`): `operational`, `starting`, `warmingup`, `accelerating`, `decelerating`.
## Predictions and pressure
Flow and power are curve-backed. The curve set is indexed by the differential pressure across the machine:
1. Best: both upstream and downstream pressures present → real Δp.
2. Degraded: only one side present → falls back to that side with a warn.
3. Minimum: no pressure → `fDimension = 0`; flow and power predictions use the lowest curve slice and will look unrealistic.
Pressure sources are resolved in priority order **real sensor child > virtual dashboard child > aggregated fallback**. Real-child values always win.
Predictions are only emitted while the FSM is in an active state (`operational`, `starting`, `warmingup`, `accelerating`, `decelerating`). In `idle`, `stopping`, `coolingdown`, `off`, `maintenance` the outputs are clamped to zero.
### Supported curves and verification
| Model | Pressure envelope | Flow envelope | Power envelope |
|---|---|---|---|
| `hidrostal-H05K-S03R` | 700 3900 mbar (33 slices) | 9.5 227 m³/h | 8.2 65.1 kW |
| `hidrostal-C5-D03R-SHN1` | 400 2900 mbar (26 slices) | 6.4 52.5 m³/h | 0.55 31.5 kW |
Both curves are covered by unit tests (`test/integration/curve-prediction.integration.test.js`) and a live E2E benchmark (`test/e2e/curve-prediction-benchmark.py`) that sweeps each pump through its own pressure × controller envelope. Last green run: **2026-04-13** — 12/12 samples per curve inside envelope, ctrl-monotonic, inverse-pressure monotonic.
> **Pressure out of envelope is not clamped.** If a measured pressure falls *below* the curve's minimum slice, the node extrapolates and may produce implausibly large flow values (e.g. H05K at 400 mbar, ctrl 20 % → flow ≈ 30 000 m³/h; real envelope max is 227). Use realistic sensor ranges on your pressure `measurement` children.
## Example flows
In the editor: **Import ▸ Examples ▸ EVOLV ▸ rotatingMachine**.
- `01 - Basic Manual Control.json` — single machine, inject-only. Good for smoke-testing a node installation.
- `02 - Integration with Machine Group.json``machineGroupControl` with two pumps as children. Good for verifying registration and parent orchestration.
- `03 - Dashboard Visualization.json` — FlowFuse dashboard with live charts. Depends on `@flowfuse/node-red-dashboard`.
## Troubleshooting
| Symptom | Likely cause | Fix |
|---|---|---|
| Editor says `pressure not initialized`, status ring is yellow | No pressure child wired yet and no simulated pressure injected. | Inject a `simulateMeasurement` of type `pressure` (both sides preferred) or wire a `measurement` child. |
| Predictions are enormous at `ctrl = 0 %` | At near-zero controller position with high backpressure, the intercept of the curve gives a nominally-nonzero flow. This is a curve-data artefact, not a runtime bug. | Confirm the curve with Rene / supplier data. For a conservative prediction use a lower `Reaction Speed` or constrain `setpoint` ≥ 10 %. |
| "Transition aborted" / "Movement aborted" in logs | Expected during `shutdown` / `emergencystop` issued during a ramp — the fix path intentionally aborts the active move. | None — informational only. |
| Status bar shows `pressure not initialized` even after inject | `simulateMeasurement` payload missing `unit` or with a non-convertible value. | Include `unit` (e.g. `"mbar"`) and a finite number in `value`. |
| Shutdown does nothing and no error | Machine is in `warmingup` or `coolingdown` (protected). | Wait for the phase to complete (≤ configured seconds) and retry. |
## Running it locally
```bash
git clone --recurse-submodules https://gitea.wbd-rd.nl/RnD/EVOLV.git
cd EVOLV
docker compose up -d
# Node-RED: http://localhost:1880 InfluxDB: :8086 Grafana: :3000
```
Then in Node-RED: **Import ▸ Examples ▸ EVOLV ▸ rotatingMachine ▸ 01 - Basic Manual Control**.
## Testing
```bash
cd nodes/rotatingMachine
npm test
```
Unit tests (79) cover construction, mode gating, sequences, interruptible movement, emergency stop, shutdown, efficiency/CoG, pressure initialization, output formatting, listener cleanup. See also `examples/README.md` for the flow-level test matrix.
## Production status
See the project memory entry `node_rotatingMachine.md` for the latest benchmarks and wishlist. Trial-ready as of 2026-04-13 following the interruptibility + schema-sync fixes documented in [session 2026-04-13](../../sessions/2026-04-13-rotatingMachine-trial-ready.md).

View File

@@ -0,0 +1,109 @@
---
title: "Session: measurement node — dispatcher bug fix + digital/MQTT mode"
created: 2026-04-13
updated: 2026-04-13
status: proven
tags: [session, measurement, smoothing, outlier, mqtt, iot]
---
# 2026-04-13 — measurement trial-ready + digital mode
## Scope
Honest review of the `measurement` node. Benchmark every method, reason about keeping the node agnostic across analog and digital sources, add a digital (MQTT/IoT) mode without breaking analog.
## Findings
### Silent dispatcher bug (critical)
`validateEnum` in `generalFunctions` lowercases enum values (`zScore``zscore`, `lowPass``lowpass`). But `specificClass.outlierDetection` and `specificClass.applySmoothing` compared against camelCase keys. Effect:
- 5 of 11 smoothing methods silently fell through to a no-op: `lowPass`, `highPass`, `weightedMovingAverage`, `bandPass`, `savitzkyGolay`.
- 2 of 3 outlier methods silently disabled: `zScore`, `modifiedZScore`.
- Only `mean`, `median`, `sd`, `min`, `max`, `none`, `kalman`, `iqr` (the already-lowercase ones) actually worked.
Users who picked any camelCase method from the dropdown got the raw last value or no outlier filtering, with no error. Flows deployed before this session that relied on these filters got no filtering at all.
### Test coverage was thin
Pre-session: **12 tests** — 1 for scaling, 1 for outlier toggle, 1 for event emit, 3 for example flow shape, 1 constructor, 1 routing, 1 invalid payload, 2 other. Every smoothing method beyond `mean` and every outlier method beyond a toggle-flip was untested. The dispatcher bug would have been caught immediately by per-method unit tests.
### Analog-only input shape
The node only accepted scalar `msg.payload`. MQTT / IoT devices commonly publish a single JSON blob with many readings per message. Every user wanting that pattern had to fan out into N measurement nodes — ugly, and the device's shared timestamp is lost.
## Fixes + additions
### Dispatcher normalization (`specificClass.js`)
Both `outlierDetection()` and `applySmoothing()` now lowercase the configured method and the lookup table keys. Legacy camelCase config values and normalized lowercase config values both work.
### `MeasurementContainer.isUnitCompatible` permissive short-circuit
Previously: if the unit couldn't be described by the convert module, compatibility returned false regardless of type. This blocked user-defined types like `humidity` with unit `%`. Now: when `measureMap[type]` is undefined (unknown type), accept any unit. Known types still validate strictly.
### Digital mode (new)
`config.mode.current === 'digital'` opts into a new input shape. `config.channels` declares one entry per JSON key. The new `Channel` class (`src/channel.js`) is a self-contained per-channel pipeline — outlier → offset → scaling → smoothing → min/max → constrain → emit. Analog behaviour is preserved exactly; flows built before this session work unchanged.
## Test additions
Before → after: **12 → 71** tests.
New files:
- `test/basic/smoothing-methods.basic.test.js` — every smoothing method covered, 16 tests.
- `test/basic/outlier-detection.basic.test.js` — every outlier method + toggle + fall-through, 10 tests.
- `test/basic/scaling-and-interpolation.basic.test.js` — offset / interpolateLinear / constrain / handleScaling / updateMinMaxValues / updateOutputPercent / updateOutputAbs / getOutput, 10 tests.
- `test/basic/calibration-and-stability.basic.test.js` — calibrate / isStable / evaluateRepeatability / toggleSimulation / tick / simulateInput, 11 tests.
- `test/integration/digital-mode.integration.test.js` — 12 tests covering channel build, payload dispatch, multi-channel emit, unknown keys, per-channel scaling / smoothing / outlier, empty channels, malformed entries, non-numeric values, digital-output shape.
## E2E verification (Dockerized Node-RED)
### Analog baseline — `/tmp/m_e2e_baseline.py`
Deploys `examples/basic.flow.json`, fires `{topic:"measurement", payload:42}` repeatedly. Observed port-0 output: `mAbs` climbed 0 → 2.1 → 2.8 → 3.15 → 3.36 → 4.2 across five ticks as the mean window filled with 42s (scaling 0..100 → 0..10). Tick cadence 9091001 ms (avg 981 ms). Registration at t=0.22 s.
### Digital end-to-end — `/tmp/m_digital_e2e.py`
Deploys a single measurement node in digital mode with three channels (`temperature` / `humidity` / `pressure`) and fires two MQTT-shaped payloads.
| Tick | Channel | mAbs | totalMinSmooth | totalMaxSmooth |
|---|---|---:|---:|---:|
| after inject 1 | temperature | 22.5 | 22.5 | 22.5 |
| after inject 1 | humidity | 45 | 45 | 45 |
| after inject 1 | pressure | 1013 | 1013 | 1013 |
| after inject 2 | temperature | 24 | 22.5 | 24 |
| after inject 2 | humidity | 42.5 | 42.5 | 45 |
| after inject 2 | pressure | 1014 | 1013 | 1014 |
Mean smoothing across a window of 3 computed per-channel, the `unknown` key in the payload ignored, all three events emitted on `<type>.measured.atequipment`.
## Files changed
```
nodes/generalFunctions/src/measurements/MeasurementContainer.js # permissive unit check for user-defined types
nodes/generalFunctions/src/configs/measurement.json # mode + channels schema
nodes/measurement/src/channel.js # new per-channel pipeline class
nodes/measurement/src/specificClass.js # dispatcher fix + digital dispatch
nodes/measurement/src/nodeClass.js # mode-aware input handler + tick
nodes/measurement/measurement.html # Mode dropdown + Channels JSON + help panel
nodes/measurement/README.md # rewrite
nodes/measurement/test/basic/smoothing-methods.basic.test.js # +16 tests
nodes/measurement/test/basic/outlier-detection.basic.test.js # +10 tests
nodes/measurement/test/basic/scaling-and-interpolation.basic.test.js # +10 tests
nodes/measurement/test/basic/calibration-and-stability.basic.test.js # +11 tests
nodes/measurement/test/integration/digital-mode.integration.test.js # +12 tests
```
## Production status
Trial-ready for both modes. Supervised trial recommended for digital-mode deployments until the channels-editor UI (currently a JSON textarea) lands.
## Follow-ups
- Repeatable-row editor widget for channels.
- `validateArray.minLength=0` evaluates as falsy; pre-existing generalFunctions bug affecting this node's `channels` and also `measurement.assetRegistration.childAssets`. Harmless warn at deploy time.
- Per-channel calibration + simulation for digital mode.
- Runtime channel reconfiguration via a dedicated topic (`addChannel` / `removeChannel`).

View File

@@ -0,0 +1,134 @@
---
title: "Session: rotatingMachine trial-ready — FSM interruptibility, config schema, UX fixes"
created: 2026-04-13
updated: 2026-04-13
status: proven
tags: [session, rotatingMachine, state-machine, docker, e2e]
---
# 2026-04-13 — rotatingMachine trial-ready
## Scope
Honest review + production-hardening pass on `rotatingMachine`. Fixes landed on top of the 2026-04-07 hardening and are verified against a Docker-hosted Node-RED stack.
## Findings (before fixes)
From a live E2E run captured via the Node-RED debug websocket (`/comms`):
- **Clean startup→operational→shutdown→idle path** works to spec: 3 s starting + 2 s warmup + 3 s stopping + 2 s cooldown, matching config exactly.
- **Tick cadence:** 1000 ms (min 1000, max 1005, avg 1002.5).
- **Predictions** gate correctly on pressure injection; at 900 mbar Δp the hidrostal-H05K-S03R curve yields a monotonic flow/power response.
- **State machine FSM** *rejects* `stopping`/`coolingdown`/`idle` transitions while the machine is in `accelerating`/`decelerating`, leaving a shutdown command silently dropped. Log symptom: `Invalid transition from accelerating to stopping. Transition not executed.`
- **Sequence `emergencyStop` not defined** warn appears when a parent orchestrator with the capital-S casing (e.g. `machineGroupControl` config) forwards the sequence name.
- **Config validator strips** `functionality.distance` and top-level `output` that `buildConfig` adds; every deploy prints removal warnings.
- Cosmetic: typo "acurate" in single-side pressure warn; editor lacks unit hints for `speed` / `startup` / etc.
## Fixes
### 1. Interruptible movement (`generalFunctions/src/state/state.js`)
`moveTo`'s `catch` block now detects `Movement aborted` / `Transition aborted` errors and transitions the FSM back to `operational`, unblocking subsequent sequence transitions. A new `movementAborted` event is emitted for observability.
### 2. Auto-abort on shutdown/emergency-stop (`rotatingMachine/src/specificClass.js`)
`executeSequence` now:
- Normalizes the sequence name to lowercase (defensive against parent callers using mixed case).
- When `shutdown` or `emergencystop` is requested from `accelerating`/`decelerating`, calls `state.abortCurrentMovement(...)` and waits up to 2 s for the FSM to return to `operational` via the new `_waitForOperational(timeoutMs)` helper that listens on the state emitter.
### 3. Config schema sync (`generalFunctions/src/configs/rotatingMachine.json`)
Added to the schema:
- `functionality.distance`, `.distanceUnit`, `.distanceDescription` (produced by the HTML editor).
- Top-level `output.process` / `output.dbase` (produced by `buildConfig`).
Also reverted an overly broad `buildConfig` addition to only emit `distance` (not `distanceUnit`/`distanceDescription`) so other nodes aren't forced to add these to their schemas.
### 4. UX polish
- Fixed typo "acurate" → "accurate" in the single-side pressure warning, plus made the message actionable.
- Added unit hints to Reaction Speed / Startup / Warmup / Shutdown / Cooldown fields in the editor.
- Expanded the Node-RED help panel with a topic reference, state diagram, prediction rules, and port documentation.
## Tests added
`test/integration/interruptible-movement.integration.test.js` — three regression tests for the FSM fix:
- `shutdown during accelerating aborts the move and reaches idle`
- `emergency stop during accelerating reaches off`
- `executeSequence accepts mixed-case sequence names`
`test/integration/curve-prediction.integration.test.js` — 12 parametrized tests across both shipped pump curves (`hidrostal-H05K-S03R` and `hidrostal-C5-D03R-SHN1`):
- Curve loader returns nq + np with matching pressure slices.
- Predicted flow and power at mid-pressure / mid-ctrl are finite and inside the curve envelope.
- Flow is monotonically non-decreasing across a ctrl sweep at fixed pressure.
- Flow decreases (or stays level) when pressure rises at fixed ctrl — centrifugal-pump physics.
- CoG / NCog are computed, finite, and inside [0, 100] controller units.
- Reverse predictor (flow → ctrl via reversed nq) round-trips within 10 % of the known controller position.
`test/e2e/curve-prediction-benchmark.py` + `test/e2e/README.md` — live Dockerized Node-RED benchmark that deploys one rotatingMachine per curve and records a (pressure × ctrl) sweep.
Full unit suite: **91/91 passing** (was 76/76 on the morning review).
## E2E verification (Dockerized Node-RED)
Via `/tmp/rm_e2e_verify.py` — deploys the example flow to `docker compose`-hosted Node-RED, drives it via `POST /inject/:id`, captures port-output via `ws://localhost:1880/comms`.
| Scenario | Observed state sequence | Pass? |
|---|---|---|
| Shutdown fired while `accelerating` | starting → warmingup → operational → accelerating → decelerating → stopping → coolingdown → **idle** | ✅ |
| Emergency stop fired while `accelerating` | starting → warmingup → operational → accelerating → **off** | ✅ |
| Clean startup → shutdown (regression) | starting → warmingup → operational → stopping → coolingdown → idle | ✅ |
Container log scan over a 3-minute window:
- `Unknown key` warns: 0 (was 6+ per deploy)
- `acurate` typo: 0 (was 2)
- `Invalid transition from accelerating/decelerating to ...` errors: 0 (was 3+)
- `Sequence '...' not defined`: 0 (was 1)
### Dual-curve prediction sweep
Via `nodes/rotatingMachine/test/e2e/curve-prediction-benchmark.py`. Deploys two live rotatingMachines, one per pump curve, and runs a (pressure × ctrl) sweep per pump. Each pump is tested only inside its own curve envelope.
| Pump | Pressures swept (mbar) | Ctrl setpoints (%) | Samples in envelope | Flow monotonic | Flow observed (m³/h) | Power observed (kW) |
|---|---|---|---|---|---|---|
| hidrostal-H05K-S03R | 700 / 2300 / 3900 | 20 / 40 / 60 / 80 | 12/12 ✅ | ✅ | 10.3 208.3 | 12.3 50.3 |
| hidrostal-C5-D03R-SHN1 | 400 / 1700 / 2900 | 20 / 40 / 60 / 80 | 12/12 ✅ | ✅ | 8.7 45.6 | 0.7 13.0 |
Inverse-pressure monotonicity (centrifugal-pump physics) also verified: for both pumps, flow at the highest pressure slice is strictly lower than flow at the lowest pressure slice for the same ctrl.
**Known limitation** captured in the memory file: extrapolating pressure *below* the curve's minimum slice produces nonsensical flow values (e.g. H05K at 400 mbar ctrl=20% predicts ~30 000 m³/h vs envelope max 227 m³/h). Upstream `measurement` nodes are expected to clamp sensors to realistic ranges; rotatingMachine itself does not.
Separately, the C5 curve still exhibits the previously-documented power non-monotonicity at p=1700 mbar (sparse-data spline artefact noted in the 2026-04-07 session); this is compensated by the group-optimization marginal-cost refinement loop.
## Files changed
```
nodes/generalFunctions/src/state/state.js # abort recovery
nodes/generalFunctions/src/configs/index.js # buildConfig trim
nodes/generalFunctions/src/configs/rotatingMachine.json # schema sync
nodes/rotatingMachine/src/specificClass.js # exec + typo
nodes/rotatingMachine/rotatingMachine.html # UX hints + help
nodes/rotatingMachine/test/integration/interruptible-movement.integration.test.js # +3 tests (FSM)
nodes/rotatingMachine/test/integration/curve-prediction.integration.test.js # +12 tests (dual curve)
nodes/rotatingMachine/test/e2e/curve-prediction-benchmark.py # new E2E benchmark
nodes/rotatingMachine/test/e2e/README.md # benchmark docs
nodes/rotatingMachine/README.md # rewrite
```
## Production readiness
Status: **trial-ready**. The caveats flagged in the 2026-04-13 memory file (`node_rotatingMachine.md`) are resolved. Remaining items are in the wishlist (interruptible curve validation feedback, domain review of ctrl≈0% + backpressure flow prediction, opt-in full-snapshot port-0 mode, per-machine `/health` endpoint).
## Verification command
```bash
cd /mnt/d/gitea/EVOLV
docker compose up -d nodered influxdb
cd nodes/rotatingMachine && npm test
python3 /tmp/rm_e2e_verify.py # end-to-end smoke
```