Compare commits
62 Commits
6a6c04d34b
...
main
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4a9521154b | ||
|
|
732b5a3380 | ||
|
|
c8f149e204 | ||
|
|
b693e0b90c | ||
|
|
2b0c4e89b1 | ||
|
|
faaeb2efd3 | ||
|
|
53b55d81c3 | ||
|
|
eb97670179 | ||
|
|
cc4ee670ea | ||
|
|
a51bc46e26 | ||
|
|
b18c47c07e | ||
|
|
60c8d0ff66 | ||
|
|
658915c53e | ||
|
|
0cbd6a4077 | ||
|
|
bc8138c3dc | ||
|
|
06d81169e8 | ||
|
|
82db2953e9 | ||
|
|
d439b048f2 | ||
|
|
e280d87e6a | ||
|
|
64944aa9d8 | ||
|
|
0d7af6bfff | ||
|
|
7aacee6482 | ||
|
|
d7d106773e | ||
|
|
89f3b5ddc4 | ||
|
|
d0fe4d0583 | ||
|
|
0300a76ae8 | ||
|
|
a1aa44f6ca | ||
|
|
6cf1821161 | ||
|
|
48f790d123 | ||
|
|
bac6c620b1 | ||
|
|
7ded2a4415 | ||
|
|
6d19038784 | ||
|
|
fd9d1679cb | ||
|
|
4336002b77 | ||
|
|
f57343f5e3 | ||
|
|
65ceb696ab | ||
|
|
91a298960c | ||
|
|
1c4a3f9685 | ||
|
|
9ca32dddfb | ||
|
|
75458713be | ||
|
|
35221fc5dd | ||
|
|
93a5b6a90e | ||
|
|
1d98670706 | ||
|
|
a432eea7fe | ||
|
|
9cb3657bae | ||
|
|
bd9432eebb | ||
|
|
c9bacb64c8 | ||
|
|
e580c93c84 | ||
|
|
b02306c42f | ||
|
|
2c76430394 | ||
|
|
49ebd833db | ||
|
|
905a061590 | ||
|
|
99aedf46c3 | ||
|
|
80de324b32 | ||
|
|
c8d5ea0fce | ||
|
|
b871b23c24 | ||
|
|
91b681a74d | ||
|
|
76d2008e52 | ||
|
|
3c304f14e5 | ||
|
|
24c443840b | ||
|
|
c4c8629c01 | ||
| 609c72cedc |
@@ -2,7 +2,7 @@
|
|||||||
|
|
||||||
## Context
|
## Context
|
||||||
- Task/request: Adapt EVOLV agents/skills using Harness Engineering patterns and set owner-controlled operating defaults.
|
- Task/request: Adapt EVOLV agents/skills using Harness Engineering patterns and set owner-controlled operating defaults.
|
||||||
- Impacted files/contracts: `AGENTS.md`, `.agents/skills/*/SKILL.md`, `.agents/skills/*/agents/openai.yaml`, decision-log policy.
|
- Impacted files/contracts: `.agents/AGENTS.md`, `.agents/skills/*/SKILL.md`, `.agents/skills/*/agents/openai.yaml`, decision-log policy.
|
||||||
- Why a decision is required now: New harness workflow needs explicit defaults for compatibility, safety bias, and governance discipline.
|
- Why a decision is required now: New harness workflow needs explicit defaults for compatibility, safety bias, and governance discipline.
|
||||||
|
|
||||||
## Options
|
## Options
|
||||||
@@ -30,9 +30,9 @@
|
|||||||
- Data/operations impact: Decision traceability improves cross-turn consistency and auditability.
|
- Data/operations impact: Decision traceability improves cross-turn consistency and auditability.
|
||||||
|
|
||||||
## Implementation Notes
|
## Implementation Notes
|
||||||
- Required code/doc updates: Set defaults in `AGENTS.md` and orchestrator skill instructions; keep decision-log template active.
|
- Required code/doc updates: Set defaults in `.agents/AGENTS.md` and orchestrator skill instructions; keep decision-log template active.
|
||||||
- Validation evidence required: Presence of defaults in policy docs and this decision artifact under `.agents/decisions/`.
|
- Validation evidence required: Presence of defaults in policy docs and this decision artifact under `.agents/decisions/`.
|
||||||
|
|
||||||
## Rollback / Migration
|
## Rollback / Migration
|
||||||
- Rollback strategy: Update defaults in `AGENTS.md` and orchestrator SKILL; create a superseding decision log entry.
|
- Rollback strategy: Update defaults in `.agents/AGENTS.md` and orchestrator SKILL; create a superseding decision log entry.
|
||||||
- Migration/deprecation plan: For any future hard-break preference, require explicit migration plan and effective date in a new decision entry.
|
- Migration/deprecation plan: For any future hard-break preference, require explicit migration plan and effective date in a new decision entry.
|
||||||
|
|||||||
@@ -0,0 +1,36 @@
|
|||||||
|
# DECISION-20260323-architecture-layering-resilience-and-config-authority
|
||||||
|
|
||||||
|
## Context
|
||||||
|
- Task/request: refine the EVOLV architecture baseline using the current stack drawings and owner guidance.
|
||||||
|
- Impacted files/contracts: architecture documentation, future wiki structure, telemetry/storage strategy, security boundaries, and configuration authority assumptions.
|
||||||
|
- Why a decision is required now: the architecture can no longer stay at a generic "Node-RED plus cloud" level; several operating principles were clarified by the owner and need to be treated as architectural defaults.
|
||||||
|
|
||||||
|
## Options
|
||||||
|
1. Keep the architecture intentionally broad and tool-centric
|
||||||
|
- Benefits: fewer early commitments.
|
||||||
|
- Risks: blurred boundaries for resilience, data ownership, and security; easier to drift into contradictory implementations.
|
||||||
|
- Rollout notes: wiki remains descriptive but not decision-shaping.
|
||||||
|
|
||||||
|
2. Adopt explicit defaults for resilience, API boundary, telemetry layering, and configuration authority
|
||||||
|
- Benefits: clearer target operating model; easier to design stack services and wiki pages consistently; aligns diagrams with intended operational behavior.
|
||||||
|
- Risks: some assumptions may outpace current implementation and therefore create an architecture debt backlog.
|
||||||
|
- Rollout notes: document gaps clearly and treat incomplete systems as planned workstreams rather than pretending they already exist.
|
||||||
|
|
||||||
|
## Decision
|
||||||
|
- Selected option: Option 2.
|
||||||
|
- Decision owner: repository owner confirmed during architecture review.
|
||||||
|
- Date: 2026-03-23.
|
||||||
|
- Rationale: the owner clarified concrete architecture goals that materially affect security, resilience, and platform structure. The documentation should encode those as defaults instead of leaving them implicit.
|
||||||
|
|
||||||
|
## Consequences
|
||||||
|
- Compatibility impact: low immediate code impact, but future implementations should align to these defaults.
|
||||||
|
- Safety/security impact: improved boundary clarity by making central the integration entry point and keeping edge protected behind site/central mediation.
|
||||||
|
- Data/operations impact: multi-level InfluxDB and smart-storage behavior become first-class design concerns; `tagcodering` becomes the intended configuration backbone.
|
||||||
|
|
||||||
|
## Implementation Notes
|
||||||
|
- Required code/doc updates: update the architecture review doc, add visual wiki-ready diagrams, and track follow-up work for incomplete `tagcodering` integration and telemetry policy design.
|
||||||
|
- Validation evidence required: architecture docs reflect the agreed principles and diagrams; no contradiction with current repo evidence for implemented components.
|
||||||
|
|
||||||
|
## Rollback / Migration
|
||||||
|
- Rollback strategy: return to a generic descriptive architecture document without explicit defaults.
|
||||||
|
- Migration/deprecation plan: implement these principles incrementally, starting with configuration authority, telemetry policy, and site/central API boundaries.
|
||||||
@@ -0,0 +1,36 @@
|
|||||||
|
# DECISION-20260323-compose-secrets-via-env
|
||||||
|
|
||||||
|
## Context
|
||||||
|
- Task/request: harden the target-state stack example so credentials are not stored directly in `temp/cloud.yml`.
|
||||||
|
- Impacted files/contracts: `temp/cloud.yml`, deployment/operations practice for target-state infrastructure examples.
|
||||||
|
- Why a decision is required now: the repository contained inline credentials in a tracked compose file, which conflicts with the intended security posture and creates avoidable secret-leak risk.
|
||||||
|
|
||||||
|
## Options
|
||||||
|
1. Keep credentials inline in the compose file
|
||||||
|
- Benefits: simplest to run as a standalone example.
|
||||||
|
- Risks: secrets leak into git history, reviews, copies, and local machines; encourages unsafe operational practice.
|
||||||
|
- Rollout notes: none, but the risk remains permanent once committed.
|
||||||
|
|
||||||
|
2. Move credentials to server-side environment variables and keep only placeholders in compose
|
||||||
|
- Benefits: aligns the manifest with a safer deployment pattern; keeps tracked config portable across environments; supports secret rotation without editing the compose file.
|
||||||
|
- Risks: operators must manage `.env` or equivalent secret injection correctly.
|
||||||
|
- Rollout notes: provide an example env file and document that the real `.env` stays on the server and out of version control.
|
||||||
|
|
||||||
|
## Decision
|
||||||
|
- Selected option: Option 2.
|
||||||
|
- Decision owner: repository owner confirmed during task discussion.
|
||||||
|
- Date: 2026-03-23.
|
||||||
|
- Rationale: the target architecture should model the right operational pattern. Inline secrets in repository-tracked compose files are not acceptable for EVOLV's intended OT/IT deployment posture.
|
||||||
|
|
||||||
|
## Consequences
|
||||||
|
- Compatibility impact: low; operators now need to supply environment variables when deploying `temp/cloud.yml`.
|
||||||
|
- Safety/security impact: improved secret hygiene and lower credential exposure risk.
|
||||||
|
- Data/operations impact: deployment requires an accompanying `.env` on the server or explicit `--env-file` usage.
|
||||||
|
|
||||||
|
## Implementation Notes
|
||||||
|
- Required code/doc updates: replace inline secrets in `temp/cloud.yml`; add `temp/cloud.env.example`; keep the real `.env` untracked on the server.
|
||||||
|
- Validation evidence required: inspect compose file for `${...}` placeholders and verify no real credentials remain in tracked files touched by this change.
|
||||||
|
|
||||||
|
## Rollback / Migration
|
||||||
|
- Rollback strategy: reintroduce inline values, though this is not recommended.
|
||||||
|
- Migration/deprecation plan: create a server-local `.env` from `temp/cloud.env.example`, fill in real values, and run compose from that environment.
|
||||||
@@ -0,0 +1,43 @@
|
|||||||
|
## Context
|
||||||
|
|
||||||
|
The single demo bioreactor did not reflect the intended EVOLV biological treatment concept. The owner requested:
|
||||||
|
|
||||||
|
- four reactor zones in series
|
||||||
|
- staged aeration based on effluent NH4
|
||||||
|
- local visualization per zone for NH4, NO3, O2, and other relevant state variables
|
||||||
|
- improved PFR numerical stability by increasing reactor resolution
|
||||||
|
|
||||||
|
The localhost deployment also needed to remain usable for E2E debugging with Node-RED, InfluxDB, and Grafana.
|
||||||
|
|
||||||
|
## Options Considered
|
||||||
|
|
||||||
|
1. Keep one large PFR and add more internal profile visualization only.
|
||||||
|
2. Split the biology into four explicit reactor zones in the flow and control aeration at zone level.
|
||||||
|
3. Replace the PFR demo with a simpler CSTR train for faster visual response.
|
||||||
|
|
||||||
|
## Decision
|
||||||
|
|
||||||
|
Choose option 2.
|
||||||
|
|
||||||
|
The demo flow now uses four explicit PFR zones in series with:
|
||||||
|
|
||||||
|
- equal-zone sizing (`4 x 500 m3`, total `2000 m3`)
|
||||||
|
- explicit `Fluent` forwarding between zones
|
||||||
|
- common clocking for all zones
|
||||||
|
- external `OTR` control instead of fixed `kla`
|
||||||
|
- staged NH4-based aeration escalation with 30-minute hold logic
|
||||||
|
- per-zone telemetry to InfluxDB and Node-RED dashboard charts
|
||||||
|
|
||||||
|
For runtime stability on localhost, the demo uses a higher spatial resolution with moderate compute load rather than the earlier single-reactor setup.
|
||||||
|
|
||||||
|
## Consequences
|
||||||
|
|
||||||
|
- The flow is easier to reason about operationally because each aeration zone is explicit.
|
||||||
|
- Zone-level telemetry is available for dashboarding and debugging.
|
||||||
|
- PFR outlet response remains residence-time dependent, so zone outlet composition will not change instantly after startup or inflow changes.
|
||||||
|
- Grafana datasource query round-trip remains valid, but dashboard auto-generation still needs separate follow-up if strict dashboard creation is required in E2E checks.
|
||||||
|
|
||||||
|
## Rollback / Migration Notes
|
||||||
|
|
||||||
|
- Rolling back to the earlier demo means restoring the single `demo_reactor` topology in `docker/demo-flow.json`.
|
||||||
|
- Existing E2E checks and dashboards should prefer the explicit zone measurements (`reactor_demo_reactor_z1` ... `reactor_demo_reactor_z4`) going forward.
|
||||||
123
.agents/improvements/EXAMPLE_FLOW_TEMPLATE.md
Normal file
123
.agents/improvements/EXAMPLE_FLOW_TEMPLATE.md
Normal file
@@ -0,0 +1,123 @@
|
|||||||
|
# EVOLV Example Flow Template Standard
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
|
||||||
|
Every EVOLV node MUST have example flows in its `examples/` directory. Node-RED automatically discovers these and shows them in **Import > Examples > EVOLV**.
|
||||||
|
|
||||||
|
## Naming Convention
|
||||||
|
|
||||||
|
```
|
||||||
|
examples/
|
||||||
|
01 - Basic Manual Control.json # Tier 1: inject-based, zero deps
|
||||||
|
02 - Integration with Parent Node.json # Tier 2: parent-child wiring
|
||||||
|
03 - Dashboard Visualization.json # Tier 3: FlowFuse dashboard (optional)
|
||||||
|
```
|
||||||
|
|
||||||
|
The filename (minus `.json`) becomes the menu label in Node-RED.
|
||||||
|
|
||||||
|
## Tier 1: Basic (inject-based, zero external dependencies)
|
||||||
|
|
||||||
|
**Purpose:** Demonstrate all key functionality using only core Node-RED nodes.
|
||||||
|
|
||||||
|
**Required elements:**
|
||||||
|
- 1x `comment` node (top-left): title + 2-3 line description of what the flow demonstrates
|
||||||
|
- 1x `comment` node (near inputs): "HOW TO USE: 1. Deploy flow. 2. Click inject nodes..."
|
||||||
|
- `inject` nodes for each control action (labeled clearly)
|
||||||
|
- The EVOLV node under test with **realistic, working configuration**
|
||||||
|
- 3x `debug` nodes: "Port 0: Process", "Port 1: InfluxDB", "Port 2: Parent"
|
||||||
|
- Optional: 1x `function` node to format output readably (keep under 20 lines)
|
||||||
|
|
||||||
|
**Forbidden:** No dashboard nodes. No FlowFuse widgets. No HTTP nodes. No third-party nodes.
|
||||||
|
|
||||||
|
**Config rules:**
|
||||||
|
- All required config fields filled with realistic values
|
||||||
|
- Model/curve fields set to existing models in the library
|
||||||
|
- `enableLog: true, logLevel: "info"` so users can see what happens
|
||||||
|
- Unit fields explicitly set (not empty strings)
|
||||||
|
|
||||||
|
**Layout rules:**
|
||||||
|
- Comment nodes: top-left
|
||||||
|
- Input section: left side (x: 100-400)
|
||||||
|
- EVOLV node: center (x: 500-600)
|
||||||
|
- Debug/output: right side (x: 700-900)
|
||||||
|
- Y spacing: ~60px between nodes
|
||||||
|
|
||||||
|
## Tier 2: Integration (parent-child relationships)
|
||||||
|
|
||||||
|
**Purpose:** Show how nodes connect as parent-child via Port 2.
|
||||||
|
|
||||||
|
**Required elements:**
|
||||||
|
- 1x `comment` node: what relationship is being demonstrated
|
||||||
|
- Parent node + child node(s) properly wired
|
||||||
|
- Port 2 of child → Port 0 input of parent (registration pathway)
|
||||||
|
- `inject` nodes to send control commands to parent
|
||||||
|
- `inject` nodes to send measurement/state to children
|
||||||
|
- `debug` nodes on all ports of both parent and children
|
||||||
|
|
||||||
|
**Node-specific integration patterns:**
|
||||||
|
- `machineGroupControl` → 2x `rotatingMachine`
|
||||||
|
- `pumpingStation` → 1x `rotatingMachine` + 1x `measurement` (assetType: "flow")
|
||||||
|
- `valveGroupControl` → 2x `valve`
|
||||||
|
- `reactor` → `settler` (downstream cascade)
|
||||||
|
- `measurement` → any parent node
|
||||||
|
|
||||||
|
## Tier 3: Dashboard Visualization (optional)
|
||||||
|
|
||||||
|
**Purpose:** Rich interactive demo with FlowFuse dashboard.
|
||||||
|
|
||||||
|
**Allowed additional dependencies:** FlowFuse dashboard nodes only (`@flowfuse/node-red-dashboard`).
|
||||||
|
|
||||||
|
**Required elements:**
|
||||||
|
- 1x `comment` node: "Requires @flowfuse/node-red-dashboard"
|
||||||
|
- Auto-initialization: `inject` node with "Inject once after 1 second" for default mode/state
|
||||||
|
- Dashboard controls clearly labeled
|
||||||
|
- Charts with proper axis labels and units
|
||||||
|
- Keep parser/formatter functions under 40 lines (split if needed)
|
||||||
|
- No null message outputs (filter before sending to charts)
|
||||||
|
|
||||||
|
## Comment Node Standard
|
||||||
|
|
||||||
|
Every comment node must use this format:
|
||||||
|
|
||||||
|
```
|
||||||
|
Title: [Node Name] - [Flow Tier]
|
||||||
|
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━
|
||||||
|
[2-3 line description]
|
||||||
|
|
||||||
|
Prerequisites: [list any requirements]
|
||||||
|
```
|
||||||
|
|
||||||
|
## ID Naming Convention
|
||||||
|
|
||||||
|
Use predictable, readable IDs for all nodes (not random hex):
|
||||||
|
|
||||||
|
```
|
||||||
|
{nodeName}_{tier}_{purpose}
|
||||||
|
|
||||||
|
Examples:
|
||||||
|
- rm_basic_tab (rotatingMachine, basic flow, tab)
|
||||||
|
- rm_basic_node (the actual rotatingMachine node)
|
||||||
|
- rm_basic_debug_port0 (debug on port 0)
|
||||||
|
- rm_basic_inject_start (inject for startup)
|
||||||
|
- rm_basic_comment_title (title comment)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Validation Checklist
|
||||||
|
|
||||||
|
Before committing an example flow:
|
||||||
|
|
||||||
|
- [ ] Can be imported into clean Node-RED + EVOLV (no other packages needed for Tier 1/2)
|
||||||
|
- [ ] All nodes show correct status after deploy (no red triangles)
|
||||||
|
- [ ] Comment nodes present and descriptive
|
||||||
|
- [ ] All 3 output ports wired to something (debug at minimum)
|
||||||
|
- [ ] IDs follow naming convention (no random hex)
|
||||||
|
- [ ] Node config uses realistic values (not empty strings or defaults)
|
||||||
|
- [ ] File named per convention (01/02/03 prefix)
|
||||||
|
|
||||||
|
## Gitea Wiki Integration
|
||||||
|
|
||||||
|
Each node's wiki gets an "Examples" page that:
|
||||||
|
1. Lists all available example flows with descriptions
|
||||||
|
2. Links to the raw .json file in the repo
|
||||||
|
3. Describes prerequisites and step-by-step usage
|
||||||
|
4. Shows expected behavior after deploy
|
||||||
@@ -22,3 +22,6 @@ Lifecycle:
|
|||||||
| IMP-20260219-022 | 2026-02-19 | generalFunctions/outliers | `DynamicClusterDeviation.update()` emits verbose `console.log` traces on each call with no log-level guard, unsafe for production telemetry volume. | `nodes/generalFunctions/src/outliers/outlierDetection.js:7` | open |
|
| IMP-20260219-022 | 2026-02-19 | generalFunctions/outliers | `DynamicClusterDeviation.update()` emits verbose `console.log` traces on each call with no log-level guard, unsafe for production telemetry volume. | `nodes/generalFunctions/src/outliers/outlierDetection.js:7` | open |
|
||||||
| IMP-20260224-006 | 2026-02-24 | rotatingMachine prediction fallback | When only one pressure side is available, predictor uses absolute pressure as surrogate differential, which can materially bias flow prediction under varying suction/discharge conditions. | `nodes/rotatingMachine/src/specificClass.js:573`, `nodes/rotatingMachine/src/specificClass.js:588` | open |
|
| IMP-20260224-006 | 2026-02-24 | rotatingMachine prediction fallback | When only one pressure side is available, predictor uses absolute pressure as surrogate differential, which can materially bias flow prediction under varying suction/discharge conditions. | `nodes/rotatingMachine/src/specificClass.js:573`, `nodes/rotatingMachine/src/specificClass.js:588` | open |
|
||||||
| IMP-20260224-012 | 2026-02-24 | cross-node unit architecture | Canonical unit-anchor strategy is implemented in rotatingMachine plus phase-1 controllers (`machineGroupControl`, `pumpingStation`, `valve`, `valveGroupControl`); continue rollout to remaining nodes so all runtime paths use canonical storage + explicit ingress/egress units. | `nodes/machineGroupControl/src/specificClass.js:42`, `nodes/pumpingStation/src/specificClass.js:48`, `nodes/valve/src/specificClass.js:87`, `nodes/valveGroupControl/src/specificClass.js:78` | open |
|
| IMP-20260224-012 | 2026-02-24 | cross-node unit architecture | Canonical unit-anchor strategy is implemented in rotatingMachine plus phase-1 controllers (`machineGroupControl`, `pumpingStation`, `valve`, `valveGroupControl`); continue rollout to remaining nodes so all runtime paths use canonical storage + explicit ingress/egress units. | `nodes/machineGroupControl/src/specificClass.js:42`, `nodes/pumpingStation/src/specificClass.js:48`, `nodes/valve/src/specificClass.js:87`, `nodes/valveGroupControl/src/specificClass.js:78` | open |
|
||||||
|
| IMP-20260323-001 | 2026-03-23 | architecture/security | `temp/cloud.yml` stores environment credentials directly in a repository-tracked target-state stack example; replace with env placeholders/secret injection and split illustrative architecture from deployable manifests. | `temp/cloud.yml:1` | open |
|
||||||
|
| IMP-20260323-002 | 2026-03-23 | architecture/configuration | Intended database-backed configuration authority (`tagcodering`) is not yet visibly integrated as the primary runtime config backbone in this repository; define access pattern, schema ownership, and rollout path for edge/site/central consumers. | `architecture/stack-architecture-review.md:1` | open |
|
||||||
|
| IMP-20260323-003 | 2026-03-23 | architecture/telemetry | Multi-level smart-storage strategy is a stated architecture goal, but signal classes, reconstruction guarantees, and authoritative-layer rules are not yet formalized; define telemetry policy before broad deployment. | `architecture/stack-architecture-review.md:1` | open |
|
||||||
|
|||||||
@@ -42,7 +42,7 @@ You are the EVOLV orchestrator agent. You decompose complex tasks, route to spec
|
|||||||
|
|
||||||
## Reference Files
|
## Reference Files
|
||||||
- `.agents/skills/evolv-orchestrator/SKILL.md` — Full orchestration protocol
|
- `.agents/skills/evolv-orchestrator/SKILL.md` — Full orchestration protocol
|
||||||
- `AGENTS.md` — Agent invocation policy, routing table, decision governance
|
- `.agents/AGENTS.md` — Agent invocation policy, routing table, decision governance
|
||||||
- `.agents/decisions/` — Decision log directory
|
- `.agents/decisions/` — Decision log directory
|
||||||
- `.agents/improvements/IMPROVEMENTS_BACKLOG.md` — Deferred improvements
|
- `.agents/improvements/IMPROVEMENTS_BACKLOG.md` — Deferred improvements
|
||||||
|
|
||||||
@@ -52,4 +52,4 @@ You are the EVOLV orchestrator agent. You decompose complex tasks, route to spec
|
|||||||
- Owner-approved defaults: compatibility=controlled, safety=availability-first
|
- Owner-approved defaults: compatibility=controlled, safety=availability-first
|
||||||
|
|
||||||
## Reasoning Difficulty: Medium-High
|
## Reasoning Difficulty: Medium-High
|
||||||
This agent handles multi-domain task decomposition, cross-cutting impact analysis, and decision governance enforcement. The primary challenge is correctly mapping changes across node boundaries — a single modification can cascade through parent-child relationships, shared contracts, and InfluxDB semantics. When uncertain about cross-domain impact, consult `.agents/skills/evolv-orchestrator/SKILL.md` and `AGENTS.md` before routing to specialist agents.
|
This agent handles multi-domain task decomposition, cross-cutting impact analysis, and decision governance enforcement. The primary challenge is correctly mapping changes across node boundaries — a single modification can cascade through parent-child relationships, shared contracts, and InfluxDB semantics. When uncertain about cross-domain impact, consult `.agents/skills/evolv-orchestrator/SKILL.md` and `.agents/AGENTS.md` before routing to specialist agents.
|
||||||
|
|||||||
501
.claude/rules/node-red-flow-layout.md
Normal file
501
.claude/rules/node-red-flow-layout.md
Normal file
@@ -0,0 +1,501 @@
|
|||||||
|
# Node-RED Flow Layout Rules
|
||||||
|
|
||||||
|
How to lay out a multi-tab Node-RED demo or production flow so it is readable, debuggable, and trivially extendable. These rules apply to anything you build with `examples/` flows, dashboards, or production deployments.
|
||||||
|
|
||||||
|
## 1. Tab boundaries — by CONCERN, not by data
|
||||||
|
|
||||||
|
Every node lives on the tab matching its **concern**, never where it happens to be wired:
|
||||||
|
|
||||||
|
| Tab | Lives here | Never here |
|
||||||
|
|---|---|---|
|
||||||
|
| **🏭 Process Plant** | EVOLV nodes (rotatingMachine, MGC, pumpingStation, measurement, reactor, settler, …) + small per-node output formatters | UI widgets, demo drivers, one-shot setup injects |
|
||||||
|
| **📊 Dashboard UI** | All `ui-*` widgets, the wrapper functions that turn a button click into a typed `msg`, the trend-feeder split functions | Anything that produces data autonomously, anything that talks to EVOLV nodes directly |
|
||||||
|
| **🎛️ Demo Drivers** | Random generators, scripted scenarios, schedule injectors, anything that exists only to drive the demo | Real production data sources (those go on Process Plant or are wired in externally) |
|
||||||
|
| **⚙️ Setup & Init** | One-shot `once: true` injects (setMode, setScaling, auto-startup) | Anything that fires more than once |
|
||||||
|
|
||||||
|
**Why these four:** each tab can be disabled or deleted independently. Disable Demo Drivers → demo becomes inert until a real data source is wired. Disable Setup → fresh deploys don't auto-configure (good for debugging). Disable Dashboard UI → headless mode for tests. Process Plant always stays.
|
||||||
|
|
||||||
|
If you find yourself wanting a node "between" two tabs, you've named your concerns wrong — re-split.
|
||||||
|
|
||||||
|
## 2. Cross-tab wiring — link nodes only, named channels
|
||||||
|
|
||||||
|
Never wire a node on tab A directly to a node on tab B. Use **named link-out / link-in pairs**:
|
||||||
|
|
||||||
|
```text
|
||||||
|
[ui-slider] ──► [link out cmd:demand] ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ┐
|
||||||
|
│
|
||||||
|
▼
|
||||||
|
[random gen] ─► [link out cmd:demand] ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─ ─► [link in cmd:demand] ──► [router] ──► [MGC]
|
||||||
|
▲
|
||||||
|
│
|
||||||
|
many link-outs may target one link-in
|
||||||
|
```
|
||||||
|
|
||||||
|
### Naming convention
|
||||||
|
|
||||||
|
Channels follow `<direction>:<topic>` lowercase, kebab-case after the colon:
|
||||||
|
|
||||||
|
- `cmd:` — UI / drivers → process. Carries commands.
|
||||||
|
- `evt:` — process → UI / external. Carries state events.
|
||||||
|
- `setup:` — setup tab → wherever. Carries one-shot init.
|
||||||
|
|
||||||
|
Examples used in the pumping-station demo:
|
||||||
|
- `cmd:demand`, `cmd:randomToggle`, `cmd:mode`
|
||||||
|
- `cmd:station-startup`, `cmd:station-shutdown`, `cmd:station-estop`
|
||||||
|
- `cmd:setpoint-A`, `cmd:setpoint-B`, `cmd:setpoint-C`
|
||||||
|
- `cmd:pump-A-seq` (start/stop for pump A specifically)
|
||||||
|
- `evt:pump-A`, `evt:pump-B`, `evt:pump-C`, `evt:mgc`, `evt:ps`
|
||||||
|
- `setup:to-mgc`
|
||||||
|
|
||||||
|
### Channels are the contract
|
||||||
|
|
||||||
|
The list of channel names IS the inter-tab API. Document it in the demo's README. Renaming a channel is a breaking change.
|
||||||
|
|
||||||
|
### When to use one channel vs many
|
||||||
|
|
||||||
|
- One channel, many emitters: same kind of message from multiple sources (e.g. `cmd:demand` is fired by both the slider and the random generator).
|
||||||
|
- Different channels: messages with different *meaning* even if they go to the same node (e.g. don't fold `cmd:setpoint-A` into a generic `cmd:pump-A` — keep setpoint and start/stop separate).
|
||||||
|
- Avoid one mega-channel: a "process commands" channel that the receiver routes-by-topic is harder to read than separate channels per concern.
|
||||||
|
|
||||||
|
### Don't use link-call for fan-out
|
||||||
|
|
||||||
|
`link call` is for synchronous request/response (waits for a paired `link out` in `return` mode). For fan-out, use plain `link out` (mode=`link`) with multiple targets, or a single link out → single link in → function-node fan-out (whichever is clearer for your case).
|
||||||
|
|
||||||
|
## 3. Spacing and visual layout
|
||||||
|
|
||||||
|
Nodes need air to be readable. Apply these constants in any flow generator:
|
||||||
|
|
||||||
|
```python
|
||||||
|
LANE_X = [120, 380, 640, 900, 1160, 1420] # 6 vertical lanes per tab
|
||||||
|
ROW = 80 # standard row pitch
|
||||||
|
SECTION_GAP = 200 # extra y-shift between sections
|
||||||
|
```
|
||||||
|
|
||||||
|
### Lane assignment (process plant tab as example)
|
||||||
|
|
||||||
|
| Lane | Contents |
|
||||||
|
|---|---|
|
||||||
|
| 0 (x=120) | Inputs from outside the tab — link-in nodes, injects |
|
||||||
|
| 1 (x=380) | First-level transformers — wrappers, fan-outs, routers |
|
||||||
|
| 2 (x=640) | Mid-level — section comments live here too |
|
||||||
|
| 3 (x=900) | Target nodes — the EVOLV node itself (pump, MGC, PS) |
|
||||||
|
| 4 (x=1160) | Output formatters — function nodes that build dashboard-friendly payloads |
|
||||||
|
| 5 (x=1420) | Outputs to outside the tab — link-out nodes, debug taps |
|
||||||
|
|
||||||
|
Inputs flow left → right. Don't loop wires backwards across the tab.
|
||||||
|
|
||||||
|
### Section comments
|
||||||
|
|
||||||
|
Every logical group within a tab gets a comment header at lane 2 with a `── Section name ──` style label. Use them liberally — every 3-5 nodes deserves a header. The `info` field on the comment carries the multi-line description.
|
||||||
|
|
||||||
|
### Section spacing
|
||||||
|
|
||||||
|
`SECTION_GAP = 200` between sections, on top of the standard row pitch. Don't pack sections together — when you have 6 measurements on a tab, give each pump 4 rows + a 200 px gap to the next pump. Yes, it makes tabs scroll. Scroll is cheap; visual confusion is expensive.
|
||||||
|
|
||||||
|
## 4. Charts — the trend-split rule
|
||||||
|
|
||||||
|
ui-chart with `category: "topic"` + `categoryType: "msg"` plots one series per unique `msg.topic`. So:
|
||||||
|
|
||||||
|
- One chart per **metric type** (one chart for flow, one for power).
|
||||||
|
- Each chart receives msgs whose `topic` is the **series label** (e.g. `Pump A`, `Pump B`, `Pump C`).
|
||||||
|
|
||||||
|
### Required chart properties (FlowFuse ui-chart renders blank without ALL of these)
|
||||||
|
|
||||||
|
Derived from working charts in rotatingMachine/examples/03-Dashboard. Every property listed below is mandatory — omit any one and the chart renders blank with no error message.
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"type": "ui-chart",
|
||||||
|
"chartType": "line",
|
||||||
|
"interpolation": "linear",
|
||||||
|
"category": "topic",
|
||||||
|
"categoryType": "msg",
|
||||||
|
"xAxisType": "time",
|
||||||
|
"xAxisProperty": "",
|
||||||
|
"xAxisPropertyType": "timestamp",
|
||||||
|
"xAxisFormat": "",
|
||||||
|
"xAxisFormatType": "auto",
|
||||||
|
"yAxisProperty": "payload",
|
||||||
|
"yAxisPropertyType": "msg",
|
||||||
|
"action": "append",
|
||||||
|
"stackSeries": false,
|
||||||
|
"pointShape": "circle",
|
||||||
|
"pointRadius": 4,
|
||||||
|
"showLegend": true,
|
||||||
|
"bins": 10,
|
||||||
|
"width": 12,
|
||||||
|
"height": 6,
|
||||||
|
"removeOlder": "15",
|
||||||
|
"removeOlderUnit": "60",
|
||||||
|
"removeOlderPoints": "",
|
||||||
|
"colors": ["#0095FF","#FF0000","#FF7F0E","#2CA02C","#A347E1","#D62728","#FF9896","#9467BD","#C5B0D5"],
|
||||||
|
"textColor": ["#666666"],
|
||||||
|
"textColorDefault": true,
|
||||||
|
"gridColor": ["#e5e5e5"],
|
||||||
|
"gridColorDefault": true
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
**Key gotchas:**
|
||||||
|
- `interpolation` MUST be set (`"linear"`, `"step"`, `"bezier"`, `"cubic"`, `"cubic-mono"`). Without it: no line drawn.
|
||||||
|
- `yAxisProperty: "payload"` + `yAxisPropertyType: "msg"` tells the chart WHERE in the msg to find the y-value. Without these: chart has no data to plot.
|
||||||
|
- `xAxisPropertyType: "timestamp"` tells the chart to use `msg.timestamp` (or auto-generated) for the x-axis.
|
||||||
|
- `width` and `height` are **numbers, not strings**. `width: 12` (correct) vs `width: "12"` (may break).
|
||||||
|
- `removeOlderPoints: ""` (empty string) → retention is controlled by removeOlder + removeOlderUnit only. Set to a number string to additionally cap points per series.
|
||||||
|
- `colors` array defines the palette for auto-assigned series colours. Provide at least 3.
|
||||||
|
|
||||||
|
### The trend-split function pattern
|
||||||
|
|
||||||
|
A common bug: feeding both flow and power msgs to a single function output that wires to both charts. Both charts then plot all metrics, garbling the legend.
|
||||||
|
|
||||||
|
**Fix:** the trend-feeder function MUST have one output per chart, and split:
|
||||||
|
|
||||||
|
```js
|
||||||
|
// outputs: 2
|
||||||
|
// wires: [["chart_flow"], ["chart_power"]]
|
||||||
|
const flowMsg = p.flowNum != null ? { topic: 'Pump A', payload: p.flowNum } : null;
|
||||||
|
const powerMsg = p.powerNum != null ? { topic: 'Pump A', payload: p.powerNum } : null;
|
||||||
|
return [flowMsg, powerMsg];
|
||||||
|
```
|
||||||
|
|
||||||
|
A null msg on a given output sends nothing on that output — exactly what we want.
|
||||||
|
|
||||||
|
### Chart axis settings to actually configure
|
||||||
|
|
||||||
|
- `removeOlder` + `removeOlderUnit`: how much history to keep (e.g. 10 minutes).
|
||||||
|
- `removeOlderPoints`: cap on points per series (200 is sensible for a demo).
|
||||||
|
- `ymin` / `ymax`: leave blank for autoscale, or set numeric strings if you want a fixed range.
|
||||||
|
|
||||||
|
## 5. Inject node — payload typing
|
||||||
|
|
||||||
|
Multi-prop inject must populate `v` and `vt` **per prop**, not just the legacy top-level `payload` + `payloadType`:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"props": [
|
||||||
|
{"p": "topic", "vt": "str"},
|
||||||
|
{"p": "payload", "v": "{\"action\":\"startup\"}", "vt": "json"}
|
||||||
|
],
|
||||||
|
"topic": "execSequence",
|
||||||
|
"payload": "{\"action\":\"startup\"}",
|
||||||
|
"payloadType": "json"
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
If you only fill the top-level fields, `payload_type=json` is silently treated as `str`.
|
||||||
|
|
||||||
|
## 6. Dashboard widget rules
|
||||||
|
|
||||||
|
- **Widget = display only.** No business logic in `ui-text` formats or `ui-template` HTML.
|
||||||
|
- **Buttons emit a typed string payload** (`"fired"` or similar). Convert to the real msg shape with a tiny wrapper function on the same tab, before the link-out.
|
||||||
|
- **Sliders use `passthru: true`** so they re-emit on input messages (useful for syncing initial state from the process side later).
|
||||||
|
- **One ui-page per demo.** Multiple groups under one page is the natural split.
|
||||||
|
- **Group widths should sum to a multiple of 12.** The page grid is 12 columns. A row of `4 + 4 + 4` or `6 + 6` works; mixing arbitrary widths leaves gaps.
|
||||||
|
- **EVERY ui-* node needs `x` and `y` keys.** Without them Node-RED dumps the node at (0,0) — every text widget and chart piles up in the top-left of the editor canvas. The dashboard itself still renders correctly (it lays out by group/order, not editor x/y), but the editor view is unreadable. If you write a flow generator helper, set `x` and `y` on the dict EVERY time. Test with `jq '[.[] | select(.x==0 and .y==0 and (.type|tostring|startswith("ui-")))]'` after generating.
|
||||||
|
|
||||||
|
## 7. Do / don't checklist
|
||||||
|
|
||||||
|
✅ Do:
|
||||||
|
|
||||||
|
- Generate flows from a Python builder (`build_flow.py`) — it's the source of truth.
|
||||||
|
- Use deterministic IDs (`pump_a`, `meas_pump_a_u`, `lin_demand_to_mgc`) — reproducible diffs across regenerations.
|
||||||
|
- Tag every channel name with `cmd:` / `evt:` / `setup:`.
|
||||||
|
- Comment every section, even short ones.
|
||||||
|
- Verify trends with a `ui-chart` of synthetic data first, before plumbing real data through.
|
||||||
|
|
||||||
|
❌ Don't:
|
||||||
|
|
||||||
|
- Don't use `replace_all` on a Python identifier that appears in a node's own wires definition — you'll create self-loops (>250k msg/s discovered the hard way).
|
||||||
|
- Don't wire across tabs directly. The wire IS allowed but it makes the editor unreadable.
|
||||||
|
- Don't put dashboard widgets next to EVOLV nodes — different concerns.
|
||||||
|
- Don't pack nodes within 40 px of each other — labels overlap, wires snap to wrong handles.
|
||||||
|
- Don't ship `enableLog: "debug"` in a demo — fills the container log within seconds and obscures real errors.
|
||||||
|
|
||||||
|
## 8. The link-out / link-in JSON shape (cheat sheet)
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"id": "lout_demand_dash",
|
||||||
|
"type": "link out",
|
||||||
|
"z": "tab_ui",
|
||||||
|
"name": "cmd:demand",
|
||||||
|
"mode": "link",
|
||||||
|
"links": ["lin_demand_to_mgc"],
|
||||||
|
"x": 380, "y": 140,
|
||||||
|
"wires": []
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"id": "lin_demand_to_mgc",
|
||||||
|
"type": "link in",
|
||||||
|
"z": "tab_process",
|
||||||
|
"name": "cmd:demand",
|
||||||
|
"links": ["lout_demand_dash", "lout_demand_drivers"],
|
||||||
|
"x": 120, "y": 1500,
|
||||||
|
"wires": [["demand_fanout_mgc_ps"]]
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Both ends store the paired ids in `links`. The `name` is cosmetic (label only) — Node-RED routes by id. Multiple emitters can target one receiver; one emitter can target multiple receivers.
|
||||||
|
|
||||||
|
## 9. Node configuration completeness — ALWAYS set every field
|
||||||
|
|
||||||
|
When placing an EVOLV node in a flow (demo or production), configure **every config field** the node's schema defines — don't rely on schema defaults for operational parameters. Schema defaults exist to make the validator happy, not to represent a realistic plant.
|
||||||
|
|
||||||
|
**Why this matters:** A pumpingStation with `basinVolume: 10` but default `heightOverflow: 2.5` and default `heightOutlet: 0.2` creates an internally inconsistent basin where the fill % exceeds 100%, safety guards fire at wrong thresholds, and the demo looks broken. Every field interacts with every other field.
|
||||||
|
|
||||||
|
**The rule:**
|
||||||
|
1. Read the node's config schema (`generalFunctions/src/configs/<nodeName>.json`) before writing the flow.
|
||||||
|
2. For each section (basin, hydraulics, control, safety, scaling, smoothing, …), set EVERY field explicitly in the flow JSON — even if you'd pick the same value as the default.
|
||||||
|
3. Add a comment in the flow generator per section explaining WHY you chose each value (e.g. "basin sized so sinus peak takes 6 min to fill from startLevel to overflow").
|
||||||
|
4. Cross-check computed values: `surfaceArea = volume / height`, `maxVolOverflow = heightOverflow × surfaceArea`, gauge `max` = basin `height`, fill % denominator = `volume` (not overflow volume).
|
||||||
|
5. If a gauge or chart references a config value (basin height, maxVol), derive it from the same source — never hardcode a number that was computed elsewhere.
|
||||||
|
|
||||||
|
## 10. Verifying the layout
|
||||||
|
|
||||||
|
Before declaring a flow done:
|
||||||
|
|
||||||
|
1. **Open the tab in the editor — every wire should run left → right.** No backward loops.
|
||||||
|
2. **Open each section by section comment — visible in 1 screen height.** If not, raise `SECTION_GAP`.
|
||||||
|
3. **Hit the dashboard URL — every widget has data.** `n/a` everywhere is a contract failure.
|
||||||
|
4. **For charts, watch a series populate over 30 s.** A blank chart after 30 s = bug.
|
||||||
|
5. **Disable each tab one at a time and re-deploy.** Process Plant alone should still load (just inert). Dashboard UI alone should serve a page (just empty). If disabling a tab errors out, the tab boundaries are wrong.
|
||||||
|
|
||||||
|
## 10. Hierarchical placement — by S88 level, not by node name
|
||||||
|
|
||||||
|
The lane assignment maps to the **S88 hierarchy**, not to specific node names. Any node that lives at a given S88 level goes in the same lane regardless of what kind of equipment it is. New node types added to the platform inherit a lane by their S88 category — no rule change needed.
|
||||||
|
|
||||||
|
### 10.1 Lane convention (x-axis = S88 level)
|
||||||
|
|
||||||
|
| Lane | x | Purpose | S88 level | Colour | Current EVOLV nodes |
|
||||||
|
|---:|---:|---|---|---|---|
|
||||||
|
| **L0** | 120 | Tab inputs | — | (none) | `link in`, `inject` |
|
||||||
|
| **L1** | 360 | Adapters | — | (none) | `function` (msg-shape wrappers) |
|
||||||
|
| **L2** | 600 | Control Module | CM | `#a9daee` | `measurement` |
|
||||||
|
| **L3** | 840 | Equipment Module | EM | `#86bbdd` | `rotatingMachine`, `valve`, `diffuser` |
|
||||||
|
| **L4** | 1080 | Unit | UN | `#50a8d9` | `machineGroupControl`, `valveGroupControl`, `reactor`, `settler`, `monster` |
|
||||||
|
| **L5** | 1320 | Process Cell | PC | `#0c99d9` | `pumpingStation` |
|
||||||
|
| **L6** | 1560 | Output formatters | — | (none) | `function` (build dashboard payload from port 0) |
|
||||||
|
| **L7** | 1800 | Tab outputs | — | (none) | `link out`, `debug` |
|
||||||
|
|
||||||
|
Spacing: **240 px** between lanes. Tab width ≤ 1920 px (fits standard monitors without horizontal scroll in the editor).
|
||||||
|
|
||||||
|
**Area level** (`#0f52a5`) is reserved for plant-wide coordination and currently unused — when added, allocate a new lane and shift formatter/output one lane right (i.e. expand to 9 lanes if and when needed).
|
||||||
|
|
||||||
|
### 10.2 The group rule (Node-RED `group` boxes anchor each parent + its children)
|
||||||
|
|
||||||
|
Use Node-RED's native `group` node (the visual box around a set of nodes — not to be confused with `ui-group`) to anchor every "parent + direct children" cluster. The box makes ownership unambiguous and lets you collapse the cluster in the editor.
|
||||||
|
|
||||||
|
**Group rules:**
|
||||||
|
|
||||||
|
- **One Node-RED group per parent + its direct children.**
|
||||||
|
Example: `Pump A + meas-A-up + meas-A-dn` is one group, named `Pump A`.
|
||||||
|
- **Group colour = parent's S88 colour.**
|
||||||
|
So a Pump-A group is `#86bbdd` (Equipment Module). A reactor group is `#50a8d9` (Unit).
|
||||||
|
- **Group `style.label = true`** so the box shows the parent's name.
|
||||||
|
- **Group must contain all the children's adapters / wrappers / formatters** too if those exclusively belong to the parent. The box is the visual anchor for "this is everything that owns / serves Pump A".
|
||||||
|
- **Utility groups for cross-cutting logic** (mode broadcast, station-wide commands, demand fan-out) use a neutral colour (`#dddddd`).
|
||||||
|
|
||||||
|
JSON shape:
|
||||||
|
|
||||||
|
```json
|
||||||
|
{
|
||||||
|
"id": "grp_pump_a",
|
||||||
|
"type": "group",
|
||||||
|
"z": "tab_process",
|
||||||
|
"name": "Pump A",
|
||||||
|
"style": { "label": true, "stroke": "#000000", "fill": "#86bbdd", "fill-opacity": "0.10" },
|
||||||
|
"nodes": ["meas_pump_a_u", "meas_pump_a_d", "pump_a", "format_pump_a", "lin_setpoint_pump_a", "build_setpoint_pump_a", "lin_seq_pump_a", "lout_evt_pump_a"],
|
||||||
|
"x": 80, "y": 100, "w": 1800, "h": 200
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
`x/y/w/h` is the bounding box of contained nodes + padding — compute it from the children's positions.
|
||||||
|
|
||||||
|
### 10.3 The hierarchy rule, restated
|
||||||
|
|
||||||
|
> Nodes at the **same S88 level** (siblings sharing one parent) **stack vertically in the same lane**.
|
||||||
|
>
|
||||||
|
> Nodes at **different S88 levels** (parent ↔ child) sit **next to each other on different lanes**.
|
||||||
|
|
||||||
|
### 10.4 Worked example — pumping station demo
|
||||||
|
|
||||||
|
```
|
||||||
|
L0 L1 L2 L3 L4 L5 L6 L7
|
||||||
|
(input) (adapter) (CM) (EM) (Unit) (PC) (formatter) (output)
|
||||||
|
|
||||||
|
┌── group: Pump A (#86bbdd) ─────────────────────────────────────────────────────────────────────────────────────────┐
|
||||||
|
│ [lin-set-A] [build-A] │
|
||||||
|
│ [lin-seq-A] │
|
||||||
|
│ [meas-A-up] │
|
||||||
|
│ [meas-A-dn] → [Pump A] → │
|
||||||
|
│ [format-A] →[lout-evt-A]
|
||||||
|
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
|
||||||
|
|
||||||
|
┌── group: Pump B (#86bbdd) ─────────────────────────────────────────────────────────────────────────────────────────┐
|
||||||
|
│ ... same shape ... │
|
||||||
|
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
|
||||||
|
|
||||||
|
┌── group: Pump C (#86bbdd) ─────────────────────────────────────────────────────────────────────────────────────────┐
|
||||||
|
│ ... same shape ... │
|
||||||
|
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
|
||||||
|
|
||||||
|
┌── group: MGC — Pump Group (#50a8d9) ──────────────────────────────────────────────────────────────────────────────┐
|
||||||
|
│ [lin-demand] [demand→MGC+PS] [MGC] [format-MGC]→[lout-evt-MGC]
|
||||||
|
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
|
||||||
|
|
||||||
|
┌── group: Pumping Station (#0c99d9) ───────────────────────────────────────────────────────────────────────────────┐
|
||||||
|
│ [PS] [format-PS]→[lout-evt-PS]
|
||||||
|
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
|
||||||
|
|
||||||
|
┌── group: Mode broadcast (#dddddd, neutral) ───────────────────────────────────────────────────────────────────────┐
|
||||||
|
│ [lin-mode] [fan-mode] ─────────────► to all 3 pumps in the Pump A/B/C groups │
|
||||||
|
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
|
||||||
|
|
||||||
|
┌── group: Station-wide commands (#dddddd) ─────────────────────────────────────────────────────────────────────────┐
|
||||||
|
│ [lin-start] [fan-start] ─► to pumps │
|
||||||
|
│ [lin-stop] [fan-stop] │
|
||||||
|
│ [lin-estop] [fan-estop] │
|
||||||
|
└────────────────────────────────────────────────────────────────────────────────────────────────────────────────────┘
|
||||||
|
```
|
||||||
|
|
||||||
|
What that buys:
|
||||||
|
|
||||||
|
- Search "Pump A" highlights the whole group box (parent + sensors + adapters + formatter).
|
||||||
|
- S88 colour of the group box tells you the level at a glance.
|
||||||
|
- Wires are horizontal within a group; cross-group wires (Pump A port 2 → MGC) cross only one band.
|
||||||
|
- Collapse a group in the editor and it becomes a single tile — clutter disappears during reviews.
|
||||||
|
|
||||||
|
### 10.5 Multi-input fan-in rule
|
||||||
|
|
||||||
|
Stack link-ins tightly at L0, centred on the destination's y. Merge node one lane right at the same y.
|
||||||
|
|
||||||
|
### 10.6 Multi-output fan-out rule
|
||||||
|
|
||||||
|
Source at the y-centre of its destinations; destinations stack vertically in the next lane. Wires fork cleanly without jogging.
|
||||||
|
|
||||||
|
### 10.7 Link-in placement (within a tab)
|
||||||
|
|
||||||
|
- All link-ins on **L0**.
|
||||||
|
- Order them top-to-bottom by the y of their **first downstream target**.
|
||||||
|
- Link-ins that feed the same destination share the same y-band as that destination.
|
||||||
|
|
||||||
|
### 10.8 Link-out placement (within a tab)
|
||||||
|
|
||||||
|
- All link-outs on **L7** (the rightmost lane).
|
||||||
|
- Each link-out's y matches its **upstream source's** y, so the wire is horizontal.
|
||||||
|
|
||||||
|
### 10.9 Cross-tab wire rule
|
||||||
|
|
||||||
|
Cross-tab wires use `link out` / `link in` pairs (see Section 2). Direct cross-tab wires are forbidden.
|
||||||
|
|
||||||
|
### 10.10 The "no jog" verification
|
||||||
|
|
||||||
|
- A wire whose source y == destination y is fine (perfectly horizontal).
|
||||||
|
- A wire that jogs vertically by ≤ 80 px is fine (one row of slop).
|
||||||
|
- A wire that jogs by > 80 px means **the destination is in the wrong group y-band**. Move the destination, not the source — the source's position was determined by its own group.
|
||||||
|
|
||||||
|
## 11. Dashboard tab variant
|
||||||
|
|
||||||
|
Dashboard widgets are stamped to the real grid by the FlowFuse renderer; editor x/y is for the editor's readability.
|
||||||
|
|
||||||
|
- Use only **L0, L2, L4, L7**:
|
||||||
|
- L0 = `link in` (events from process)
|
||||||
|
- L2 = `ui-*` inputs (sliders, switches, buttons)
|
||||||
|
- L4 = wrapper / format / trend-split functions
|
||||||
|
- L7 = `link out` (commands going back)
|
||||||
|
- **One Node-RED group per `ui-group`.** Editor group's name matches the `ui-group` name. Colour follows the S88 level of the represented equipment (MGC group = `#50a8d9`, Pump A group = `#86bbdd`, …) so the editor view mirrors the dashboard structure.
|
||||||
|
- Within the group, widgets stack vertically by their visual order in the dashboard.
|
||||||
|
|
||||||
|
## 12. Setup tab variant
|
||||||
|
|
||||||
|
Single-column ladder L0 → L7, ordered top-to-bottom by `onceDelay`. Wrap in a single neutral-grey Node-RED group named `Deploy-time setup`.
|
||||||
|
|
||||||
|
## 13. Demo Drivers tab variant
|
||||||
|
|
||||||
|
Same as Process Plant but typically only L0, L2, L4, L7 are used. Wrap each driver (random gen, scripted scenario, …) in its own neutral Node-RED group.
|
||||||
|
|
||||||
|
## 14. Spacing constants (final)
|
||||||
|
|
||||||
|
```python
|
||||||
|
LANE_X = [120, 360, 600, 840, 1080, 1320, 1560, 1800]
|
||||||
|
SIBLING_PITCH = 40
|
||||||
|
GROUP_GAP = 200
|
||||||
|
TAB_TOP_MARGIN = 80
|
||||||
|
GROUP_PADDING = 20 # extra px around child bounding box for the Node-RED group box
|
||||||
|
|
||||||
|
S88_COLORS = {
|
||||||
|
"AR": "#0f52a5", # Area (currently unused)
|
||||||
|
"PC": "#0c99d9", # Process Cell
|
||||||
|
"UN": "#50a8d9", # Unit
|
||||||
|
"EM": "#86bbdd", # Equipment Module
|
||||||
|
"CM": "#a9daee", # Control Module
|
||||||
|
"neutral": "#dddddd",
|
||||||
|
}
|
||||||
|
|
||||||
|
# Registry: drop a new node type here to place it automatically.
|
||||||
|
NODE_LEVEL = {
|
||||||
|
"measurement": "CM",
|
||||||
|
"rotatingMachine": "EM",
|
||||||
|
"valve": "EM",
|
||||||
|
"diffuser": "EM",
|
||||||
|
"machineGroupControl": "UN",
|
||||||
|
"valveGroupControl": "UN",
|
||||||
|
"reactor": "UN",
|
||||||
|
"settler": "UN",
|
||||||
|
"monster": "UN",
|
||||||
|
"pumpingStation": "PC",
|
||||||
|
"dashboardAPI": "neutral",
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
Helpers for the build script:
|
||||||
|
|
||||||
|
```python
|
||||||
|
def place(lane, group_index, position_in_group, group_size):
|
||||||
|
"""Compute (x, y) for a node in a process group."""
|
||||||
|
x = LANE_X[lane]
|
||||||
|
band_centre = TAB_TOP_MARGIN + group_index * (group_size * SIBLING_PITCH + GROUP_GAP) \
|
||||||
|
+ (group_size - 1) * SIBLING_PITCH / 2
|
||||||
|
y = band_centre + (position_in_group - (group_size - 1) / 2) * SIBLING_PITCH
|
||||||
|
return int(x), int(y)
|
||||||
|
|
||||||
|
def wrap_in_group(child_ids, name, s88_color, nodes_by_id, padding=GROUP_PADDING):
|
||||||
|
"""Compute the Node-RED group box around a set of children."""
|
||||||
|
xs = [nodes_by_id[c]["x"] for c in child_ids]
|
||||||
|
ys = [nodes_by_id[c]["y"] for c in child_ids]
|
||||||
|
return {
|
||||||
|
"type": "group", "name": name,
|
||||||
|
"style": {"label": True, "stroke": "#000000", "fill": s88_color, "fill-opacity": "0.10"},
|
||||||
|
"nodes": list(child_ids),
|
||||||
|
"x": min(xs) - padding, "y": min(ys) - padding,
|
||||||
|
"w": max(xs) - min(xs) + 160 + 2 * padding,
|
||||||
|
"h": max(ys) - min(ys) + 40 + 2 * padding,
|
||||||
|
}
|
||||||
|
```
|
||||||
|
|
||||||
|
## 15. Verification checklist (extends Section 9)
|
||||||
|
|
||||||
|
After building a tab:
|
||||||
|
|
||||||
|
1. **No wire jogs > 80 px vertically within a group.**
|
||||||
|
2. **Each lane contains nodes of one purpose only** (never an `ui-text` on L3; never a `rotatingMachine` on L2).
|
||||||
|
3. **Peers share a lane; parents and children sit on adjacent lanes.**
|
||||||
|
4. **Every parent + direct children sit inside one Node-RED group box, coloured by the parent's S88 level.**
|
||||||
|
5. **Utility groups** (mode broadcast, station commands, demand fan-out) wrapped in neutral-grey Node-RED groups.
|
||||||
|
6. **Section comments at the top of each group band.**
|
||||||
|
7. **Editor scrollable in y but NOT in x** on a normal monitor.
|
||||||
|
8. **Search test:** typing the parent's name in the editor highlights the whole group box.
|
||||||
|
|
||||||
|
## 16. S88 colour cleanup (separate follow-up task)
|
||||||
|
|
||||||
|
These nodes don't currently follow the S88 palette. They should be brought in line in a separate session before the placement rule is fully consistent across the editor:
|
||||||
|
|
||||||
|
- `settler` (`#e4a363` orange) → should be `#50a8d9` (Unit)
|
||||||
|
- `monster` (`#4f8582` teal) → should be `#50a8d9` (Unit)
|
||||||
|
- `diffuser` (no colour set) → should be `#86bbdd` (Equipment Module)
|
||||||
|
- `dashboardAPI` (no colour set) → utility, no S88 colour needed
|
||||||
|
|
||||||
|
Until cleaned up, the placement rule still works — `NODE_LEVEL` (Section 14) already maps these to their semantic S88 level regardless of the node's own colour.
|
||||||
77
.claude/settings.local.json
Normal file
77
.claude/settings.local.json
Normal file
@@ -0,0 +1,77 @@
|
|||||||
|
{
|
||||||
|
"permissions": {
|
||||||
|
"allow": [
|
||||||
|
"Bash(node --test:*)",
|
||||||
|
"Bash(node -c:*)",
|
||||||
|
"Bash(npm:*)",
|
||||||
|
"Bash(git:*)",
|
||||||
|
"Bash(ls:*)",
|
||||||
|
"Bash(tree:*)",
|
||||||
|
"Bash(wc:*)",
|
||||||
|
"Bash(head:*)",
|
||||||
|
"Bash(tail:*)",
|
||||||
|
"Bash(sort:*)",
|
||||||
|
"Bash(find:*)",
|
||||||
|
"Bash(echo:*)",
|
||||||
|
"Bash(cat:*)",
|
||||||
|
"Bash(cut:*)",
|
||||||
|
"Bash(xargs:*)",
|
||||||
|
"WebSearch",
|
||||||
|
"WebFetch(domain:nodered.org)",
|
||||||
|
"WebFetch(domain:docs.influxdata.com)",
|
||||||
|
"WebFetch(domain:github.com)",
|
||||||
|
"WebFetch(domain:docs.anthropic.com)",
|
||||||
|
"WebFetch(domain:nodejs.org)",
|
||||||
|
"WebFetch(domain:www.npmjs.com)",
|
||||||
|
"WebFetch(domain:developer.mozilla.org)",
|
||||||
|
"WebFetch(domain:flowfuse.com)",
|
||||||
|
"WebFetch(domain:www.coolprop.org)",
|
||||||
|
"WebFetch(domain:en.wikipedia.org)",
|
||||||
|
"WebFetch(domain:www.engineeringtoolbox.com)",
|
||||||
|
"mcp__ide__getDiagnostics",
|
||||||
|
"Bash(chmod +x:*)",
|
||||||
|
"Bash(docker compose:*)",
|
||||||
|
"Bash(docker:*)",
|
||||||
|
"Bash(npm run docker:*)",
|
||||||
|
"Bash(sh:*)",
|
||||||
|
"Bash(curl:*)",
|
||||||
|
"Bash(# Check Node-RED context for the parse function to see if it received data\ndocker compose exec -T nodered sh -c 'curl -sf \"http://localhost:1880/context/node/demo_fn_ps_west_parse\" 2>/dev/null' | python3 -c \"\nimport json, sys\ntry:\n data = json.load\\(sys.stdin\\)\n print\\(json.dumps\\(data, indent=2\\)[:800]\\)\nexcept Exception as e: print\\(f'Error: {e}'\\)\n\" 2>&1)",
|
||||||
|
"Bash(# Check what the deployed flow looks like for link out type nodes\ncurl -sf http://localhost:1880/flows 2>/dev/null | python3 -c \"\nimport json, sys\nflows = json.load\\(sys.stdin\\)\n# All node types and their counts\nfrom collections import Counter\ntypes = Counter\\(n.get\\('type',''\\) for n in flows if 'type' in n\\)\nfor t, c in sorted\\(types.items\\(\\)\\):\n if 'link' in t.lower\\(\\):\n print\\(f'{t}: {c}'\\)\nprint\\('---'\\)\n# Show existing link out nodes\nfor n in flows:\n if n.get\\('type'\\) == 'link out':\n print\\(f' {n[\\\\\"id\\\\\"]}: links={n.get\\(\\\\\"links\\\\\",[]\\)}'\\)\n\" 2>&1)",
|
||||||
|
"Bash(# Full count of all deployed node types\ncurl -sf http://localhost:1880/flows 2>/dev/null | python3 -c \"\nimport json, sys\nflows = json.load\\(sys.stdin\\)\nfrom collections import Counter\ntypes = Counter\\(n.get\\('type',''\\) for n in flows if 'type' in n\\)\nfor t, c in sorted\\(types.items\\(\\)\\):\n print\\(f'{t:30s}: {c}'\\)\nprint\\(f'Total nodes: {len\\(flows\\)}'\\)\n\" 2>&1)",
|
||||||
|
"Bash(# Check exact registered node type names\ncurl -sf http://localhost:1880/nodes 2>/dev/null | python3 -c \"\nimport json, sys\nnodes = json.load\\(sys.stdin\\)\nfor mod in nodes:\n if 'EVOLV' in json.dumps\\(mod\\) or 'evolv' in json.dumps\\(mod\\).lower\\(\\):\n if isinstance\\(mod, dict\\) and 'types' in mod:\n for t in mod['types']:\n print\\(f'Registered type: {t}'\\)\n elif isinstance\\(mod, dict\\) and 'nodes' in mod:\n for n in mod['nodes']:\n for t in n.get\\('types', []\\):\n print\\(f'Registered type: {t}'\\)\n\" 2>&1)",
|
||||||
|
"Bash(# Get node types from the /nodes endpoint properly\ndocker compose exec -T nodered sh -c 'curl -sf http://localhost:1880/nodes' | python3 -c \"\nimport json, sys\ndata = json.load\\(sys.stdin\\)\n# Find EVOLV node types\nfor module in data:\n if isinstance\\(module, dict\\):\n name = module.get\\('name', module.get\\('module', ''\\)\\)\n if 'EVOLV' in str\\(name\\).upper\\(\\) or 'evolv' in str\\(name\\).lower\\(\\):\n print\\(f'Module: {name}'\\)\n for node_set in module.get\\('nodes', []\\):\n for t in node_set.get\\('types', []\\):\n print\\(f' Type: {t}'\\)\n\" 2>&1)",
|
||||||
|
"Bash(# Get raw flow data directly from inside the container\ndocker compose exec -T nodered sh -c 'curl -sf http://localhost:1880/flows 2>/dev/null' | python3 -c \"\nimport json, sys\ndata = json.load\\(sys.stdin\\)\nprint\\(f'Total entries: {len\\(data\\)}'\\)\nprint\\(f'Type: {type\\(data\\)}'\\)\nif isinstance\\(data, list\\):\n print\\('First 3:'\\)\n for n in data[:3]:\n print\\(f' {n.get\\(\\\\\"id\\\\\",\\\\\"?\\\\\"\\)}: type={n.get\\(\\\\\"type\\\\\",\\\\\"?\\\\\"\\)}'\\)\n # Count\n from collections import Counter\n types = Counter\\(n.get\\('type',''\\) for n in data\\)\n for t, c in sorted\\(types.items\\(\\)\\):\n print\\(f' {t}: {c}'\\)\nelif isinstance\\(data, dict\\):\n print\\(f'Keys: {list\\(data.keys\\(\\)\\)}'\\)\n if 'flows' in data:\n flows = data['flows']\n print\\(f'Flows count: {len\\(flows\\)}'\\)\n from collections import Counter\n types = Counter\\(n.get\\('type',''\\) for n in flows\\)\n for t, c in sorted\\(types.items\\(\\)\\):\n print\\(f' {t}: {c}'\\)\n\" 2>&1)",
|
||||||
|
"Bash(# Check individual tab flows\ndocker compose exec -T nodered sh -c 'curl -sf http://localhost:1880/flow/demo_tab_wwtp' | python3 -c \"\nimport json, sys\ndata = json.load\\(sys.stdin\\)\nif isinstance\\(data, dict\\):\n print\\(f'Tab: {data.get\\(\\\\\"label\\\\\",\\\\\"?\\\\\"\\)}'\\)\n nodes = data.get\\('nodes', []\\)\n print\\(f'Nodes: {len\\(nodes\\)}'\\)\n from collections import Counter\n types = Counter\\(n.get\\('type',''\\) for n in nodes\\)\n for t, c in sorted\\(types.items\\(\\)\\):\n print\\(f' {t}: {c}'\\)\nelse:\n print\\(data\\)\n\" 2>&1)",
|
||||||
|
"Bash(sleep 5:*)",
|
||||||
|
"Bash(sleep 15:*)",
|
||||||
|
"Bash(# Get all dashboard UIDs and update the bucket variable from lvl2 to telemetry\ncurl -sf -H \"Authorization: Bearer glsa_4tbdInvrkQ6c7J6N3InjSsH8de83vZ66_9db7efa3\" \\\\\n \"http://localhost:3000/api/search?type=dash-db\" | python3 -c \"\nimport json, sys\ndashboards = json.load\\(sys.stdin\\)\nfor d in dashboards:\n print\\(d['uid']\\)\n\" 2>&1)",
|
||||||
|
"Bash(sleep 20:*)",
|
||||||
|
"Bash(# Check reactor parse function context\ncurl -sf http://localhost:1880/flows 2>/dev/null | python3 -c \"\nimport json, sys\nflows = json.load\\(sys.stdin\\)\n# Find parse functions by name\nfor n in flows:\n if n.get\\('type'\\) == 'function' and 'reactor' in n.get\\('name',''\\).lower\\(\\):\n print\\(f\\\\\"Reactor parse: id={n['id']}, name={n.get\\('name'\\)}\\\\\"\\)\" 2>&1)",
|
||||||
|
"Bash(# Check if reactor node is sending output — look at debug info\ncurl -sf http://localhost:1880/flows 2>/dev/null | python3 -c \"\nimport json, sys\nflows = json.load\\(sys.stdin\\)\n# Find the reactor node and its wires\nfor n in flows:\n if n.get\\('type'\\) == 'reactor':\n print\\(f\\\\\"Reactor: id={n['id']}, name={n.get\\('name',''\\)}\\\\\"\\)\n wires = n.get\\('wires', []\\)\n for i, port in enumerate\\(wires\\):\n print\\(f' Port {i}: {port}'\\)\n if n.get\\('type'\\) == 'link out' and 'reactor' in n.get\\('name',''\\).lower\\(\\):\n print\\(f\\\\\"Link-out reactor: id={n['id']}, name={n.get\\('name',''\\)}, links={n.get\\('links',[]\\)}\\\\\"\\)\" 2>&1)",
|
||||||
|
"Bash(# Check measurement node wiring and output\ncurl -sf http://localhost:1880/flows 2>/dev/null | python3 -c \"\nimport json, sys\nflows = json.load\\(sys.stdin\\)\nfor n in flows:\n if n.get\\('type'\\) == 'measurement':\n print\\(f\\\\\"Measurement: id={n['id']}, name={n.get\\('name',''\\)}\\\\\"\\)\n wires = n.get\\('wires', []\\)\n for i, port in enumerate\\(wires\\):\n print\\(f' Port {i}: {port}'\\)\n if n.get\\('type'\\) == 'link out' and 'meas' in n.get\\('name',''\\).lower\\(\\):\n print\\(f\\\\\"Link-out meas: id={n['id']}, name={n.get\\('name',''\\)}, links={n.get\\('links',[]\\)}\\\\\"\\)\" 2>&1)",
|
||||||
|
"Bash(# Check reactor node config and measurement configs\ncurl -sf http://localhost:1880/flows 2>/dev/null | python3 -c \"\nimport json, sys\nflows = json.load\\(sys.stdin\\)\nfor n in flows:\n if n.get\\('type'\\) == 'reactor':\n print\\('=== REACTOR CONFIG ==='\\)\n for k,v in sorted\\(n.items\\(\\)\\):\n if k not in \\('wires','x','y','z'\\):\n print\\(f' {k}: {v}'\\)\n if n.get\\('type'\\) == 'measurement' and n.get\\('id'\\) == 'demo_meas_flow':\n print\\('=== MEASUREMENT FT-001 CONFIG ==='\\)\n for k,v in sorted\\(n.items\\(\\)\\):\n if k not in \\('wires','x','y','z'\\):\n print\\(f' {k}: {v}'\\)\" 2>&1)",
|
||||||
|
"Bash(# Check what inject/input nodes target the measurement nodes\ncurl -sf http://localhost:1880/flows 2>/dev/null | python3 -c \"\nimport json, sys\nflows = json.load\\(sys.stdin\\)\n\n# Find all nodes that wire INTO the measurement nodes\nmeas_ids = {'demo_meas_flow', 'demo_meas_do', 'demo_meas_nh4'}\nfor n in flows:\n wires = n.get\\('wires', []\\)\n for port_idx, port_wires in enumerate\\(wires\\):\n for target in port_wires:\n if target in meas_ids:\n print\\(f'{n.get\\(\\\\\"type\\\\\"\\)}:{n.get\\(\\\\\"name\\\\\",\\\\\"\\\\\"\\)} \\(id={n.get\\(\\\\\"id\\\\\"\\)}\\) port {port_idx} → {target}'\\)\n\n# Check inject nodes that send to measurements \nprint\\(\\)\nprint\\('=== Inject nodes ==='\\)\nfor n in flows:\n if n.get\\('type'\\) == 'inject':\n wires = n.get\\('wires', []\\)\n all_targets = [t for port in wires for t in port]\n print\\(f'inject: {n.get\\(\\\\\"name\\\\\",\\\\\"\\\\\"\\)} id={n.get\\(\\\\\"id\\\\\"\\)} → targets={all_targets} repeat={n.get\\(\\\\\"repeat\\\\\",\\\\\"\\\\\"\\)} topic={n.get\\(\\\\\"topic\\\\\",\\\\\"\\\\\"\\)}'\\)\" 2>&1)",
|
||||||
|
"Bash(# Check the simulator function code for measurements\ncurl -sf http://localhost:1880/flows 2>/dev/null | python3 -c \"\nimport json, sys\nflows = json.load\\(sys.stdin\\)\nfor n in flows:\n if n.get\\('id'\\) in \\('demo_fn_sim_flow', 'demo_fn_sim_do', 'demo_fn_sim_nh4'\\):\n print\\(f'=== {n.get\\(\\\\\"name\\\\\"\\)} ==='\\)\n print\\(n.get\\('func',''\\)\\)\n print\\(\\)\" 2>&1)",
|
||||||
|
"Bash(# Check what the reactor tick inject sends\ncurl -sf http://localhost:1880/flows 2>/dev/null | python3 -c \"\nimport json, sys\nflows = json.load\\(sys.stdin\\)\nfor n in flows:\n if n.get\\('id'\\) == 'demo_inj_reactor_tick':\n print\\('=== Reactor tick inject ==='\\)\n for k,v in sorted\\(n.items\\(\\)\\):\n if k not in \\('x','y','z','wires'\\):\n print\\(f' {k}: {v}'\\)\n if n.get\\('id'\\) == 'demo_inj_meas_flow':\n print\\('=== Flow sensor inject ==='\\)\n for k,v in sorted\\(n.items\\(\\)\\):\n if k not in \\('x','y','z','wires'\\):\n print\\(f' {k}: {v}'\\)\" 2>&1)",
|
||||||
|
"Bash(# Check measurement parse function code\ncurl -sf http://localhost:1880/flows 2>/dev/null | python3 -c \"\nimport json, sys\nflows = json.load\\(sys.stdin\\)\nfor n in flows:\n if n.get\\('id'\\) == 'demo_fn_reactor_parse':\n print\\('=== Parse Reactor ==='\\)\n print\\(n.get\\('func',''\\)\\)\n print\\(\\)\n if n.get\\('id'\\) == 'demo_fn_meas_parse':\n print\\('=== Parse Measurements ==='\\)\n print\\(n.get\\('func',''\\)\\)\n print\\(\\)\n if n.get\\('type'\\) == 'function' and 'meas' in n.get\\('name',''\\).lower\\(\\) and 'parse' in n.get\\('name',''\\).lower\\(\\):\n print\\(f'=== {n.get\\(\\\\\"name\\\\\"\\)} \\(id={n.get\\(\\\\\"id\\\\\"\\)}\\) ==='\\)\n print\\(n.get\\('func',''\\)\\)\n print\\(\\)\" 2>&1)",
|
||||||
|
"Bash(# Check the link node pairs are properly paired\ncurl -sf http://localhost:1880/flows 2>/dev/null | python3 -c \"\nimport json, sys\nflows = json.load\\(sys.stdin\\)\nnodes = {n['id']: n for n in flows if 'id' in n}\n\nlink_outs = [n for n in flows if n.get\\('type'\\) == 'link out']\nlink_ins = [n for n in flows if n.get\\('type'\\) == 'link in']\n\nprint\\('=== Link-out nodes ==='\\)\nfor lo in link_outs:\n links = lo.get\\('links', []\\)\n targets = [nodes.get\\(l, {}\\).get\\('name', f'MISSING:{l}'\\) for l in links]\n tab = nodes.get\\(lo.get\\('z',''\\), {}\\).get\\('label', '?'\\)\n print\\(f' [{tab}] {lo.get\\(\\\\\"name\\\\\",\\\\\"\\\\\"\\)} \\(id={lo[\\\\\"id\\\\\"]}\\) → {targets}'\\)\n\nprint\\(\\)\nprint\\('=== Link-in nodes ==='\\) \nfor li in link_ins:\n links = li.get\\('links', []\\)\n tab = nodes.get\\(li.get\\('z',''\\), {}\\).get\\('label', '?'\\)\n print\\(f' [{tab}] {li.get\\(\\\\\"name\\\\\",\\\\\"\\\\\"\\)} \\(id={li[\\\\\"id\\\\\"]}\\) links={links}'\\)\" 2>&1)",
|
||||||
|
"Bash(sleep 8:*)",
|
||||||
|
"Bash(# Check the InfluxDB convert function and HTTP request config\ncurl -sf http://localhost:1880/flows 2>/dev/null | python3 -c \"\nimport json, sys\nflows = json.load\\(sys.stdin\\)\nfor n in flows:\n if n.get\\('id'\\) == 'demo_fn_influx_convert':\n print\\('=== InfluxDB Convert Function ==='\\)\n print\\(f'func: {n.get\\(\\\\\"func\\\\\",\\\\\"\\\\\"\\)}'\\)\n print\\(f'wires: {n.get\\(\\\\\"wires\\\\\",[]\\)}'\\)\n print\\(\\)\n if n.get\\('id'\\) == 'demo_http_influx':\n print\\('=== Write InfluxDB HTTP ==='\\)\n for k,v in sorted\\(n.items\\(\\)\\):\n if k not in \\('x','y','z','wires'\\):\n print\\(f' {k}: {v}'\\)\n print\\(f' wires: {n.get\\(\\\\\"wires\\\\\",[]\\)}'\\)\n\" 2>&1)",
|
||||||
|
"Bash(echo Grafana API not accessible:*)",
|
||||||
|
"Bash(python3 -c \":*)",
|
||||||
|
"Bash(__NEW_LINE_6565c53f4a65adcb__ echo \"\")",
|
||||||
|
"Bash(__NEW_LINE_43bd4a070667d63e__ echo \"\")",
|
||||||
|
"Bash(node:*)",
|
||||||
|
"Bash(python3:*)",
|
||||||
|
"WebFetch(domain:dashboard.flowfuse.com)",
|
||||||
|
"Bash(do echo:*)",
|
||||||
|
"Bash(__NEW_LINE_5a355214e3d8caae__ git:*)",
|
||||||
|
"Bash(git add:*)",
|
||||||
|
"Bash(__NEW_LINE_4762b8ca1fb65139__ for:*)",
|
||||||
|
"Bash(docker.exe ps:*)",
|
||||||
|
"Bash(docker.exe logs:*)",
|
||||||
|
"Bash(docker.exe compose:*)",
|
||||||
|
"Bash(docker.exe exec:*)"
|
||||||
|
]
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -11,7 +11,9 @@ node_modules/
|
|||||||
# Agent/Claude metadata (not needed at runtime)
|
# Agent/Claude metadata (not needed at runtime)
|
||||||
.agents/
|
.agents/
|
||||||
.claude/
|
.claude/
|
||||||
manuals/
|
|
||||||
|
# Documentation (not needed at runtime)
|
||||||
|
wiki/
|
||||||
|
|
||||||
# IDE
|
# IDE
|
||||||
.vscode/
|
.vscode/
|
||||||
@@ -23,10 +25,3 @@ manuals/
|
|||||||
# OS
|
# OS
|
||||||
.DS_Store
|
.DS_Store
|
||||||
Thumbs.db
|
Thumbs.db
|
||||||
|
|
||||||
# Documentation (not needed at runtime)
|
|
||||||
third_party/
|
|
||||||
FUNCTIONAL_ISSUES_BACKLOG.md
|
|
||||||
AGENTS.md
|
|
||||||
README.md
|
|
||||||
LICENSE
|
|
||||||
|
|||||||
41
.gitea/workflows/ci.yml
Normal file
41
.gitea/workflows/ci.yml
Normal file
@@ -0,0 +1,41 @@
|
|||||||
|
name: CI
|
||||||
|
|
||||||
|
on:
|
||||||
|
push:
|
||||||
|
branches: [main, develop, dev-Rene]
|
||||||
|
pull_request:
|
||||||
|
branches: [main]
|
||||||
|
|
||||||
|
jobs:
|
||||||
|
lint-and-test:
|
||||||
|
runs-on: ubuntu-latest
|
||||||
|
container:
|
||||||
|
image: node:20-slim
|
||||||
|
|
||||||
|
steps:
|
||||||
|
- name: Install git
|
||||||
|
run: apt-get update -qq && apt-get install -y -qq git
|
||||||
|
|
||||||
|
- name: Checkout with submodules
|
||||||
|
uses: actions/checkout@v4
|
||||||
|
with:
|
||||||
|
submodules: recursive
|
||||||
|
|
||||||
|
- name: Rewrite generalFunctions to local path
|
||||||
|
run: |
|
||||||
|
sed -i 's|"generalFunctions": "git+https://[^"]*"|"generalFunctions": "file:./nodes/generalFunctions"|' package.json
|
||||||
|
|
||||||
|
- name: Install dependencies
|
||||||
|
run: npm install --ignore-scripts
|
||||||
|
|
||||||
|
- name: Lint
|
||||||
|
run: npm run lint
|
||||||
|
|
||||||
|
- name: Test (Jest)
|
||||||
|
run: npm test
|
||||||
|
|
||||||
|
- name: Test (node:test)
|
||||||
|
run: npm run test:node
|
||||||
|
|
||||||
|
- name: Test (legacy)
|
||||||
|
run: npm run test:legacy
|
||||||
@@ -1,2 +0,0 @@
|
|||||||
# Ignore test files
|
|
||||||
node_modules/
|
|
||||||
33
CLAUDE.md
Normal file
33
CLAUDE.md
Normal file
@@ -0,0 +1,33 @@
|
|||||||
|
# EVOLV - Claude Code Project Guide
|
||||||
|
|
||||||
|
## What This Is
|
||||||
|
Node-RED custom nodes package for wastewater treatment plant automation. Developed by Waterschap Brabantse Delta R&D team. Follows ISA-88 (S88) batch control standard.
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
Each node follows a three-layer pattern:
|
||||||
|
1. **Node-RED wrapper** (`<name>.js`) - registers the node type, sets up HTTP endpoints
|
||||||
|
2. **Node adapter** (`src/nodeClass.js`) - bridges Node-RED API with domain logic, handles config loading, tick loops, events
|
||||||
|
3. **Domain logic** (`src/specificClass.js`) - pure business logic, no Node-RED dependencies
|
||||||
|
|
||||||
|
## Key Shared Library: `nodes/generalFunctions/`
|
||||||
|
- `logger` - structured logging (use this, NOT console.log)
|
||||||
|
- `MeasurementContainer` - chainable measurement storage (type/variant/position)
|
||||||
|
- `configManager` - loads JSON configs from `src/configs/`
|
||||||
|
- `MenuManager` - dynamic UI dropdowns
|
||||||
|
- `outputUtils` - formats messages for InfluxDB and process outputs
|
||||||
|
- `childRegistrationUtils` - parent-child node relationships
|
||||||
|
- `coolprop` - thermodynamic property calculations
|
||||||
|
|
||||||
|
## Conventions
|
||||||
|
- Nodes register under category `'EVOLV'` in Node-RED
|
||||||
|
- S88 color scheme: Area=#0f52a5, ProcessCell=#0c99d9, Unit=#50a8d9, Equipment=#86bbdd, ControlModule=#a9daee
|
||||||
|
- Config JSON files in `generalFunctions/src/configs/` define defaults, types, enums per node
|
||||||
|
- Tick loop runs at 1000ms intervals for time-based updates
|
||||||
|
- Three outputs per node: [process, dbase, parent]
|
||||||
|
- **Multi-tab demo flows**: see `.claude/rules/node-red-flow-layout.md` for the tab/link-channel/spacing rule set used by `examples/`
|
||||||
|
|
||||||
|
## Development Notes
|
||||||
|
- No build step required - pure Node.js
|
||||||
|
- Install: `npm install` in root
|
||||||
|
- Submodule URLs were rewritten from `gitea.centraal.wbd-rd.nl` to `gitea.wbd-rd.nl` for external access
|
||||||
|
- Dependencies: mathjs, generalFunctions (git submodule)
|
||||||
@@ -1,6 +0,0 @@
|
|||||||
# Functional Issues Backlog (Deprecated Location)
|
|
||||||
|
|
||||||
This backlog has moved to:
|
|
||||||
- `.agents/improvements/IMPROVEMENTS_BACKLOG.md`
|
|
||||||
|
|
||||||
Use `.agents/improvements/TOP10_PRODUCTION_PRIORITIES_YYYY-MM-DD.md` for ranked review lists.
|
|
||||||
162
README.md
162
README.md
@@ -1,147 +1,77 @@
|
|||||||
# R&D Bouwblok: EVOLV (Edge-Layer Evolution for Optimized Virtualization)
|
# EVOLV — Edge-Layer Evolution for Optimized Virtualization
|
||||||
|
|
||||||
## Over
|
Node-RED custom nodes package voor de automatisering van afvalwaterzuiveringsinstallaties. Ontwikkeld door het R&D-team van Waterschap Brabantse Delta. Volgt de ISA-88 (S88) batch control standaard.
|
||||||
|
|
||||||
Dit bouwblok is ontwikkeld door het R&D-team van Waterschap Brabantse Delta voor gebruik in Node-RED.
|
## Nodes
|
||||||
|
|
||||||
|
| Node | Functie | S88-niveau |
|
||||||
|
|------|---------|------------|
|
||||||
|
| **rotatingMachine** | Individuele pomp/compressor/blower aansturing | Equipment |
|
||||||
|
| **machineGroupControl** | Multi-pomp optimalisatie (BEP-Gravitation) | Unit |
|
||||||
|
| **pumpingStation** | Pompgemaal met hydraulische context | Unit |
|
||||||
|
| **valve** | Individuele klep modellering | Equipment |
|
||||||
|
| **valveGroupControl** | Klep groep coordinatie | Unit |
|
||||||
|
| **reactor** | Biologische reactor (ASM kinetiek) | Unit |
|
||||||
|
| **settler** | Nabezinker / slibscheiding | Unit |
|
||||||
|
| **monster** | Multi-parameter biologische monitoring | Equipment |
|
||||||
|
| **measurement** | Sensor signaalconditionering | Control Module |
|
||||||
|
| **diffuser** | Beluchting aansturing | Equipment |
|
||||||
|
| **dashboardAPI** | InfluxDB telemetrie + FlowFuse dashboards | — |
|
||||||
|
| **generalFunctions** | Gedeelde bibliotheek (predict, PID, convert, etc.) | — |
|
||||||
|
|
||||||
> *[Voeg hier een korte toelichting toe over de specifieke functionele werking van dit bouwblok]*
|
## Architectuur
|
||||||
|
|
||||||
---
|
Elke node volgt een drie-lagen patroon:
|
||||||
|
1. **Entry file** (`<naam>.js`) — registratie bij Node-RED, admin endpoints
|
||||||
|
2. **nodeClass** (`src/nodeClass.js`) — Node-RED adapter (tick loop, routing, status)
|
||||||
|
3. **specificClass** (`src/specificClass.js`) — pure domeinlogica (fysica, toestandsmachines)
|
||||||
|
|
||||||
## Licentie
|
Drie output-poorten per node: **Port 0** = procesdata, **Port 1** = InfluxDB telemetrie, **Port 2** = registratie/besturing.
|
||||||
|
|
||||||
Deze software valt onder de **Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0)**-licentie.
|
## Installatie
|
||||||
|
|
||||||
- Gebruik, aanpassing en verspreiding is toegestaan voor **niet-commerciële doeleinden**, mits duidelijke naamsvermelding naar Waterschap Brabantse Delta.
|
|
||||||
- Voor **commercieel gebruik** is voorafgaande toestemming vereist.
|
|
||||||
|
|
||||||
📧 Contact: [rdlab@brabantsedelta.nl](mailto:rdlab@brabantsedelta.nl)
|
|
||||||
🔗 Licentie: [https://creativecommons.org/licenses/by-nc/4.0/](https://creativecommons.org/licenses/by-nc/4.0/)
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Generieke opbouw van bouwblokken
|
|
||||||
|
|
||||||
- Reageren automatisch op inkomende data (bijv. de positie van een object bepaalt de berekening).
|
|
||||||
- Ondersteunen koppeling van complexe dataketens tussen processen.
|
|
||||||
- Gestandaardiseerde input/output:
|
|
||||||
- Output = procesdata
|
|
||||||
- Opslaginformatie + relatieve positionering t.o.v. andere objecten
|
|
||||||
- Ontworpen voor combinatie met andere bouwblokken (ook van derden).
|
|
||||||
- Open source en vrij beschikbaar voor iedereen.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Installatie – Alle bouwblokken (via EVOLV)
|
|
||||||
|
|
||||||
Alle bouwblokken van het R&D-team zijn gebundeld in de **EVOLV-repository**, waarin gebruik wordt gemaakt van Git submodules.
|
|
||||||
|
|
||||||
### Eerste keer klonen:
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
git clone --recurse-submodules https://gitea.centraal.wbd-rd.nl/RnD/EVOLV.git
|
git clone --recurse-submodules https://gitea.wbd-rd.nl/RnD/EVOLV.git
|
||||||
cd EVOLV
|
cd EVOLV
|
||||||
|
npm install
|
||||||
```
|
```
|
||||||
|
|
||||||
Of, als je zonder submodules hebt gekloond:
|
Submodules updaten:
|
||||||
|
|
||||||
```bash
|
|
||||||
git submodule init
|
|
||||||
git submodule update
|
|
||||||
```
|
|
||||||
|
|
||||||
### Submodules updaten:
|
|
||||||
|
|
||||||
Om alle submodules te updaten naar de laatste versie van hun eigen repository:
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
git submodule update --remote --merge
|
git submodule update --remote --merge
|
||||||
```
|
```
|
||||||
|
|
||||||
Individuele submodule updaten:
|
Enkel bouwblok installeren in Node-RED:
|
||||||
|
|
||||||
```bash
|
|
||||||
cd nodes/<bouwblok-naam>
|
|
||||||
git checkout main
|
|
||||||
git pull origin main
|
|
||||||
cd ../..
|
|
||||||
git add nodes/<bouwblok-naam>
|
|
||||||
git commit -m "Update submodule <bouwblok-naam>"
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Installatie – Enkel bouwblok
|
|
||||||
|
|
||||||
1. Clone de gewenste repository:
|
|
||||||
|
|
||||||
```bash
|
|
||||||
git clone https://gitea.centraal.wbd-rd.nl/<repo-naam>.git
|
|
||||||
```
|
|
||||||
|
|
||||||
2. Kopieer het bouwblok naar je Node-RED map:
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
mkdir -p ~/.node-red/nodes
|
mkdir -p ~/.node-red/nodes
|
||||||
cp -r <pad-naar-geclonede-map> ~/.node-red/nodes/
|
cp -r nodes/<bouwblok-naam> ~/.node-red/nodes/
|
||||||
```
|
```
|
||||||
|
|
||||||
3. Controleer of `settings.js` het volgende bevat:
|
## Testen
|
||||||
|
|
||||||
```js
|
|
||||||
nodesDir: './nodes',
|
|
||||||
```
|
|
||||||
|
|
||||||
4. Herstart Node-RED:
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
node-red-stop
|
# Alle nodes
|
||||||
node-red-start
|
bash scripts/test-all.sh
|
||||||
|
|
||||||
|
# Specifieke node
|
||||||
|
node --test nodes/<nodeName>/test/basic/*.test.js
|
||||||
|
node --test nodes/<nodeName>/test/integration/*.test.js
|
||||||
|
node --test nodes/<nodeName>/test/edge/*.test.js
|
||||||
```
|
```
|
||||||
|
|
||||||
---
|
## Documentatie
|
||||||
|
|
||||||
## Bijdragen (Fork & Pull Request)
|
- **`wiki/`** — Projectwiki met architectuur, bevindingen en metrics ([index](wiki/index.md))
|
||||||
|
- **`CLAUDE.md`** — Claude Code projectgids
|
||||||
|
- **`manuals/node-red/`** — FlowFuse en Node-RED referentiedocumentatie
|
||||||
|
- **`.agents/`** — Agent skills, beslissingen en function-anchors
|
||||||
|
|
||||||
Wil je bijdragen aan de R&D bouwblokken? Volg dan dit stappenplan:
|
## Licentie
|
||||||
|
|
||||||
1. Fork maken
|
**Creative Commons Attribution-NonCommercial 4.0 International (CC BY-NC 4.0)**
|
||||||
|
|
||||||
- Maak een fork van de gewenste R&D repository in Gitea.
|
Gebruik, aanpassing en verspreiding is toegestaan voor niet-commerciele doeleinden, mits naamsvermelding naar Waterschap Brabantse Delta. Voor commercieel gebruik is voorafgaande toestemming vereist.
|
||||||
|
|
||||||
- Je krijgt hiermee een eigen kopie van de repository in je account.
|
|
||||||
|
|
||||||
2. Wijzigingen aanbrengen
|
|
||||||
|
|
||||||
- Clone je fork lokaal en maak een nieuwe branch (bijv. feature/mijn-wijziging).
|
|
||||||
|
|
||||||
- Breng je wijzigingen aan, commit en push de branch terug naar je fork.
|
|
||||||
|
|
||||||
3. Pull Request indienen
|
|
||||||
|
|
||||||
- Ga in Gitea naar je fork en open de branch.
|
|
||||||
|
|
||||||
- Klik op New Pull Request.
|
|
||||||
|
|
||||||
- Stel de R&D repository in bij samenvoegen met.
|
|
||||||
|
|
||||||
- Stel jouw fork/branch in bij trekken van.
|
|
||||||
|
|
||||||
4. Beschrijving toevoegen
|
|
||||||
|
|
||||||
- Geef een duidelijke titel en beschrijving.
|
|
||||||
|
|
||||||
- Verwijs indien van toepassing naar een issue met de notatie #<nummer> (bijv. #42).
|
|
||||||
|
|
||||||
5. Code review en merge
|
|
||||||
|
|
||||||
- De beheerders van de R&D repository beoordelen je wijziging.
|
|
||||||
|
|
||||||
- Na goedkeuring wordt de wijziging opgenomen in de R&D repository.
|
|
||||||
|
|
||||||
----
|
|
||||||
|
|
||||||
## Contact
|
## Contact
|
||||||
|
|
||||||
📧 rdlab@brabantsedelta.nl
|
rdlab@brabantsedelta.nl
|
||||||
|
|||||||
File diff suppressed because it is too large
Load Diff
29
eslint.config.js
Normal file
29
eslint.config.js
Normal file
@@ -0,0 +1,29 @@
|
|||||||
|
const js = require('@eslint/js');
|
||||||
|
const globals = require('globals');
|
||||||
|
|
||||||
|
module.exports = [
|
||||||
|
js.configs.recommended,
|
||||||
|
{
|
||||||
|
languageOptions: {
|
||||||
|
ecmaVersion: 2022,
|
||||||
|
sourceType: 'commonjs',
|
||||||
|
globals: {
|
||||||
|
...globals.node,
|
||||||
|
...globals.jest,
|
||||||
|
RED: 'readonly',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
rules: {
|
||||||
|
'no-unused-vars': ['warn', { argsIgnorePattern: '^_', varsIgnorePattern: '^_' }],
|
||||||
|
'no-console': 'off',
|
||||||
|
'no-prototype-builtins': 'warn',
|
||||||
|
'no-constant-condition': 'warn',
|
||||||
|
},
|
||||||
|
},
|
||||||
|
{
|
||||||
|
ignores: [
|
||||||
|
'node_modules/**',
|
||||||
|
'nodes/generalFunctions/src/coolprop-node/coolprop/**',
|
||||||
|
],
|
||||||
|
},
|
||||||
|
];
|
||||||
42
examples/README.md
Normal file
42
examples/README.md
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
# EVOLV — End-to-End Example Flows
|
||||||
|
|
||||||
|
Demo flows that show how multiple EVOLV nodes work together in a realistic wastewater-automation scenario. Each example is self-contained: its folder has a `flow.json` you can import directly into Node-RED plus a `README.md` that walks through the topology, control modes, and dashboard layout.
|
||||||
|
|
||||||
|
These flows complement the per-node example flows under `nodes/<name>/examples/` (which exercise a single node in isolation). Use the per-node flows for smoke tests during development; use the flows here when you want to see how a real plant section behaves end-to-end.
|
||||||
|
|
||||||
|
## Catalogue
|
||||||
|
|
||||||
|
| Folder | What it shows |
|
||||||
|
|---|---|
|
||||||
|
| [`pumpingstation-3pumps-dashboard/`](pumpingstation-3pumps-dashboard/) | Wet-well basin + machineGroupControl orchestrating 3 pumps (each with up/downstream pressure measurements), individual + auto control, process-demand input via dashboard slider or random generator, full FlowFuse dashboard. |
|
||||||
|
|
||||||
|
## How to import
|
||||||
|
|
||||||
|
1. Bring up the EVOLV stack: `docker compose up -d` from the superproject root.
|
||||||
|
2. Open Node-RED at `http://localhost:1880`.
|
||||||
|
3. Menu → **Import** → drop in the example's `flow.json` (or paste the contents).
|
||||||
|
4. Open the FlowFuse dashboard at `http://localhost:1880/dashboard`.
|
||||||
|
|
||||||
|
Each example uses a unique dashboard `path` so they can coexist in the same Node-RED runtime.
|
||||||
|
|
||||||
|
## Adding new examples
|
||||||
|
|
||||||
|
When you create a new end-to-end example:
|
||||||
|
|
||||||
|
1. Make a subfolder under `examples/` named `<scenario>-<focus>`.
|
||||||
|
2. Include `flow.json` (Node-RED export) and `README.md` (topology, control modes, dashboard map, things to try).
|
||||||
|
3. Test it on a fresh Dockerized Node-RED — clean import, no errors, dashboard loads.
|
||||||
|
4. Add a row to the catalogue table above.
|
||||||
|
|
||||||
|
## Wishlist for future examples
|
||||||
|
|
||||||
|
These are scenarios worth building when there's a session for it:
|
||||||
|
|
||||||
|
- **Pump failure + MGC re-routing** — kill pump 2 mid-run, watch MGC redistribute to pumps 1 and 3.
|
||||||
|
- **Energy-optimal vs equal-flow control** — same demand profile run through `optimalcontrol` and `prioritycontrol` modes side-by-side, energy comparison chart.
|
||||||
|
- **Schedule-driven demand** — diurnal flow pattern (low at night, peak at 7 am), MGC auto-tuning over 24 simulated hours.
|
||||||
|
- **Reactor + clarifier loop** — `reactor` upstream feeding `settler`, return sludge controlled by a small `pumpingStation`.
|
||||||
|
- **Diffuser + DO control** — aeration grid driven by a PID controller from a dissolved-oxygen sensor.
|
||||||
|
- **Digital sensor bundle** — MQTT-style sensor (BME280, ATAS, etc.) feeding a `measurement` node in digital mode + parent equipment node.
|
||||||
|
- **Maintenance window** — entermaintenance / exitmaintenance cycle with operator handover dashboard.
|
||||||
|
- **Calibration walk-through** — measurement node calibrate cycle with stable / unstable input demonstrations.
|
||||||
140
examples/pumpingstation-3pumps-dashboard/README.md
Normal file
140
examples/pumpingstation-3pumps-dashboard/README.md
Normal file
@@ -0,0 +1,140 @@
|
|||||||
|
# Pumping Station — 3 Pumps with Dashboard
|
||||||
|
|
||||||
|
A complete end-to-end EVOLV stack: a wet-well basin model, a `machineGroupControl` orchestrating three `rotatingMachine` pumps (each with upstream/downstream pressure measurements), process-demand input from either a dashboard slider or an auto random generator, individual + auto control modes, and a FlowFuse dashboard with status, gauges, and trend charts.
|
||||||
|
|
||||||
|
This is the canonical "make sure everything works together" demo for the platform. Use it after any cross-node refactor to confirm the architecture still hangs together end-to-end.
|
||||||
|
|
||||||
|
## Quick start
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd /mnt/d/gitea/EVOLV
|
||||||
|
docker compose up -d
|
||||||
|
# Wait for http://localhost:1880/nodes to return 200, then:
|
||||||
|
curl -s -X POST http://localhost:1880/flows \
|
||||||
|
-H "Content-Type: application/json" \
|
||||||
|
-H "Node-RED-Deployment-Type: full" \
|
||||||
|
--data-binary @examples/pumpingstation-3pumps-dashboard/flow.json
|
||||||
|
```
|
||||||
|
|
||||||
|
Or open Node-RED at <http://localhost:1880>, **Import → drop the `flow.json`**, click **Deploy**.
|
||||||
|
|
||||||
|
Then open the dashboard:
|
||||||
|
|
||||||
|
- <http://localhost:1880/dashboard/pumping-station-demo>
|
||||||
|
|
||||||
|
## Tabs
|
||||||
|
|
||||||
|
The flow is split across four tabs by **concern**:
|
||||||
|
|
||||||
|
| Tab | Lives here | Why |
|
||||||
|
|---|---|---|
|
||||||
|
| 🏭 **Process Plant** | EVOLV nodes (3 pumps + MGC + PS + 6 measurements) and per-node output formatters | The "real plant" layer. Lift this tab into production unchanged. |
|
||||||
|
| 📊 **Dashboard UI** | All `ui-*` widgets, button/setpoint wrappers, trend-split functions | Display + operator inputs only. No business logic. |
|
||||||
|
| 🎛️ **Demo Drivers** | Random demand generator, random-toggle state | Demo-only stimulus. In production, delete this tab and feed `cmd:demand` from your real demand source. |
|
||||||
|
| ⚙️ **Setup & Init** | One-shot `once: true` injects (MGC scaling/mode, pumps mode, auto-startup, random-on) | Runs at deploy time only. Disable for production runtimes. |
|
||||||
|
|
||||||
|
Cross-tab wiring uses **named link-out / link-in pairs**, never direct cross-tab wires. The channel names form the contract:
|
||||||
|
|
||||||
|
| Channel | Direction | What it carries |
|
||||||
|
|---|---|---|
|
||||||
|
| `cmd:demand` | UI / drivers → process | numeric demand in m³/h |
|
||||||
|
| `cmd:randomToggle` | UI → drivers | `'on'` / `'off'` |
|
||||||
|
| `cmd:mode` | UI / setup → process | `'auto'` / `'virtualControl'` setMode broadcast |
|
||||||
|
| `cmd:station-startup` / `cmd:station-shutdown` / `cmd:station-estop` | UI / setup → process | station-wide command, fanned to all 3 pumps |
|
||||||
|
| `cmd:setpoint-A` / `-B` / `-C` | UI → process | per-pump setpoint slider value |
|
||||||
|
| `cmd:pump-A-seq` / `-B-seq` / `-C-seq` | UI → process | per-pump start/stop |
|
||||||
|
| `evt:pump-A` / `-B` / `-C` | process → UI | formatted per-pump status |
|
||||||
|
| `evt:mgc` | process → UI | MGC totals (flow / power / efficiency) |
|
||||||
|
| `evt:ps` | process → UI | basin state + level + volume + flows |
|
||||||
|
| `setup:to-mgc` | setup → process | MGC scaling/mode init |
|
||||||
|
|
||||||
|
See `.claude/rules/node-red-flow-layout.md` for the full layout rule set this demo follows.
|
||||||
|
|
||||||
|
## What the flow contains
|
||||||
|
|
||||||
|
| Layer | Node(s) | Role |
|
||||||
|
|---|---|---|
|
||||||
|
| Top | `pumpingStation` "Pumping Station" | Wet-well basin model. Tracks inflow (`q_in`), outflow (from machine-group child predictions), basin level/volume. PS is in `manual` control mode for the demo so it observes without taking control. |
|
||||||
|
| Mid | `machineGroupControl` "MGC — Pump Group" | Distributes Qd flow demand across the 3 pumps via `optimalcontrol` (BEP-driven). Scaling: `absolute` (Qd is in m³/h directly). |
|
||||||
|
| Low | `rotatingMachine` × 3 — Pump A / B / C | Hidrostal H05K-S03R curve. `auto` mode by default so MGC's `parent` commands are accepted. Manual setpoint slider overrides per-pump when each is in `virtualControl`. |
|
||||||
|
| Sensors | `measurement` × 6 | Per pump: upstream + downstream pressure (mbar). Simulator mode — each ticks a random-walk value continuously. Registered as children of their pump. |
|
||||||
|
| Demand | inject `demand_rand_tick` + function `demand_rand_fn` + `ui-slider` | Random generator (3 s tick, [40, 240] m³/h) AND a manual slider. Both feed a router that fans out to PS (`q_in` in m³/s) and MGC (`Qd` in m³/h). |
|
||||||
|
| Glue | `setMode` fanouts + station-wide buttons | Mode toggle broadcasts `setMode` to all 3 pumps. Station-wide Start / Stop / Emergency-Stop buttons fan out to all 3. |
|
||||||
|
| Dashboard | FlowFuse `ui-page` + 6 groups | Process Demand · Pumping Station · Pump A · Pump B · Pump C · Trends. |
|
||||||
|
|
||||||
|
## Dashboard map
|
||||||
|
|
||||||
|
The page (`/dashboard/pumping-station-demo`) is laid out top-to-bottom:
|
||||||
|
|
||||||
|
1. **Process Demand**
|
||||||
|
- Slider 0–300 m³/h (`manualDemand` topic)
|
||||||
|
- Random demand toggle (auto cycles every 3 s)
|
||||||
|
- Live "current demand" text
|
||||||
|
2. **Pumping Station**
|
||||||
|
- Auto/Manual mode toggle (drives all pumps' `setMode` simultaneously)
|
||||||
|
- Station-wide buttons: Start all · Stop all · Emergency stop
|
||||||
|
- Basin state, level (m), volume (m³), inflow / pumped-out flow (m³/h)
|
||||||
|
3. **Pump A / B / C** (one group each)
|
||||||
|
- Setpoint slider 0–100 % (only effective when that pump is in `virtualControl`)
|
||||||
|
- Per-pump Startup + Shutdown buttons
|
||||||
|
- Live state, mode, controller %, flow, power, upstream/downstream pressure
|
||||||
|
4. **Trends**
|
||||||
|
- Flow per pump chart (m³/h)
|
||||||
|
- Power per pump chart (kW)
|
||||||
|
|
||||||
|
## Control model
|
||||||
|
|
||||||
|
- **AUTO** — the default. `setMode auto` → MGC's `optimalcontrol` decides which pumps run and at what flow. Operator drives only the **Process Demand** slider (or leaves the random generator on); the per-pump setpoint sliders are ignored.
|
||||||
|
- **MANUAL** — flip the Auto/Manual switch. All 3 pumps go to `virtualControl`. MGC commands are now ignored. Per-pump setpoint sliders / Start / Stop are the only inputs that affect the pumps.
|
||||||
|
|
||||||
|
The Emergency Stop button always works regardless of mode and uses the new interruptible-movement path so it stops a pump mid-ramp.
|
||||||
|
|
||||||
|
## Notable design choices
|
||||||
|
|
||||||
|
- **PS is in `manual` control mode** (`controlMode: "manual"`). The default `levelbased` mode would auto-shut all pumps as soon as basin level dips below `stopLevel` (1 m default), which masks the demo. Manual = observation only.
|
||||||
|
- **PS safety guards (dry-run / overfill) disabled.** With no real inflow the basin will frequently look "empty" — that's expected for a demo, not a fault. In production you'd configure a real `q_in` source and leave safeties on.
|
||||||
|
- **MGC scaling = `absolute`, mode = `optimalcontrol`.** Set via inject at deploy. Demand in m³/h, BEP-driven distribution.
|
||||||
|
- **demand_router gates Qd ≤ 0.** A demand of 0 would shut every running pump (via MGC.turnOffAllMachines). Use the explicit Stop All button to actually take pumps down.
|
||||||
|
- **Auto-startup on deploy.** All three pumps fire `execSequence startup` 4 s after deploy so the dashboard shows activity immediately.
|
||||||
|
- **Auto-enable random demand** 5 s after deploy so the trends fill in without operator action.
|
||||||
|
- **Verbose logging is OFF.** All EVOLV nodes are at `warn`. Crank the per-node `logLevel` to `info` or `debug` if you're diagnosing a flow.
|
||||||
|
|
||||||
|
## Things to try
|
||||||
|
|
||||||
|
- Drag the **Process Demand slider** with random off — watch MGC distribute that target across pumps and the basin start filling/draining accordingly.
|
||||||
|
- Flip to **Manual** mode and use the per-pump setpoint sliders — note that MGC stops driving them.
|
||||||
|
- Hit **Emergency Stop** while a pump is ramping — confirms the interruptible-movement fix shipped in `rotatingMachine` v1.0.3.
|
||||||
|
- Watch the **Trends** chart over a few minutes — flow distribution shifts as MGC re-balances around the BEP.
|
||||||
|
|
||||||
|
## Verification (last green run, 2026-04-13)
|
||||||
|
|
||||||
|
Deployed via `POST /flows` to a Dockerized Node-RED, observed for ~15 s after auto-startup:
|
||||||
|
|
||||||
|
- All 3 measurement nodes per pump tick (6 total): pressure values stream every second.
|
||||||
|
- Each pump reaches `operational` ~5 s after the auto-startup inject (3 s starting + 1 s warmup + 1 s for setpoint=0 settle).
|
||||||
|
- MGC reports `3 machine(s) connected` with mode `optimalcontrol`.
|
||||||
|
- Pumping Station shows non-zero basin volume + tracks net flow direction (⬆ / ⬇ / ⏸).
|
||||||
|
- Random demand cycles between ~40 and ~240 m³/h every 3 s.
|
||||||
|
- Per-pump status text + trend chart update on every tick.
|
||||||
|
|
||||||
|
## Regenerating `flow.json`
|
||||||
|
|
||||||
|
`flow.json` is generated from `build_flow.py`. Edit the Python (cleaner diff) and regenerate:
|
||||||
|
|
||||||
|
```bash
|
||||||
|
cd examples/pumpingstation-3pumps-dashboard
|
||||||
|
python3 build_flow.py > flow.json
|
||||||
|
```
|
||||||
|
|
||||||
|
The `build_flow.py` is the source of truth — keep it in sync if you tweak the demo.
|
||||||
|
|
||||||
|
## Wishlist (not in this demo, build separately)
|
||||||
|
|
||||||
|
- **Pump failure + MGC re-routing** — kill pump 2 mid-run, watch MGC redistribute. Would demonstrate fault-tolerance.
|
||||||
|
- **Energy-optimal vs equal-flow control** — same demand profile run through `optimalcontrol` and `prioritycontrol` modes side-by-side, energy comparison chart.
|
||||||
|
- **Schedule-driven demand** — diurnal flow pattern (low at night, peak at 7 am), MGC auto-tuning over 24 simulated hours.
|
||||||
|
- **PS with real `q_in` source + safeties on** — show the basin auto-shut behaviour as a feature, not a bug.
|
||||||
|
- **Real flow sensor per pump** (vs. relying on rotatingMachine's predicted flow) — would let the demo also show measurement-vs-prediction drift indicators.
|
||||||
|
- **Reactor or settler downstream** — close the loop on a real wastewater scenario.
|
||||||
|
|
||||||
|
See the parent `examples/README.md` for the full follow-up catalogue.
|
||||||
1368
examples/pumpingstation-3pumps-dashboard/build_flow.py
Normal file
1368
examples/pumpingstation-3pumps-dashboard/build_flow.py
Normal file
File diff suppressed because it is too large
Load Diff
4149
examples/pumpingstation-3pumps-dashboard/flow.json
Normal file
4149
examples/pumpingstation-3pumps-dashboard/flow.json
Normal file
File diff suppressed because it is too large
Load Diff
19
jest.config.js
Normal file
19
jest.config.js
Normal file
@@ -0,0 +1,19 @@
|
|||||||
|
module.exports = {
|
||||||
|
testEnvironment: 'node',
|
||||||
|
verbose: true,
|
||||||
|
testMatch: [
|
||||||
|
'<rootDir>/nodes/generalFunctions/src/coolprop-node/test/**/*.test.js',
|
||||||
|
'<rootDir>/nodes/generalFunctions/test/**/*.test.js',
|
||||||
|
'<rootDir>/nodes/dashboardAPI/test/**/*.test.js',
|
||||||
|
'<rootDir>/nodes/diffuser/test/specificClass.test.js',
|
||||||
|
'<rootDir>/nodes/monster/test/**/*.test.js',
|
||||||
|
'<rootDir>/nodes/pumpingStation/test/**/*.test.js',
|
||||||
|
'<rootDir>/nodes/reactor/test/**/*.test.js',
|
||||||
|
'<rootDir>/nodes/settler/test/**/*.test.js',
|
||||||
|
'<rootDir>/nodes/measurement/test/**/*.test.js',
|
||||||
|
],
|
||||||
|
testPathIgnorePatterns: [
|
||||||
|
'/node_modules/',
|
||||||
|
],
|
||||||
|
testTimeout: 15000,
|
||||||
|
};
|
||||||
Submodule nodes/dashboardAPI updated: 547333be7d...869ba4fca5
Submodule nodes/diffuser updated: c4dda5955f...7fbd207985
Submodule nodes/generalFunctions updated: c60aa40666...693517cc8f
Submodule nodes/machineGroupControl updated: f8012c8bad...7eafd89f4e
Submodule nodes/measurement updated: c587ed9c7b...998b2002e9
Submodule nodes/monster updated: 38013a86db...5a43f90569
Submodule nodes/pumpingStation updated: 7efd3b0a07...5e2ebe4d96
Submodule nodes/reactor updated: 460b872053...c5fc5c1b59
Submodule nodes/rotatingMachine updated: 33f3c2ef61...11d196f363
Submodule nodes/settler updated: 7f2d326612...b199663c77
Submodule nodes/valve updated: d56f8a382c...ae5bc750cd
Submodule nodes/valveGroupControl updated: cbe868a148...0aa538c2c1
5883
package-lock.json
generated
5883
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
26
package.json
26
package.json
@@ -12,18 +12,21 @@
|
|||||||
"node-red": {
|
"node-red": {
|
||||||
"nodes": {
|
"nodes": {
|
||||||
"dashboardapi": "nodes/dashboardAPI/dashboardapi.js",
|
"dashboardapi": "nodes/dashboardAPI/dashboardapi.js",
|
||||||
|
"diffuser": "nodes/diffuser/diffuser.js",
|
||||||
"machineGroupControl": "nodes/machineGroupControl/mgc.js",
|
"machineGroupControl": "nodes/machineGroupControl/mgc.js",
|
||||||
"measurement": "nodes/measurement/measurement.js",
|
"measurement": "nodes/measurement/measurement.js",
|
||||||
"monster": "nodes/monster/monster.js",
|
"monster": "nodes/monster/monster.js",
|
||||||
|
"pumpingstation": "nodes/pumpingStation/pumpingStation.js",
|
||||||
"reactor": "nodes/reactor/reactor.js",
|
"reactor": "nodes/reactor/reactor.js",
|
||||||
"rotatingMachine": "nodes/rotatingMachine/rotatingMachine.js",
|
"rotatingMachine": "nodes/rotatingMachine/rotatingMachine.js",
|
||||||
|
"settler": "nodes/settler/settler.js",
|
||||||
"valve": "nodes/valve/valve.js",
|
"valve": "nodes/valve/valve.js",
|
||||||
"valveGroupControl": "nodes/valveGroupControl/vgc.js",
|
"valveGroupControl": "nodes/valveGroupControl/vgc.js"
|
||||||
"pumpingstation": "nodes/pumpingStation/pumpingStation.js",
|
|
||||||
"settler": "nodes/settler/settler.js"
|
|
||||||
}
|
}
|
||||||
},
|
},
|
||||||
"scripts": {
|
"scripts": {
|
||||||
|
"preinstall": "node scripts/patch-deps.js",
|
||||||
|
"postinstall": "git checkout -- package.json 2>/dev/null || true",
|
||||||
"docker:build": "docker compose build",
|
"docker:build": "docker compose build",
|
||||||
"docker:up": "docker compose up -d",
|
"docker:up": "docker compose up -d",
|
||||||
"docker:down": "docker compose down",
|
"docker:down": "docker compose down",
|
||||||
@@ -36,7 +39,16 @@
|
|||||||
"docker:test:gf": "docker compose exec nodered sh /data/evolv/scripts/test-all.sh gf",
|
"docker:test:gf": "docker compose exec nodered sh /data/evolv/scripts/test-all.sh gf",
|
||||||
"docker:validate": "docker compose exec nodered sh /data/evolv/scripts/validate-nodes.sh",
|
"docker:validate": "docker compose exec nodered sh /data/evolv/scripts/validate-nodes.sh",
|
||||||
"docker:deploy": "docker compose exec nodered sh /data/evolv/scripts/deploy-flow.sh",
|
"docker:deploy": "docker compose exec nodered sh /data/evolv/scripts/deploy-flow.sh",
|
||||||
"docker:reset": "docker compose down -v && docker compose up -d --build"
|
"docker:reset": "docker compose down -v && docker compose up -d --build",
|
||||||
|
"test": "jest --forceExit",
|
||||||
|
"test:node": "node --test nodes/valve/test/basic/*.test.js nodes/valve/test/edge/*.test.js nodes/valve/test/integration/*.test.js nodes/valveGroupControl/test/basic/*.test.js nodes/valveGroupControl/test/edge/*.test.js nodes/valveGroupControl/test/integration/*.test.js",
|
||||||
|
"test:legacy": "node nodes/machineGroupControl/src/groupcontrol.test.js && node nodes/generalFunctions/src/nrmse/errorMetric.test.js",
|
||||||
|
"test:all": "npm test && npm run test:node && npm run test:legacy",
|
||||||
|
"test:e2e:reactor": "node scripts/e2e-reactor-roundtrip.js",
|
||||||
|
"lint": "eslint nodes/",
|
||||||
|
"lint:fix": "eslint nodes/ --fix",
|
||||||
|
"ci": "npm run lint && npm run test:all",
|
||||||
|
"test:e2e": "bash test/e2e/run-e2e.sh"
|
||||||
},
|
},
|
||||||
"author": "Rene De Ren, Pim Moerman, Janneke Tack, Sjoerd Fijnje, Dieke Gabriels, pieter van der wilt",
|
"author": "Rene De Ren, Pim Moerman, Janneke Tack, Sjoerd Fijnje, Dieke Gabriels, pieter van der wilt",
|
||||||
"license": "SEE LICENSE",
|
"license": "SEE LICENSE",
|
||||||
@@ -46,5 +58,11 @@
|
|||||||
"@tensorflow/tfjs-node": "^4.22.0",
|
"@tensorflow/tfjs-node": "^4.22.0",
|
||||||
"generalFunctions": "file:nodes/generalFunctions",
|
"generalFunctions": "file:nodes/generalFunctions",
|
||||||
"mathjs": "^13.2.0"
|
"mathjs": "^13.2.0"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@eslint/js": "^8.57.0",
|
||||||
|
"eslint": "^8.57.0",
|
||||||
|
"globals": "^15.0.0",
|
||||||
|
"jest": "^29.7.0"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|||||||
@@ -1,114 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Add monitoring/debug nodes to the demo flow for process visibility.
|
|
||||||
* Adds a function node per PS that logs volume, level, flow rate every 10 ticks.
|
|
||||||
* Also adds a status debug node for the overall system.
|
|
||||||
*/
|
|
||||||
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
|
|
||||||
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
|
|
||||||
|
|
||||||
// Remove existing monitoring nodes
|
|
||||||
const monitorIds = flow.filter(n => n.id && n.id.startsWith('demo_mon_')).map(n => n.id);
|
|
||||||
if (monitorIds.length > 0) {
|
|
||||||
console.log('Removing existing monitoring nodes:', monitorIds);
|
|
||||||
for (const id of monitorIds) {
|
|
||||||
const idx = flow.findIndex(n => n.id === id);
|
|
||||||
if (idx !== -1) flow.splice(idx, 1);
|
|
||||||
}
|
|
||||||
// Also remove from wires
|
|
||||||
flow.forEach(n => {
|
|
||||||
if (n.wires) {
|
|
||||||
n.wires = n.wires.map(portWires =>
|
|
||||||
Array.isArray(portWires) ? portWires.filter(w => !monitorIds.includes(w)) : portWires
|
|
||||||
);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Add monitoring function nodes for each PS
|
|
||||||
const monitors = [
|
|
||||||
{
|
|
||||||
id: 'demo_mon_west',
|
|
||||||
name: 'Monitor PS West',
|
|
||||||
ps: 'demo_ps_west',
|
|
||||||
x: 800, y: 50,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: 'demo_mon_north',
|
|
||||||
name: 'Monitor PS North',
|
|
||||||
ps: 'demo_ps_north',
|
|
||||||
x: 800, y: 100,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: 'demo_mon_south',
|
|
||||||
name: 'Monitor PS South',
|
|
||||||
ps: 'demo_ps_south',
|
|
||||||
x: 800, y: 150,
|
|
||||||
},
|
|
||||||
];
|
|
||||||
|
|
||||||
// Each PS sends process data on port 0. Wire monitoring nodes to PS port 0.
|
|
||||||
monitors.forEach(mon => {
|
|
||||||
// Function node that extracts key metrics and logs them periodically
|
|
||||||
const fnNode = {
|
|
||||||
id: mon.id,
|
|
||||||
type: 'function',
|
|
||||||
z: 'demo_tab_wwtp',
|
|
||||||
name: mon.name,
|
|
||||||
func: `// Extract key metrics from PS process output
|
|
||||||
const p = msg.payload || {};
|
|
||||||
|
|
||||||
// Keys have .default suffix in PS output format
|
|
||||||
const vol = p["volume.predicted.atequipment.default"];
|
|
||||||
const level = p["level.predicted.atequipment.default"];
|
|
||||||
const netFlow = p["netFlowRate.predicted.atequipment.default"];
|
|
||||||
const volPct = p["volumePercent.predicted.atequipment.default"];
|
|
||||||
|
|
||||||
// Only log when we have volume data
|
|
||||||
if (vol !== null && vol !== undefined) {
|
|
||||||
const ctx = context.get("tickCount") || 0;
|
|
||||||
context.set("tickCount", ctx + 1);
|
|
||||||
|
|
||||||
// Log every 10 ticks
|
|
||||||
if (ctx % 10 === 0) {
|
|
||||||
const fmt = (v, dec) => typeof v === "number" ? v.toFixed(dec) : String(v);
|
|
||||||
const parts = ["vol=" + fmt(vol, 1) + "m3"];
|
|
||||||
if (level !== null && level !== undefined) parts.push("lvl=" + fmt(level, 3) + "m");
|
|
||||||
if (volPct !== null && volPct !== undefined) parts.push("fill=" + fmt(volPct, 1) + "%");
|
|
||||||
if (netFlow !== null && netFlow !== undefined) parts.push("net=" + fmt(netFlow, 1) + "m3/h");
|
|
||||||
|
|
||||||
node.warn(parts.join(" | "));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
return msg;`,
|
|
||||||
outputs: 1,
|
|
||||||
timeout: '',
|
|
||||||
noerr: 0,
|
|
||||||
initialize: '',
|
|
||||||
finalize: '',
|
|
||||||
libs: [],
|
|
||||||
x: mon.x,
|
|
||||||
y: mon.y,
|
|
||||||
wires: [[]],
|
|
||||||
};
|
|
||||||
|
|
||||||
flow.push(fnNode);
|
|
||||||
|
|
||||||
// Wire PS port 0 to this monitor (append to existing wires)
|
|
||||||
const psNode = flow.find(n => n.id === mon.ps);
|
|
||||||
if (psNode && psNode.wires && psNode.wires[0]) {
|
|
||||||
if (!psNode.wires[0].includes(mon.id)) {
|
|
||||||
psNode.wires[0].push(mon.id);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log(`Added ${mon.id}: ${mon.name} → wired to ${mon.ps} port 0`);
|
|
||||||
});
|
|
||||||
|
|
||||||
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n');
|
|
||||||
console.log(`\nDone. ${monitors.length} monitoring nodes added.`);
|
|
||||||
@@ -1,138 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Comprehensive runtime analysis of the WWTP demo flow.
|
|
||||||
* Captures process debug output, pumping station state, measurements,
|
|
||||||
* and analyzes filling/draining behavior over time.
|
|
||||||
*/
|
|
||||||
|
|
||||||
const http = require('http');
|
|
||||||
|
|
||||||
const NR_URL = 'http://localhost:1880';
|
|
||||||
|
|
||||||
function fetchJSON(url) {
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
http.get(url, res => {
|
|
||||||
const chunks = [];
|
|
||||||
res.on('data', c => chunks.push(c));
|
|
||||||
res.on('end', () => {
|
|
||||||
try { resolve(JSON.parse(Buffer.concat(chunks))); }
|
|
||||||
catch (e) { reject(new Error('Parse error from ' + url + ': ' + e.message)); }
|
|
||||||
});
|
|
||||||
}).on('error', reject);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Inject a debug-capture subflow to intercept process messages
|
|
||||||
async function injectDebugCapture() {
|
|
||||||
const flows = await fetchJSON(NR_URL + '/flows');
|
|
||||||
|
|
||||||
// Find all nodes on WWTP tab
|
|
||||||
const wwtp = flows.filter(n => n.z === 'demo_tab_wwtp');
|
|
||||||
|
|
||||||
console.log('=== WWTP Node Inventory ===');
|
|
||||||
const byType = {};
|
|
||||||
wwtp.forEach(n => {
|
|
||||||
if (!byType[n.type]) byType[n.type] = [];
|
|
||||||
byType[n.type].push(n);
|
|
||||||
});
|
|
||||||
|
|
||||||
Object.entries(byType).sort().forEach(([type, nodes]) => {
|
|
||||||
console.log(type + ' (' + nodes.length + '):');
|
|
||||||
nodes.forEach(n => {
|
|
||||||
const extra = [];
|
|
||||||
if (n.simulator) extra.push('sim=ON');
|
|
||||||
if (n.model) extra.push('model=' + n.model);
|
|
||||||
if (n.basinVolume) extra.push('basin=' + n.basinVolume + 'm3');
|
|
||||||
if (n.basinHeight) extra.push('h=' + n.basinHeight + 'm');
|
|
||||||
if (n.positionVsParent) extra.push('pos=' + n.positionVsParent);
|
|
||||||
if (n.control) extra.push('ctrl=' + JSON.stringify(n.control));
|
|
||||||
console.log(' ' + n.id + ' "' + (n.name || '') + '" ' + (extra.length ? '[' + extra.join(', ') + ']' : ''));
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
// Analyze pumping station configurations
|
|
||||||
console.log('\n=== Pumping Station Configs ===');
|
|
||||||
const pss = wwtp.filter(n => n.type === 'pumpingStation');
|
|
||||||
pss.forEach(ps => {
|
|
||||||
console.log('\n' + ps.id + ' "' + ps.name + '"');
|
|
||||||
console.log(' Basin: vol=' + ps.basinVolume + 'm3, h=' + ps.basinHeight + 'm');
|
|
||||||
console.log(' Inlet: h=' + ps.heightInlet + 'm, Outlet: h=' + ps.heightOutlet + 'm');
|
|
||||||
console.log(' Simulator: ' + ps.simulator);
|
|
||||||
console.log(' Control mode: ' + (ps.controlMode || 'not set'));
|
|
||||||
|
|
||||||
// Check q_in inject wiring
|
|
||||||
const qinInject = wwtp.find(n => n.id === 'demo_inj_' + ps.id.replace('demo_ps_', '') + '_flow');
|
|
||||||
if (qinInject) {
|
|
||||||
console.log(' q_in inject: repeat=' + qinInject.repeat + 's, wired to ' + JSON.stringify(qinInject.wires));
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check what's wired to this PS (port 2 = parent registration)
|
|
||||||
const children = wwtp.filter(n => {
|
|
||||||
if (!n.wires) return false;
|
|
||||||
return n.wires.some(portWires => portWires && portWires.includes(ps.id));
|
|
||||||
});
|
|
||||||
console.log(' Children wired to it: ' + children.map(c => c.id + '(' + c.type + ')').join(', '));
|
|
||||||
});
|
|
||||||
|
|
||||||
// Analyze inject timers
|
|
||||||
console.log('\n=== Active Inject Timers ===');
|
|
||||||
const injects = wwtp.filter(n => n.type === 'inject');
|
|
||||||
injects.forEach(inj => {
|
|
||||||
const targets = (inj.wires || []).flat();
|
|
||||||
console.log(inj.id + ' "' + (inj.name || '') + '"');
|
|
||||||
console.log(' topic=' + inj.topic + ' payload=' + inj.payload);
|
|
||||||
console.log(' once=' + inj.once + ' repeat=' + (inj.repeat || 'none'));
|
|
||||||
console.log(' → ' + targets.join(', '));
|
|
||||||
});
|
|
||||||
|
|
||||||
// Analyze q_in function nodes
|
|
||||||
console.log('\n=== q_in Flow Simulation Functions ===');
|
|
||||||
const fnNodes = wwtp.filter(n => n.type === 'function' && n.name && n.name.includes('Flow'));
|
|
||||||
fnNodes.forEach(fn => {
|
|
||||||
console.log(fn.id + ' "' + fn.name + '"');
|
|
||||||
console.log(' func: ' + (fn.func || '').substring(0, 200));
|
|
||||||
const targets = (fn.wires || []).flat();
|
|
||||||
console.log(' → ' + targets.join(', '));
|
|
||||||
});
|
|
||||||
|
|
||||||
// Analyze measurement nodes
|
|
||||||
console.log('\n=== Measurement Nodes ===');
|
|
||||||
const meas = wwtp.filter(n => n.type === 'measurement');
|
|
||||||
meas.forEach(m => {
|
|
||||||
console.log(m.id + ' "' + (m.name || '') + '"');
|
|
||||||
console.log(' type=' + m.assetType + ' sim=' + m.simulator + ' range=[' + m.o_min + ',' + m.o_max + '] unit=' + m.unit);
|
|
||||||
console.log(' pos=' + (m.positionVsParent || 'none'));
|
|
||||||
// Check port 2 wiring (parent registration)
|
|
||||||
const port2 = m.wires && m.wires[2] ? m.wires[2] : [];
|
|
||||||
console.log(' port2→ ' + (port2.length ? port2.join(', ') : 'none'));
|
|
||||||
});
|
|
||||||
|
|
||||||
// Analyze rotating machines
|
|
||||||
console.log('\n=== Rotating Machine Nodes ===');
|
|
||||||
const machines = wwtp.filter(n => n.type === 'rotatingMachine');
|
|
||||||
machines.forEach(m => {
|
|
||||||
console.log(m.id + ' "' + (m.name || '') + '"');
|
|
||||||
console.log(' model=' + m.model + ' mode=' + m.movementMode);
|
|
||||||
console.log(' pos=' + m.positionVsParent + ' supplier=' + m.supplier);
|
|
||||||
console.log(' speed=' + m.speed + ' startup=' + m.startup + ' shutdown=' + m.shutdown);
|
|
||||||
const port2 = m.wires && m.wires[2] ? m.wires[2] : [];
|
|
||||||
console.log(' port2→ ' + (port2.length ? port2.join(', ') : 'none'));
|
|
||||||
});
|
|
||||||
|
|
||||||
// Check wiring integrity
|
|
||||||
console.log('\n=== Wiring Analysis ===');
|
|
||||||
pss.forEach(ps => {
|
|
||||||
const psPort0 = ps.wires && ps.wires[0] ? ps.wires[0] : [];
|
|
||||||
const psPort1 = ps.wires && ps.wires[1] ? ps.wires[1] : [];
|
|
||||||
const psPort2 = ps.wires && ps.wires[2] ? ps.wires[2] : [];
|
|
||||||
console.log(ps.id + ' wiring:');
|
|
||||||
console.log(' port0 (process): ' + psPort0.join(', '));
|
|
||||||
console.log(' port1 (influx): ' + psPort1.join(', '));
|
|
||||||
console.log(' port2 (parent): ' + psPort2.join(', '));
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
injectDebugCapture().catch(err => {
|
|
||||||
console.error('Analysis failed:', err);
|
|
||||||
process.exit(1);
|
|
||||||
});
|
|
||||||
@@ -1,145 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Capture live process data from Node-RED WebSocket debug sidebar.
|
|
||||||
* Collects samples over a time window and analyzes trends.
|
|
||||||
*/
|
|
||||||
|
|
||||||
const http = require('http');
|
|
||||||
|
|
||||||
const NR_URL = 'http://localhost:1880';
|
|
||||||
const CAPTURE_SECONDS = 30;
|
|
||||||
|
|
||||||
// Alternative: poll the Node-RED comms endpoint
|
|
||||||
// But let's use a simpler approach - inject a temporary catch-all debug and read context
|
|
||||||
|
|
||||||
async function fetchJSON(url) {
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
http.get(url, res => {
|
|
||||||
const chunks = [];
|
|
||||||
res.on('data', c => chunks.push(c));
|
|
||||||
res.on('end', () => {
|
|
||||||
try { resolve(JSON.parse(Buffer.concat(chunks))); }
|
|
||||||
catch (e) { reject(new Error('Parse: ' + e.message)); }
|
|
||||||
});
|
|
||||||
}).on('error', reject);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
async function postJSON(url, data) {
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
const body = JSON.stringify(data);
|
|
||||||
const parsed = new URL(url);
|
|
||||||
const req = http.request({
|
|
||||||
hostname: parsed.hostname,
|
|
||||||
port: parsed.port,
|
|
||||||
path: parsed.pathname,
|
|
||||||
method: 'POST',
|
|
||||||
headers: {
|
|
||||||
'Content-Type': 'application/json',
|
|
||||||
'Content-Length': Buffer.byteLength(body),
|
|
||||||
},
|
|
||||||
}, res => {
|
|
||||||
const chunks = [];
|
|
||||||
res.on('data', c => chunks.push(c));
|
|
||||||
res.on('end', () => {
|
|
||||||
const text = Buffer.concat(chunks).toString();
|
|
||||||
try { resolve(JSON.parse(text)); } catch { resolve(text); }
|
|
||||||
});
|
|
||||||
});
|
|
||||||
req.on('error', reject);
|
|
||||||
req.write(body);
|
|
||||||
req.end();
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
(async () => {
|
|
||||||
console.log('=== Capturing Process Data (' + CAPTURE_SECONDS + 's) ===\n');
|
|
||||||
|
|
||||||
// Use Node-RED inject API to trigger debug output
|
|
||||||
// Instead, let's read node context which stores the current state
|
|
||||||
|
|
||||||
// Get flows to find node IDs
|
|
||||||
const flows = await fetchJSON(NR_URL + '/flows');
|
|
||||||
const wwtp = flows.filter(n => n.z === 'demo_tab_wwtp');
|
|
||||||
|
|
||||||
// Pumping stations store state in node context
|
|
||||||
const pss = wwtp.filter(n => n.type === 'pumpingStation');
|
|
||||||
const pumps = wwtp.filter(n => n.type === 'rotatingMachine');
|
|
||||||
|
|
||||||
const samples = [];
|
|
||||||
const startTime = Date.now();
|
|
||||||
|
|
||||||
console.log('Sampling every 3 seconds for ' + CAPTURE_SECONDS + 's...\n');
|
|
||||||
|
|
||||||
for (let i = 0; i < Math.ceil(CAPTURE_SECONDS / 3); i++) {
|
|
||||||
const t = Date.now();
|
|
||||||
const elapsed = ((t - startTime) / 1000).toFixed(1);
|
|
||||||
|
|
||||||
// Read PS context data via Node-RED context API
|
|
||||||
const sample = { t: elapsed, stations: {} };
|
|
||||||
|
|
||||||
for (const ps of pss) {
|
|
||||||
try {
|
|
||||||
const ctx = await fetchJSON(NR_URL + '/context/node/' + ps.id + '?store=default');
|
|
||||||
sample.stations[ps.id] = ctx;
|
|
||||||
} catch (e) {
|
|
||||||
sample.stations[ps.id] = { error: e.message };
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const pump of pumps) {
|
|
||||||
try {
|
|
||||||
const ctx = await fetchJSON(NR_URL + '/context/node/' + pump.id + '?store=default');
|
|
||||||
sample.stations[pump.id] = ctx;
|
|
||||||
} catch (e) {
|
|
||||||
sample.stations[pump.id] = { error: e.message };
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
samples.push(sample);
|
|
||||||
|
|
||||||
// Print summary for this sample
|
|
||||||
console.log('--- Sample at t=' + elapsed + 's ---');
|
|
||||||
for (const ps of pss) {
|
|
||||||
const ctx = sample.stations[ps.id];
|
|
||||||
if (ctx && ctx.data) {
|
|
||||||
console.log(ps.name + ':');
|
|
||||||
// Print all context keys
|
|
||||||
Object.entries(ctx.data).forEach(([key, val]) => {
|
|
||||||
if (typeof val === 'object') {
|
|
||||||
console.log(' ' + key + ': ' + JSON.stringify(val).substring(0, 200));
|
|
||||||
} else {
|
|
||||||
console.log(' ' + key + ': ' + val);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
} else {
|
|
||||||
console.log(ps.name + ': ' + JSON.stringify(ctx).substring(0, 200));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const pump of pumps) {
|
|
||||||
const ctx = sample.stations[pump.id];
|
|
||||||
if (ctx && ctx.data && Object.keys(ctx.data).length > 0) {
|
|
||||||
console.log(pump.name + ':');
|
|
||||||
Object.entries(ctx.data).forEach(([key, val]) => {
|
|
||||||
if (typeof val === 'object') {
|
|
||||||
console.log(' ' + key + ': ' + JSON.stringify(val).substring(0, 200));
|
|
||||||
} else {
|
|
||||||
console.log(' ' + key + ': ' + val);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
|
||||||
}
|
|
||||||
console.log('');
|
|
||||||
|
|
||||||
if (i < Math.ceil(CAPTURE_SECONDS / 3) - 1) {
|
|
||||||
await new Promise(r => setTimeout(r, 3000));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log('\n=== Summary ===');
|
|
||||||
console.log('Collected ' + samples.length + ' samples over ' + CAPTURE_SECONDS + 's');
|
|
||||||
})().catch(err => {
|
|
||||||
console.error('Capture failed:', err);
|
|
||||||
process.exit(1);
|
|
||||||
});
|
|
||||||
@@ -1,109 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Verify asset selection fields are correct in deployed flow.
|
|
||||||
* Checks that supplier/assetType/model/unit values match asset data IDs
|
|
||||||
* so the editor dropdowns will pre-select correctly.
|
|
||||||
*/
|
|
||||||
|
|
||||||
const http = require('http');
|
|
||||||
|
|
||||||
const NR_URL = 'http://localhost:1880';
|
|
||||||
|
|
||||||
async function fetchJSON(url) {
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
http.get(url, res => {
|
|
||||||
const chunks = [];
|
|
||||||
res.on('data', c => chunks.push(c));
|
|
||||||
res.on('end', () => {
|
|
||||||
try { resolve(JSON.parse(Buffer.concat(chunks))); }
|
|
||||||
catch (e) { reject(new Error(`Parse error: ${e.message}`)); }
|
|
||||||
});
|
|
||||||
}).on('error', reject);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
(async () => {
|
|
||||||
const flows = await fetchJSON(`${NR_URL}/flows`);
|
|
||||||
const errors = [];
|
|
||||||
|
|
||||||
console.log('=== Pump Asset Selection Checks ===');
|
|
||||||
const pumps = flows.filter(n => n.type === 'rotatingMachine' && n.z === 'demo_tab_wwtp');
|
|
||||||
pumps.forEach(p => {
|
|
||||||
const checks = [
|
|
||||||
{ field: 'supplier', expected: 'hidrostal', actual: p.supplier },
|
|
||||||
{ field: 'assetType', expected: 'pump-centrifugal', actual: p.assetType },
|
|
||||||
{ field: 'category', expected: 'machine', actual: p.category },
|
|
||||||
];
|
|
||||||
checks.forEach(c => {
|
|
||||||
if (c.actual === c.expected) {
|
|
||||||
console.log(` PASS: ${p.id} ${c.field} = "${c.actual}"`);
|
|
||||||
} else {
|
|
||||||
console.log(` FAIL: ${p.id} ${c.field} = "${c.actual}" (expected "${c.expected}")`);
|
|
||||||
errors.push(`${p.id}.${c.field}`);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
// Model should be one of the known models
|
|
||||||
const validModels = ['hidrostal-H05K-S03R', 'hidrostal-C5-D03R-SHN1'];
|
|
||||||
if (validModels.includes(p.model)) {
|
|
||||||
console.log(` PASS: ${p.id} model = "${p.model}"`);
|
|
||||||
} else {
|
|
||||||
console.log(` FAIL: ${p.id} model = "${p.model}" (expected one of ${validModels})`);
|
|
||||||
errors.push(`${p.id}.model`);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
console.log('\n=== Measurement Asset Selection Checks ===');
|
|
||||||
const measurements = flows.filter(n => n.type === 'measurement' && n.z === 'demo_tab_wwtp');
|
|
||||||
|
|
||||||
// Valid supplier→type→model combinations from measurement.json
|
|
||||||
const validSuppliers = {
|
|
||||||
'Endress+Hauser': {
|
|
||||||
types: ['flow', 'pressure', 'level'],
|
|
||||||
models: { flow: ['Promag-W400', 'Promag-W300'], pressure: ['Cerabar-PMC51', 'Cerabar-PMC71'], level: ['Levelflex-FMP50'] }
|
|
||||||
},
|
|
||||||
'Hach': {
|
|
||||||
types: ['dissolved-oxygen', 'ammonium', 'nitrate', 'tss'],
|
|
||||||
models: { 'dissolved-oxygen': ['LDO2'], ammonium: ['Amtax-sc'], nitrate: ['Nitratax-sc'], tss: ['Solitax-sc'] }
|
|
||||||
},
|
|
||||||
'vega': {
|
|
||||||
types: ['temperature', 'pressure', 'flow', 'level', 'oxygen'],
|
|
||||||
models: {} // not checking Vega models for now
|
|
||||||
}
|
|
||||||
};
|
|
||||||
|
|
||||||
measurements.forEach(m => {
|
|
||||||
const supplierData = validSuppliers[m.supplier];
|
|
||||||
if (!supplierData) {
|
|
||||||
console.log(` FAIL: ${m.id} supplier "${m.supplier}" not in asset data`);
|
|
||||||
errors.push(`${m.id}.supplier`);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
console.log(` PASS: ${m.id} supplier = "${m.supplier}"`);
|
|
||||||
|
|
||||||
if (!supplierData.types.includes(m.assetType)) {
|
|
||||||
console.log(` FAIL: ${m.id} assetType "${m.assetType}" not in ${m.supplier} types`);
|
|
||||||
errors.push(`${m.id}.assetType`);
|
|
||||||
} else {
|
|
||||||
console.log(` PASS: ${m.id} assetType = "${m.assetType}"`);
|
|
||||||
}
|
|
||||||
|
|
||||||
const validModels = supplierData.models[m.assetType] || [];
|
|
||||||
if (validModels.length > 0 && !validModels.includes(m.model)) {
|
|
||||||
console.log(` FAIL: ${m.id} model "${m.model}" not in ${m.supplier}/${m.assetType} models`);
|
|
||||||
errors.push(`${m.id}.model`);
|
|
||||||
} else if (m.model) {
|
|
||||||
console.log(` PASS: ${m.id} model = "${m.model}"`);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
console.log('\n=== RESULT ===');
|
|
||||||
if (errors.length === 0) {
|
|
||||||
console.log('ALL ASSET SELECTION CHECKS PASSED');
|
|
||||||
} else {
|
|
||||||
console.log(`${errors.length} FAILURE(S):`, errors.join(', '));
|
|
||||||
process.exit(1);
|
|
||||||
}
|
|
||||||
})().catch(err => {
|
|
||||||
console.error('Check failed:', err.message);
|
|
||||||
process.exit(1);
|
|
||||||
});
|
|
||||||
@@ -1,142 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Check the deployed Node-RED flow for correctness after changes.
|
|
||||||
*/
|
|
||||||
const http = require('http');
|
|
||||||
|
|
||||||
function fetch(url) {
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
http.get(url, res => {
|
|
||||||
const chunks = [];
|
|
||||||
res.on('data', c => chunks.push(c));
|
|
||||||
res.on('end', () => resolve(JSON.parse(Buffer.concat(chunks))));
|
|
||||||
}).on('error', reject);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
(async () => {
|
|
||||||
let errors = 0;
|
|
||||||
|
|
||||||
// 1. Check deployed flow structure
|
|
||||||
console.log('=== Checking deployed flow structure ===');
|
|
||||||
const flow = await fetch('http://localhost:1880/flows');
|
|
||||||
console.log('Total deployed nodes:', flow.length);
|
|
||||||
|
|
||||||
// Check MGC exists
|
|
||||||
const mgc = flow.find(n => n.id === 'demo_mgc_west');
|
|
||||||
if (mgc) {
|
|
||||||
console.log('PASS: MGC West exists, position:', mgc.positionVsParent);
|
|
||||||
} else {
|
|
||||||
console.log('FAIL: MGC West missing from deployed flow');
|
|
||||||
errors++;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check reactor speedUpFactor
|
|
||||||
const reactor = flow.find(n => n.id === 'demo_reactor');
|
|
||||||
if (reactor && reactor.speedUpFactor === 1) {
|
|
||||||
console.log('PASS: Reactor speedUpFactor = 1');
|
|
||||||
} else {
|
|
||||||
console.log('FAIL: Reactor speedUpFactor =', reactor?.speedUpFactor);
|
|
||||||
errors++;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check sim mode on measurements
|
|
||||||
const simMeasIds = [
|
|
||||||
'demo_meas_flow', 'demo_meas_do', 'demo_meas_nh4',
|
|
||||||
'demo_meas_ft_n1', 'demo_meas_eff_flow', 'demo_meas_eff_do',
|
|
||||||
'demo_meas_eff_nh4', 'demo_meas_eff_no3', 'demo_meas_eff_tss'
|
|
||||||
];
|
|
||||||
let simOk = 0;
|
|
||||||
simMeasIds.forEach(id => {
|
|
||||||
const n = flow.find(x => x.id === id);
|
|
||||||
if (n && n.simulator === true) simOk++;
|
|
||||||
else { console.log('FAIL: simulator not true on', id); errors++; }
|
|
||||||
});
|
|
||||||
console.log(`PASS: ${simOk}/9 measurement nodes have simulator=true`);
|
|
||||||
|
|
||||||
// Check pressure nodes exist
|
|
||||||
const ptIds = ['demo_meas_pt_w_up','demo_meas_pt_w_down','demo_meas_pt_n_up','demo_meas_pt_n_down','demo_meas_pt_s_up','demo_meas_pt_s_down'];
|
|
||||||
let ptOk = 0;
|
|
||||||
ptIds.forEach(id => {
|
|
||||||
const n = flow.find(x => x.id === id);
|
|
||||||
if (n && n.type === 'measurement') ptOk++;
|
|
||||||
else { console.log('FAIL: pressure node missing:', id); errors++; }
|
|
||||||
});
|
|
||||||
console.log(`PASS: ${ptOk}/6 pressure measurement nodes present`);
|
|
||||||
|
|
||||||
// Check removed nodes are gone
|
|
||||||
const removedIds = [
|
|
||||||
'demo_inj_meas_flow', 'demo_fn_sim_flow', 'demo_inj_meas_do', 'demo_fn_sim_do',
|
|
||||||
'demo_inj_meas_nh4', 'demo_fn_sim_nh4', 'demo_inj_ft_n1', 'demo_fn_sim_ft_n1',
|
|
||||||
'demo_inj_eff_flow', 'demo_fn_sim_eff_flow', 'demo_inj_eff_do', 'demo_fn_sim_eff_do',
|
|
||||||
'demo_inj_eff_nh4', 'demo_fn_sim_eff_nh4', 'demo_inj_eff_no3', 'demo_fn_sim_eff_no3',
|
|
||||||
'demo_inj_eff_tss', 'demo_fn_sim_eff_tss',
|
|
||||||
'demo_inj_w1_startup', 'demo_inj_w1_setpoint', 'demo_inj_w2_startup', 'demo_inj_w2_setpoint',
|
|
||||||
'demo_inj_n1_startup', 'demo_inj_s1_startup'
|
|
||||||
];
|
|
||||||
const stillPresent = removedIds.filter(id => flow.find(x => x.id === id));
|
|
||||||
if (stillPresent.length === 0) {
|
|
||||||
console.log('PASS: All 24 removed nodes are gone');
|
|
||||||
} else {
|
|
||||||
console.log('FAIL: These removed nodes are still present:', stillPresent);
|
|
||||||
errors++;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check kept nodes still exist
|
|
||||||
const keptIds = [
|
|
||||||
'demo_inj_west_flow', 'demo_fn_west_flow_sim',
|
|
||||||
'demo_inj_north_flow', 'demo_fn_north_flow_sim',
|
|
||||||
'demo_inj_south_flow', 'demo_fn_south_flow_sim',
|
|
||||||
'demo_inj_w1_mode', 'demo_inj_w2_mode', 'demo_inj_n1_mode', 'demo_inj_s1_mode',
|
|
||||||
'demo_inj_west_mode', 'demo_inj_north_mode', 'demo_inj_south_mode'
|
|
||||||
];
|
|
||||||
const keptMissing = keptIds.filter(id => !flow.find(x => x.id === id));
|
|
||||||
if (keptMissing.length === 0) {
|
|
||||||
console.log('PASS: All kept nodes still present');
|
|
||||||
} else {
|
|
||||||
console.log('FAIL: These nodes should exist but are missing:', keptMissing);
|
|
||||||
errors++;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check wiring: W1/W2 register to MGC, MGC registers to PS West
|
|
||||||
const w1 = flow.find(n => n.id === 'demo_pump_w1');
|
|
||||||
const w2 = flow.find(n => n.id === 'demo_pump_w2');
|
|
||||||
if (w1 && w1.wires[2] && w1.wires[2].includes('demo_mgc_west')) {
|
|
||||||
console.log('PASS: W1 port 2 wired to MGC');
|
|
||||||
} else {
|
|
||||||
console.log('FAIL: W1 port 2 not wired to MGC, got:', w1?.wires?.[2]);
|
|
||||||
errors++;
|
|
||||||
}
|
|
||||||
if (w2 && w2.wires[2] && w2.wires[2].includes('demo_mgc_west')) {
|
|
||||||
console.log('PASS: W2 port 2 wired to MGC');
|
|
||||||
} else {
|
|
||||||
console.log('FAIL: W2 port 2 not wired to MGC, got:', w2?.wires?.[2]);
|
|
||||||
errors++;
|
|
||||||
}
|
|
||||||
if (mgc && mgc.wires[2] && mgc.wires[2].includes('demo_ps_west')) {
|
|
||||||
console.log('PASS: MGC port 2 wired to PS West');
|
|
||||||
} else {
|
|
||||||
console.log('FAIL: MGC port 2 not wired to PS West');
|
|
||||||
errors++;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check PS outputs wire to level-to-pressure functions
|
|
||||||
const psWest = flow.find(n => n.id === 'demo_ps_west');
|
|
||||||
if (psWest && psWest.wires[0] && psWest.wires[0].includes('demo_fn_level_to_pressure_w')) {
|
|
||||||
console.log('PASS: PS West port 0 wired to level-to-pressure function');
|
|
||||||
} else {
|
|
||||||
console.log('FAIL: PS West port 0 missing level-to-pressure wire');
|
|
||||||
errors++;
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log('\n=== RESULT ===');
|
|
||||||
if (errors === 0) {
|
|
||||||
console.log('ALL CHECKS PASSED');
|
|
||||||
} else {
|
|
||||||
console.log(`${errors} FAILURE(S)`);
|
|
||||||
process.exit(1);
|
|
||||||
}
|
|
||||||
})().catch(err => {
|
|
||||||
console.error('Failed to connect to Node-RED:', err.message);
|
|
||||||
process.exit(1);
|
|
||||||
});
|
|
||||||
@@ -1,78 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Runtime smoke test: connect to Node-RED WebSocket debug and verify
|
|
||||||
* that key nodes are producing output within a timeout period.
|
|
||||||
*/
|
|
||||||
const http = require('http');
|
|
||||||
|
|
||||||
const TIMEOUT_MS = 15000;
|
|
||||||
const NR_URL = 'http://localhost:1880';
|
|
||||||
|
|
||||||
async function fetchJSON(url) {
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
http.get(url, res => {
|
|
||||||
const chunks = [];
|
|
||||||
res.on('data', c => chunks.push(c));
|
|
||||||
res.on('end', () => {
|
|
||||||
try { resolve(JSON.parse(Buffer.concat(chunks))); }
|
|
||||||
catch (e) { reject(new Error(`Parse error from ${url}: ${e.message}`)); }
|
|
||||||
});
|
|
||||||
}).on('error', reject);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
(async () => {
|
|
||||||
|
|
||||||
const errors = [];
|
|
||||||
|
|
||||||
// REST-based checks: verify Node-RED is healthy
|
|
||||||
console.log('=== Runtime Health Checks ===');
|
|
||||||
|
|
||||||
try {
|
|
||||||
const settings = await fetchJSON(`${NR_URL}/settings`);
|
|
||||||
console.log('PASS: Node-RED is responding, version:', settings.editorTheme ? 'custom' : 'default');
|
|
||||||
} catch (e) {
|
|
||||||
console.log('FAIL: Node-RED not responding:', e.message);
|
|
||||||
errors.push('Node-RED not responding');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check that flows are loaded
|
|
||||||
try {
|
|
||||||
const flows = await fetchJSON(`${NR_URL}/flows`);
|
|
||||||
const wwtp = flows.filter(n => n.z === 'demo_tab_wwtp');
|
|
||||||
if (wwtp.length > 50) {
|
|
||||||
console.log(`PASS: ${wwtp.length} nodes loaded on WWTP tab`);
|
|
||||||
} else {
|
|
||||||
console.log(`FAIL: Only ${wwtp.length} nodes on WWTP tab (expected >50)`);
|
|
||||||
errors.push('Too few nodes');
|
|
||||||
}
|
|
||||||
} catch (e) {
|
|
||||||
console.log('FAIL: Cannot read flows:', e.message);
|
|
||||||
errors.push('Cannot read flows');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check inject nodes are running (they have repeat timers)
|
|
||||||
try {
|
|
||||||
const flows = await fetchJSON(`${NR_URL}/flows`);
|
|
||||||
const injects = flows.filter(n => n.type === 'inject' && n.repeat && n.z === 'demo_tab_wwtp');
|
|
||||||
console.log(`PASS: ${injects.length} inject nodes with timers on WWTP tab`);
|
|
||||||
|
|
||||||
// Verify the q_in inject nodes are still there
|
|
||||||
const qinInjects = injects.filter(n => n.id.includes('_flow') || n.id.includes('_tick'));
|
|
||||||
console.log(`PASS: ${qinInjects.length} q_in/tick inject timers active`);
|
|
||||||
} catch (e) {
|
|
||||||
console.log('FAIL: Cannot check inject nodes:', e.message);
|
|
||||||
errors.push('Cannot check inject nodes');
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log('\n=== RESULT ===');
|
|
||||||
if (errors.length === 0) {
|
|
||||||
console.log('ALL RUNTIME CHECKS PASSED');
|
|
||||||
} else {
|
|
||||||
console.log(`${errors.length} FAILURE(S):`, errors.join(', '));
|
|
||||||
process.exit(1);
|
|
||||||
}
|
|
||||||
})().catch(err => {
|
|
||||||
console.error('Runtime check failed:', err.message);
|
|
||||||
process.exit(1);
|
|
||||||
});
|
|
||||||
@@ -1,294 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Comprehensive WWTP Demo Test Suite
|
|
||||||
*
|
|
||||||
* Tests:
|
|
||||||
* 1. Deploy succeeds
|
|
||||||
* 2. All nodes healthy (no errors)
|
|
||||||
* 3. PS volumes above safety threshold after calibration
|
|
||||||
* 4. q_in flowing to all PSs (volume rising)
|
|
||||||
* 5. Measurement simulators producing values
|
|
||||||
* 6. MGC pressure handling working
|
|
||||||
* 7. No persistent safety triggers
|
|
||||||
* 8. Level-based control (PS West) stays idle at low level
|
|
||||||
* 9. Flow-based control (PS North) responds to flow
|
|
||||||
* 10. PS output format correct
|
|
||||||
*/
|
|
||||||
|
|
||||||
const http = require('http');
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
const { execSync } = require('child_process');
|
|
||||||
|
|
||||||
const NR_URL = 'http://localhost:1880';
|
|
||||||
const FLOW_FILE = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
|
|
||||||
let passed = 0;
|
|
||||||
let failed = 0;
|
|
||||||
let warnings = 0;
|
|
||||||
|
|
||||||
function test(name, condition, detail) {
|
|
||||||
if (condition) {
|
|
||||||
console.log(` ✅ PASS: ${name}${detail ? ' — ' + detail : ''}`);
|
|
||||||
passed++;
|
|
||||||
} else {
|
|
||||||
console.log(` ❌ FAIL: ${name}${detail ? ' — ' + detail : ''}`);
|
|
||||||
failed++;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
function warn(name, detail) {
|
|
||||||
console.log(` ⚠️ WARN: ${name}${detail ? ' — ' + detail : ''}`);
|
|
||||||
warnings++;
|
|
||||||
}
|
|
||||||
|
|
||||||
function httpReq(method, urlPath, body) {
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
const parsed = new URL(NR_URL + urlPath);
|
|
||||||
const opts = {
|
|
||||||
hostname: parsed.hostname,
|
|
||||||
port: parsed.port,
|
|
||||||
path: parsed.pathname,
|
|
||||||
method,
|
|
||||||
headers: { 'Content-Type': 'application/json', 'Node-RED-Deployment-Type': 'full' },
|
|
||||||
};
|
|
||||||
if (body) opts.headers['Content-Length'] = Buffer.byteLength(JSON.stringify(body));
|
|
||||||
const req = http.request(opts, (res) => {
|
|
||||||
const chunks = [];
|
|
||||||
res.on('data', (c) => chunks.push(c));
|
|
||||||
res.on('end', () => resolve({ status: res.statusCode, body: Buffer.concat(chunks).toString() }));
|
|
||||||
});
|
|
||||||
req.on('error', reject);
|
|
||||||
if (body) req.write(JSON.stringify(body));
|
|
||||||
req.end();
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
function getLogs(since) {
|
|
||||||
try {
|
|
||||||
return execSync(`docker logs evolv-nodered --since ${since} 2>&1`, {
|
|
||||||
encoding: 'utf8', timeout: 5000,
|
|
||||||
});
|
|
||||||
} catch (e) { return ''; }
|
|
||||||
}
|
|
||||||
|
|
||||||
function fetchJSON(url) {
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
http.get(url, (res) => {
|
|
||||||
const chunks = [];
|
|
||||||
res.on('data', (c) => chunks.push(c));
|
|
||||||
res.on('end', () => {
|
|
||||||
try { resolve(JSON.parse(Buffer.concat(chunks))); }
|
|
||||||
catch (e) { reject(e); }
|
|
||||||
});
|
|
||||||
}).on('error', reject);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
(async () => {
|
|
||||||
console.log('═══════════════════════════════════════');
|
|
||||||
console.log(' WWTP Demo Flow — Comprehensive Test');
|
|
||||||
console.log('═══════════════════════════════════════\n');
|
|
||||||
|
|
||||||
// ==========================================================
|
|
||||||
console.log('1. DEPLOYMENT');
|
|
||||||
console.log('─────────────');
|
|
||||||
|
|
||||||
const flow = JSON.parse(fs.readFileSync(FLOW_FILE, 'utf8'));
|
|
||||||
test('Flow file loads', flow.length > 0, `${flow.length} nodes`);
|
|
||||||
|
|
||||||
const deployTime = new Date().toISOString();
|
|
||||||
const res = await httpReq('POST', '/flows', flow);
|
|
||||||
test('Deploy succeeds', res.status === 204 || res.status === 200, `HTTP ${res.status}`);
|
|
||||||
|
|
||||||
// Wait for init + calibration
|
|
||||||
console.log(' Waiting 5s for initialization...');
|
|
||||||
await new Promise((r) => setTimeout(r, 5000));
|
|
||||||
|
|
||||||
// Check for errors in logs
|
|
||||||
const initLogs = getLogs(deployTime);
|
|
||||||
const initErrors = initLogs.split('\n').filter((l) => l.includes('[ERROR]') || l.includes('Error'));
|
|
||||||
test('No initialization errors', initErrors.length === 0,
|
|
||||||
initErrors.length > 0 ? initErrors.slice(0, 3).join('; ') : 'clean');
|
|
||||||
|
|
||||||
// ==========================================================
|
|
||||||
console.log('\n2. NODE INVENTORY');
|
|
||||||
console.log('─────────────────');
|
|
||||||
|
|
||||||
const flows = await fetchJSON(NR_URL + '/flows');
|
|
||||||
const processTabs = ['demo_tab_wwtp', 'demo_tab_ps_west', 'demo_tab_ps_north', 'demo_tab_ps_south', 'demo_tab_treatment'];
|
|
||||||
const wwtp = flows.filter((n) => processTabs.includes(n.z));
|
|
||||||
|
|
||||||
const byType = {};
|
|
||||||
wwtp.forEach((n) => {
|
|
||||||
if (!n.type || n.type === 'tab' || n.type === 'comment') return;
|
|
||||||
byType[n.type] = (byType[n.type] || 0) + 1;
|
|
||||||
});
|
|
||||||
|
|
||||||
test('Has pumping stations', (byType['pumpingStation'] || 0) === 3, `${byType['pumpingStation'] || 0} PS nodes`);
|
|
||||||
test('Has rotating machines', (byType['rotatingMachine'] || 0) === 5, `${byType['rotatingMachine'] || 0} pumps`);
|
|
||||||
test('Has measurements', (byType['measurement'] || 0) >= 15, `${byType['measurement'] || 0} measurement nodes`);
|
|
||||||
test('Has reactor', (byType['reactor'] || 0) === 1, `${byType['reactor'] || 0} reactor`);
|
|
||||||
test('Has machineGroupControl', (byType['machineGroupControl'] || 0) >= 1, `${byType['machineGroupControl'] || 0} MGC`);
|
|
||||||
test('Has inject nodes', (byType['inject'] || 0) >= 10, `${byType['inject'] || 0} injects`);
|
|
||||||
|
|
||||||
console.log(` Node types: ${JSON.stringify(byType)}`);
|
|
||||||
|
|
||||||
// ==========================================================
|
|
||||||
console.log('\n3. PS CONFIGURATION');
|
|
||||||
console.log('───────────────────');
|
|
||||||
|
|
||||||
const pss = flows.filter((n) => n.type === 'pumpingStation');
|
|
||||||
pss.forEach((ps) => {
|
|
||||||
const vol = Number(ps.basinVolume);
|
|
||||||
const h = Number(ps.basinHeight);
|
|
||||||
const hOut = Number(ps.heightOutlet);
|
|
||||||
const sa = vol / h;
|
|
||||||
const minVol = hOut * sa;
|
|
||||||
test(`${ps.name} basin config valid`, vol > 0 && h > 0 && hOut >= 0, `vol=${vol} h=${h} hOut=${hOut}`);
|
|
||||||
test(`${ps.name} has safety enabled`, ps.enableDryRunProtection === true || ps.enableDryRunProtection === 'true');
|
|
||||||
});
|
|
||||||
|
|
||||||
// Check calibration nodes exist
|
|
||||||
const calibNodes = flows.filter((n) => n.id && n.id.startsWith('demo_inj_calib_'));
|
|
||||||
test('Calibration inject nodes exist', calibNodes.length === 3, `${calibNodes.length} calibration nodes`);
|
|
||||||
|
|
||||||
// ==========================================================
|
|
||||||
console.log('\n4. MEASUREMENT SIMULATORS');
|
|
||||||
console.log('─────────────────────────');
|
|
||||||
|
|
||||||
const measurements = flows.filter((n) => n.type === 'measurement' && processTabs.includes(n.z));
|
|
||||||
const simEnabled = measurements.filter((n) => n.simulator === true || n.simulator === 'true');
|
|
||||||
test('Measurement simulators enabled', simEnabled.length >= 10, `${simEnabled.length} of ${measurements.length} have sim=true`);
|
|
||||||
|
|
||||||
// List measurement nodes
|
|
||||||
measurements.forEach((m) => {
|
|
||||||
const sim = m.simulator === true || m.simulator === 'true';
|
|
||||||
const range = `[${m.o_min}-${m.o_max}] ${m.unit}`;
|
|
||||||
if (!sim && !m.id.includes('level') && !m.id.includes('pt_')) {
|
|
||||||
warn(`${m.name || m.id} sim=${sim}`, `range ${range}`);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// ==========================================================
|
|
||||||
console.log('\n5. PUMP CONFIGURATION');
|
|
||||||
console.log('─────────────────────');
|
|
||||||
|
|
||||||
const pumps = flows.filter((n) => n.type === 'rotatingMachine' && processTabs.includes(n.z));
|
|
||||||
pumps.forEach((p) => {
|
|
||||||
test(`${p.name} has model`, !!p.model, p.model);
|
|
||||||
test(`${p.name} supplier lowercase`, p.supplier === 'hidrostal', `supplier="${p.supplier}"`);
|
|
||||||
});
|
|
||||||
|
|
||||||
// ==========================================================
|
|
||||||
console.log('\n6. PRESSURE MEASUREMENTS');
|
|
||||||
console.log('────────────────────────');
|
|
||||||
|
|
||||||
const pts = flows.filter((n) => n.type === 'measurement' && n.id && n.id.includes('_pt_'));
|
|
||||||
test('6 pressure transmitters', pts.length === 6, `found ${pts.length}`);
|
|
||||||
|
|
||||||
pts.forEach((pt) => {
|
|
||||||
const range = `${pt.o_min}-${pt.o_max} ${pt.unit}`;
|
|
||||||
const sim = pt.simulator === true || pt.simulator === 'true';
|
|
||||||
const pos = pt.positionVsParent;
|
|
||||||
test(`${pt.name} valid`, pt.assetType === 'pressure', `pos=${pos} sim=${sim} range=${range}`);
|
|
||||||
|
|
||||||
// Check reasonable pressure ranges (not 0-5000)
|
|
||||||
if (pos === 'downstream' || pos === 'Downstream') {
|
|
||||||
test(`${pt.name} realistic range`, Number(pt.o_max) <= 2000, `o_max=${pt.o_max} (should be <=2000)`);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// ==========================================================
|
|
||||||
console.log('\n7. RUNTIME BEHAVIOR (30s observation)');
|
|
||||||
console.log('─────────────────────────────────────');
|
|
||||||
|
|
||||||
const obsStart = new Date().toISOString();
|
|
||||||
|
|
||||||
// Wait 30 seconds and observe
|
|
||||||
console.log(' Observing for 30 seconds...');
|
|
||||||
await new Promise((r) => setTimeout(r, 30000));
|
|
||||||
|
|
||||||
const obsLogs = getLogs(obsStart);
|
|
||||||
const obsLines = obsLogs.split('\n');
|
|
||||||
|
|
||||||
// Count message types
|
|
||||||
const safetyLines = obsLines.filter((l) => l.includes('Safe guard'));
|
|
||||||
const errorLines = obsLines.filter((l) => l.includes('[ERROR]'));
|
|
||||||
const monitorLines = obsLines.filter((l) => l.includes('[function:Monitor'));
|
|
||||||
|
|
||||||
test('No safety triggers in 30s', safetyLines.length === 0, `${safetyLines.length} triggers`);
|
|
||||||
test('No errors in 30s', errorLines.length === 0,
|
|
||||||
errorLines.length > 0 ? errorLines[0].substring(0, 100) : 'clean');
|
|
||||||
test('Monitor nodes producing data', monitorLines.length > 0, `${monitorLines.length} monitor lines`);
|
|
||||||
|
|
||||||
// Parse monitoring data
|
|
||||||
if (monitorLines.length > 0) {
|
|
||||||
console.log('\n Monitor data:');
|
|
||||||
monitorLines.forEach((l) => {
|
|
||||||
const clean = l.replace(/^\[WARN\] -> /, ' ');
|
|
||||||
console.log(' ' + clean.trim().substring(0, 150));
|
|
||||||
});
|
|
||||||
|
|
||||||
// Check volume per PS
|
|
||||||
const psVolumes = {};
|
|
||||||
monitorLines.forEach((l) => {
|
|
||||||
const psMatch = l.match(/Monitor (PS \w+)/);
|
|
||||||
const volMatch = l.match(/vol=([\d.]+)m3/);
|
|
||||||
if (psMatch && volMatch) {
|
|
||||||
const ps = psMatch[1];
|
|
||||||
if (!psVolumes[ps]) psVolumes[ps] = [];
|
|
||||||
psVolumes[ps].push(parseFloat(volMatch[1]));
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
Object.entries(psVolumes).forEach(([ps, vols]) => {
|
|
||||||
const first = vols[0];
|
|
||||||
const last = vols[vols.length - 1];
|
|
||||||
test(`${ps} volume above 0`, first > 0, `vol=${first.toFixed(1)} m3`);
|
|
||||||
test(`${ps} volume reasonable`, first < 1000, `vol=${first.toFixed(1)} m3`);
|
|
||||||
if (vols.length >= 2) {
|
|
||||||
const trend = last - first;
|
|
||||||
test(`${ps} volume stable/rising`, trend >= -0.5, `${first.toFixed(1)} → ${last.toFixed(1)} m3 (${trend >= 0 ? '+' : ''}${trend.toFixed(2)})`);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
} else {
|
|
||||||
warn('No monitor data', 'monitoring function nodes may not have fired yet');
|
|
||||||
}
|
|
||||||
|
|
||||||
// ==========================================================
|
|
||||||
console.log('\n8. WIRING INTEGRITY');
|
|
||||||
console.log('───────────────────');
|
|
||||||
|
|
||||||
// Check all PS have q_in inject
|
|
||||||
pss.forEach((ps) => {
|
|
||||||
const qinFn = flows.find((n) => n.wires && n.wires.flat && n.wires.flat().includes(ps.id) && n.type === 'function');
|
|
||||||
test(`${ps.name} has q_in source`, !!qinFn, qinFn ? qinFn.name : 'none');
|
|
||||||
});
|
|
||||||
|
|
||||||
// Check all pumps have pressure measurements (RAS pump has flow sensor instead)
|
|
||||||
pumps.forEach((p) => {
|
|
||||||
const childSensors = flows.filter((n) => n.type === 'measurement' && n.wires && n.wires[2] && n.wires[2].includes(p.id));
|
|
||||||
const isRAS = p.id === 'demo_pump_ras';
|
|
||||||
const minSensors = isRAS ? 1 : 2;
|
|
||||||
test(`${p.name} has ${isRAS ? 'sensors' : 'pressure PTs'}`, childSensors.length >= minSensors,
|
|
||||||
`${childSensors.length} ${isRAS ? 'sensors' : 'PTs'} (${childSensors.map((pt) => pt.positionVsParent).join(', ')})`);
|
|
||||||
});
|
|
||||||
|
|
||||||
// ==========================================================
|
|
||||||
console.log('\n═══════════════════════════════════════');
|
|
||||||
console.log(` Results: ${passed} passed, ${failed} failed, ${warnings} warnings`);
|
|
||||||
console.log('═══════════════════════════════════════');
|
|
||||||
|
|
||||||
if (failed > 0) {
|
|
||||||
console.log('\n ❌ SOME TESTS FAILED');
|
|
||||||
process.exit(1);
|
|
||||||
} else if (warnings > 0) {
|
|
||||||
console.log('\n ⚠️ ALL TESTS PASSED (with warnings)');
|
|
||||||
} else {
|
|
||||||
console.log('\n ✅ ALL TESTS PASSED');
|
|
||||||
}
|
|
||||||
})().catch((err) => {
|
|
||||||
console.error('Test suite failed:', err);
|
|
||||||
process.exit(1);
|
|
||||||
});
|
|
||||||
@@ -1,217 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Deploy the demo flow fresh and trace the first 60 seconds of behavior.
|
|
||||||
* Captures: container logs, PS volume evolution, flow events.
|
|
||||||
*/
|
|
||||||
|
|
||||||
const http = require('http');
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
const { execSync } = require('child_process');
|
|
||||||
|
|
||||||
const NR_URL = 'http://localhost:1880';
|
|
||||||
const FLOW_FILE = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const TRACE_SECONDS = 45;
|
|
||||||
|
|
||||||
function httpReq(method, urlPath, body) {
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
const parsed = new URL(NR_URL + urlPath);
|
|
||||||
const opts = {
|
|
||||||
hostname: parsed.hostname,
|
|
||||||
port: parsed.port,
|
|
||||||
path: parsed.pathname,
|
|
||||||
method,
|
|
||||||
headers: {
|
|
||||||
'Content-Type': 'application/json',
|
|
||||||
'Node-RED-Deployment-Type': 'full',
|
|
||||||
},
|
|
||||||
};
|
|
||||||
if (body) {
|
|
||||||
const buf = Buffer.from(JSON.stringify(body));
|
|
||||||
opts.headers['Content-Length'] = buf.length;
|
|
||||||
}
|
|
||||||
const req = http.request(opts, (res) => {
|
|
||||||
const chunks = [];
|
|
||||||
res.on('data', (c) => chunks.push(c));
|
|
||||||
res.on('end', () => {
|
|
||||||
const text = Buffer.concat(chunks).toString();
|
|
||||||
resolve({ status: res.statusCode, body: text });
|
|
||||||
});
|
|
||||||
});
|
|
||||||
req.on('error', reject);
|
|
||||||
if (body) req.write(JSON.stringify(body));
|
|
||||||
req.end();
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
function getLogs(since) {
|
|
||||||
try {
|
|
||||||
// Get ALL logs since our timestamp
|
|
||||||
const cmd = `docker logs evolv-nodered --since ${since} 2>&1`;
|
|
||||||
return execSync(cmd, { encoding: 'utf8', timeout: 5000 });
|
|
||||||
} catch (e) {
|
|
||||||
return 'Log error: ' + e.message;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
(async () => {
|
|
||||||
console.log('=== Deploy & Trace ===');
|
|
||||||
console.log('Loading flow from', FLOW_FILE);
|
|
||||||
|
|
||||||
const flow = JSON.parse(fs.readFileSync(FLOW_FILE, 'utf8'));
|
|
||||||
console.log(`Flow has ${flow.length} nodes`);
|
|
||||||
|
|
||||||
// Deploy
|
|
||||||
const deployTime = new Date().toISOString();
|
|
||||||
console.log(`\nDeploying at ${deployTime}...`);
|
|
||||||
const res = await httpReq('POST', '/flows', flow);
|
|
||||||
console.log(`Deploy response: ${res.status}`);
|
|
||||||
|
|
||||||
if (res.status !== 204 && res.status !== 200) {
|
|
||||||
console.error('Deploy failed:', res.body);
|
|
||||||
process.exit(1);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Wait 3 seconds for initial setup
|
|
||||||
console.log('Waiting 3s for init...\n');
|
|
||||||
await new Promise((r) => setTimeout(r, 3000));
|
|
||||||
|
|
||||||
// Trace loop
|
|
||||||
const traceStart = Date.now();
|
|
||||||
const volumeHistory = [];
|
|
||||||
let lastLogPos = 0;
|
|
||||||
|
|
||||||
for (let i = 0; i < Math.ceil(TRACE_SECONDS / 3); i++) {
|
|
||||||
const elapsed = ((Date.now() - traceStart) / 1000).toFixed(1);
|
|
||||||
|
|
||||||
// Get new logs since deploy
|
|
||||||
const logs = getLogs(deployTime);
|
|
||||||
const newLines = logs.split('\n').slice(lastLogPos);
|
|
||||||
lastLogPos = logs.split('\n').length;
|
|
||||||
|
|
||||||
// Parse interesting log lines
|
|
||||||
const safeGuards = [];
|
|
||||||
const pressureChanges = [];
|
|
||||||
const modeChanges = [];
|
|
||||||
const stateChanges = [];
|
|
||||||
const other = [];
|
|
||||||
|
|
||||||
newLines.forEach((line) => {
|
|
||||||
if (!line.trim()) return;
|
|
||||||
|
|
||||||
const volMatch = line.match(/vol=([-\d.]+) m3.*remainingTime=([\w.]+)/);
|
|
||||||
if (volMatch) {
|
|
||||||
safeGuards.push({ vol: parseFloat(volMatch[1]), remaining: volMatch[2] });
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (line.includes('Pressure change detected')) {
|
|
||||||
pressureChanges.push(1);
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (line.includes('Mode changed') || line.includes('setMode') || line.includes('Control mode')) {
|
|
||||||
modeChanges.push(line.trim().substring(0, 200));
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (line.includes('machine state') || line.includes('State:') || line.includes('startup') || line.includes('shutdown')) {
|
|
||||||
stateChanges.push(line.trim().substring(0, 200));
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (line.includes('q_in') || line.includes('netflow') || line.includes('Volume') ||
|
|
||||||
line.includes('Height') || line.includes('Level') || line.includes('Controllevel')) {
|
|
||||||
other.push(line.trim().substring(0, 200));
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
console.log(`--- t=${elapsed}s ---`);
|
|
||||||
|
|
||||||
if (safeGuards.length > 0) {
|
|
||||||
const latest = safeGuards[safeGuards.length - 1];
|
|
||||||
const first = safeGuards[0];
|
|
||||||
console.log(` SAFETY: ${safeGuards.length} triggers, vol: ${first.vol} → ${latest.vol} m3, remaining: ${latest.remaining}s`);
|
|
||||||
volumeHistory.push({ t: parseFloat(elapsed), vol: latest.vol });
|
|
||||||
} else {
|
|
||||||
console.log(' SAFETY: none (good)');
|
|
||||||
}
|
|
||||||
|
|
||||||
if (pressureChanges.length > 0) {
|
|
||||||
console.log(` PRESSURE: ${pressureChanges.length} changes`);
|
|
||||||
}
|
|
||||||
|
|
||||||
if (modeChanges.length > 0) {
|
|
||||||
modeChanges.forEach((m) => console.log(` MODE: ${m}`));
|
|
||||||
}
|
|
||||||
|
|
||||||
if (stateChanges.length > 0) {
|
|
||||||
stateChanges.slice(-5).forEach((s) => console.log(` STATE: ${s}`));
|
|
||||||
}
|
|
||||||
|
|
||||||
if (other.length > 0) {
|
|
||||||
other.slice(-5).forEach((o) => console.log(` INFO: ${o}`));
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log('');
|
|
||||||
await new Promise((r) => setTimeout(r, 3000));
|
|
||||||
}
|
|
||||||
|
|
||||||
// Final analysis
|
|
||||||
console.log('\n=== Volume Trajectory ===');
|
|
||||||
volumeHistory.forEach((v) => {
|
|
||||||
const bar = '#'.repeat(Math.max(0, Math.round(v.vol / 2)));
|
|
||||||
console.log(` t=${String(v.t).padStart(5)}s: ${String(v.vol.toFixed(2)).padStart(8)} m3 ${bar}`);
|
|
||||||
});
|
|
||||||
|
|
||||||
if (volumeHistory.length >= 2) {
|
|
||||||
const first = volumeHistory[0];
|
|
||||||
const last = volumeHistory[volumeHistory.length - 1];
|
|
||||||
const dt = last.t - first.t;
|
|
||||||
const dv = last.vol - first.vol;
|
|
||||||
const rate = dt > 0 ? (dv / dt * 3600).toFixed(1) : 'N/A';
|
|
||||||
console.log(`\n Rate: ${rate} m3/h (${dv > 0 ? 'FILLING' : 'DRAINING'})`);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Get ALL logs for comprehensive analysis
|
|
||||||
console.log('\n=== Full Log Analysis ===');
|
|
||||||
const allLogs = getLogs(deployTime);
|
|
||||||
const allLines = allLogs.split('\n');
|
|
||||||
|
|
||||||
// Count different message types
|
|
||||||
const counts = { safety: 0, pressure: 0, mode: 0, state: 0, error: 0, warn: 0, flow: 0 };
|
|
||||||
allLines.forEach((l) => {
|
|
||||||
if (l.includes('Safe guard')) counts.safety++;
|
|
||||||
if (l.includes('Pressure change')) counts.pressure++;
|
|
||||||
if (l.includes('Mode') || l.includes('mode')) counts.mode++;
|
|
||||||
if (l.includes('startup') || l.includes('shutdown') || l.includes('machine state')) counts.state++;
|
|
||||||
if (l.includes('[ERROR]') || l.includes('Error')) counts.error++;
|
|
||||||
if (l.includes('[WARN]')) counts.warn++;
|
|
||||||
if (l.includes('netflow') || l.includes('q_in') || l.includes('flow')) counts.flow++;
|
|
||||||
});
|
|
||||||
|
|
||||||
console.log('Message counts:', JSON.stringify(counts, null, 2));
|
|
||||||
|
|
||||||
// Print errors
|
|
||||||
const errors = allLines.filter((l) => l.includes('[ERROR]') || l.includes('Error'));
|
|
||||||
if (errors.length > 0) {
|
|
||||||
console.log('\nErrors:');
|
|
||||||
errors.slice(0, 20).forEach((e) => console.log(' ' + e.trim().substring(0, 200)));
|
|
||||||
}
|
|
||||||
|
|
||||||
// Print first few non-pressure, non-safety lines
|
|
||||||
console.log('\nKey events (first 30):');
|
|
||||||
let keyCount = 0;
|
|
||||||
allLines.forEach((l) => {
|
|
||||||
if (keyCount >= 30) return;
|
|
||||||
if (l.includes('Pressure change detected')) return;
|
|
||||||
if (l.includes('Safe guard triggered')) return;
|
|
||||||
if (!l.trim()) return;
|
|
||||||
console.log(' ' + l.trim().substring(0, 200));
|
|
||||||
keyCount++;
|
|
||||||
});
|
|
||||||
})().catch((err) => {
|
|
||||||
console.error('Failed:', err);
|
|
||||||
process.exit(1);
|
|
||||||
});
|
|
||||||
269
scripts/e2e-reactor-roundtrip.js
Normal file
269
scripts/e2e-reactor-roundtrip.js
Normal file
@@ -0,0 +1,269 @@
|
|||||||
|
#!/usr/bin/env node
|
||||||
|
/**
|
||||||
|
* E2E reactor round-trip test:
|
||||||
|
* Node-RED -> InfluxDB -> Grafana proxy query
|
||||||
|
*/
|
||||||
|
|
||||||
|
const fs = require('node:fs');
|
||||||
|
const path = require('node:path');
|
||||||
|
|
||||||
|
const NR_URL = process.env.NR_URL || 'http://localhost:1880';
|
||||||
|
const INFLUX_URL = process.env.INFLUX_URL || 'http://localhost:8086';
|
||||||
|
const GRAFANA_URL = process.env.GRAFANA_URL || 'http://localhost:3000';
|
||||||
|
const GRAFANA_USER = process.env.GRAFANA_USER || 'admin';
|
||||||
|
const GRAFANA_PASSWORD = process.env.GRAFANA_PASSWORD || 'evolv';
|
||||||
|
const INFLUX_ORG = process.env.INFLUX_ORG || 'evolv';
|
||||||
|
const INFLUX_BUCKET = process.env.INFLUX_BUCKET || 'telemetry';
|
||||||
|
const INFLUX_TOKEN = process.env.INFLUX_TOKEN || 'evolv-dev-token';
|
||||||
|
const GRAFANA_DS_UID = process.env.GRAFANA_DS_UID || 'cdzg44tv250jkd';
|
||||||
|
const FLOW_FILE = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
||||||
|
const REQUIRE_GRAFANA_DASHBOARDS = process.env.REQUIRE_GRAFANA_DASHBOARDS === '1';
|
||||||
|
const REACTOR_MEASUREMENTS = [
|
||||||
|
'reactor_demo_reactor_z1',
|
||||||
|
'reactor_demo_reactor_z2',
|
||||||
|
'reactor_demo_reactor_z3',
|
||||||
|
'reactor_demo_reactor_z4',
|
||||||
|
];
|
||||||
|
const REACTOR_MEASUREMENT = REACTOR_MEASUREMENTS[3];
|
||||||
|
const QUERY_TIMEOUT_MS = 90000;
|
||||||
|
const POLL_INTERVAL_MS = 3000;
|
||||||
|
const REQUIRED_DASHBOARD_TITLES = ['Bioreactor Z1', 'Bioreactor Z2', 'Bioreactor Z3', 'Bioreactor Z4', 'Settler S1'];
|
||||||
|
|
||||||
|
async function wait(ms) {
|
||||||
|
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||||
|
}
|
||||||
|
|
||||||
|
async function fetchJson(url, options = {}) {
|
||||||
|
const response = await fetch(url, options);
|
||||||
|
const text = await response.text();
|
||||||
|
let body = null;
|
||||||
|
if (text) {
|
||||||
|
try {
|
||||||
|
body = JSON.parse(text);
|
||||||
|
} catch {
|
||||||
|
body = text;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return { response, body, text };
|
||||||
|
}
|
||||||
|
|
||||||
|
async function assertReachable() {
|
||||||
|
const checks = [
|
||||||
|
[`${NR_URL}/settings`, 'Node-RED'],
|
||||||
|
[`${INFLUX_URL}/health`, 'InfluxDB'],
|
||||||
|
[`${GRAFANA_URL}/api/health`, 'Grafana'],
|
||||||
|
];
|
||||||
|
|
||||||
|
for (const [url, label] of checks) {
|
||||||
|
const { response, text } = await fetchJson(url, {
|
||||||
|
headers: label === 'Grafana'
|
||||||
|
? { Authorization: `Basic ${Buffer.from(`${GRAFANA_USER}:${GRAFANA_PASSWORD}`).toString('base64')}` }
|
||||||
|
: undefined,
|
||||||
|
});
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`${label} not reachable at ${url} (${response.status}): ${text}`);
|
||||||
|
}
|
||||||
|
console.log(`PASS: ${label} reachable`);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
async function deployDemoFlow() {
|
||||||
|
const flow = JSON.parse(fs.readFileSync(FLOW_FILE, 'utf8'));
|
||||||
|
const { response, text } = await fetchJson(`${NR_URL}/flows`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
'Node-RED-Deployment-Type': 'full',
|
||||||
|
},
|
||||||
|
body: JSON.stringify(flow),
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!(response.status === 200 || response.status === 204)) {
|
||||||
|
throw new Error(`Flow deploy failed (${response.status}): ${text}`);
|
||||||
|
}
|
||||||
|
console.log(`PASS: Demo flow deployed (${response.status})`);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function queryInfluxCsv(query) {
|
||||||
|
const response = await fetch(`${INFLUX_URL}/api/v2/query?org=${encodeURIComponent(INFLUX_ORG)}`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
Authorization: `Token ${INFLUX_TOKEN}`,
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
Accept: 'application/csv',
|
||||||
|
},
|
||||||
|
body: JSON.stringify({ query }),
|
||||||
|
});
|
||||||
|
|
||||||
|
const text = await response.text();
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`Influx query failed (${response.status}): ${text}`);
|
||||||
|
}
|
||||||
|
return text;
|
||||||
|
}
|
||||||
|
|
||||||
|
function countCsvDataRows(csvText) {
|
||||||
|
return csvText
|
||||||
|
.split('\n')
|
||||||
|
.map((line) => line.trim())
|
||||||
|
.filter((line) => line && !line.startsWith('#') && line.includes(','))
|
||||||
|
.length;
|
||||||
|
}
|
||||||
|
|
||||||
|
async function waitForReactorTelemetry() {
|
||||||
|
const deadline = Date.now() + QUERY_TIMEOUT_MS;
|
||||||
|
|
||||||
|
while (Date.now() < deadline) {
|
||||||
|
const counts = {};
|
||||||
|
for (const measurement of REACTOR_MEASUREMENTS) {
|
||||||
|
const query = `
|
||||||
|
from(bucket: "${INFLUX_BUCKET}")
|
||||||
|
|> range(start: -15m)
|
||||||
|
|> filter(fn: (r) => r._measurement == "${measurement}")
|
||||||
|
|> limit(n: 20)
|
||||||
|
`.trim();
|
||||||
|
counts[measurement] = countCsvDataRows(await queryInfluxCsv(query));
|
||||||
|
}
|
||||||
|
|
||||||
|
const missing = Object.entries(counts)
|
||||||
|
.filter(([, rows]) => rows === 0)
|
||||||
|
.map(([measurement]) => measurement);
|
||||||
|
|
||||||
|
if (missing.length === 0) {
|
||||||
|
const summary = Object.entries(counts)
|
||||||
|
.map(([measurement, rows]) => `${measurement}=${rows}`)
|
||||||
|
.join(', ');
|
||||||
|
console.log(`PASS: Reactor telemetry reached InfluxDB (${summary})`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
console.log(`WAIT: reactor telemetry not yet present in InfluxDB for ${missing.join(', ')}`);
|
||||||
|
await wait(POLL_INTERVAL_MS);
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new Error(`Timed out waiting for reactor telemetry measurements ${REACTOR_MEASUREMENTS.join(', ')}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function assertGrafanaDatasource() {
|
||||||
|
const auth = `Basic ${Buffer.from(`${GRAFANA_USER}:${GRAFANA_PASSWORD}`).toString('base64')}`;
|
||||||
|
const { response, body, text } = await fetchJson(`${GRAFANA_URL}/api/datasources/uid/${GRAFANA_DS_UID}`, {
|
||||||
|
headers: { Authorization: auth },
|
||||||
|
});
|
||||||
|
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`Grafana datasource lookup failed (${response.status}): ${text}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (body?.uid !== GRAFANA_DS_UID) {
|
||||||
|
throw new Error(`Grafana datasource UID mismatch: expected ${GRAFANA_DS_UID}, got ${body?.uid}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(`PASS: Grafana datasource ${GRAFANA_DS_UID} is present`);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function queryGrafanaDatasource() {
|
||||||
|
const auth = `Basic ${Buffer.from(`${GRAFANA_USER}:${GRAFANA_PASSWORD}`).toString('base64')}`;
|
||||||
|
const response = await fetch(`${GRAFANA_URL}/api/ds/query`, {
|
||||||
|
method: 'POST',
|
||||||
|
headers: {
|
||||||
|
Authorization: auth,
|
||||||
|
'Content-Type': 'application/json',
|
||||||
|
},
|
||||||
|
body: JSON.stringify({
|
||||||
|
from: 'now-15m',
|
||||||
|
to: 'now',
|
||||||
|
queries: [
|
||||||
|
{
|
||||||
|
refId: 'A',
|
||||||
|
datasource: { uid: GRAFANA_DS_UID, type: 'influxdb' },
|
||||||
|
query: `
|
||||||
|
from(bucket: "${INFLUX_BUCKET}")
|
||||||
|
|> range(start: -15m)
|
||||||
|
|> filter(fn: (r) => r._measurement == "${REACTOR_MEASUREMENT}" and r._field == "S_O")
|
||||||
|
|> last()
|
||||||
|
`.trim(),
|
||||||
|
rawQuery: true,
|
||||||
|
intervalMs: 1000,
|
||||||
|
maxDataPoints: 100,
|
||||||
|
}
|
||||||
|
],
|
||||||
|
}),
|
||||||
|
});
|
||||||
|
|
||||||
|
const text = await response.text();
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`Grafana datasource query failed (${response.status}): ${text}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const body = JSON.parse(text);
|
||||||
|
const frames = body?.results?.A?.frames || [];
|
||||||
|
if (frames.length === 0) {
|
||||||
|
throw new Error('Grafana datasource query returned no reactor frames');
|
||||||
|
}
|
||||||
|
|
||||||
|
console.log(`PASS: Grafana can query reactor telemetry through datasource (${frames.length} frame(s))`);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function waitForGrafanaDashboards(timeoutMs = QUERY_TIMEOUT_MS) {
|
||||||
|
const deadline = Date.now() + timeoutMs;
|
||||||
|
const auth = `Basic ${Buffer.from(`${GRAFANA_USER}:${GRAFANA_PASSWORD}`).toString('base64')}`;
|
||||||
|
|
||||||
|
while (Date.now() < deadline) {
|
||||||
|
const response = await fetch(`${GRAFANA_URL}/api/search?query=`, {
|
||||||
|
headers: { Authorization: auth },
|
||||||
|
});
|
||||||
|
const text = await response.text();
|
||||||
|
if (!response.ok) {
|
||||||
|
throw new Error(`Grafana dashboard search failed (${response.status}): ${text}`);
|
||||||
|
}
|
||||||
|
|
||||||
|
const results = JSON.parse(text);
|
||||||
|
const titles = new Set(results.map((item) => item.title));
|
||||||
|
const missing = REQUIRED_DASHBOARD_TITLES.filter((title) => !titles.has(title));
|
||||||
|
const pumpingStationCount = results.filter((item) => item.title === 'pumpingStation').length;
|
||||||
|
if (missing.length === 0 && pumpingStationCount >= 3) {
|
||||||
|
console.log(`PASS: Grafana dashboards created (${REQUIRED_DASHBOARD_TITLES.join(', ')} + ${pumpingStationCount} pumpingStation dashboards)`);
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
const missingParts = [];
|
||||||
|
if (missing.length > 0) {
|
||||||
|
missingParts.push(`missing titled dashboards: ${missing.join(', ')}`);
|
||||||
|
}
|
||||||
|
if (pumpingStationCount < 3) {
|
||||||
|
missingParts.push(`pumpingStation dashboards=${pumpingStationCount}`);
|
||||||
|
}
|
||||||
|
console.log(`WAIT: Grafana dashboards not ready: ${missingParts.join(' | ')}`);
|
||||||
|
await wait(POLL_INTERVAL_MS);
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new Error(`Timed out waiting for Grafana dashboards: ${REQUIRED_DASHBOARD_TITLES.join(', ')} and >=3 pumpingStation dashboards`);
|
||||||
|
}
|
||||||
|
|
||||||
|
async function main() {
|
||||||
|
console.log('=== EVOLV Reactor E2E Round Trip ===');
|
||||||
|
await assertReachable();
|
||||||
|
await deployDemoFlow();
|
||||||
|
console.log('WAIT: allowing Node-RED inject/tick loops to populate telemetry');
|
||||||
|
await wait(12000);
|
||||||
|
await waitForReactorTelemetry();
|
||||||
|
await assertGrafanaDatasource();
|
||||||
|
await queryGrafanaDatasource();
|
||||||
|
if (REQUIRE_GRAFANA_DASHBOARDS) {
|
||||||
|
await waitForGrafanaDashboards();
|
||||||
|
console.log('PASS: Node-RED -> InfluxDB -> Grafana round trip is working for reactor telemetry and dashboard generation');
|
||||||
|
return;
|
||||||
|
}
|
||||||
|
|
||||||
|
try {
|
||||||
|
await waitForGrafanaDashboards(15000);
|
||||||
|
console.log('PASS: Node-RED -> InfluxDB -> Grafana round trip is working for reactor telemetry and dashboard generation');
|
||||||
|
} catch (error) {
|
||||||
|
console.warn(`WARN: Grafana dashboard auto-generation is not ready yet: ${error.message}`);
|
||||||
|
console.log('PASS: Node-RED -> InfluxDB -> Grafana round trip is working for live reactor telemetry');
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
main().catch((error) => {
|
||||||
|
console.error(`FAIL: ${error.message}`);
|
||||||
|
process.exit(1);
|
||||||
|
});
|
||||||
@@ -1,36 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Fix asset selection in demo-flow.json so editor dropdowns correctly
|
|
||||||
* pre-select the configured supplier/type/model when a node is opened.
|
|
||||||
*
|
|
||||||
* Issues fixed:
|
|
||||||
* 1. Pump nodes: supplier "Hidrostal" → "hidrostal" (matches machine.json id)
|
|
||||||
* 2. demo_meas_flow: assetType "flow-electromagnetic" → "flow" (matches measurement.json type id)
|
|
||||||
*/
|
|
||||||
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
|
|
||||||
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
|
|
||||||
|
|
||||||
let changes = 0;
|
|
||||||
|
|
||||||
flow.forEach(node => {
|
|
||||||
// Fix 1: Pump supplier id mismatch
|
|
||||||
if (node.type === 'rotatingMachine' && node.supplier === 'Hidrostal') {
|
|
||||||
node.supplier = 'hidrostal';
|
|
||||||
changes++;
|
|
||||||
console.log(`Fixed pump ${node.id}: supplier "Hidrostal" → "hidrostal"`);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Fix 2: Standardize flow measurement assetType
|
|
||||||
if (node.type === 'measurement' && node.assetType === 'flow-electromagnetic') {
|
|
||||||
node.assetType = 'flow';
|
|
||||||
changes++;
|
|
||||||
console.log(`Fixed ${node.id}: assetType "flow-electromagnetic" → "flow"`);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n');
|
|
||||||
console.log(`\nDone. ${changes} node(s) updated.`);
|
|
||||||
@@ -1,243 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Fix display issues:
|
|
||||||
* 1. Set positionIcon on all nodes based on positionVsParent
|
|
||||||
* 2. Switch reactor from CSTR to PFR with proper length/resolution
|
|
||||||
* 3. Add missing default fields to all dashboard widgets (gauges, sliders, button-groups)
|
|
||||||
*/
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
|
|
||||||
const FLOW_PATH = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const flow = JSON.parse(fs.readFileSync(FLOW_PATH, 'utf8'));
|
|
||||||
|
|
||||||
const byId = (id) => flow.find(n => n.id === id);
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// FIX 1: positionIcon on all process nodes
|
|
||||||
// =============================================
|
|
||||||
// Icon mapping from physicalPosition.js
|
|
||||||
const positionIconMap = {
|
|
||||||
'upstream': '→',
|
|
||||||
'atEquipment': '⊥',
|
|
||||||
'downstream': '←',
|
|
||||||
};
|
|
||||||
|
|
||||||
let iconFixed = 0;
|
|
||||||
for (const node of flow) {
|
|
||||||
if (node.positionVsParent !== undefined && node.positionVsParent !== '') {
|
|
||||||
const icon = positionIconMap[node.positionVsParent];
|
|
||||||
if (icon && node.positionIcon !== icon) {
|
|
||||||
node.positionIcon = icon;
|
|
||||||
iconFixed++;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
// Also ensure positionIcon has a fallback if positionVsParent is set
|
|
||||||
if (node.positionVsParent && !node.positionIcon) {
|
|
||||||
node.positionIcon = positionIconMap[node.positionVsParent] || '⊥';
|
|
||||||
iconFixed++;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
console.log(`Fixed positionIcon on ${iconFixed} nodes`);
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// FIX 2: Switch reactor from CSTR to PFR
|
|
||||||
// =============================================
|
|
||||||
const reactor = byId('demo_reactor');
|
|
||||||
if (reactor) {
|
|
||||||
reactor.reactor_type = 'PFR';
|
|
||||||
reactor.length = 50; // 50m plug flow reactor
|
|
||||||
reactor.resolution_L = 10; // 10 slices for spatial resolution
|
|
||||||
reactor.alpha = 0; // Danckwerts BC (dispersive flow, more realistic)
|
|
||||||
console.log(`Switched reactor to PFR: length=${reactor.length}m, resolution=${reactor.resolution_L} slices`);
|
|
||||||
|
|
||||||
// Update influent measurements with positions along the reactor
|
|
||||||
// FT-001 at inlet (position 0), DO-001 at 1/3, NH4-001 at 2/3
|
|
||||||
const measFlow = byId('demo_meas_flow');
|
|
||||||
if (measFlow) {
|
|
||||||
measFlow.hasDistance = true;
|
|
||||||
measFlow.distance = 0; // at inlet
|
|
||||||
measFlow.distanceUnit = 'm';
|
|
||||||
measFlow.distanceDescription = 'reactor inlet';
|
|
||||||
measFlow.positionVsParent = 'upstream';
|
|
||||||
measFlow.positionIcon = '→';
|
|
||||||
console.log(' FT-001 positioned at reactor inlet (0m)');
|
|
||||||
}
|
|
||||||
|
|
||||||
const measDo = byId('demo_meas_do');
|
|
||||||
if (measDo) {
|
|
||||||
measDo.hasDistance = true;
|
|
||||||
measDo.distance = 15; // 15m along the reactor (30% of length)
|
|
||||||
measDo.distanceUnit = 'm';
|
|
||||||
measDo.distanceDescription = 'aeration zone';
|
|
||||||
measDo.positionVsParent = 'atEquipment';
|
|
||||||
measDo.positionIcon = '⊥';
|
|
||||||
console.log(' DO-001 positioned at 15m (aeration zone)');
|
|
||||||
}
|
|
||||||
|
|
||||||
const measNh4 = byId('demo_meas_nh4');
|
|
||||||
if (measNh4) {
|
|
||||||
measNh4.hasDistance = true;
|
|
||||||
measNh4.distance = 35; // 35m along the reactor (70% of length)
|
|
||||||
measNh4.distanceUnit = 'm';
|
|
||||||
measNh4.distanceDescription = 'post-aeration zone';
|
|
||||||
measNh4.positionVsParent = 'atEquipment';
|
|
||||||
measNh4.positionIcon = '⊥';
|
|
||||||
console.log(' NH4-001 positioned at 35m (post-aeration zone)');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// FIX 3: Add missing defaults to dashboard widgets
|
|
||||||
// =============================================
|
|
||||||
|
|
||||||
// --- ui-gauge: add missing fields ---
|
|
||||||
const gaugeDefaults = {
|
|
||||||
value: 'payload',
|
|
||||||
valueType: 'msg',
|
|
||||||
sizeThickness: 16,
|
|
||||||
sizeGap: 4,
|
|
||||||
sizeKeyThickness: 8,
|
|
||||||
styleRounded: true,
|
|
||||||
styleGlow: false,
|
|
||||||
alwaysShowTitle: false,
|
|
||||||
floatingTitlePosition: 'top-left',
|
|
||||||
icon: '',
|
|
||||||
};
|
|
||||||
|
|
||||||
let gaugeFixed = 0;
|
|
||||||
for (const node of flow) {
|
|
||||||
if (node.type !== 'ui-gauge') continue;
|
|
||||||
let changed = false;
|
|
||||||
for (const [key, defaultVal] of Object.entries(gaugeDefaults)) {
|
|
||||||
if (node[key] === undefined) {
|
|
||||||
node[key] = defaultVal;
|
|
||||||
changed = true;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
// Ensure className exists
|
|
||||||
if (node.className === undefined) node.className = '';
|
|
||||||
// Ensure outputs (gauges have 1 output in newer versions)
|
|
||||||
if (changed) gaugeFixed++;
|
|
||||||
}
|
|
||||||
console.log(`Fixed ${gaugeFixed} ui-gauge nodes with missing defaults`);
|
|
||||||
|
|
||||||
// --- ui-button-group: add missing fields ---
|
|
||||||
const buttonGroupDefaults = {
|
|
||||||
rounded: true,
|
|
||||||
useThemeColors: true,
|
|
||||||
topic: 'topic',
|
|
||||||
topicType: 'msg',
|
|
||||||
className: '',
|
|
||||||
};
|
|
||||||
|
|
||||||
let bgFixed = 0;
|
|
||||||
for (const node of flow) {
|
|
||||||
if (node.type !== 'ui-button-group') continue;
|
|
||||||
let changed = false;
|
|
||||||
for (const [key, defaultVal] of Object.entries(buttonGroupDefaults)) {
|
|
||||||
if (node[key] === undefined) {
|
|
||||||
node[key] = defaultVal;
|
|
||||||
changed = true;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
// Ensure options have valueType
|
|
||||||
if (node.options && Array.isArray(node.options)) {
|
|
||||||
for (const opt of node.options) {
|
|
||||||
if (!opt.valueType) opt.valueType = 'str';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (changed) bgFixed++;
|
|
||||||
}
|
|
||||||
console.log(`Fixed ${bgFixed} ui-button-group nodes with missing defaults`);
|
|
||||||
|
|
||||||
// --- ui-slider: add missing fields ---
|
|
||||||
const sliderDefaults = {
|
|
||||||
topic: 'topic',
|
|
||||||
topicType: 'msg',
|
|
||||||
thumbLabel: true,
|
|
||||||
showTicks: 'always',
|
|
||||||
className: '',
|
|
||||||
iconPrepend: '',
|
|
||||||
iconAppend: '',
|
|
||||||
color: '',
|
|
||||||
colorTrack: '',
|
|
||||||
colorThumb: '',
|
|
||||||
showTextField: false,
|
|
||||||
};
|
|
||||||
|
|
||||||
let sliderFixed = 0;
|
|
||||||
for (const node of flow) {
|
|
||||||
if (node.type !== 'ui-slider') continue;
|
|
||||||
let changed = false;
|
|
||||||
for (const [key, defaultVal] of Object.entries(sliderDefaults)) {
|
|
||||||
if (node[key] === undefined) {
|
|
||||||
node[key] = defaultVal;
|
|
||||||
changed = true;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (changed) sliderFixed++;
|
|
||||||
}
|
|
||||||
console.log(`Fixed ${sliderFixed} ui-slider nodes with missing defaults`);
|
|
||||||
|
|
||||||
// --- ui-chart: add missing fields ---
|
|
||||||
const chartDefaults = {
|
|
||||||
className: '',
|
|
||||||
};
|
|
||||||
|
|
||||||
let chartFixed = 0;
|
|
||||||
for (const node of flow) {
|
|
||||||
if (node.type !== 'ui-chart') continue;
|
|
||||||
let changed = false;
|
|
||||||
for (const [key, defaultVal] of Object.entries(chartDefaults)) {
|
|
||||||
if (node[key] === undefined) {
|
|
||||||
node[key] = defaultVal;
|
|
||||||
changed = true;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (changed) chartFixed++;
|
|
||||||
}
|
|
||||||
console.log(`Fixed ${chartFixed} ui-chart nodes with missing defaults`);
|
|
||||||
|
|
||||||
// --- ui-template: add missing fields ---
|
|
||||||
for (const node of flow) {
|
|
||||||
if (node.type !== 'ui-template') continue;
|
|
||||||
if (node.templateScope === undefined) node.templateScope = 'local';
|
|
||||||
if (node.className === undefined) node.className = '';
|
|
||||||
}
|
|
||||||
|
|
||||||
// --- ui-text: add missing fields ---
|
|
||||||
for (const node of flow) {
|
|
||||||
if (node.type !== 'ui-text') continue;
|
|
||||||
if (node.className === undefined) node.className = '';
|
|
||||||
}
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// Validate
|
|
||||||
// =============================================
|
|
||||||
const allIds = new Set(flow.map(n => n.id));
|
|
||||||
let issues = 0;
|
|
||||||
for (const n of flow) {
|
|
||||||
if (!n.wires) continue;
|
|
||||||
for (const port of n.wires) {
|
|
||||||
for (const target of port) {
|
|
||||||
if (!allIds.has(target)) {
|
|
||||||
console.warn(`BROKEN WIRE: ${n.id} → ${target}`);
|
|
||||||
issues++;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (issues === 0) console.log('All wire references valid ✓');
|
|
||||||
|
|
||||||
// List all nodes with positionIcon to verify
|
|
||||||
console.log('\nNodes with positionIcon:');
|
|
||||||
for (const n of flow) {
|
|
||||||
if (n.positionIcon) {
|
|
||||||
console.log(` ${n.positionIcon} ${n.name || n.id} (${n.positionVsParent})`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Write
|
|
||||||
fs.writeFileSync(FLOW_PATH, JSON.stringify(flow, null, 2) + '\n');
|
|
||||||
console.log(`\nWrote ${FLOW_PATH} (${flow.length} nodes)`);
|
|
||||||
@@ -1,154 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Fix layout of demo-flow.json so nodes are nicely grouped and don't overlap.
|
|
||||||
*
|
|
||||||
* Layout structure (on demo_tab_wwtp):
|
|
||||||
*
|
|
||||||
* Row 1 (y=40-300): PS West section (comment, mode injects, pumps, MGC, PS, q_in sim)
|
|
||||||
* Row 2 (y=340-500): PS North section
|
|
||||||
* Row 3 (y=520-680): PS South section
|
|
||||||
* Row 4 (y=720-920): Biological Treatment (measurements, reactor, settler, monster)
|
|
||||||
* Row 5 (y=960-1120): Pressure Measurements section
|
|
||||||
* Row 6 (y=1140-1440): Effluent measurements
|
|
||||||
* Row 7 (y=1460+): Telemetry & Dashboard API
|
|
||||||
*
|
|
||||||
* Column layout:
|
|
||||||
* x=140: Inject nodes (left)
|
|
||||||
* x=370: Function nodes
|
|
||||||
* x=580: Intermediate nodes (measurements feeding other nodes)
|
|
||||||
* x=700: Main equipment nodes (PS, pumps, measurement nodes)
|
|
||||||
* x=935: Link out nodes
|
|
||||||
* x=1050+: Right side (reactor, settler, telemetry)
|
|
||||||
*/
|
|
||||||
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
|
|
||||||
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
|
|
||||||
|
|
||||||
function setPos(id, x, y) {
|
|
||||||
const node = flow.find(n => n.id === id);
|
|
||||||
if (node) {
|
|
||||||
node.x = x;
|
|
||||||
node.y = y;
|
|
||||||
} else {
|
|
||||||
console.warn('Layout: node not found:', id);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// === PS West section (y: 40-300) ===
|
|
||||||
setPos('demo_comment_ps', 340, 40);
|
|
||||||
|
|
||||||
// Mode + q_in injects (left column)
|
|
||||||
setPos('demo_inj_w1_mode', 140, 80);
|
|
||||||
setPos('demo_inj_w2_mode', 140, 260);
|
|
||||||
setPos('demo_inj_west_mode', 140, 160);
|
|
||||||
setPos('demo_inj_west_flow', 140, 200);
|
|
||||||
|
|
||||||
// q_in function node
|
|
||||||
setPos('demo_fn_west_flow_sim', 370, 200);
|
|
||||||
|
|
||||||
// MGC sits between PS and pumps
|
|
||||||
setPos('demo_pump_w1', 700, 100);
|
|
||||||
setPos('demo_mgc_west', 700, 180);
|
|
||||||
setPos('demo_pump_w2', 700, 260);
|
|
||||||
setPos('demo_ps_west', 940, 180);
|
|
||||||
|
|
||||||
// === PS North section (y: 340-500) ===
|
|
||||||
setPos('demo_comment_ps_north', 330, 340);
|
|
||||||
setPos('demo_inj_n1_mode', 140, 380);
|
|
||||||
setPos('demo_inj_north_mode', 140, 420);
|
|
||||||
setPos('demo_inj_north_flow', 140, 460);
|
|
||||||
setPos('demo_fn_north_flow_sim', 370, 460);
|
|
||||||
|
|
||||||
// North outflow measurement
|
|
||||||
setPos('demo_comment_north_outflow', 200, 500);
|
|
||||||
setPos('demo_meas_ft_n1', 580, 500);
|
|
||||||
|
|
||||||
setPos('demo_pump_n1', 700, 400);
|
|
||||||
setPos('demo_ps_north', 940, 440);
|
|
||||||
|
|
||||||
// === PS South section (y: 540-680) ===
|
|
||||||
setPos('demo_comment_ps_south', 320, 540);
|
|
||||||
setPos('demo_inj_s1_mode', 140, 580);
|
|
||||||
setPos('demo_inj_south_mode', 140, 620);
|
|
||||||
setPos('demo_inj_south_flow', 140, 660);
|
|
||||||
setPos('demo_fn_south_flow_sim', 370, 660);
|
|
||||||
|
|
||||||
setPos('demo_pump_s1', 700, 580);
|
|
||||||
setPos('demo_ps_south', 940, 620);
|
|
||||||
|
|
||||||
// === Biological Treatment (y: 720-920) ===
|
|
||||||
setPos('demo_comment_treatment', 200, 720);
|
|
||||||
setPos('demo_meas_flow', 700, 760);
|
|
||||||
setPos('demo_meas_do', 700, 820);
|
|
||||||
setPos('demo_meas_nh4', 700, 880);
|
|
||||||
|
|
||||||
setPos('demo_reactor', 1100, 820);
|
|
||||||
setPos('demo_inj_reactor_tick', 900, 760);
|
|
||||||
setPos('demo_settler', 1100, 920);
|
|
||||||
setPos('demo_monster', 1100, 1000);
|
|
||||||
setPos('demo_inj_monster_flow', 850, 1000);
|
|
||||||
setPos('demo_fn_monster_flow', 930, 1040);
|
|
||||||
|
|
||||||
// === Pressure Measurements (y: 960-1120) — new section ===
|
|
||||||
setPos('demo_comment_pressure', 320, 960);
|
|
||||||
|
|
||||||
// West pressure (grouped together)
|
|
||||||
setPos('demo_fn_level_to_pressure_w', 370, 1000);
|
|
||||||
setPos('demo_meas_pt_w_up', 580, 1000);
|
|
||||||
setPos('demo_meas_pt_w_down', 580, 1040);
|
|
||||||
|
|
||||||
// North pressure
|
|
||||||
setPos('demo_fn_level_to_pressure_n', 370, 1080);
|
|
||||||
setPos('demo_meas_pt_n_up', 580, 1080);
|
|
||||||
setPos('demo_meas_pt_n_down', 580, 1120);
|
|
||||||
|
|
||||||
// South pressure
|
|
||||||
setPos('demo_fn_level_to_pressure_s', 370, 1160);
|
|
||||||
setPos('demo_meas_pt_s_up', 580, 1160);
|
|
||||||
setPos('demo_meas_pt_s_down', 580, 1200);
|
|
||||||
|
|
||||||
// === Effluent Measurements (y: 1240-1520) ===
|
|
||||||
setPos('demo_comment_effluent_meas', 300, 1240);
|
|
||||||
setPos('demo_meas_eff_flow', 700, 1280);
|
|
||||||
setPos('demo_meas_eff_do', 700, 1340);
|
|
||||||
setPos('demo_meas_eff_nh4', 700, 1400);
|
|
||||||
setPos('demo_meas_eff_no3', 700, 1460);
|
|
||||||
setPos('demo_meas_eff_tss', 700, 1520);
|
|
||||||
|
|
||||||
// === Telemetry section (right side, y: 40-240) ===
|
|
||||||
setPos('demo_comment_telemetry', 1300, 40);
|
|
||||||
setPos('demo_link_influx_out', 1135, 500);
|
|
||||||
setPos('demo_link_influx_in', 1175, 100);
|
|
||||||
setPos('demo_fn_influx_convert', 1350, 100);
|
|
||||||
setPos('demo_http_influx', 1560, 100);
|
|
||||||
setPos('demo_fn_influx_count', 1740, 100);
|
|
||||||
|
|
||||||
// Process debug
|
|
||||||
setPos('demo_comment_process_out', 1300, 160);
|
|
||||||
setPos('demo_link_process_out', 1135, 540);
|
|
||||||
setPos('demo_link_process_in', 1175, 200);
|
|
||||||
setPos('demo_dbg_process', 1360, 200);
|
|
||||||
setPos('demo_dbg_registration', 1370, 240);
|
|
||||||
|
|
||||||
// Dashboard link outs
|
|
||||||
setPos('demo_link_ps_west_dash', 1135, 160);
|
|
||||||
setPos('demo_link_ps_north_dash', 1135, 420);
|
|
||||||
setPos('demo_link_ps_south_dash', 1135, 600);
|
|
||||||
setPos('demo_link_reactor_dash', 1300, 820);
|
|
||||||
setPos('demo_link_meas_dash', 1135, 860);
|
|
||||||
setPos('demo_link_eff_meas_dash', 1135, 1300);
|
|
||||||
|
|
||||||
// Dashboard API
|
|
||||||
setPos('demo_dashapi', 1100, 1100);
|
|
||||||
setPos('demo_inj_dashapi', 850, 1100);
|
|
||||||
setPos('demo_http_grafana', 1300, 1100);
|
|
||||||
setPos('demo_dbg_grafana', 1500, 1100);
|
|
||||||
|
|
||||||
// InfluxDB status link
|
|
||||||
setPos('demo_link_influx_status_out', 1940, 100);
|
|
||||||
|
|
||||||
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n');
|
|
||||||
console.log('Layout fixed. Deploying...');
|
|
||||||
@@ -1,103 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Add initial volume calibration inject nodes to the demo flow.
|
|
||||||
*
|
|
||||||
* Problem: All 3 pumping stations start with initial volume = minVol,
|
|
||||||
* which is below the dryRun safety threshold. This causes the safety
|
|
||||||
* guard to trigger immediately on every tick, preventing normal control.
|
|
||||||
*
|
|
||||||
* Fix: Add inject nodes that fire once at deploy, sending
|
|
||||||
* calibratePredictedVolume to each PS with a reasonable starting volume.
|
|
||||||
*
|
|
||||||
* PS West: 500m3 basin, startLevel=2.5m → start at 200m3 (level 1.6m)
|
|
||||||
* Below startLevel, pumps stay off. q_in fills basin naturally.
|
|
||||||
* PS North: 200m3 basin, flowbased → start at 100m3 (50% fill)
|
|
||||||
* PS South: 100m3 basin, manual → start at 50m3 (50% fill)
|
|
||||||
*/
|
|
||||||
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
|
|
||||||
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
|
|
||||||
|
|
||||||
// Check if calibration nodes already exist
|
|
||||||
const existingCalib = flow.filter(n => n.id && n.id.startsWith('demo_inj_calib_'));
|
|
||||||
if (existingCalib.length > 0) {
|
|
||||||
console.log('Calibration nodes already exist:', existingCalib.map(n => n.id));
|
|
||||||
console.log('Removing existing calibration nodes first...');
|
|
||||||
for (const node of existingCalib) {
|
|
||||||
const idx = flow.findIndex(n => n.id === node.id);
|
|
||||||
if (idx !== -1) flow.splice(idx, 1);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Find the WWTP tab for positioning
|
|
||||||
const wwtpTab = flow.find(n => n.id === 'demo_tab_wwtp');
|
|
||||||
if (!wwtpTab) {
|
|
||||||
console.error('WWTP tab not found!');
|
|
||||||
process.exit(1);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Calibration configs: { ps_id, name, volume, x, y }
|
|
||||||
const calibrations = [
|
|
||||||
{
|
|
||||||
id: 'demo_inj_calib_west',
|
|
||||||
name: 'Cal: PS West → 200m3',
|
|
||||||
target: 'demo_ps_west',
|
|
||||||
volume: 200,
|
|
||||||
x: 100, y: 50,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: 'demo_inj_calib_north',
|
|
||||||
name: 'Cal: PS North → 100m3',
|
|
||||||
target: 'demo_ps_north',
|
|
||||||
volume: 100,
|
|
||||||
x: 100, y: 100,
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: 'demo_inj_calib_south',
|
|
||||||
name: 'Cal: PS South → 50m3',
|
|
||||||
target: 'demo_ps_south',
|
|
||||||
volume: 50,
|
|
||||||
x: 100, y: 150,
|
|
||||||
},
|
|
||||||
];
|
|
||||||
|
|
||||||
let added = 0;
|
|
||||||
|
|
||||||
calibrations.forEach(cal => {
|
|
||||||
const injectNode = {
|
|
||||||
id: cal.id,
|
|
||||||
type: 'inject',
|
|
||||||
z: 'demo_tab_wwtp',
|
|
||||||
name: cal.name,
|
|
||||||
props: [
|
|
||||||
{
|
|
||||||
p: 'payload',
|
|
||||||
vt: 'num',
|
|
||||||
},
|
|
||||||
{
|
|
||||||
p: 'topic',
|
|
||||||
vt: 'str',
|
|
||||||
},
|
|
||||||
],
|
|
||||||
repeat: '',
|
|
||||||
crontab: '',
|
|
||||||
once: true,
|
|
||||||
onceDelay: '0.5',
|
|
||||||
topic: 'calibratePredictedVolume',
|
|
||||||
payload: String(cal.volume),
|
|
||||||
payloadType: 'num',
|
|
||||||
x: cal.x,
|
|
||||||
y: cal.y,
|
|
||||||
wires: [[cal.target]],
|
|
||||||
};
|
|
||||||
|
|
||||||
flow.push(injectNode);
|
|
||||||
added++;
|
|
||||||
console.log(`Added ${cal.id}: ${cal.name} → ${cal.target} (${cal.volume} m3)`);
|
|
||||||
});
|
|
||||||
|
|
||||||
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n');
|
|
||||||
console.log(`\nDone. ${added} calibration node(s) added.`);
|
|
||||||
@@ -1,25 +0,0 @@
|
|||||||
const fs = require("fs");
|
|
||||||
const flowPath = "docker/demo-flow.json";
|
|
||||||
const flow = JSON.parse(fs.readFileSync(flowPath, "utf8"));
|
|
||||||
|
|
||||||
let newFlow = flow.filter(n => n.id !== "demo_dbg_reactor_inspect");
|
|
||||||
const reactor = newFlow.find(n => n.id === "demo_reactor");
|
|
||||||
reactor.wires[0] = reactor.wires[0].filter(id => id !== "demo_dbg_reactor_inspect");
|
|
||||||
|
|
||||||
reactor.kla = 70;
|
|
||||||
|
|
||||||
newFlow.push({
|
|
||||||
id: "demo_dbg_reactor_inspect",
|
|
||||||
type: "function",
|
|
||||||
z: "demo_tab_treatment",
|
|
||||||
name: "Reactor State Inspector",
|
|
||||||
func: 'if (msg.topic !== "GridProfile") return null;\nconst p = msg.payload;\nif (!p || !p.grid) return null;\nconst now = Date.now();\nif (global.get("lastInspect") && now - global.get("lastInspect") < 5000) return null;\nglobal.set("lastInspect", now);\nconst profile = p.grid.map((row, i) => "cell" + i + "(" + (i*p.d_x).toFixed(0) + "m): NH4=" + row[3].toFixed(2) + " DO=" + row[0].toFixed(2));\nnode.warn("GRID: " + profile.join(" | "));\nreturn null;',
|
|
||||||
outputs: 1,
|
|
||||||
x: 840,
|
|
||||||
y: 320,
|
|
||||||
wires: [[]]
|
|
||||||
});
|
|
||||||
reactor.wires[0].push("demo_dbg_reactor_inspect");
|
|
||||||
|
|
||||||
fs.writeFileSync(flowPath, JSON.stringify(newFlow, null, 2) + "\n");
|
|
||||||
console.log("kla:", reactor.kla, "X_A_init:", reactor.X_A_init);
|
|
||||||
@@ -1,72 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Fix downstream pressure simulator ranges and add a monitoring debug node.
|
|
||||||
*
|
|
||||||
* Problems found:
|
|
||||||
* 1. Downstream pressure simulator range 0-5000 mbar is unrealistic.
|
|
||||||
* Real WWTP system backpressure: 800-1500 mbar (0.8-1.5 bar).
|
|
||||||
* The pump curve operates in 700-3900 mbar. With upstream ~300 mbar
|
|
||||||
* (hydrostatic from 3m basin) and downstream at 5000 mbar, the
|
|
||||||
* pressure differential pushes the curve to extreme predictions.
|
|
||||||
*
|
|
||||||
* 2. No way to see runtime state visually. We'll leave visual monitoring
|
|
||||||
* to the Grafana/dashboard layer, but fix the root cause here.
|
|
||||||
*
|
|
||||||
* Fix: Set downstream pressure simulators to realistic ranges:
|
|
||||||
* - West: o_min=800, o_max=1500, i_min=800, i_max=1500
|
|
||||||
* - North: o_min=600, o_max=1200, i_min=600, i_max=1200
|
|
||||||
* - South: o_min=500, o_max=1000, i_min=500, i_max=1000
|
|
||||||
*
|
|
||||||
* This keeps pressure differential in ~500-1200 mbar range,
|
|
||||||
* well within the pump curve (700-3900 mbar).
|
|
||||||
*/
|
|
||||||
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
|
|
||||||
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
|
|
||||||
|
|
||||||
let changes = 0;
|
|
||||||
|
|
||||||
// Fix downstream pressure simulator ranges
|
|
||||||
const pressureFixes = {
|
|
||||||
'demo_meas_pt_w_down': { i_min: 800, i_max: 1500, o_min: 800, o_max: 1500 },
|
|
||||||
'demo_meas_pt_n_down': { i_min: 600, i_max: 1200, o_min: 600, o_max: 1200 },
|
|
||||||
'demo_meas_pt_s_down': { i_min: 500, i_max: 1000, o_min: 500, o_max: 1000 },
|
|
||||||
};
|
|
||||||
|
|
||||||
flow.forEach(node => {
|
|
||||||
const fix = pressureFixes[node.id];
|
|
||||||
if (fix) {
|
|
||||||
const old = { i_min: node.i_min, i_max: node.i_max, o_min: node.o_min, o_max: node.o_max };
|
|
||||||
Object.assign(node, fix);
|
|
||||||
console.log(`Fixed ${node.id} "${node.name}":`);
|
|
||||||
console.log(` Was: i=[${old.i_min},${old.i_max}] o=[${old.o_min},${old.o_max}]`);
|
|
||||||
console.log(` Now: i=[${fix.i_min},${fix.i_max}] o=[${fix.o_min},${fix.o_max}]`);
|
|
||||||
changes++;
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// Also fix upstream pressure ranges to match realistic hydrostatic range
|
|
||||||
// Basin level 0-4m → hydrostatic 0-392 mbar → use 0-500 mbar range
|
|
||||||
const upstreamFixes = {
|
|
||||||
'demo_meas_pt_w_up': { i_min: 0, i_max: 500, o_min: 0, o_max: 500 },
|
|
||||||
'demo_meas_pt_n_up': { i_min: 0, i_max: 400, o_min: 0, o_max: 400 },
|
|
||||||
'demo_meas_pt_s_up': { i_min: 0, i_max: 300, o_min: 0, o_max: 300 },
|
|
||||||
};
|
|
||||||
|
|
||||||
flow.forEach(node => {
|
|
||||||
const fix = upstreamFixes[node.id];
|
|
||||||
if (fix) {
|
|
||||||
const old = { i_min: node.i_min, i_max: node.i_max, o_min: node.o_min, o_max: node.o_max };
|
|
||||||
Object.assign(node, fix);
|
|
||||||
console.log(`Fixed ${node.id} "${node.name}":`);
|
|
||||||
console.log(` Was: i=[${old.i_min},${old.i_max}] o=[${old.o_min},${old.o_max}]`);
|
|
||||||
console.log(` Now: i=[${fix.i_min},${fix.i_max}] o=[${fix.o_min},${fix.o_max}]`);
|
|
||||||
changes++;
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n');
|
|
||||||
console.log(`\nDone. ${changes} node(s) updated.`);
|
|
||||||
@@ -1,142 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Monitor WWTP system health and process state.
|
|
||||||
* Captures PS volume, flow rates, pump states, and control actions.
|
|
||||||
*/
|
|
||||||
|
|
||||||
const http = require('http');
|
|
||||||
const { execSync } = require('child_process');
|
|
||||||
|
|
||||||
const NR_URL = 'http://localhost:1880';
|
|
||||||
const SAMPLE_INTERVAL = 5000;
|
|
||||||
const NUM_SAMPLES = 20; // 100 seconds
|
|
||||||
|
|
||||||
function getLogs(lines = 50) {
|
|
||||||
try {
|
|
||||||
return execSync('docker logs evolv-nodered --tail ' + lines + ' 2>&1', {
|
|
||||||
encoding: 'utf8', timeout: 5000,
|
|
||||||
});
|
|
||||||
} catch (e) { return ''; }
|
|
||||||
}
|
|
||||||
|
|
||||||
function parseLogs(logs) {
|
|
||||||
const result = { safety: [], pressure: 0, control: [], state: [], errors: [], flow: [] };
|
|
||||||
logs.split('\n').forEach(line => {
|
|
||||||
if (!line.trim()) return;
|
|
||||||
|
|
||||||
const volMatch = line.match(/vol=([-\d.]+) m3.*remainingTime=([\w.]+)/);
|
|
||||||
if (volMatch) {
|
|
||||||
result.safety.push({ vol: parseFloat(volMatch[1]), remaining: volMatch[2] });
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (line.includes('Pressure change detected')) { result.pressure++; return; }
|
|
||||||
|
|
||||||
if (line.includes('Controllevel') || line.includes('flowbased') || line.includes('control applying')) {
|
|
||||||
result.control.push(line.trim().substring(0, 200));
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (line.includes('startup') || line.includes('shutdown') || line.includes('machine state') ||
|
|
||||||
line.includes('Handling input') || line.includes('execSequence') || line.includes('execsequence')) {
|
|
||||||
result.state.push(line.trim().substring(0, 200));
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (line.includes('[ERROR]') || line.includes('Error')) {
|
|
||||||
result.errors.push(line.trim().substring(0, 200));
|
|
||||||
return;
|
|
||||||
}
|
|
||||||
|
|
||||||
if (line.includes('netflow') || line.includes('Height') || line.includes('flow')) {
|
|
||||||
result.flow.push(line.trim().substring(0, 200));
|
|
||||||
}
|
|
||||||
});
|
|
||||||
return result;
|
|
||||||
}
|
|
||||||
|
|
||||||
(async () => {
|
|
||||||
console.log('=== WWTP Health Monitor ===');
|
|
||||||
console.log(`Sampling every ${SAMPLE_INTERVAL/1000}s for ${NUM_SAMPLES * SAMPLE_INTERVAL / 1000}s\n`);
|
|
||||||
|
|
||||||
const history = [];
|
|
||||||
|
|
||||||
for (let i = 0; i < NUM_SAMPLES; i++) {
|
|
||||||
const elapsed = (i * SAMPLE_INTERVAL / 1000).toFixed(0);
|
|
||||||
const logs = getLogs(40);
|
|
||||||
const parsed = parseLogs(logs);
|
|
||||||
|
|
||||||
console.log(`--- Sample ${i+1}/${NUM_SAMPLES} (t=${elapsed}s) ---`);
|
|
||||||
|
|
||||||
// Safety status
|
|
||||||
if (parsed.safety.length > 0) {
|
|
||||||
const latest = parsed.safety[parsed.safety.length - 1];
|
|
||||||
console.log(` ⚠️ SAFETY: ${parsed.safety.length} triggers, vol=${latest.vol} m3`);
|
|
||||||
} else {
|
|
||||||
console.log(' ✅ SAFETY: OK');
|
|
||||||
}
|
|
||||||
|
|
||||||
// Pressure changes
|
|
||||||
if (parsed.pressure > 0) {
|
|
||||||
console.log(` 📊 PRESSURE: ${parsed.pressure} changes (sim active)`);
|
|
||||||
}
|
|
||||||
|
|
||||||
// Control actions
|
|
||||||
if (parsed.control.length > 0) {
|
|
||||||
parsed.control.slice(-3).forEach(c => console.log(` 🎛️ CONTROL: ${c}`));
|
|
||||||
}
|
|
||||||
|
|
||||||
// State changes
|
|
||||||
if (parsed.state.length > 0) {
|
|
||||||
parsed.state.slice(-3).forEach(s => console.log(` 🔄 STATE: ${s}`));
|
|
||||||
}
|
|
||||||
|
|
||||||
// Flow info
|
|
||||||
if (parsed.flow.length > 0) {
|
|
||||||
parsed.flow.slice(-2).forEach(f => console.log(` 💧 FLOW: ${f}`));
|
|
||||||
}
|
|
||||||
|
|
||||||
// Errors
|
|
||||||
if (parsed.errors.length > 0) {
|
|
||||||
parsed.errors.forEach(e => console.log(` ❌ ERROR: ${e}`));
|
|
||||||
}
|
|
||||||
|
|
||||||
history.push({
|
|
||||||
t: parseInt(elapsed),
|
|
||||||
safety: parsed.safety.length,
|
|
||||||
pressure: parsed.pressure,
|
|
||||||
control: parsed.control.length,
|
|
||||||
state: parsed.state.length,
|
|
||||||
errors: parsed.errors.length,
|
|
||||||
});
|
|
||||||
|
|
||||||
console.log('');
|
|
||||||
|
|
||||||
if (i < NUM_SAMPLES - 1) {
|
|
||||||
await new Promise(r => setTimeout(r, SAMPLE_INTERVAL));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Summary
|
|
||||||
console.log('\n=== Health Summary ===');
|
|
||||||
const totalSafety = history.reduce((a, h) => a + h.safety, 0);
|
|
||||||
const totalErrors = history.reduce((a, h) => a + h.errors, 0);
|
|
||||||
const totalControl = history.reduce((a, h) => a + h.control, 0);
|
|
||||||
const totalState = history.reduce((a, h) => a + h.state, 0);
|
|
||||||
|
|
||||||
console.log(`Safety triggers: ${totalSafety} ${totalSafety === 0 ? '✅' : '⚠️'}`);
|
|
||||||
console.log(`Errors: ${totalErrors} ${totalErrors === 0 ? '✅' : '❌'}`);
|
|
||||||
console.log(`Control actions: ${totalControl}`);
|
|
||||||
console.log(`State changes: ${totalState}`);
|
|
||||||
|
|
||||||
if (totalSafety === 0 && totalErrors === 0) {
|
|
||||||
console.log('\n🟢 SYSTEM HEALTHY');
|
|
||||||
} else if (totalErrors > 0) {
|
|
||||||
console.log('\n🔴 ERRORS DETECTED');
|
|
||||||
} else {
|
|
||||||
console.log('\n🟡 SAFETY ACTIVE (may be normal during startup)');
|
|
||||||
}
|
|
||||||
})().catch(err => {
|
|
||||||
console.error('Monitor failed:', err);
|
|
||||||
process.exit(1);
|
|
||||||
});
|
|
||||||
@@ -1,158 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Monitor WWTP runtime via Node-RED debug WebSocket and container logs.
|
|
||||||
* Captures process data every few seconds and displays trends.
|
|
||||||
*/
|
|
||||||
|
|
||||||
const http = require('http');
|
|
||||||
const { execSync } = require('child_process');
|
|
||||||
|
|
||||||
const NR_URL = 'http://localhost:1880';
|
|
||||||
const SAMPLE_INTERVAL = 5000; // ms
|
|
||||||
const NUM_SAMPLES = 12; // 60 seconds total
|
|
||||||
|
|
||||||
function fetchJSON(url) {
|
|
||||||
return new Promise((resolve, reject) => {
|
|
||||||
http.get(url, res => {
|
|
||||||
const chunks = [];
|
|
||||||
res.on('data', c => chunks.push(c));
|
|
||||||
res.on('end', () => {
|
|
||||||
try { resolve(JSON.parse(Buffer.concat(chunks))); }
|
|
||||||
catch (e) { reject(new Error('Parse: ' + e.message)); }
|
|
||||||
});
|
|
||||||
}).on('error', reject);
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
function getRecentLogs(lines = 50) {
|
|
||||||
try {
|
|
||||||
return execSync('docker logs evolv-nodered --tail ' + lines + ' 2>&1', {
|
|
||||||
encoding: 'utf8',
|
|
||||||
timeout: 5000,
|
|
||||||
});
|
|
||||||
} catch (e) {
|
|
||||||
return 'Failed to get logs: ' + e.message;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
function parseSafeGuardLogs(logs) {
|
|
||||||
const lines = logs.split('\n');
|
|
||||||
const safeGuards = [];
|
|
||||||
const pressures = [];
|
|
||||||
const others = [];
|
|
||||||
|
|
||||||
lines.forEach(line => {
|
|
||||||
const volMatch = line.match(/Safe guard triggered: vol=([-\d.]+) m3/);
|
|
||||||
if (volMatch) {
|
|
||||||
safeGuards.push(parseFloat(volMatch[1]));
|
|
||||||
}
|
|
||||||
const pressMatch = line.match(/New f =([\d.]+) is constrained/);
|
|
||||||
if (pressMatch) {
|
|
||||||
pressures.push(parseFloat(pressMatch[1]));
|
|
||||||
}
|
|
||||||
if (line.includes('_controlLevelBased') || line.includes('Mode changed') ||
|
|
||||||
line.includes('execSequence') || line.includes('startup') ||
|
|
||||||
line.includes('shutdown') || line.includes('setMode')) {
|
|
||||||
others.push(line.trim().substring(0, 200));
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
return { safeGuards, pressures, others };
|
|
||||||
}
|
|
||||||
|
|
||||||
(async () => {
|
|
||||||
console.log('=== WWTP Runtime Monitor ===');
|
|
||||||
console.log('Capturing ' + NUM_SAMPLES + ' samples at ' + (SAMPLE_INTERVAL/1000) + 's intervals\n');
|
|
||||||
|
|
||||||
// Wait for nodes to initialize after deploy
|
|
||||||
console.log('Waiting 10s for nodes to initialize...\n');
|
|
||||||
await new Promise(r => setTimeout(r, 10000));
|
|
||||||
|
|
||||||
for (let i = 0; i < NUM_SAMPLES; i++) {
|
|
||||||
const elapsed = (i * SAMPLE_INTERVAL / 1000 + 10).toFixed(0);
|
|
||||||
console.log('--- Sample ' + (i+1) + '/' + NUM_SAMPLES + ' (t=' + elapsed + 's after deploy) ---');
|
|
||||||
|
|
||||||
// Capture container logs (last 30 lines since last sample)
|
|
||||||
const logs = getRecentLogs(30);
|
|
||||||
const parsed = parseSafeGuardLogs(logs);
|
|
||||||
|
|
||||||
if (parsed.safeGuards.length > 0) {
|
|
||||||
const latest = parsed.safeGuards[parsed.safeGuards.length - 1];
|
|
||||||
const trend = parsed.safeGuards.length > 1
|
|
||||||
? (parsed.safeGuards[parsed.safeGuards.length-1] - parsed.safeGuards[0] > 0 ? 'RISING' : 'FALLING')
|
|
||||||
: 'STABLE';
|
|
||||||
console.log(' SAFETY: vol=' + latest.toFixed(2) + ' m3 (' + parsed.safeGuards.length + ' triggers, ' + trend + ')');
|
|
||||||
} else {
|
|
||||||
console.log(' SAFETY: No safe guard triggers (GOOD)');
|
|
||||||
}
|
|
||||||
|
|
||||||
if (parsed.pressures.length > 0) {
|
|
||||||
const avg = parsed.pressures.reduce((a,b) => a+b, 0) / parsed.pressures.length;
|
|
||||||
console.log(' PRESSURE CLAMP: avg f=' + avg.toFixed(0) + ' (' + parsed.pressures.length + ' warnings)');
|
|
||||||
} else {
|
|
||||||
console.log(' PRESSURE: No interpolation warnings (GOOD)');
|
|
||||||
}
|
|
||||||
|
|
||||||
if (parsed.others.length > 0) {
|
|
||||||
console.log(' CONTROL: ' + parsed.others.slice(-3).join('\n '));
|
|
||||||
}
|
|
||||||
|
|
||||||
// Check if there are state change or mode messages
|
|
||||||
const logLines = logs.split('\n');
|
|
||||||
const stateChanges = logLines.filter(l =>
|
|
||||||
l.includes('machine state') || l.includes('State:') ||
|
|
||||||
l.includes('draining') || l.includes('filling') ||
|
|
||||||
l.includes('q_in') || l.includes('netFlow')
|
|
||||||
);
|
|
||||||
if (stateChanges.length > 0) {
|
|
||||||
console.log(' STATE: ' + stateChanges.slice(-3).map(s => s.trim().substring(0, 150)).join('\n '));
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log('');
|
|
||||||
|
|
||||||
if (i < NUM_SAMPLES - 1) {
|
|
||||||
await new Promise(r => setTimeout(r, SAMPLE_INTERVAL));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Final log dump
|
|
||||||
console.log('\n=== Final Log Analysis (last 200 lines) ===');
|
|
||||||
const finalLogs = getRecentLogs(200);
|
|
||||||
const finalParsed = parseSafeGuardLogs(finalLogs);
|
|
||||||
|
|
||||||
console.log('Safe guard triggers: ' + finalParsed.safeGuards.length);
|
|
||||||
if (finalParsed.safeGuards.length > 0) {
|
|
||||||
console.log(' First vol: ' + finalParsed.safeGuards[0].toFixed(2) + ' m3');
|
|
||||||
console.log(' Last vol: ' + finalParsed.safeGuards[finalParsed.safeGuards.length-1].toFixed(2) + ' m3');
|
|
||||||
const delta = finalParsed.safeGuards[finalParsed.safeGuards.length-1] - finalParsed.safeGuards[0];
|
|
||||||
console.log(' Delta: ' + (delta > 0 ? '+' : '') + delta.toFixed(2) + ' m3 (' + (delta > 0 ? 'RECOVERING' : 'STILL DRAINING') + ')');
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log('Pressure clamp warnings: ' + finalParsed.pressures.length);
|
|
||||||
if (finalParsed.pressures.length > 0) {
|
|
||||||
const min = Math.min(...finalParsed.pressures);
|
|
||||||
const max = Math.max(...finalParsed.pressures);
|
|
||||||
console.log(' Range: ' + min.toFixed(0) + ' - ' + max.toFixed(0));
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log('\nControl events: ' + finalParsed.others.length);
|
|
||||||
finalParsed.others.slice(-10).forEach(l => console.log(' ' + l));
|
|
||||||
|
|
||||||
// Overall assessment
|
|
||||||
console.log('\n=== ASSESSMENT ===');
|
|
||||||
if (finalParsed.safeGuards.length === 0 && finalParsed.pressures.length === 0) {
|
|
||||||
console.log('HEALTHY: No safety triggers, no pressure warnings');
|
|
||||||
} else if (finalParsed.safeGuards.length > 0) {
|
|
||||||
const trend = finalParsed.safeGuards[finalParsed.safeGuards.length-1] - finalParsed.safeGuards[0];
|
|
||||||
if (trend > 0) {
|
|
||||||
console.log('RECOVERING: Volume rising but still negative');
|
|
||||||
} else {
|
|
||||||
console.log('CRITICAL: Volume still dropping - control issue persists');
|
|
||||||
}
|
|
||||||
} else if (finalParsed.pressures.length > 0) {
|
|
||||||
console.log('WARNING: Pressure values exceeding curve bounds');
|
|
||||||
}
|
|
||||||
})().catch(err => {
|
|
||||||
console.error('Monitor failed:', err);
|
|
||||||
process.exit(1);
|
|
||||||
});
|
|
||||||
20
scripts/patch-deps.js
Normal file
20
scripts/patch-deps.js
Normal file
@@ -0,0 +1,20 @@
|
|||||||
|
/**
|
||||||
|
* Preinstall script: rewrites the generalFunctions dependency
|
||||||
|
* from git+https to a local file path when the submodule exists.
|
||||||
|
* This avoids needing Gitea credentials during npm install.
|
||||||
|
*/
|
||||||
|
const fs = require('fs');
|
||||||
|
const path = require('path');
|
||||||
|
|
||||||
|
const pkgPath = path.join(__dirname, '..', 'package.json');
|
||||||
|
const localGF = path.join(__dirname, '..', 'nodes', 'generalFunctions');
|
||||||
|
|
||||||
|
if (fs.existsSync(localGF) && fs.existsSync(path.join(localGF, 'index.js'))) {
|
||||||
|
const pkg = JSON.parse(fs.readFileSync(pkgPath, 'utf8'));
|
||||||
|
if (pkg.dependencies && pkg.dependencies.generalFunctions &&
|
||||||
|
pkg.dependencies.generalFunctions.startsWith('git+')) {
|
||||||
|
pkg.dependencies.generalFunctions = 'file:./nodes/generalFunctions';
|
||||||
|
fs.writeFileSync(pkgPath, JSON.stringify(pkg, null, 2) + '\n');
|
||||||
|
console.log('[patch-deps] Rewrote generalFunctions to local path');
|
||||||
|
}
|
||||||
|
}
|
||||||
@@ -1,184 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Patch demo-flow.json:
|
|
||||||
* 1. Fix NH4 chart — remove demo_link_meas_dash from new NH4 nodes
|
|
||||||
* 2. Update parse function — use "NH4 @ Xm" label format
|
|
||||||
* 3. Reorganize entire treatment tab — logical left-to-right layout
|
|
||||||
*/
|
|
||||||
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
|
|
||||||
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
|
|
||||||
|
|
||||||
const find = (id) => flow.find(n => n.id === id);
|
|
||||||
|
|
||||||
// ============================================================
|
|
||||||
// 1. FIX NH4 CHART WIRING
|
|
||||||
// Remove demo_link_meas_dash from the 4 new NH4 nodes.
|
|
||||||
// They should only go to process link + NH4 profile link.
|
|
||||||
// ============================================================
|
|
||||||
|
|
||||||
const newNh4Ids = ['demo_meas_nh4_in', 'demo_meas_nh4_a', 'demo_meas_nh4_b', 'demo_meas_nh4_c'];
|
|
||||||
for (const id of newNh4Ids) {
|
|
||||||
const n = find(id);
|
|
||||||
if (n) {
|
|
||||||
n.wires[0] = n.wires[0].filter(w => w !== 'demo_link_meas_dash');
|
|
||||||
console.log(` ${id} Port 0 wires: ${JSON.stringify(n.wires[0])}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
console.log('1. Fixed: removed demo_link_meas_dash from new NH4 nodes');
|
|
||||||
|
|
||||||
// ============================================================
|
|
||||||
// 2. UPDATE PARSE FUNCTION — "NH4 @ Xm" format
|
|
||||||
// Also make it generic: read distance from payload metadata
|
|
||||||
// if available, fall back to topic matching.
|
|
||||||
// ============================================================
|
|
||||||
|
|
||||||
const parseFn = find('demo_fn_nh4_profile_parse');
|
|
||||||
if (parseFn) {
|
|
||||||
parseFn.func = `const p = msg.payload || {};
|
|
||||||
const topic = msg.topic || '';
|
|
||||||
const now = Date.now();
|
|
||||||
const val = Number(p.mAbs);
|
|
||||||
if (!Number.isFinite(val)) return null;
|
|
||||||
|
|
||||||
// Build label from distance metadata if available, else match by tag
|
|
||||||
const dist = p.distance;
|
|
||||||
const tag = p.assetTagNumber || topic;
|
|
||||||
let label;
|
|
||||||
if (dist !== undefined && dist !== null) {
|
|
||||||
label = 'NH4 @ ' + dist + 'm';
|
|
||||||
} else if (tag.includes('NH4-IN')) label = 'NH4 @ 0m';
|
|
||||||
else if (tag.includes('NH4-A')) label = 'NH4 @ 10m';
|
|
||||||
else if (tag.includes('NH4-B')) label = 'NH4 @ 25m';
|
|
||||||
else if (tag.includes('NH4-001')) label = 'NH4 @ 35m';
|
|
||||||
else if (tag.includes('NH4-C')) label = 'NH4 @ 45m';
|
|
||||||
else label = 'NH4 @ ?m';
|
|
||||||
|
|
||||||
return { topic: label, payload: Math.round(val * 100) / 100, timestamp: now };`;
|
|
||||||
console.log('2. Updated NH4 profile parse function to "NH4 @ Xm" format');
|
|
||||||
}
|
|
||||||
|
|
||||||
// ============================================================
|
|
||||||
// 3. REORGANIZE TREATMENT TAB LAYOUT
|
|
||||||
//
|
|
||||||
// Logical left-to-right process flow:
|
|
||||||
//
|
|
||||||
// Col 1 (x=80): Comments / section headers
|
|
||||||
// Col 2 (x=200): Injects (reactor tick, monster flow)
|
|
||||||
// Col 3 (x=420): Inlet measurements (flow, DO, NH4 profile)
|
|
||||||
// Col 4 (x=640): Link outs (meas dash, NH4 profile dash)
|
|
||||||
// Col 5 (x=820): Reactor
|
|
||||||
// Col 6 (x=1060): Settler
|
|
||||||
// Col 7 (x=1280): Effluent measurements
|
|
||||||
// Col 8 (x=1500): Effluent link outs
|
|
||||||
//
|
|
||||||
// Row zones (y):
|
|
||||||
// Row A (y=40): Section comment
|
|
||||||
// Row B (y=100-440): Main process: reactor measurements → reactor → settler
|
|
||||||
// Row C (y=500-700): Effluent measurements (downstream of settler)
|
|
||||||
// Row D (y=760-900): RAS recycle loop (below main flow)
|
|
||||||
// Row E (y=960-1120): Merge collection / influent composition
|
|
||||||
//
|
|
||||||
// ============================================================
|
|
||||||
|
|
||||||
const layout = {
|
|
||||||
// ── SECTION COMMENT ──
|
|
||||||
'demo_comment_treatment': { x: 80, y: 40 },
|
|
||||||
|
|
||||||
// ── INJECTS ──
|
|
||||||
'demo_inj_reactor_tick': { x: 200, y: 120 },
|
|
||||||
'demo_inj_monster_flow': { x: 200, y: 560 },
|
|
||||||
|
|
||||||
// ── INLET MEASUREMENTS (column, spaced 60px) ──
|
|
||||||
'demo_meas_flow': { x: 420, y: 100 }, // FT-001 flow
|
|
||||||
'demo_meas_do': { x: 420, y: 160 }, // DO-001
|
|
||||||
'demo_meas_nh4_in': { x: 420, y: 220 }, // NH4-IN 0m
|
|
||||||
'demo_meas_nh4_a': { x: 420, y: 280 }, // NH4-A 10m
|
|
||||||
'demo_meas_nh4': { x: 420, y: 340 }, // NH4-001 35m (existing, keep between A & B for distance order — wait, 25m < 35m)
|
|
||||||
'demo_meas_nh4_b': { x: 420, y: 400 }, // NH4-B 25m
|
|
||||||
'demo_meas_nh4_c': { x: 420, y: 460 }, // NH4-C 45m
|
|
||||||
|
|
||||||
// ── LINK OUTS (from measurements) ──
|
|
||||||
'demo_link_meas_dash': { x: 640, y: 130 },
|
|
||||||
'demo_link_nh4_profile_dash': { x: 640, y: 340 },
|
|
||||||
|
|
||||||
// ── REACTOR ──
|
|
||||||
'demo_reactor': { x: 820, y: 220 },
|
|
||||||
|
|
||||||
// ── REACTOR LINK OUTS ──
|
|
||||||
'demo_link_reactor_dash': { x: 1020, y: 180 },
|
|
||||||
'demo_link_overview_reactor_out': { x: 1020, y: 220 },
|
|
||||||
|
|
||||||
// ── SETTLER ──
|
|
||||||
'demo_settler': { x: 1060, y: 320 },
|
|
||||||
|
|
||||||
// ── SHARED LINK OUTS (process + influx) ──
|
|
||||||
'demo_link_influx_out_treatment': { x: 1020, y: 260 },
|
|
||||||
'demo_link_process_out_treatment': { x: 1020, y: 300 },
|
|
||||||
|
|
||||||
// ── EFFLUENT SECTION ──
|
|
||||||
'demo_comment_effluent_meas': { x: 80, y: 520 },
|
|
||||||
'demo_meas_eff_flow': { x: 1280, y: 320 },
|
|
||||||
'demo_meas_eff_do': { x: 1280, y: 380 },
|
|
||||||
'demo_meas_eff_nh4': { x: 1280, y: 440 },
|
|
||||||
'demo_meas_eff_no3': { x: 1280, y: 500 },
|
|
||||||
'demo_meas_eff_tss': { x: 1280, y: 560 },
|
|
||||||
'demo_link_eff_meas_dash': { x: 1500, y: 440 },
|
|
||||||
'demo_link_overview_eff_out': { x: 1500, y: 500 },
|
|
||||||
|
|
||||||
// ── MONSTER (downstream of settler, parallel to effluent meas) ──
|
|
||||||
'demo_monster': { x: 1060, y: 440 },
|
|
||||||
'demo_fn_monster_flow': { x: 400, y: 560 },
|
|
||||||
|
|
||||||
// ── RAS RECYCLE LOOP (below main process) ──
|
|
||||||
'demo_fn_ras_filter': { x: 1060, y: 760 },
|
|
||||||
'demo_pump_ras': { x: 1280, y: 760 },
|
|
||||||
'demo_meas_ft_ras': { x: 1500, y: 760 },
|
|
||||||
'demo_inj_ras_mode': { x: 1280, y: 820 },
|
|
||||||
'demo_inj_ras_speed': { x: 1280, y: 880 },
|
|
||||||
'demo_comment_pressure': { x: 80, y: 740 },
|
|
||||||
|
|
||||||
// ── MERGE COLLECTION (bottom section) ──
|
|
||||||
'demo_comment_merge': { x: 80, y: 960 },
|
|
||||||
'demo_link_merge_west_in': { x: 100, y: 1000 },
|
|
||||||
'demo_link_merge_north_in': { x: 100, y: 1060 },
|
|
||||||
'demo_link_merge_south_in': { x: 100, y: 1120 },
|
|
||||||
'demo_fn_tag_west': { x: 300, y: 1000 },
|
|
||||||
'demo_fn_tag_north': { x: 300, y: 1060 },
|
|
||||||
'demo_fn_tag_south': { x: 300, y: 1120 },
|
|
||||||
'demo_fn_merge_collect': { x: 520, y: 1060 },
|
|
||||||
'demo_link_merge_dash': { x: 720, y: 1020 },
|
|
||||||
'demo_fn_influent_compose': { x: 720, y: 1100 },
|
|
||||||
};
|
|
||||||
|
|
||||||
// Sort NH4 measurements by distance for visual order
|
|
||||||
// NH4-IN=0m, NH4-A=10m, NH4-B=25m, NH4-001=35m, NH4-C=45m
|
|
||||||
// Adjust y to be in distance order:
|
|
||||||
layout['demo_meas_nh4_in'] = { x: 420, y: 220 }; // 0m
|
|
||||||
layout['demo_meas_nh4_a'] = { x: 420, y: 280 }; // 10m
|
|
||||||
layout['demo_meas_nh4_b'] = { x: 420, y: 340 }; // 25m
|
|
||||||
layout['demo_meas_nh4'] = { x: 420, y: 400 }; // 35m
|
|
||||||
layout['demo_meas_nh4_c'] = { x: 420, y: 460 }; // 45m
|
|
||||||
|
|
||||||
let moved = 0;
|
|
||||||
for (const [id, pos] of Object.entries(layout)) {
|
|
||||||
const n = find(id);
|
|
||||||
if (n) {
|
|
||||||
n.x = pos.x;
|
|
||||||
n.y = pos.y;
|
|
||||||
moved++;
|
|
||||||
} else {
|
|
||||||
console.warn(` WARN: node ${id} not found`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
console.log(`3. Repositioned ${moved} nodes on treatment tab`);
|
|
||||||
|
|
||||||
// ============================================================
|
|
||||||
// WRITE OUTPUT
|
|
||||||
// ============================================================
|
|
||||||
|
|
||||||
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n', 'utf8');
|
|
||||||
console.log(`\nDone. Wrote ${flow.length} nodes to ${flowPath}`);
|
|
||||||
@@ -1,455 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Patch demo-flow.json:
|
|
||||||
* Phase A: Add 4 NH4 measurement nodes + ui-group + ui-chart
|
|
||||||
* Phase B: Add influent composer function node + wire merge collector
|
|
||||||
* Phase C: Fix biomass init on reactor
|
|
||||||
* Phase D: Add RAS pump, flow sensor, 2 injects, filter function + wiring
|
|
||||||
*/
|
|
||||||
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
|
|
||||||
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
|
|
||||||
|
|
||||||
// Helper: find node by id
|
|
||||||
const findNode = (id) => flow.find(n => n.id === id);
|
|
||||||
|
|
||||||
// ============================================================
|
|
||||||
// PHASE A: Add 4 NH4 measurement nodes + ui-group + ui-chart
|
|
||||||
// ============================================================
|
|
||||||
|
|
||||||
const nh4Measurements = [
|
|
||||||
{
|
|
||||||
id: 'demo_meas_nh4_in',
|
|
||||||
name: 'NH4-IN (Ammonium Inlet)',
|
|
||||||
uuid: 'nh4-in-001',
|
|
||||||
assetTagNumber: 'NH4-IN',
|
|
||||||
distance: 0,
|
|
||||||
distanceDescription: 'reactor inlet',
|
|
||||||
y: 280
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: 'demo_meas_nh4_a',
|
|
||||||
name: 'NH4-A (Early Aeration)',
|
|
||||||
uuid: 'nh4-a-001',
|
|
||||||
assetTagNumber: 'NH4-A',
|
|
||||||
distance: 10,
|
|
||||||
distanceDescription: 'early aeration zone',
|
|
||||||
y: 320
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: 'demo_meas_nh4_b',
|
|
||||||
name: 'NH4-B (Mid-Reactor)',
|
|
||||||
uuid: 'nh4-b-001',
|
|
||||||
assetTagNumber: 'NH4-B',
|
|
||||||
distance: 25,
|
|
||||||
distanceDescription: 'mid-reactor',
|
|
||||||
y: 360
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: 'demo_meas_nh4_c',
|
|
||||||
name: 'NH4-C (Near Outlet)',
|
|
||||||
uuid: 'nh4-c-001',
|
|
||||||
assetTagNumber: 'NH4-C',
|
|
||||||
distance: 45,
|
|
||||||
distanceDescription: 'near outlet',
|
|
||||||
y: 400
|
|
||||||
}
|
|
||||||
];
|
|
||||||
|
|
||||||
for (const m of nh4Measurements) {
|
|
||||||
flow.push({
|
|
||||||
id: m.id,
|
|
||||||
type: 'measurement',
|
|
||||||
z: 'demo_tab_treatment',
|
|
||||||
name: m.name,
|
|
||||||
scaling: true,
|
|
||||||
i_min: 0,
|
|
||||||
i_max: 50,
|
|
||||||
i_offset: 0,
|
|
||||||
o_min: 0,
|
|
||||||
o_max: 50,
|
|
||||||
smooth_method: 'mean',
|
|
||||||
count: 3,
|
|
||||||
simulator: true,
|
|
||||||
uuid: m.uuid,
|
|
||||||
supplier: 'Hach',
|
|
||||||
category: 'sensor',
|
|
||||||
assetType: 'ammonium',
|
|
||||||
model: 'Amtax-sc',
|
|
||||||
unit: 'mg/L',
|
|
||||||
assetTagNumber: m.assetTagNumber,
|
|
||||||
enableLog: false,
|
|
||||||
logLevel: 'error',
|
|
||||||
positionVsParent: 'atEquipment',
|
|
||||||
x: 400,
|
|
||||||
y: m.y,
|
|
||||||
wires: [
|
|
||||||
['demo_link_meas_dash', 'demo_link_process_out_treatment'],
|
|
||||||
['demo_link_influx_out_treatment'],
|
|
||||||
['demo_reactor']
|
|
||||||
],
|
|
||||||
positionIcon: '⊥',
|
|
||||||
hasDistance: true,
|
|
||||||
distance: m.distance,
|
|
||||||
distanceUnit: 'm',
|
|
||||||
distanceDescription: m.distanceDescription
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// NH4 profile ui-group
|
|
||||||
flow.push({
|
|
||||||
id: 'demo_ui_grp_nh4_profile',
|
|
||||||
type: 'ui-group',
|
|
||||||
name: 'NH4 Profile Along Reactor',
|
|
||||||
page: 'demo_ui_page_treatment',
|
|
||||||
width: '6',
|
|
||||||
height: '1',
|
|
||||||
order: 6,
|
|
||||||
showTitle: true,
|
|
||||||
className: ''
|
|
||||||
});
|
|
||||||
|
|
||||||
// NH4 profile chart
|
|
||||||
flow.push({
|
|
||||||
id: 'demo_chart_nh4_profile',
|
|
||||||
type: 'ui-chart',
|
|
||||||
z: 'demo_tab_dashboard',
|
|
||||||
group: 'demo_ui_grp_nh4_profile',
|
|
||||||
name: 'NH4 Profile',
|
|
||||||
label: 'NH4 Along Reactor (mg/L)',
|
|
||||||
order: 1,
|
|
||||||
width: '6',
|
|
||||||
height: '5',
|
|
||||||
chartType: 'line',
|
|
||||||
category: 'topic',
|
|
||||||
categoryType: 'msg',
|
|
||||||
xAxisType: 'time',
|
|
||||||
yAxisLabel: 'mg/L',
|
|
||||||
removeOlder: '10',
|
|
||||||
removeOlderUnit: '60',
|
|
||||||
action: 'append',
|
|
||||||
pointShape: 'false',
|
|
||||||
pointRadius: 0,
|
|
||||||
interpolation: 'linear',
|
|
||||||
x: 510,
|
|
||||||
y: 1060,
|
|
||||||
wires: [],
|
|
||||||
showLegend: true,
|
|
||||||
xAxisProperty: '',
|
|
||||||
xAxisPropertyType: 'timestamp',
|
|
||||||
yAxisProperty: 'payload',
|
|
||||||
yAxisPropertyType: 'msg',
|
|
||||||
colors: [
|
|
||||||
'#0094ce',
|
|
||||||
'#FF7F0E',
|
|
||||||
'#2CA02C',
|
|
||||||
'#D62728',
|
|
||||||
'#A347E1',
|
|
||||||
'#D62728',
|
|
||||||
'#FF9896',
|
|
||||||
'#9467BD',
|
|
||||||
'#C5B0D5'
|
|
||||||
],
|
|
||||||
textColor: ['#aaaaaa'],
|
|
||||||
textColorDefault: false,
|
|
||||||
gridColor: ['#333333'],
|
|
||||||
gridColorDefault: false,
|
|
||||||
className: ''
|
|
||||||
});
|
|
||||||
|
|
||||||
// Link out + link in for NH4 profile chart
|
|
||||||
flow.push({
|
|
||||||
id: 'demo_link_nh4_profile_dash',
|
|
||||||
type: 'link out',
|
|
||||||
z: 'demo_tab_treatment',
|
|
||||||
name: '→ NH4 Profile Dashboard',
|
|
||||||
mode: 'link',
|
|
||||||
links: ['demo_link_nh4_profile_dash_in'],
|
|
||||||
x: 620,
|
|
||||||
y: 340
|
|
||||||
});
|
|
||||||
|
|
||||||
flow.push({
|
|
||||||
id: 'demo_link_nh4_profile_dash_in',
|
|
||||||
type: 'link in',
|
|
||||||
z: 'demo_tab_dashboard',
|
|
||||||
name: '← NH4 Profile',
|
|
||||||
links: ['demo_link_nh4_profile_dash'],
|
|
||||||
x: 75,
|
|
||||||
y: 1060,
|
|
||||||
wires: [['demo_fn_nh4_profile_parse']]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Parse function for NH4 profile chart
|
|
||||||
flow.push({
|
|
||||||
id: 'demo_fn_nh4_profile_parse',
|
|
||||||
type: 'function',
|
|
||||||
z: 'demo_tab_dashboard',
|
|
||||||
name: 'Parse NH4 Profile',
|
|
||||||
func: `const p = msg.payload || {};
|
|
||||||
const topic = msg.topic || '';
|
|
||||||
const now = Date.now();
|
|
||||||
const val = Number(p.mAbs);
|
|
||||||
if (!Number.isFinite(val)) return null;
|
|
||||||
|
|
||||||
let label = topic;
|
|
||||||
if (topic.includes('NH4-IN')) label = 'NH4-IN (0m)';
|
|
||||||
else if (topic.includes('NH4-A')) label = 'NH4-A (10m)';
|
|
||||||
else if (topic.includes('NH4-B')) label = 'NH4-B (25m)';
|
|
||||||
else if (topic.includes('NH4-001')) label = 'NH4-001 (35m)';
|
|
||||||
else if (topic.includes('NH4-C')) label = 'NH4-C (45m)';
|
|
||||||
|
|
||||||
return { topic: label, payload: Math.round(val * 100) / 100, timestamp: now };`,
|
|
||||||
outputs: 1,
|
|
||||||
x: 280,
|
|
||||||
y: 1060,
|
|
||||||
wires: [['demo_chart_nh4_profile']]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Wire existing NH4-001 and new NH4 measurements to the profile link out
|
|
||||||
const existingNh4 = findNode('demo_meas_nh4');
|
|
||||||
if (existingNh4) {
|
|
||||||
if (!existingNh4.wires[0].includes('demo_link_nh4_profile_dash')) {
|
|
||||||
existingNh4.wires[0].push('demo_link_nh4_profile_dash');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
for (const m of nh4Measurements) {
|
|
||||||
const node = findNode(m.id);
|
|
||||||
if (node && !node.wires[0].includes('demo_link_nh4_profile_dash')) {
|
|
||||||
node.wires[0].push('demo_link_nh4_profile_dash');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log('Phase A: Added 4 NH4 measurements + ui-group + chart + wiring');
|
|
||||||
|
|
||||||
// ============================================================
|
|
||||||
// PHASE B: Add influent composer + wire merge collector
|
|
||||||
// ============================================================
|
|
||||||
|
|
||||||
flow.push({
|
|
||||||
id: 'demo_fn_influent_compose',
|
|
||||||
type: 'function',
|
|
||||||
z: 'demo_tab_treatment',
|
|
||||||
name: 'Influent Composer',
|
|
||||||
func: `// Convert merge collector output to Fluent messages for reactor
|
|
||||||
// ASM3: [S_O, S_I, S_S, S_NH, S_N2, S_NO, S_HCO, X_I, X_S, X_H, X_STO, X_A, X_TS]
|
|
||||||
const p = msg.payload || {};
|
|
||||||
const MUNICIPAL = [0.5, 30, 200, 40, 0, 0, 5, 25, 150, 30, 0, 0, 200];
|
|
||||||
const INDUSTRIAL = [0.5, 40, 300, 25, 0, 0, 4, 30, 100, 20, 0, 0, 150];
|
|
||||||
const RESIDENTIAL = [0.5, 25, 180, 45, 0, 0, 5, 20, 130, 25, 0, 0, 175];
|
|
||||||
|
|
||||||
const Fw = (p.west?.netFlow || 0) * 24; // m3/h -> m3/d
|
|
||||||
const Fn = (p.north?.netFlow || 0) * 24;
|
|
||||||
const Fs = (p.south?.netFlow || 0) * 24;
|
|
||||||
|
|
||||||
const msgs = [];
|
|
||||||
if (Fw > 0) msgs.push({ topic: 'Fluent', payload: { inlet: 0, F: Fw, C: MUNICIPAL }});
|
|
||||||
if (Fn > 0) msgs.push({ topic: 'Fluent', payload: { inlet: 1, F: Fn, C: INDUSTRIAL }});
|
|
||||||
if (Fs > 0) msgs.push({ topic: 'Fluent', payload: { inlet: 2, F: Fs, C: RESIDENTIAL }});
|
|
||||||
return [msgs];`,
|
|
||||||
outputs: 1,
|
|
||||||
x: 480,
|
|
||||||
y: 1040,
|
|
||||||
wires: [['demo_reactor']]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Wire merge collector → influent composer (add to existing wires)
|
|
||||||
const mergeCollect = findNode('demo_fn_merge_collect');
|
|
||||||
if (mergeCollect) {
|
|
||||||
if (!mergeCollect.wires[0].includes('demo_fn_influent_compose')) {
|
|
||||||
mergeCollect.wires[0].push('demo_fn_influent_compose');
|
|
||||||
}
|
|
||||||
console.log('Phase B: Wired merge collector → influent composer → reactor');
|
|
||||||
} else {
|
|
||||||
console.error('Phase B: ERROR — demo_fn_merge_collect not found!');
|
|
||||||
}
|
|
||||||
|
|
||||||
// ============================================================
|
|
||||||
// PHASE C: Fix biomass initialization
|
|
||||||
// ============================================================
|
|
||||||
|
|
||||||
const reactor = findNode('demo_reactor');
|
|
||||||
if (reactor) {
|
|
||||||
reactor.X_A_init = 300;
|
|
||||||
reactor.X_H_init = 1500;
|
|
||||||
reactor.X_TS_init = 2500;
|
|
||||||
reactor.S_HCO_init = 8;
|
|
||||||
console.log('Phase C: Updated reactor biomass init values');
|
|
||||||
} else {
|
|
||||||
console.error('Phase C: ERROR — demo_reactor not found!');
|
|
||||||
}
|
|
||||||
|
|
||||||
// ============================================================
|
|
||||||
// PHASE D: Return Activated Sludge
|
|
||||||
// ============================================================
|
|
||||||
|
|
||||||
// D1: RAS pump
|
|
||||||
flow.push({
|
|
||||||
id: 'demo_pump_ras',
|
|
||||||
type: 'rotatingMachine',
|
|
||||||
z: 'demo_tab_treatment',
|
|
||||||
name: 'RAS Pump',
|
|
||||||
speed: '1',
|
|
||||||
startup: '5',
|
|
||||||
warmup: '3',
|
|
||||||
shutdown: '4',
|
|
||||||
cooldown: '2',
|
|
||||||
movementMode: 'dynspeed',
|
|
||||||
machineCurve: '',
|
|
||||||
uuid: 'pump-ras-001',
|
|
||||||
supplier: 'hidrostal',
|
|
||||||
category: 'machine',
|
|
||||||
assetType: 'pump-centrifugal',
|
|
||||||
model: 'hidrostal-RAS',
|
|
||||||
unit: 'm3/h',
|
|
||||||
enableLog: true,
|
|
||||||
logLevel: 'info',
|
|
||||||
positionVsParent: 'downstream',
|
|
||||||
positionIcon: '←',
|
|
||||||
hasDistance: false,
|
|
||||||
distance: 0,
|
|
||||||
distanceUnit: 'm',
|
|
||||||
distanceDescription: '',
|
|
||||||
x: 1000,
|
|
||||||
y: 380,
|
|
||||||
wires: [
|
|
||||||
['demo_link_process_out_treatment'],
|
|
||||||
['demo_link_influx_out_treatment'],
|
|
||||||
['demo_settler']
|
|
||||||
],
|
|
||||||
curveFlowUnit: 'l/s',
|
|
||||||
curvePressureUnit: 'mbar',
|
|
||||||
curvePowerUnit: 'kW'
|
|
||||||
});
|
|
||||||
|
|
||||||
// D2: RAS flow sensor
|
|
||||||
flow.push({
|
|
||||||
id: 'demo_meas_ft_ras',
|
|
||||||
type: 'measurement',
|
|
||||||
z: 'demo_tab_treatment',
|
|
||||||
name: 'FT-RAS (RAS Flow)',
|
|
||||||
scaling: true,
|
|
||||||
i_min: 20,
|
|
||||||
i_max: 80,
|
|
||||||
i_offset: 0,
|
|
||||||
o_min: 20,
|
|
||||||
o_max: 80,
|
|
||||||
smooth_method: 'mean',
|
|
||||||
count: 3,
|
|
||||||
simulator: true,
|
|
||||||
uuid: 'ft-ras-001',
|
|
||||||
supplier: 'Endress+Hauser',
|
|
||||||
category: 'sensor',
|
|
||||||
assetType: 'flow',
|
|
||||||
model: 'Promag-W400',
|
|
||||||
unit: 'm3/h',
|
|
||||||
assetTagNumber: 'FT-RAS',
|
|
||||||
enableLog: false,
|
|
||||||
logLevel: 'error',
|
|
||||||
positionVsParent: 'atEquipment',
|
|
||||||
positionIcon: '⊥',
|
|
||||||
hasDistance: false,
|
|
||||||
distance: 0,
|
|
||||||
distanceUnit: 'm',
|
|
||||||
distanceDescription: '',
|
|
||||||
x: 1200,
|
|
||||||
y: 380,
|
|
||||||
wires: [
|
|
||||||
['demo_link_process_out_treatment'],
|
|
||||||
['demo_link_influx_out_treatment'],
|
|
||||||
['demo_pump_ras']
|
|
||||||
]
|
|
||||||
});
|
|
||||||
|
|
||||||
// D3: Inject to set pump mode
|
|
||||||
flow.push({
|
|
||||||
id: 'demo_inj_ras_mode',
|
|
||||||
type: 'inject',
|
|
||||||
z: 'demo_tab_treatment',
|
|
||||||
name: 'RAS → virtualControl',
|
|
||||||
props: [
|
|
||||||
{ p: 'topic', vt: 'str' },
|
|
||||||
{ p: 'payload', vt: 'str' }
|
|
||||||
],
|
|
||||||
topic: 'setMode',
|
|
||||||
payload: 'virtualControl',
|
|
||||||
payloadType: 'str',
|
|
||||||
once: true,
|
|
||||||
onceDelay: '3',
|
|
||||||
x: 1000,
|
|
||||||
y: 440,
|
|
||||||
wires: [['demo_pump_ras']],
|
|
||||||
repeatType: 'none',
|
|
||||||
crontab: '',
|
|
||||||
repeat: ''
|
|
||||||
});
|
|
||||||
|
|
||||||
// D3: Inject to set pump speed
|
|
||||||
flow.push({
|
|
||||||
id: 'demo_inj_ras_speed',
|
|
||||||
type: 'inject',
|
|
||||||
z: 'demo_tab_treatment',
|
|
||||||
name: 'RAS speed → 50%',
|
|
||||||
props: [
|
|
||||||
{ p: 'topic', vt: 'str' },
|
|
||||||
{ p: 'payload', vt: 'json' }
|
|
||||||
],
|
|
||||||
topic: 'execMovement',
|
|
||||||
payload: '{"source":"auto","action":"setpoint","setpoint":50}',
|
|
||||||
payloadType: 'json',
|
|
||||||
once: true,
|
|
||||||
onceDelay: '4',
|
|
||||||
x: 1000,
|
|
||||||
y: 480,
|
|
||||||
wires: [['demo_pump_ras']],
|
|
||||||
repeatType: 'none',
|
|
||||||
crontab: '',
|
|
||||||
repeat: ''
|
|
||||||
});
|
|
||||||
|
|
||||||
// D4: RAS filter function
|
|
||||||
flow.push({
|
|
||||||
id: 'demo_fn_ras_filter',
|
|
||||||
type: 'function',
|
|
||||||
z: 'demo_tab_treatment',
|
|
||||||
name: 'RAS Filter',
|
|
||||||
func: `// Only pass RAS (inlet 2) from settler to reactor as inlet 3
|
|
||||||
if (msg.topic === 'Fluent' && msg.payload && msg.payload.inlet === 2) {
|
|
||||||
msg.payload.inlet = 3; // reactor inlet 3 = RAS
|
|
||||||
return msg;
|
|
||||||
}
|
|
||||||
return null;`,
|
|
||||||
outputs: 1,
|
|
||||||
x: 1000,
|
|
||||||
y: 320,
|
|
||||||
wires: [['demo_reactor']]
|
|
||||||
});
|
|
||||||
|
|
||||||
// D5: Wire settler Port 0 → RAS filter
|
|
||||||
const settler = findNode('demo_settler');
|
|
||||||
if (settler) {
|
|
||||||
if (!settler.wires[0].includes('demo_fn_ras_filter')) {
|
|
||||||
settler.wires[0].push('demo_fn_ras_filter');
|
|
||||||
}
|
|
||||||
console.log('Phase D: Wired settler → RAS filter → reactor');
|
|
||||||
} else {
|
|
||||||
console.error('Phase D: ERROR — demo_settler not found!');
|
|
||||||
}
|
|
||||||
|
|
||||||
// D5: Update reactor n_inlets: 3 → 4
|
|
||||||
if (reactor) {
|
|
||||||
reactor.n_inlets = 4;
|
|
||||||
console.log('Phase D: Updated reactor n_inlets to 4');
|
|
||||||
}
|
|
||||||
|
|
||||||
console.log('Phase D: Added RAS pump, flow sensor, 2 injects, filter function');
|
|
||||||
|
|
||||||
// ============================================================
|
|
||||||
// WRITE OUTPUT
|
|
||||||
// ============================================================
|
|
||||||
|
|
||||||
fs.writeFileSync(flowPath, JSON.stringify(flow, null, 2) + '\n', 'utf8');
|
|
||||||
console.log(`\nDone. Wrote ${flow.length} nodes to ${flowPath}`);
|
|
||||||
@@ -1,380 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Step 1: Tab Restructure + Per-tab link-outs
|
|
||||||
* - Creates 4 new tabs (PS West, PS North, PS South, Treatment)
|
|
||||||
* - Renames WWTP tab to "Telemetry / InfluxDB"
|
|
||||||
* - Moves nodes to their new tabs
|
|
||||||
* - Creates per-tab link-out nodes for influx + process
|
|
||||||
* - Rewires nodes to use local link-outs
|
|
||||||
* - Recalculates coordinates for clean layout
|
|
||||||
*/
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
|
|
||||||
const FLOW_PATH = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const flow = JSON.parse(fs.readFileSync(FLOW_PATH, 'utf8'));
|
|
||||||
|
|
||||||
const byId = (id) => flow.find(n => n.id === id);
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 1a. Create 4 new tabs
|
|
||||||
// =============================================
|
|
||||||
flow.push(
|
|
||||||
{ id: "demo_tab_ps_west", type: "tab", label: "PS West", disabled: false, info: "Pumping Station West (Urban Catchment - 2 pumps, Level-based)" },
|
|
||||||
{ id: "demo_tab_ps_north", type: "tab", label: "PS North", disabled: false, info: "Pumping Station North (Industrial - 1 pump, Flow-based)" },
|
|
||||||
{ id: "demo_tab_ps_south", type: "tab", label: "PS South", disabled: false, info: "Pumping Station South (Residential - 1 pump, Manual)" },
|
|
||||||
{ id: "demo_tab_treatment", type: "tab", label: "Biological Treatment", disabled: false, info: "Merge point, Reactor, Settler, Effluent Measurements" }
|
|
||||||
);
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 1b. Rename existing WWTP tab
|
|
||||||
// =============================================
|
|
||||||
const wwtpTab = byId("demo_tab_wwtp");
|
|
||||||
wwtpTab.label = "Telemetry / InfluxDB";
|
|
||||||
wwtpTab.info = "InfluxDB write chain, process debug, Grafana dashboard API, shared infrastructure";
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 1c. Move nodes to new tabs
|
|
||||||
// =============================================
|
|
||||||
const moveMap = {
|
|
||||||
// PS West tab
|
|
||||||
"demo_comment_ps": "demo_tab_ps_west",
|
|
||||||
"demo_ps_west": "demo_tab_ps_west",
|
|
||||||
"demo_pump_w1": "demo_tab_ps_west",
|
|
||||||
"demo_pump_w2": "demo_tab_ps_west",
|
|
||||||
"demo_mgc_west": "demo_tab_ps_west",
|
|
||||||
"demo_inj_west_mode": "demo_tab_ps_west",
|
|
||||||
"demo_inj_west_flow": "demo_tab_ps_west",
|
|
||||||
"demo_fn_west_flow_sim": "demo_tab_ps_west",
|
|
||||||
"demo_inj_w1_mode": "demo_tab_ps_west",
|
|
||||||
"demo_inj_w2_mode": "demo_tab_ps_west",
|
|
||||||
"demo_inj_calib_west": "demo_tab_ps_west",
|
|
||||||
"demo_fn_level_to_pressure_w": "demo_tab_ps_west",
|
|
||||||
"demo_meas_pt_w_up": "demo_tab_ps_west",
|
|
||||||
"demo_meas_pt_w_down": "demo_tab_ps_west",
|
|
||||||
"demo_mon_west": "demo_tab_ps_west",
|
|
||||||
"demo_link_ps_west_dash": "demo_tab_ps_west",
|
|
||||||
|
|
||||||
// PS North tab
|
|
||||||
"demo_comment_ps_north": "demo_tab_ps_north",
|
|
||||||
"demo_ps_north": "demo_tab_ps_north",
|
|
||||||
"demo_pump_n1": "demo_tab_ps_north",
|
|
||||||
"demo_inj_north_mode": "demo_tab_ps_north",
|
|
||||||
"demo_inj_north_flow": "demo_tab_ps_north",
|
|
||||||
"demo_fn_north_flow_sim": "demo_tab_ps_north",
|
|
||||||
"demo_inj_n1_mode": "demo_tab_ps_north",
|
|
||||||
"demo_inj_calib_north": "demo_tab_ps_north",
|
|
||||||
"demo_comment_north_outflow": "demo_tab_ps_north",
|
|
||||||
"demo_meas_ft_n1": "demo_tab_ps_north",
|
|
||||||
"demo_fn_level_to_pressure_n": "demo_tab_ps_north",
|
|
||||||
"demo_meas_pt_n_up": "demo_tab_ps_north",
|
|
||||||
"demo_meas_pt_n_down": "demo_tab_ps_north",
|
|
||||||
"demo_mon_north": "demo_tab_ps_north",
|
|
||||||
"demo_link_ps_north_dash": "demo_tab_ps_north",
|
|
||||||
|
|
||||||
// PS South tab
|
|
||||||
"demo_comment_ps_south": "demo_tab_ps_south",
|
|
||||||
"demo_ps_south": "demo_tab_ps_south",
|
|
||||||
"demo_pump_s1": "demo_tab_ps_south",
|
|
||||||
"demo_inj_south_mode": "demo_tab_ps_south",
|
|
||||||
"demo_inj_south_flow": "demo_tab_ps_south",
|
|
||||||
"demo_fn_south_flow_sim": "demo_tab_ps_south",
|
|
||||||
"demo_inj_s1_mode": "demo_tab_ps_south",
|
|
||||||
"demo_inj_calib_south": "demo_tab_ps_south",
|
|
||||||
"demo_fn_level_to_pressure_s": "demo_tab_ps_south",
|
|
||||||
"demo_meas_pt_s_up": "demo_tab_ps_south",
|
|
||||||
"demo_meas_pt_s_down": "demo_tab_ps_south",
|
|
||||||
"demo_mon_south": "demo_tab_ps_south",
|
|
||||||
"demo_link_ps_south_dash": "demo_tab_ps_south",
|
|
||||||
|
|
||||||
// Treatment tab
|
|
||||||
"demo_comment_treatment": "demo_tab_treatment",
|
|
||||||
"demo_meas_flow": "demo_tab_treatment",
|
|
||||||
"demo_meas_do": "demo_tab_treatment",
|
|
||||||
"demo_meas_nh4": "demo_tab_treatment",
|
|
||||||
"demo_reactor": "demo_tab_treatment",
|
|
||||||
"demo_inj_reactor_tick": "demo_tab_treatment",
|
|
||||||
"demo_settler": "demo_tab_treatment",
|
|
||||||
"demo_monster": "demo_tab_treatment",
|
|
||||||
"demo_inj_monster_flow": "demo_tab_treatment",
|
|
||||||
"demo_fn_monster_flow": "demo_tab_treatment",
|
|
||||||
"demo_comment_effluent_meas": "demo_tab_treatment",
|
|
||||||
"demo_meas_eff_flow": "demo_tab_treatment",
|
|
||||||
"demo_meas_eff_do": "demo_tab_treatment",
|
|
||||||
"demo_meas_eff_nh4": "demo_tab_treatment",
|
|
||||||
"demo_meas_eff_no3": "demo_tab_treatment",
|
|
||||||
"demo_meas_eff_tss": "demo_tab_treatment",
|
|
||||||
"demo_comment_pressure": "demo_tab_treatment",
|
|
||||||
"demo_link_reactor_dash": "demo_tab_treatment",
|
|
||||||
"demo_link_meas_dash": "demo_tab_treatment",
|
|
||||||
"demo_link_eff_meas_dash": "demo_tab_treatment"
|
|
||||||
};
|
|
||||||
|
|
||||||
for (const [nodeId, tabId] of Object.entries(moveMap)) {
|
|
||||||
const node = byId(nodeId);
|
|
||||||
if (node) {
|
|
||||||
node.z = tabId;
|
|
||||||
} else {
|
|
||||||
console.warn(`WARNING: Node ${nodeId} not found for move`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 1c-coords. Recalculate coordinates per tab
|
|
||||||
// =============================================
|
|
||||||
|
|
||||||
// PS West layout (2 pumps + MGC)
|
|
||||||
const psWestCoords = {
|
|
||||||
"demo_comment_ps": { x: 340, y: 40 },
|
|
||||||
"demo_inj_calib_west": { x: 120, y: 80 },
|
|
||||||
"demo_inj_w1_mode": { x: 120, y: 120 },
|
|
||||||
"demo_inj_west_mode": { x: 120, y: 200 },
|
|
||||||
"demo_inj_west_flow": { x: 120, y: 240 },
|
|
||||||
"demo_inj_w2_mode": { x: 120, y: 320 },
|
|
||||||
"demo_fn_west_flow_sim": { x: 360, y: 240 },
|
|
||||||
"demo_pump_w1": { x: 600, y: 120 },
|
|
||||||
"demo_pump_w2": { x: 600, y: 320 },
|
|
||||||
"demo_mgc_west": { x: 600, y: 220 },
|
|
||||||
"demo_ps_west": { x: 860, y: 220 },
|
|
||||||
"demo_fn_level_to_pressure_w": { x: 360, y: 420 },
|
|
||||||
"demo_meas_pt_w_up": { x: 560, y: 420 },
|
|
||||||
"demo_meas_pt_w_down": { x: 560, y: 480 },
|
|
||||||
"demo_mon_west": { x: 1080, y: 160 },
|
|
||||||
"demo_link_ps_west_dash": { x: 1080, y: 220 },
|
|
||||||
};
|
|
||||||
|
|
||||||
// PS North layout (1 pump, no MGC)
|
|
||||||
const psNorthCoords = {
|
|
||||||
"demo_comment_ps_north": { x: 340, y: 40 },
|
|
||||||
"demo_inj_calib_north": { x: 120, y: 80 },
|
|
||||||
"demo_inj_n1_mode": { x: 120, y: 120 },
|
|
||||||
"demo_inj_north_mode": { x: 120, y: 200 },
|
|
||||||
"demo_inj_north_flow": { x: 120, y: 240 },
|
|
||||||
"demo_fn_north_flow_sim": { x: 360, y: 240 },
|
|
||||||
"demo_pump_n1": { x: 600, y: 120 },
|
|
||||||
"demo_ps_north": { x: 860, y: 200 },
|
|
||||||
"demo_comment_north_outflow":{ x: 200, y: 320 },
|
|
||||||
"demo_meas_ft_n1": { x: 560, y: 340 },
|
|
||||||
"demo_fn_level_to_pressure_n":{ x: 360, y: 420 },
|
|
||||||
"demo_meas_pt_n_up": { x: 560, y: 420 },
|
|
||||||
"demo_meas_pt_n_down": { x: 560, y: 480 },
|
|
||||||
"demo_mon_north": { x: 1080, y: 140 },
|
|
||||||
"demo_link_ps_north_dash": { x: 1080, y: 200 },
|
|
||||||
};
|
|
||||||
|
|
||||||
// PS South layout (1 pump, no MGC)
|
|
||||||
const psSouthCoords = {
|
|
||||||
"demo_comment_ps_south": { x: 340, y: 40 },
|
|
||||||
"demo_inj_calib_south": { x: 120, y: 80 },
|
|
||||||
"demo_inj_s1_mode": { x: 120, y: 120 },
|
|
||||||
"demo_inj_south_mode": { x: 120, y: 200 },
|
|
||||||
"demo_inj_south_flow": { x: 120, y: 240 },
|
|
||||||
"demo_fn_south_flow_sim": { x: 360, y: 240 },
|
|
||||||
"demo_pump_s1": { x: 600, y: 120 },
|
|
||||||
"demo_ps_south": { x: 860, y: 200 },
|
|
||||||
"demo_fn_level_to_pressure_s":{ x: 360, y: 380 },
|
|
||||||
"demo_meas_pt_s_up": { x: 560, y: 380 },
|
|
||||||
"demo_meas_pt_s_down": { x: 560, y: 440 },
|
|
||||||
"demo_mon_south": { x: 1080, y: 140 },
|
|
||||||
"demo_link_ps_south_dash": { x: 1080, y: 200 },
|
|
||||||
};
|
|
||||||
|
|
||||||
// Treatment layout
|
|
||||||
const treatmentCoords = {
|
|
||||||
"demo_comment_treatment": { x: 200, y: 40 },
|
|
||||||
"demo_meas_flow": { x: 400, y: 120 },
|
|
||||||
"demo_meas_do": { x: 400, y: 180 },
|
|
||||||
"demo_meas_nh4": { x: 400, y: 240 },
|
|
||||||
"demo_inj_reactor_tick": { x: 600, y: 80 },
|
|
||||||
"demo_reactor": { x: 800, y: 180 },
|
|
||||||
"demo_settler": { x: 800, y: 320 },
|
|
||||||
"demo_monster": { x: 800, y: 420 },
|
|
||||||
"demo_inj_monster_flow": { x: 560, y: 420 },
|
|
||||||
"demo_fn_monster_flow": { x: 660, y: 460 },
|
|
||||||
"demo_comment_effluent_meas":{ x: 200, y: 520 },
|
|
||||||
"demo_meas_eff_flow": { x: 400, y: 560 },
|
|
||||||
"demo_meas_eff_do": { x: 400, y: 620 },
|
|
||||||
"demo_meas_eff_nh4": { x: 400, y: 680 },
|
|
||||||
"demo_meas_eff_no3": { x: 400, y: 740 },
|
|
||||||
"demo_meas_eff_tss": { x: 400, y: 800 },
|
|
||||||
"demo_comment_pressure": { x: 200, y: 860 },
|
|
||||||
"demo_link_reactor_dash": { x: 1020, y: 180 },
|
|
||||||
"demo_link_meas_dash": { x: 620, y: 180 },
|
|
||||||
"demo_link_eff_meas_dash": { x: 620, y: 620 },
|
|
||||||
};
|
|
||||||
|
|
||||||
// Apply coordinates
|
|
||||||
for (const [nodeId, coords] of Object.entries({...psWestCoords, ...psNorthCoords, ...psSouthCoords, ...treatmentCoords})) {
|
|
||||||
const node = byId(nodeId);
|
|
||||||
if (node) {
|
|
||||||
node.x = coords.x;
|
|
||||||
node.y = coords.y;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 1d. Create per-tab link-out nodes
|
|
||||||
// =============================================
|
|
||||||
|
|
||||||
// Determine which tab each moved node belongs to
|
|
||||||
const tabForNode = {};
|
|
||||||
for (const n of flow) {
|
|
||||||
if (n.z) tabForNode[n.id] = n.z;
|
|
||||||
}
|
|
||||||
|
|
||||||
// Map from tab → influx link-out ID
|
|
||||||
const influxLinkOutMap = {
|
|
||||||
"demo_tab_ps_west": "demo_link_influx_out_west",
|
|
||||||
"demo_tab_ps_north": "demo_link_influx_out_north",
|
|
||||||
"demo_tab_ps_south": "demo_link_influx_out_south",
|
|
||||||
"demo_tab_treatment": "demo_link_influx_out_treatment",
|
|
||||||
};
|
|
||||||
|
|
||||||
// Map from tab → process link-out ID
|
|
||||||
const processLinkOutMap = {
|
|
||||||
"demo_tab_ps_west": "demo_link_process_out_west",
|
|
||||||
"demo_tab_ps_north": "demo_link_process_out_north",
|
|
||||||
"demo_tab_ps_south": "demo_link_process_out_south",
|
|
||||||
"demo_tab_treatment": "demo_link_process_out_treatment",
|
|
||||||
};
|
|
||||||
|
|
||||||
// Link-out node positions per tab
|
|
||||||
const linkOutPositions = {
|
|
||||||
"demo_tab_ps_west": { influx: { x: 1080, y: 280 }, process: { x: 1080, y: 320 } },
|
|
||||||
"demo_tab_ps_north": { influx: { x: 1080, y: 260 }, process: { x: 1080, y: 300 } },
|
|
||||||
"demo_tab_ps_south": { influx: { x: 1080, y: 260 }, process: { x: 1080, y: 300 } },
|
|
||||||
"demo_tab_treatment": { influx: { x: 1020, y: 280 }, process: { x: 1020, y: 320 } },
|
|
||||||
};
|
|
||||||
|
|
||||||
// Create influx link-out nodes
|
|
||||||
for (const [tabId, nodeId] of Object.entries(influxLinkOutMap)) {
|
|
||||||
const pos = linkOutPositions[tabId].influx;
|
|
||||||
flow.push({
|
|
||||||
id: nodeId,
|
|
||||||
type: "link out",
|
|
||||||
z: tabId,
|
|
||||||
name: "→ InfluxDB",
|
|
||||||
mode: "link",
|
|
||||||
links: ["demo_link_influx_in"],
|
|
||||||
x: pos.x,
|
|
||||||
y: pos.y
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// Create process link-out nodes
|
|
||||||
for (const [tabId, nodeId] of Object.entries(processLinkOutMap)) {
|
|
||||||
const pos = linkOutPositions[tabId].process;
|
|
||||||
flow.push({
|
|
||||||
id: nodeId,
|
|
||||||
type: "link out",
|
|
||||||
z: tabId,
|
|
||||||
name: "→ Process debug",
|
|
||||||
mode: "link",
|
|
||||||
links: ["demo_link_process_in"],
|
|
||||||
x: pos.x,
|
|
||||||
y: pos.y
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 1d-rewire. Rewire nodes to use local link-outs
|
|
||||||
// =============================================
|
|
||||||
|
|
||||||
// For every node that references "demo_link_influx_out" or "demo_link_process_out"
|
|
||||||
// in its wires, replace with the per-tab version
|
|
||||||
for (const node of flow) {
|
|
||||||
if (!node.wires || !node.z) continue;
|
|
||||||
const tab = node.z;
|
|
||||||
const localInflux = influxLinkOutMap[tab];
|
|
||||||
const localProcess = processLinkOutMap[tab];
|
|
||||||
|
|
||||||
for (let portIdx = 0; portIdx < node.wires.length; portIdx++) {
|
|
||||||
for (let wireIdx = 0; wireIdx < node.wires[portIdx].length; wireIdx++) {
|
|
||||||
if (node.wires[portIdx][wireIdx] === "demo_link_influx_out" && localInflux) {
|
|
||||||
node.wires[portIdx][wireIdx] = localInflux;
|
|
||||||
}
|
|
||||||
if (node.wires[portIdx][wireIdx] === "demo_link_process_out" && localProcess) {
|
|
||||||
node.wires[portIdx][wireIdx] = localProcess;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Update the link-in nodes to reference all new link-out IDs
|
|
||||||
const influxIn = byId("demo_link_influx_in");
|
|
||||||
influxIn.links = Object.values(influxLinkOutMap);
|
|
||||||
// Also keep the old one if any nodes on the telemetry tab still reference it
|
|
||||||
// (the dashapi, telemetry nodes that stayed on demo_tab_wwtp)
|
|
||||||
influxIn.links.push("demo_link_influx_out");
|
|
||||||
|
|
||||||
const processIn = byId("demo_link_process_in");
|
|
||||||
processIn.links = Object.values(processLinkOutMap);
|
|
||||||
processIn.links.push("demo_link_process_out");
|
|
||||||
|
|
||||||
// Keep old link-out nodes on telemetry tab (they may still be needed
|
|
||||||
// by nodes that remain there, like dashapi)
|
|
||||||
// Update their links arrays too
|
|
||||||
const oldInfluxOut = byId("demo_link_influx_out");
|
|
||||||
if (oldInfluxOut) {
|
|
||||||
oldInfluxOut.links = ["demo_link_influx_in"];
|
|
||||||
// Move to bottom of telemetry tab
|
|
||||||
oldInfluxOut.x = 1135;
|
|
||||||
oldInfluxOut.y = 500;
|
|
||||||
}
|
|
||||||
|
|
||||||
const oldProcessOut = byId("demo_link_process_out");
|
|
||||||
if (oldProcessOut) {
|
|
||||||
oldProcessOut.links = ["demo_link_process_in"];
|
|
||||||
oldProcessOut.x = 1135;
|
|
||||||
oldProcessOut.y = 540;
|
|
||||||
}
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// Validate
|
|
||||||
// =============================================
|
|
||||||
const tabCounts = {};
|
|
||||||
for (const n of flow) {
|
|
||||||
if (n.z) {
|
|
||||||
tabCounts[n.z] = (tabCounts[n.z] || 0) + 1;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
console.log('Nodes per tab:', JSON.stringify(tabCounts, null, 2));
|
|
||||||
console.log('Total nodes:', flow.length);
|
|
||||||
|
|
||||||
// Check for broken wire references
|
|
||||||
const allIds = new Set(flow.map(n => n.id));
|
|
||||||
let brokenWires = 0;
|
|
||||||
for (const n of flow) {
|
|
||||||
if (!n.wires) continue;
|
|
||||||
for (const port of n.wires) {
|
|
||||||
for (const target of port) {
|
|
||||||
if (!allIds.has(target)) {
|
|
||||||
console.warn(`BROKEN WIRE: ${n.id} → ${target}`);
|
|
||||||
brokenWires++;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (brokenWires === 0) console.log('All wire references valid ✓');
|
|
||||||
|
|
||||||
// Check link-in/link-out pairing
|
|
||||||
for (const n of flow) {
|
|
||||||
if (n.type === 'link out' && n.links) {
|
|
||||||
for (const linkTarget of n.links) {
|
|
||||||
if (!allIds.has(linkTarget)) {
|
|
||||||
console.warn(`BROKEN LINK: ${n.id} links to missing ${linkTarget}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (n.type === 'link in' && n.links) {
|
|
||||||
for (const linkSource of n.links) {
|
|
||||||
if (!allIds.has(linkSource)) {
|
|
||||||
console.warn(`BROKEN LINK: ${n.id} expects link from missing ${linkSource}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
// Write
|
|
||||||
fs.writeFileSync(FLOW_PATH, JSON.stringify(flow, null, 2) + '\n');
|
|
||||||
console.log(`\nWrote ${FLOW_PATH} (${flow.length} nodes)`);
|
|
||||||
@@ -1,219 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Step 2: Merge Collection Point
|
|
||||||
* - Adds link-out from each PS tab to merge on treatment tab
|
|
||||||
* - Creates link-in, tag, collect, and dashboard link-out nodes on treatment
|
|
||||||
* - Wires PS outputs through merge to feed reactor
|
|
||||||
*/
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
|
|
||||||
const FLOW_PATH = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const flow = JSON.parse(fs.readFileSync(FLOW_PATH, 'utf8'));
|
|
||||||
|
|
||||||
const byId = (id) => flow.find(n => n.id === id);
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 2a. Link-out nodes on each PS tab
|
|
||||||
// =============================================
|
|
||||||
flow.push(
|
|
||||||
{
|
|
||||||
id: "demo_link_merge_west_out",
|
|
||||||
type: "link out",
|
|
||||||
z: "demo_tab_ps_west",
|
|
||||||
name: "→ Merge (West)",
|
|
||||||
mode: "link",
|
|
||||||
links: ["demo_link_merge_west_in"],
|
|
||||||
x: 1080, y: 360
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: "demo_link_merge_north_out",
|
|
||||||
type: "link out",
|
|
||||||
z: "demo_tab_ps_north",
|
|
||||||
name: "→ Merge (North)",
|
|
||||||
mode: "link",
|
|
||||||
links: ["demo_link_merge_north_in"],
|
|
||||||
x: 1080, y: 340
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: "demo_link_merge_south_out",
|
|
||||||
type: "link out",
|
|
||||||
z: "demo_tab_ps_south",
|
|
||||||
name: "→ Merge (South)",
|
|
||||||
mode: "link",
|
|
||||||
links: ["demo_link_merge_south_in"],
|
|
||||||
x: 1080, y: 340
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
// Add merge link-outs to each PS node's wires[0]
|
|
||||||
const psWest = byId("demo_ps_west");
|
|
||||||
psWest.wires[0].push("demo_link_merge_west_out");
|
|
||||||
|
|
||||||
const psNorth = byId("demo_ps_north");
|
|
||||||
psNorth.wires[0].push("demo_link_merge_north_out");
|
|
||||||
|
|
||||||
const psSouth = byId("demo_ps_south");
|
|
||||||
psSouth.wires[0].push("demo_link_merge_south_out");
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 2b. Merge nodes on Treatment tab
|
|
||||||
// =============================================
|
|
||||||
|
|
||||||
// Link-in nodes
|
|
||||||
flow.push(
|
|
||||||
{
|
|
||||||
id: "demo_link_merge_west_in",
|
|
||||||
type: "link in",
|
|
||||||
z: "demo_tab_treatment",
|
|
||||||
name: "← PS West",
|
|
||||||
links: ["demo_link_merge_west_out"],
|
|
||||||
x: 100, y: 920,
|
|
||||||
wires: [["demo_fn_tag_west"]]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: "demo_link_merge_north_in",
|
|
||||||
type: "link in",
|
|
||||||
z: "demo_tab_treatment",
|
|
||||||
name: "← PS North",
|
|
||||||
links: ["demo_link_merge_north_out"],
|
|
||||||
x: 100, y: 980,
|
|
||||||
wires: [["demo_fn_tag_north"]]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: "demo_link_merge_south_in",
|
|
||||||
type: "link in",
|
|
||||||
z: "demo_tab_treatment",
|
|
||||||
name: "← PS South",
|
|
||||||
links: ["demo_link_merge_south_out"],
|
|
||||||
x: 100, y: 1040,
|
|
||||||
wires: [["demo_fn_tag_south"]]
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
// Tag functions
|
|
||||||
flow.push(
|
|
||||||
{
|
|
||||||
id: "demo_fn_tag_west",
|
|
||||||
type: "function",
|
|
||||||
z: "demo_tab_treatment",
|
|
||||||
name: "Tag: west",
|
|
||||||
func: "msg._psSource = 'west';\nreturn msg;",
|
|
||||||
outputs: 1,
|
|
||||||
x: 280, y: 920,
|
|
||||||
wires: [["demo_fn_merge_collect"]]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: "demo_fn_tag_north",
|
|
||||||
type: "function",
|
|
||||||
z: "demo_tab_treatment",
|
|
||||||
name: "Tag: north",
|
|
||||||
func: "msg._psSource = 'north';\nreturn msg;",
|
|
||||||
outputs: 1,
|
|
||||||
x: 280, y: 980,
|
|
||||||
wires: [["demo_fn_merge_collect"]]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: "demo_fn_tag_south",
|
|
||||||
type: "function",
|
|
||||||
z: "demo_tab_treatment",
|
|
||||||
name: "Tag: south",
|
|
||||||
func: "msg._psSource = 'south';\nreturn msg;",
|
|
||||||
outputs: 1,
|
|
||||||
x: 280, y: 1040,
|
|
||||||
wires: [["demo_fn_merge_collect"]]
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
// Merge collect function
|
|
||||||
flow.push({
|
|
||||||
id: "demo_fn_merge_collect",
|
|
||||||
type: "function",
|
|
||||||
z: "demo_tab_treatment",
|
|
||||||
name: "Merge Collector",
|
|
||||||
func: `// Cache each PS output by _psSource tag, compute totals
|
|
||||||
const p = msg.payload || {};
|
|
||||||
const ps = msg._psSource;
|
|
||||||
const cache = flow.get('merge_cache') || { west: {}, north: {}, south: {} };
|
|
||||||
const keys = Object.keys(p);
|
|
||||||
const pick = (prefix) => { const k = keys.find(k => k.startsWith(prefix)); return k ? Number(p[k]) : null; };
|
|
||||||
|
|
||||||
if (ps && cache[ps]) {
|
|
||||||
const nf = pick('netFlowRate.predicted'); if (nf !== null) cache[ps].netFlow = nf;
|
|
||||||
const fp = pick('volumePercent.predicted'); if (fp !== null) cache[ps].fillPct = fp;
|
|
||||||
cache[ps].direction = p.direction || cache[ps].direction;
|
|
||||||
cache[ps].ts = Date.now();
|
|
||||||
}
|
|
||||||
flow.set('merge_cache', cache);
|
|
||||||
|
|
||||||
const totalFlow = (cache.west.netFlow||0) + (cache.north.netFlow||0) + (cache.south.netFlow||0);
|
|
||||||
const avgFill = ((cache.west.fillPct||0) + (cache.north.fillPct||0) + (cache.south.fillPct||0)) / 3;
|
|
||||||
|
|
||||||
return {
|
|
||||||
topic: 'merge_combined_influent',
|
|
||||||
payload: { totalInfluentFlow: +totalFlow.toFixed(1), avgFillPercent: +avgFill.toFixed(1),
|
|
||||||
west: cache.west, north: cache.north, south: cache.south }
|
|
||||||
};`,
|
|
||||||
outputs: 1,
|
|
||||||
x: 480, y: 980,
|
|
||||||
wires: [["demo_link_merge_dash"]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Dashboard link-out for merge data
|
|
||||||
flow.push({
|
|
||||||
id: "demo_link_merge_dash",
|
|
||||||
type: "link out",
|
|
||||||
z: "demo_tab_treatment",
|
|
||||||
name: "→ Merge Dashboard",
|
|
||||||
mode: "link",
|
|
||||||
links: ["demo_link_merge_dash_in"],
|
|
||||||
x: 680, y: 980
|
|
||||||
});
|
|
||||||
|
|
||||||
// Create a comment for the merge section
|
|
||||||
flow.push({
|
|
||||||
id: "demo_comment_merge",
|
|
||||||
type: "comment",
|
|
||||||
z: "demo_tab_treatment",
|
|
||||||
name: "=== MERGE COLLECTION POINT ===",
|
|
||||||
info: "Combines output from all 3 pumping stations",
|
|
||||||
x: 200, y: 880
|
|
||||||
});
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// Validate
|
|
||||||
// =============================================
|
|
||||||
const allIds = new Set(flow.map(n => n.id));
|
|
||||||
let brokenWires = 0;
|
|
||||||
for (const n of flow) {
|
|
||||||
if (!n.wires) continue;
|
|
||||||
for (const port of n.wires) {
|
|
||||||
for (const target of port) {
|
|
||||||
if (!allIds.has(target)) {
|
|
||||||
console.warn(`BROKEN WIRE: ${n.id} → ${target}`);
|
|
||||||
brokenWires++;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const n of flow) {
|
|
||||||
if (n.type === 'link out' && n.links) {
|
|
||||||
for (const lt of n.links) {
|
|
||||||
if (!allIds.has(lt)) console.warn(`BROKEN LINK: ${n.id} links to missing ${lt}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (n.type === 'link in' && n.links) {
|
|
||||||
for (const ls of n.links) {
|
|
||||||
if (!allIds.has(ls)) console.warn(`BROKEN LINK: ${n.id} expects link from missing ${ls}`);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (brokenWires === 0) console.log('All wire references valid ✓');
|
|
||||||
|
|
||||||
console.log('Total nodes:', flow.length);
|
|
||||||
|
|
||||||
// Write
|
|
||||||
fs.writeFileSync(FLOW_PATH, JSON.stringify(flow, null, 2) + '\n');
|
|
||||||
console.log(`Wrote ${FLOW_PATH}`);
|
|
||||||
@@ -1,583 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Step 3: Overview Dashboard Page + KPI Gauges
|
|
||||||
* - Creates overview page with chain visualization
|
|
||||||
* - Adds KPI gauges (Total Flow, DO, TSS, NH4)
|
|
||||||
* - Link-in nodes to feed overview from merge + reactor + effluent data
|
|
||||||
* - Reorders all page navigation
|
|
||||||
*/
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
|
|
||||||
const FLOW_PATH = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const flow = JSON.parse(fs.readFileSync(FLOW_PATH, 'utf8'));
|
|
||||||
|
|
||||||
const byId = (id) => flow.find(n => n.id === id);
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 3a. New config nodes
|
|
||||||
// =============================================
|
|
||||||
|
|
||||||
// Overview page
|
|
||||||
flow.push({
|
|
||||||
id: "demo_ui_page_overview",
|
|
||||||
type: "ui-page",
|
|
||||||
name: "Plant Overview",
|
|
||||||
ui: "demo_ui_base",
|
|
||||||
path: "/overview",
|
|
||||||
icon: "dashboard",
|
|
||||||
layout: "grid",
|
|
||||||
theme: "demo_ui_theme",
|
|
||||||
breakpoints: [{ name: "Default", px: "0", cols: "12" }],
|
|
||||||
order: 0,
|
|
||||||
className: ""
|
|
||||||
});
|
|
||||||
|
|
||||||
// Overview groups
|
|
||||||
flow.push(
|
|
||||||
{
|
|
||||||
id: "demo_ui_grp_overview_chain",
|
|
||||||
type: "ui-group",
|
|
||||||
name: "Process Chain",
|
|
||||||
page: "demo_ui_page_overview",
|
|
||||||
width: "12",
|
|
||||||
height: "1",
|
|
||||||
order: 1,
|
|
||||||
showTitle: true,
|
|
||||||
className: ""
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: "demo_ui_grp_overview_kpi",
|
|
||||||
type: "ui-group",
|
|
||||||
name: "Key Indicators",
|
|
||||||
page: "demo_ui_page_overview",
|
|
||||||
width: "12",
|
|
||||||
height: "1",
|
|
||||||
order: 2,
|
|
||||||
showTitle: true,
|
|
||||||
className: ""
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 3b. Chain visualization - link-in nodes on dashboard tab
|
|
||||||
// =============================================
|
|
||||||
|
|
||||||
// Link-in for merge data (this is what step 2's demo_link_merge_dash links to)
|
|
||||||
flow.push({
|
|
||||||
id: "demo_link_merge_dash_in",
|
|
||||||
type: "link in",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: "← Merge Data",
|
|
||||||
links: ["demo_link_merge_dash"],
|
|
||||||
x: 75, y: 960,
|
|
||||||
wires: [["demo_fn_overview_parse"]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// We also need reactor and effluent data for the overview.
|
|
||||||
// Create link-out nodes on treatment tab for overview data
|
|
||||||
flow.push(
|
|
||||||
{
|
|
||||||
id: "demo_link_overview_reactor_out",
|
|
||||||
type: "link out",
|
|
||||||
z: "demo_tab_treatment",
|
|
||||||
name: "→ Overview (Reactor)",
|
|
||||||
mode: "link",
|
|
||||||
links: ["demo_link_overview_reactor_in"],
|
|
||||||
x: 1020, y: 220
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: "demo_link_overview_reactor_in",
|
|
||||||
type: "link in",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: "← Reactor (Overview)",
|
|
||||||
links: ["demo_link_overview_reactor_out"],
|
|
||||||
x: 75, y: 1020,
|
|
||||||
wires: [["demo_fn_overview_reactor_parse"]]
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
// Add overview reactor link-out to reactor's wires[0]
|
|
||||||
const reactor = byId("demo_reactor");
|
|
||||||
reactor.wires[0].push("demo_link_overview_reactor_out");
|
|
||||||
|
|
||||||
// Effluent measurements link for overview KPIs
|
|
||||||
flow.push(
|
|
||||||
{
|
|
||||||
id: "demo_link_overview_eff_out",
|
|
||||||
type: "link out",
|
|
||||||
z: "demo_tab_treatment",
|
|
||||||
name: "→ Overview (Effluent)",
|
|
||||||
mode: "link",
|
|
||||||
links: ["demo_link_overview_eff_in"],
|
|
||||||
x: 620, y: 660
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: "demo_link_overview_eff_in",
|
|
||||||
type: "link in",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: "← Effluent (Overview)",
|
|
||||||
links: ["demo_link_overview_eff_out"],
|
|
||||||
x: 75, y: 1080,
|
|
||||||
wires: [["demo_fn_overview_eff_parse"]]
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
// Add overview eff link-out to effluent measurement nodes wires[0]
|
|
||||||
// TSS and NH4 are the key effluent quality indicators
|
|
||||||
const effTss = byId("demo_meas_eff_tss");
|
|
||||||
effTss.wires[0].push("demo_link_overview_eff_out");
|
|
||||||
const effNh4 = byId("demo_meas_eff_nh4");
|
|
||||||
effNh4.wires[0].push("demo_link_overview_eff_out");
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 3b. Parse functions for overview
|
|
||||||
// =============================================
|
|
||||||
|
|
||||||
// Parse merge data for chain visualization + total flow gauge
|
|
||||||
flow.push({
|
|
||||||
id: "demo_fn_overview_parse",
|
|
||||||
type: "function",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: "Parse Overview (Merge)",
|
|
||||||
func: `const p = msg.payload || {};
|
|
||||||
const now = Date.now();
|
|
||||||
|
|
||||||
// Store in flow context for the template
|
|
||||||
flow.set('overview_merge', p);
|
|
||||||
|
|
||||||
// Output 1: chain vis data, Output 2: total flow gauge
|
|
||||||
return [
|
|
||||||
{ topic: 'overview_chain', payload: p },
|
|
||||||
p.totalInfluentFlow !== undefined ? { topic: 'Total Influent Flow', payload: p.totalInfluentFlow } : null
|
|
||||||
];`,
|
|
||||||
outputs: 2,
|
|
||||||
x: 280, y: 960,
|
|
||||||
wires: [
|
|
||||||
["demo_overview_template"],
|
|
||||||
["demo_gauge_overview_flow"]
|
|
||||||
]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Parse reactor data for overview
|
|
||||||
flow.push({
|
|
||||||
id: "demo_fn_overview_reactor_parse",
|
|
||||||
type: "function",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: "Parse Overview (Reactor)",
|
|
||||||
func: `const p = msg.payload || {};
|
|
||||||
if (!p.C || !Array.isArray(p.C)) return null;
|
|
||||||
|
|
||||||
flow.set('overview_reactor', p);
|
|
||||||
|
|
||||||
// Output: DO gauge value
|
|
||||||
return { topic: 'Reactor DO', payload: Math.round(p.C[0]*100)/100 };`,
|
|
||||||
outputs: 1,
|
|
||||||
x: 280, y: 1020,
|
|
||||||
wires: [["demo_gauge_overview_do"]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Parse effluent data for overview KPIs
|
|
||||||
flow.push({
|
|
||||||
id: "demo_fn_overview_eff_parse",
|
|
||||||
type: "function",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: "Parse Overview (Effluent)",
|
|
||||||
func: `const p = msg.payload || {};
|
|
||||||
const topic = msg.topic || '';
|
|
||||||
const val = Number(p.mAbs);
|
|
||||||
if (!Number.isFinite(val)) return null;
|
|
||||||
|
|
||||||
// Route to appropriate gauge based on measurement type
|
|
||||||
if (topic.includes('TSS') || topic.includes('tss')) {
|
|
||||||
return [{ topic: 'Effluent TSS', payload: Math.round(val*100)/100 }, null];
|
|
||||||
}
|
|
||||||
if (topic.includes('NH4') || topic.includes('ammonium')) {
|
|
||||||
return [null, { topic: 'Effluent NH4', payload: Math.round(val*100)/100 }];
|
|
||||||
}
|
|
||||||
return [null, null];`,
|
|
||||||
outputs: 2,
|
|
||||||
x: 280, y: 1080,
|
|
||||||
wires: [
|
|
||||||
["demo_gauge_overview_tss"],
|
|
||||||
["demo_gauge_overview_nh4"]
|
|
||||||
]
|
|
||||||
});
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 3b. Chain visualization template
|
|
||||||
// =============================================
|
|
||||||
flow.push({
|
|
||||||
id: "demo_overview_template",
|
|
||||||
type: "ui-template",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: "demo_ui_grp_overview_chain",
|
|
||||||
name: "Process Chain Diagram",
|
|
||||||
order: 1,
|
|
||||||
width: "12",
|
|
||||||
height: "6",
|
|
||||||
head: "",
|
|
||||||
format: `<template>
|
|
||||||
<div class="chain-container">
|
|
||||||
<svg viewBox="0 0 900 280" class="chain-svg">
|
|
||||||
<!-- PS West -->
|
|
||||||
<g @click="navigateTo('/ps-west')" class="chain-block clickable">
|
|
||||||
<rect x="20" y="20" width="160" height="80" rx="8" :fill="blockColor(merge?.west)"/>
|
|
||||||
<text x="100" y="50" class="block-title">PS West</text>
|
|
||||||
<text x="100" y="70" class="block-value">{{ formatPct(merge?.west?.fillPct) }}</text>
|
|
||||||
<text x="100" y="86" class="block-sub">{{ formatDir(merge?.west?.direction) }}</text>
|
|
||||||
</g>
|
|
||||||
|
|
||||||
<!-- PS North -->
|
|
||||||
<g @click="navigateTo('/ps-north')" class="chain-block clickable">
|
|
||||||
<rect x="20" y="120" width="160" height="80" rx="8" :fill="blockColor(merge?.north)"/>
|
|
||||||
<text x="100" y="150" class="block-title">PS North</text>
|
|
||||||
<text x="100" y="170" class="block-value">{{ formatPct(merge?.north?.fillPct) }}</text>
|
|
||||||
<text x="100" y="186" class="block-sub">{{ formatDir(merge?.north?.direction) }}</text>
|
|
||||||
</g>
|
|
||||||
|
|
||||||
<!-- PS South -->
|
|
||||||
<g @click="navigateTo('/ps-south')" class="chain-block clickable">
|
|
||||||
<rect x="20" y="220" width="160" height="80" rx="8" :fill="blockColor(merge?.south)"/>
|
|
||||||
<text x="100" y="250" class="block-title">PS South</text>
|
|
||||||
<text x="100" y="270" class="block-value">{{ formatPct(merge?.south?.fillPct) }}</text>
|
|
||||||
<text x="100" y="286" class="block-sub">{{ formatDir(merge?.south?.direction) }}</text>
|
|
||||||
</g>
|
|
||||||
|
|
||||||
<!-- Merge arrows -->
|
|
||||||
<line x1="180" y1="60" x2="260" y2="160" class="chain-arrow"/>
|
|
||||||
<line x1="180" y1="160" x2="260" y2="160" class="chain-arrow"/>
|
|
||||||
<line x1="180" y1="260" x2="260" y2="160" class="chain-arrow"/>
|
|
||||||
|
|
||||||
<!-- Merge point -->
|
|
||||||
<g class="chain-block">
|
|
||||||
<rect x="260" y="120" width="120" height="80" rx="8" fill="#0f3460"/>
|
|
||||||
<text x="320" y="150" class="block-title">Merge</text>
|
|
||||||
<text x="320" y="170" class="block-value">{{ formatFlow(merge?.totalInfluentFlow) }}</text>
|
|
||||||
<text x="320" y="186" class="block-sub">m\\u00b3/h total</text>
|
|
||||||
</g>
|
|
||||||
|
|
||||||
<!-- Arrow merge → reactor -->
|
|
||||||
<line x1="380" y1="160" x2="420" y2="160" class="chain-arrow"/>
|
|
||||||
|
|
||||||
<!-- Reactor -->
|
|
||||||
<g @click="navigateTo('/treatment')" class="chain-block clickable">
|
|
||||||
<rect x="420" y="120" width="140" height="80" rx="8" :fill="reactorColor"/>
|
|
||||||
<text x="490" y="150" class="block-title">Reactor</text>
|
|
||||||
<text x="490" y="170" class="block-value">DO: {{ reactorDO }}</text>
|
|
||||||
<text x="490" y="186" class="block-sub">mg/L</text>
|
|
||||||
</g>
|
|
||||||
|
|
||||||
<!-- Arrow reactor → settler -->
|
|
||||||
<line x1="560" y1="160" x2="600" y2="160" class="chain-arrow"/>
|
|
||||||
|
|
||||||
<!-- Settler -->
|
|
||||||
<g @click="navigateTo('/treatment')" class="chain-block clickable">
|
|
||||||
<rect x="600" y="120" width="120" height="80" rx="8" fill="#0f3460"/>
|
|
||||||
<text x="660" y="150" class="block-title">Settler</text>
|
|
||||||
<text x="660" y="170" class="block-value">TSS: {{ effTSS }}</text>
|
|
||||||
<text x="660" y="186" class="block-sub">mg/L</text>
|
|
||||||
</g>
|
|
||||||
|
|
||||||
<!-- Arrow settler → effluent -->
|
|
||||||
<line x1="720" y1="160" x2="760" y2="160" class="chain-arrow"/>
|
|
||||||
|
|
||||||
<!-- Effluent -->
|
|
||||||
<g class="chain-block">
|
|
||||||
<rect x="760" y="120" width="120" height="80" rx="8" :fill="effluentColor"/>
|
|
||||||
<text x="820" y="150" class="block-title">Effluent</text>
|
|
||||||
<text x="820" y="170" class="block-value">NH4: {{ effNH4 }}</text>
|
|
||||||
<text x="820" y="186" class="block-sub">mg/L</text>
|
|
||||||
</g>
|
|
||||||
</svg>
|
|
||||||
</div>
|
|
||||||
</template>
|
|
||||||
|
|
||||||
<script>
|
|
||||||
export default {
|
|
||||||
data() {
|
|
||||||
return {
|
|
||||||
merge: null,
|
|
||||||
reactorDO: '--',
|
|
||||||
effTSS: '--',
|
|
||||||
effNH4: '--'
|
|
||||||
}
|
|
||||||
},
|
|
||||||
computed: {
|
|
||||||
reactorColor() {
|
|
||||||
const d = parseFloat(this.reactorDO);
|
|
||||||
if (isNaN(d)) return '#0f3460';
|
|
||||||
if (d < 1) return '#f44336';
|
|
||||||
if (d < 2) return '#ff9800';
|
|
||||||
return '#1b5e20';
|
|
||||||
},
|
|
||||||
effluentColor() {
|
|
||||||
const n = parseFloat(this.effNH4);
|
|
||||||
if (isNaN(n)) return '#0f3460';
|
|
||||||
if (n > 10) return '#f44336';
|
|
||||||
if (n > 5) return '#ff9800';
|
|
||||||
return '#1b5e20';
|
|
||||||
}
|
|
||||||
},
|
|
||||||
watch: {
|
|
||||||
msg(val) {
|
|
||||||
if (!val) return;
|
|
||||||
const t = val.topic || '';
|
|
||||||
if (t === 'overview_chain') {
|
|
||||||
this.merge = val.payload;
|
|
||||||
} else if (t === 'Reactor DO') {
|
|
||||||
this.reactorDO = val.payload?.toFixed(1) || '--';
|
|
||||||
} else if (t === 'Effluent TSS') {
|
|
||||||
this.effTSS = val.payload?.toFixed(1) || '--';
|
|
||||||
} else if (t === 'Effluent NH4') {
|
|
||||||
this.effNH4 = val.payload?.toFixed(1) || '--';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
},
|
|
||||||
methods: {
|
|
||||||
navigateTo(path) {
|
|
||||||
this.$router.push('/dashboard' + path);
|
|
||||||
},
|
|
||||||
blockColor(ps) {
|
|
||||||
if (!ps || ps.fillPct === undefined) return '#0f3460';
|
|
||||||
if (ps.fillPct > 90) return '#f44336';
|
|
||||||
if (ps.fillPct > 75) return '#ff9800';
|
|
||||||
if (ps.fillPct < 10) return '#f44336';
|
|
||||||
return '#0f3460';
|
|
||||||
},
|
|
||||||
formatPct(v) { return v !== undefined && v !== null ? v.toFixed(0) + '%' : '--'; },
|
|
||||||
formatFlow(v) { return v !== undefined && v !== null ? v.toFixed(0) : '--'; },
|
|
||||||
formatDir(d) { return d === 'filling' ? '\\u2191 filling' : d === 'emptying' ? '\\u2193 emptying' : '--'; }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
</script>
|
|
||||||
|
|
||||||
<style>
|
|
||||||
.chain-container { width: 100%; overflow-x: auto; }
|
|
||||||
.chain-svg { width: 100%; height: auto; min-height: 200px; }
|
|
||||||
.chain-block text { text-anchor: middle; fill: #e0e0e0; }
|
|
||||||
.block-title { font-size: 14px; font-weight: bold; }
|
|
||||||
.block-value { font-size: 13px; fill: #4fc3f7; }
|
|
||||||
.block-sub { font-size: 10px; fill: #90a4ae; }
|
|
||||||
.chain-arrow { stroke: #4fc3f7; stroke-width: 2; marker-end: url(#arrowhead); }
|
|
||||||
.clickable { cursor: pointer; }
|
|
||||||
.clickable:hover rect { opacity: 0.8; }
|
|
||||||
</style>`,
|
|
||||||
templateScope: "local",
|
|
||||||
className: "",
|
|
||||||
x: 510, y: 960,
|
|
||||||
wires: [[]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 3c. KPI gauges on overview
|
|
||||||
// =============================================
|
|
||||||
|
|
||||||
// Total Influent Flow gauge
|
|
||||||
flow.push({
|
|
||||||
id: "demo_gauge_overview_flow",
|
|
||||||
type: "ui-gauge",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: "demo_ui_grp_overview_kpi",
|
|
||||||
name: "Total Influent Flow",
|
|
||||||
gtype: "gauge-34",
|
|
||||||
gstyle: "Rounded",
|
|
||||||
title: "Influent Flow",
|
|
||||||
units: "m\u00b3/h",
|
|
||||||
prefix: "",
|
|
||||||
suffix: "m\u00b3/h",
|
|
||||||
min: 0,
|
|
||||||
max: 500,
|
|
||||||
segments: [
|
|
||||||
{ color: "#2196f3", from: 0 },
|
|
||||||
{ color: "#4caf50", from: 50 },
|
|
||||||
{ color: "#ff9800", from: 350 },
|
|
||||||
{ color: "#f44336", from: 450 }
|
|
||||||
],
|
|
||||||
width: 3,
|
|
||||||
height: 4,
|
|
||||||
order: 1,
|
|
||||||
className: "",
|
|
||||||
x: 510, y: 1020,
|
|
||||||
wires: []
|
|
||||||
});
|
|
||||||
|
|
||||||
// Reactor DO gauge
|
|
||||||
flow.push({
|
|
||||||
id: "demo_gauge_overview_do",
|
|
||||||
type: "ui-gauge",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: "demo_ui_grp_overview_kpi",
|
|
||||||
name: "Reactor DO",
|
|
||||||
gtype: "gauge-34",
|
|
||||||
gstyle: "Rounded",
|
|
||||||
title: "Reactor DO",
|
|
||||||
units: "mg/L",
|
|
||||||
prefix: "",
|
|
||||||
suffix: "mg/L",
|
|
||||||
min: 0,
|
|
||||||
max: 10,
|
|
||||||
segments: [
|
|
||||||
{ color: "#f44336", from: 0 },
|
|
||||||
{ color: "#ff9800", from: 1 },
|
|
||||||
{ color: "#4caf50", from: 2 },
|
|
||||||
{ color: "#ff9800", from: 6 },
|
|
||||||
{ color: "#f44336", from: 8 }
|
|
||||||
],
|
|
||||||
width: 3,
|
|
||||||
height: 4,
|
|
||||||
order: 2,
|
|
||||||
className: "",
|
|
||||||
x: 510, y: 1060,
|
|
||||||
wires: []
|
|
||||||
});
|
|
||||||
|
|
||||||
// Effluent TSS gauge
|
|
||||||
flow.push({
|
|
||||||
id: "demo_gauge_overview_tss",
|
|
||||||
type: "ui-gauge",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: "demo_ui_grp_overview_kpi",
|
|
||||||
name: "Effluent TSS",
|
|
||||||
gtype: "gauge-34",
|
|
||||||
gstyle: "Rounded",
|
|
||||||
title: "Effluent TSS",
|
|
||||||
units: "mg/L",
|
|
||||||
prefix: "",
|
|
||||||
suffix: "mg/L",
|
|
||||||
min: 0,
|
|
||||||
max: 50,
|
|
||||||
segments: [
|
|
||||||
{ color: "#4caf50", from: 0 },
|
|
||||||
{ color: "#ff9800", from: 25 },
|
|
||||||
{ color: "#f44336", from: 40 }
|
|
||||||
],
|
|
||||||
width: 3,
|
|
||||||
height: 4,
|
|
||||||
order: 3,
|
|
||||||
className: "",
|
|
||||||
x: 510, y: 1100,
|
|
||||||
wires: []
|
|
||||||
});
|
|
||||||
|
|
||||||
// Effluent NH4 gauge
|
|
||||||
flow.push({
|
|
||||||
id: "demo_gauge_overview_nh4",
|
|
||||||
type: "ui-gauge",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: "demo_ui_grp_overview_kpi",
|
|
||||||
name: "Effluent NH4",
|
|
||||||
gtype: "gauge-34",
|
|
||||||
gstyle: "Rounded",
|
|
||||||
title: "Effluent NH4",
|
|
||||||
units: "mg/L",
|
|
||||||
prefix: "",
|
|
||||||
suffix: "mg/L",
|
|
||||||
min: 0,
|
|
||||||
max: 20,
|
|
||||||
segments: [
|
|
||||||
{ color: "#4caf50", from: 0 },
|
|
||||||
{ color: "#ff9800", from: 5 },
|
|
||||||
{ color: "#f44336", from: 10 }
|
|
||||||
],
|
|
||||||
width: 3,
|
|
||||||
height: 4,
|
|
||||||
order: 4,
|
|
||||||
className: "",
|
|
||||||
x: 510, y: 1140,
|
|
||||||
wires: []
|
|
||||||
});
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// 3d. Reorder all page navigation
|
|
||||||
// =============================================
|
|
||||||
const pageOrders = {
|
|
||||||
"demo_ui_page_overview": 0,
|
|
||||||
"demo_ui_page_influent": 1,
|
|
||||||
"demo_ui_page_treatment": 5,
|
|
||||||
"demo_ui_page_telemetry": 6,
|
|
||||||
};
|
|
||||||
|
|
||||||
for (const [pageId, order] of Object.entries(pageOrders)) {
|
|
||||||
const page = byId(pageId);
|
|
||||||
if (page) page.order = order;
|
|
||||||
}
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// Feed chain vis and KPIs from merge + reactor + effluent
|
|
||||||
// We need to also wire the overview_template to receive reactor/eff data
|
|
||||||
// The parse functions already wire to the template and gauges separately
|
|
||||||
// But the template needs ALL data sources - let's connect reactor and eff parsers to it too
|
|
||||||
// =============================================
|
|
||||||
|
|
||||||
// Actually, the template needs multiple inputs. Let's connect reactor and eff parse outputs too.
|
|
||||||
// Modify overview reactor parse to also send to template
|
|
||||||
const reactorParse = byId("demo_fn_overview_reactor_parse");
|
|
||||||
// Currently wires to demo_gauge_overview_do. Add template as well.
|
|
||||||
reactorParse.func = `const p = msg.payload || {};
|
|
||||||
if (!p.C || !Array.isArray(p.C)) return null;
|
|
||||||
|
|
||||||
flow.set('overview_reactor', p);
|
|
||||||
|
|
||||||
// Output 1: DO gauge, Output 2: to chain template
|
|
||||||
const doVal = Math.round(p.C[0]*100)/100;
|
|
||||||
return [
|
|
||||||
{ topic: 'Reactor DO', payload: doVal },
|
|
||||||
{ topic: 'Reactor DO', payload: doVal }
|
|
||||||
];`;
|
|
||||||
reactorParse.outputs = 2;
|
|
||||||
reactorParse.wires = [["demo_gauge_overview_do"], ["demo_overview_template"]];
|
|
||||||
|
|
||||||
// Same for effluent parse - add template output
|
|
||||||
const effParse = byId("demo_fn_overview_eff_parse");
|
|
||||||
effParse.func = `const p = msg.payload || {};
|
|
||||||
const topic = msg.topic || '';
|
|
||||||
const val = Number(p.mAbs);
|
|
||||||
if (!Number.isFinite(val)) return null;
|
|
||||||
|
|
||||||
const rounded = Math.round(val*100)/100;
|
|
||||||
|
|
||||||
// Route to appropriate gauge + template based on measurement type
|
|
||||||
if (topic.includes('TSS') || topic.includes('tss')) {
|
|
||||||
return [{ topic: 'Effluent TSS', payload: rounded }, null, { topic: 'Effluent TSS', payload: rounded }];
|
|
||||||
}
|
|
||||||
if (topic.includes('NH4') || topic.includes('ammonium')) {
|
|
||||||
return [null, { topic: 'Effluent NH4', payload: rounded }, { topic: 'Effluent NH4', payload: rounded }];
|
|
||||||
}
|
|
||||||
return [null, null, null];`;
|
|
||||||
effParse.outputs = 3;
|
|
||||||
effParse.wires = [["demo_gauge_overview_tss"], ["demo_gauge_overview_nh4"], ["demo_overview_template"]];
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// Validate
|
|
||||||
// =============================================
|
|
||||||
const allIds = new Set(flow.map(n => n.id));
|
|
||||||
let issues = 0;
|
|
||||||
for (const n of flow) {
|
|
||||||
if (!n.wires) continue;
|
|
||||||
for (const port of n.wires) {
|
|
||||||
for (const target of port) {
|
|
||||||
if (!allIds.has(target)) {
|
|
||||||
console.warn(`BROKEN WIRE: ${n.id} → ${target}`);
|
|
||||||
issues++;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (n.type === 'link out' && n.links) {
|
|
||||||
for (const lt of n.links) {
|
|
||||||
if (!allIds.has(lt)) { console.warn(`BROKEN LINK OUT: ${n.id} → ${lt}`); issues++; }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (n.type === 'link in' && n.links) {
|
|
||||||
for (const ls of n.links) {
|
|
||||||
if (!allIds.has(ls)) { console.warn(`BROKEN LINK IN: ${n.id} ← ${ls}`); issues++; }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (issues === 0) console.log('All references valid ✓');
|
|
||||||
console.log('Total nodes:', flow.length);
|
|
||||||
|
|
||||||
// Write
|
|
||||||
fs.writeFileSync(FLOW_PATH, JSON.stringify(flow, null, 2) + '\n');
|
|
||||||
console.log(`Wrote ${FLOW_PATH}`);
|
|
||||||
@@ -1,613 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Step 4: Manual Controls per PS Detail Page
|
|
||||||
* - Creates 3 PS detail pages (/ps-west, /ps-north, /ps-south) with control groups
|
|
||||||
* - Adds control widgets: mode switches, pump speed sliders
|
|
||||||
* - Format functions to convert dashboard inputs to process node messages
|
|
||||||
* - Link-in/out routing between dashboard tab and PS tabs
|
|
||||||
* - Per-PS monitoring charts on detail pages
|
|
||||||
*/
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
|
|
||||||
const FLOW_PATH = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const flow = JSON.parse(fs.readFileSync(FLOW_PATH, 'utf8'));
|
|
||||||
|
|
||||||
const byId = (id) => flow.find(n => n.id === id);
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// Helper to create a standard set of controls for a PS
|
|
||||||
// =============================================
|
|
||||||
function createPSDetailPage(config) {
|
|
||||||
const {
|
|
||||||
psKey, // 'west', 'north', 'south'
|
|
||||||
psLabel, // 'PS West', 'PS North', 'PS South'
|
|
||||||
pagePath, // '/ps-west'
|
|
||||||
pageOrder, // 2, 3, 4
|
|
||||||
psNodeId, // 'demo_ps_west'
|
|
||||||
pumps, // [{id: 'demo_pump_w1', label: 'W1'}, ...]
|
|
||||||
controlModes, // ['levelbased','flowbased','manual']
|
|
||||||
defaultMode, // 'levelbased'
|
|
||||||
maxFlow, // 300
|
|
||||||
basinHeight, // 4
|
|
||||||
tabId, // 'demo_tab_ps_west'
|
|
||||||
} = config;
|
|
||||||
|
|
||||||
const prefix = `demo_ctrl_${psKey}`;
|
|
||||||
const nodes = [];
|
|
||||||
|
|
||||||
// === Page ===
|
|
||||||
nodes.push({
|
|
||||||
id: `demo_ui_page_ps_${psKey}_detail`,
|
|
||||||
type: "ui-page",
|
|
||||||
name: `${psLabel} Detail`,
|
|
||||||
ui: "demo_ui_base",
|
|
||||||
path: pagePath,
|
|
||||||
icon: "water_drop",
|
|
||||||
layout: "grid",
|
|
||||||
theme: "demo_ui_theme",
|
|
||||||
breakpoints: [{ name: "Default", px: "0", cols: "12" }],
|
|
||||||
order: pageOrder,
|
|
||||||
className: ""
|
|
||||||
});
|
|
||||||
|
|
||||||
// === Groups ===
|
|
||||||
nodes.push(
|
|
||||||
{
|
|
||||||
id: `${prefix}_grp_controls`,
|
|
||||||
type: "ui-group",
|
|
||||||
name: `${psLabel} Controls`,
|
|
||||||
page: `demo_ui_page_ps_${psKey}_detail`,
|
|
||||||
width: "6",
|
|
||||||
height: "1",
|
|
||||||
order: 1,
|
|
||||||
showTitle: true,
|
|
||||||
className: ""
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: `${prefix}_grp_monitoring`,
|
|
||||||
type: "ui-group",
|
|
||||||
name: `${psLabel} Monitoring`,
|
|
||||||
page: `demo_ui_page_ps_${psKey}_detail`,
|
|
||||||
width: "6",
|
|
||||||
height: "1",
|
|
||||||
order: 2,
|
|
||||||
showTitle: true,
|
|
||||||
className: ""
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: `${prefix}_grp_charts`,
|
|
||||||
type: "ui-group",
|
|
||||||
name: `${psLabel} Trends`,
|
|
||||||
page: `demo_ui_page_ps_${psKey}_detail`,
|
|
||||||
width: "12",
|
|
||||||
height: "1",
|
|
||||||
order: 3,
|
|
||||||
showTitle: true,
|
|
||||||
className: ""
|
|
||||||
}
|
|
||||||
);
|
|
||||||
|
|
||||||
// === PS Mode button group ===
|
|
||||||
const modeOptions = controlModes.map(m => ({
|
|
||||||
label: m === 'levelbased' ? 'Level' : m === 'flowbased' ? 'Flow' : m.charAt(0).toUpperCase() + m.slice(1),
|
|
||||||
value: m,
|
|
||||||
valueType: "str"
|
|
||||||
}));
|
|
||||||
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_mode`,
|
|
||||||
type: "ui-button-group",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: `${prefix}_grp_controls`,
|
|
||||||
name: `${psLabel} Mode`,
|
|
||||||
label: "Station Mode",
|
|
||||||
tooltip: "",
|
|
||||||
order: 1,
|
|
||||||
width: "6",
|
|
||||||
height: "1",
|
|
||||||
passthru: false,
|
|
||||||
options: modeOptions,
|
|
||||||
x: 120, y: 100 + pageOrder * 300,
|
|
||||||
wires: [[`${prefix}_fn_mode`]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Format: PS mode → setMode message
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_fn_mode`,
|
|
||||||
type: "function",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: `Fmt ${psLabel} Mode`,
|
|
||||||
func: `msg.topic = 'setMode';\nmsg.payload = msg.payload;\nreturn msg;`,
|
|
||||||
outputs: 1,
|
|
||||||
x: 320, y: 100 + pageOrder * 300,
|
|
||||||
wires: [[`${prefix}_link_cmd_out`]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// === Manual Flow slider ===
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_flow`,
|
|
||||||
type: "ui-slider",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: `${prefix}_grp_controls`,
|
|
||||||
name: `${psLabel} Flow`,
|
|
||||||
label: "Manual Flow (m\u00b3/h)",
|
|
||||||
tooltip: "",
|
|
||||||
order: 2,
|
|
||||||
width: "6",
|
|
||||||
height: "1",
|
|
||||||
passthru: false,
|
|
||||||
outs: "end",
|
|
||||||
min: 0,
|
|
||||||
max: maxFlow,
|
|
||||||
step: 1,
|
|
||||||
x: 120, y: 140 + pageOrder * 300,
|
|
||||||
wires: [[`${prefix}_fn_flow`]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Format: flow slider → q_in message
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_fn_flow`,
|
|
||||||
type: "function",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: `Fmt ${psLabel} Flow`,
|
|
||||||
func: `msg.topic = 'q_in';\nmsg.payload = { value: Number(msg.payload), unit: 'm3/h' };\nreturn msg;`,
|
|
||||||
outputs: 1,
|
|
||||||
x: 320, y: 140 + pageOrder * 300,
|
|
||||||
wires: [[`${prefix}_link_cmd_out`]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// === Pump controls ===
|
|
||||||
pumps.forEach((pump, pIdx) => {
|
|
||||||
const yOff = 180 + pageOrder * 300 + pIdx * 80;
|
|
||||||
|
|
||||||
// Pump mode button group
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_pump_${pump.label.toLowerCase()}_mode`,
|
|
||||||
type: "ui-button-group",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: `${prefix}_grp_controls`,
|
|
||||||
name: `${pump.label} Mode`,
|
|
||||||
label: `${pump.label} Mode`,
|
|
||||||
tooltip: "",
|
|
||||||
order: 3 + pIdx * 2,
|
|
||||||
width: "3",
|
|
||||||
height: "1",
|
|
||||||
passthru: false,
|
|
||||||
options: [
|
|
||||||
{ label: "Auto", value: "auto", valueType: "str" },
|
|
||||||
{ label: "Virtual", value: "virtualControl", valueType: "str" },
|
|
||||||
{ label: "Physical", value: "fysicalControl", valueType: "str" }
|
|
||||||
],
|
|
||||||
x: 120, y: yOff,
|
|
||||||
wires: [[`${prefix}_fn_pump_${pump.label.toLowerCase()}_mode`]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Format: pump mode
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_fn_pump_${pump.label.toLowerCase()}_mode`,
|
|
||||||
type: "function",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: `Fmt ${pump.label} Mode`,
|
|
||||||
func: `msg.topic = 'setMode';\nmsg.payload = msg.payload;\nmsg._targetNode = '${pump.id}';\nreturn msg;`,
|
|
||||||
outputs: 1,
|
|
||||||
x: 320, y: yOff,
|
|
||||||
wires: [[`${prefix}_link_pump_${pump.label.toLowerCase()}_out`]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Pump speed slider
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_pump_${pump.label.toLowerCase()}_speed`,
|
|
||||||
type: "ui-slider",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: `${prefix}_grp_controls`,
|
|
||||||
name: `${pump.label} Speed`,
|
|
||||||
label: `${pump.label} Speed (%)`,
|
|
||||||
tooltip: "",
|
|
||||||
order: 4 + pIdx * 2,
|
|
||||||
width: "3",
|
|
||||||
height: "1",
|
|
||||||
passthru: false,
|
|
||||||
outs: "end",
|
|
||||||
min: 0,
|
|
||||||
max: 100,
|
|
||||||
step: 1,
|
|
||||||
x: 120, y: yOff + 40,
|
|
||||||
wires: [[`${prefix}_fn_pump_${pump.label.toLowerCase()}_speed`]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Format: pump speed → execMovement
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_fn_pump_${pump.label.toLowerCase()}_speed`,
|
|
||||||
type: "function",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: `Fmt ${pump.label} Speed`,
|
|
||||||
func: `msg.topic = 'execMovement';\nmsg.payload = { source: 'dashboard', action: 'setpoint', setpoint: Number(msg.payload) };\nmsg._targetNode = '${pump.id}';\nreturn msg;`,
|
|
||||||
outputs: 1,
|
|
||||||
x: 320, y: yOff + 40,
|
|
||||||
wires: [[`${prefix}_link_pump_${pump.label.toLowerCase()}_out`]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Link-out for pump commands (dashboard → PS tab)
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_link_pump_${pump.label.toLowerCase()}_out`,
|
|
||||||
type: "link out",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: `→ ${pump.label} Cmd`,
|
|
||||||
mode: "link",
|
|
||||||
links: [`${prefix}_link_pump_${pump.label.toLowerCase()}_in`],
|
|
||||||
x: 520, y: yOff + 20
|
|
||||||
});
|
|
||||||
|
|
||||||
// Link-in on PS tab
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_link_pump_${pump.label.toLowerCase()}_in`,
|
|
||||||
type: "link in",
|
|
||||||
z: tabId,
|
|
||||||
name: `← ${pump.label} Cmd`,
|
|
||||||
links: [`${prefix}_link_pump_${pump.label.toLowerCase()}_out`],
|
|
||||||
x: 120, y: 540 + pIdx * 60,
|
|
||||||
wires: [[pump.id]]
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
// === PS command link-out (dashboard → PS tab) ===
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_link_cmd_out`,
|
|
||||||
type: "link out",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: `→ ${psLabel} Cmd`,
|
|
||||||
mode: "link",
|
|
||||||
links: [`${prefix}_link_cmd_in`],
|
|
||||||
x: 520, y: 120 + pageOrder * 300
|
|
||||||
});
|
|
||||||
|
|
||||||
// Link-in on PS tab for PS-level commands
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_link_cmd_in`,
|
|
||||||
type: "link in",
|
|
||||||
z: tabId,
|
|
||||||
name: `← ${psLabel} Cmd`,
|
|
||||||
links: [`${prefix}_link_cmd_out`],
|
|
||||||
x: 120, y: 480,
|
|
||||||
wires: [[psNodeId]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// === Monitoring widgets on detail page ===
|
|
||||||
// Re-use existing data from the PS parse functions on dashboard tab
|
|
||||||
// Create a link-in to receive PS data and parse for detail page
|
|
||||||
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_link_detail_data_out`,
|
|
||||||
type: "link out",
|
|
||||||
z: tabId,
|
|
||||||
name: `→ ${psLabel} Detail`,
|
|
||||||
mode: "link",
|
|
||||||
links: [`${prefix}_link_detail_data_in`],
|
|
||||||
x: 1080, y: 400
|
|
||||||
});
|
|
||||||
|
|
||||||
// Add to PS node wires[0]
|
|
||||||
const psNode = byId(psNodeId);
|
|
||||||
if (psNode && psNode.wires && psNode.wires[0]) {
|
|
||||||
psNode.wires[0].push(`${prefix}_link_detail_data_out`);
|
|
||||||
}
|
|
||||||
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_link_detail_data_in`,
|
|
||||||
type: "link in",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: `← ${psLabel} Detail`,
|
|
||||||
links: [`${prefix}_link_detail_data_out`],
|
|
||||||
x: 75, y: 50 + pageOrder * 300,
|
|
||||||
wires: [[`${prefix}_fn_detail_parse`]]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Parse function for detail monitoring
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_fn_detail_parse`,
|
|
||||||
type: "function",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
name: `Parse ${psLabel} Detail`,
|
|
||||||
func: `const p = msg.payload || {};
|
|
||||||
const cache = context.get('c') || {};
|
|
||||||
const keys = Object.keys(p);
|
|
||||||
const pick = (prefixes) => { for (const pfx of prefixes) { const k = keys.find(k => k.startsWith(pfx)); if (k) { const v = Number(p[k]); if (Number.isFinite(v)) return v; } } return null; };
|
|
||||||
|
|
||||||
const level = pick(['level.predicted.atequipment','level.measured.atequipment']);
|
|
||||||
const volume = pick(['volume.predicted.atequipment']);
|
|
||||||
const netFlow = pick(['netFlowRate.predicted.atequipment']);
|
|
||||||
const fillPct = pick(['volumePercent.predicted.atequipment']);
|
|
||||||
const direction = p.direction || cache.direction || '?';
|
|
||||||
|
|
||||||
if (level !== null) cache.level = level;
|
|
||||||
if (volume !== null) cache.volume = volume;
|
|
||||||
if (netFlow !== null) cache.netFlow = netFlow;
|
|
||||||
if (fillPct !== null) cache.fillPct = fillPct;
|
|
||||||
cache.direction = direction;
|
|
||||||
context.set('c', cache);
|
|
||||||
|
|
||||||
const now = Date.now();
|
|
||||||
const dirArrow = cache.direction === 'filling' ? '\\u2191' : cache.direction === 'emptying' ? '\\u2193' : '\\u2014';
|
|
||||||
const status = [
|
|
||||||
dirArrow + ' ' + (cache.direction || ''),
|
|
||||||
cache.netFlow !== undefined ? Math.abs(cache.netFlow).toFixed(0) + ' m\\u00b3/h' : '',
|
|
||||||
].filter(s => s.trim()).join(' | ');
|
|
||||||
|
|
||||||
return [
|
|
||||||
cache.level !== undefined ? {topic:'${psLabel} Level', payload: cache.level, timestamp: now} : null,
|
|
||||||
cache.netFlow !== undefined ? {topic:'${psLabel} Flow', payload: cache.netFlow, timestamp: now} : null,
|
|
||||||
{topic:'${psLabel} Status', payload: status},
|
|
||||||
cache.fillPct !== undefined ? {payload: Number(cache.fillPct.toFixed(1))} : null,
|
|
||||||
cache.level !== undefined ? {payload: Number(cache.level.toFixed(2))} : null
|
|
||||||
];`,
|
|
||||||
outputs: 5,
|
|
||||||
x: 280, y: 50 + pageOrder * 300,
|
|
||||||
wires: [
|
|
||||||
[`${prefix}_chart_level`],
|
|
||||||
[`${prefix}_chart_flow`],
|
|
||||||
[`${prefix}_text_status`],
|
|
||||||
[`${prefix}_gauge_fill`],
|
|
||||||
[`${prefix}_gauge_tank`]
|
|
||||||
]
|
|
||||||
});
|
|
||||||
|
|
||||||
// Level chart
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_chart_level`,
|
|
||||||
type: "ui-chart",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: `${prefix}_grp_charts`,
|
|
||||||
name: `${psLabel} Level`,
|
|
||||||
label: "Basin Level (m)",
|
|
||||||
order: 1,
|
|
||||||
width: "6",
|
|
||||||
height: "5",
|
|
||||||
chartType: "line",
|
|
||||||
category: "topic",
|
|
||||||
categoryType: "msg",
|
|
||||||
xAxisType: "time",
|
|
||||||
yAxisLabel: "m",
|
|
||||||
removeOlder: "10",
|
|
||||||
removeOlderUnit: "60",
|
|
||||||
action: "append",
|
|
||||||
pointShape: "false",
|
|
||||||
pointRadius: 0,
|
|
||||||
interpolation: "linear",
|
|
||||||
showLegend: true,
|
|
||||||
xAxisProperty: "",
|
|
||||||
xAxisPropertyType: "timestamp",
|
|
||||||
yAxisProperty: "payload",
|
|
||||||
yAxisPropertyType: "msg",
|
|
||||||
colors: ["#0094ce", "#FF7F0E", "#2CA02C"],
|
|
||||||
textColor: ["#aaaaaa"],
|
|
||||||
textColorDefault: false,
|
|
||||||
gridColor: ["#333333"],
|
|
||||||
gridColorDefault: false,
|
|
||||||
x: 510, y: 30 + pageOrder * 300,
|
|
||||||
wires: []
|
|
||||||
});
|
|
||||||
|
|
||||||
// Flow chart
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_chart_flow`,
|
|
||||||
type: "ui-chart",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: `${prefix}_grp_charts`,
|
|
||||||
name: `${psLabel} Flow`,
|
|
||||||
label: "Net Flow (m\u00b3/h)",
|
|
||||||
order: 2,
|
|
||||||
width: "6",
|
|
||||||
height: "5",
|
|
||||||
chartType: "line",
|
|
||||||
category: "topic",
|
|
||||||
categoryType: "msg",
|
|
||||||
xAxisType: "time",
|
|
||||||
yAxisLabel: "m\u00b3/h",
|
|
||||||
removeOlder: "10",
|
|
||||||
removeOlderUnit: "60",
|
|
||||||
action: "append",
|
|
||||||
pointShape: "false",
|
|
||||||
pointRadius: 0,
|
|
||||||
interpolation: "linear",
|
|
||||||
showLegend: true,
|
|
||||||
xAxisProperty: "",
|
|
||||||
xAxisPropertyType: "timestamp",
|
|
||||||
yAxisProperty: "payload",
|
|
||||||
yAxisPropertyType: "msg",
|
|
||||||
colors: ["#4fc3f7", "#FF7F0E", "#2CA02C"],
|
|
||||||
textColor: ["#aaaaaa"],
|
|
||||||
textColorDefault: false,
|
|
||||||
gridColor: ["#333333"],
|
|
||||||
gridColorDefault: false,
|
|
||||||
x: 510, y: 60 + pageOrder * 300,
|
|
||||||
wires: []
|
|
||||||
});
|
|
||||||
|
|
||||||
// Status text
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_text_status`,
|
|
||||||
type: "ui-text",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: `${prefix}_grp_monitoring`,
|
|
||||||
name: `${psLabel} Status`,
|
|
||||||
label: "Status",
|
|
||||||
order: 1,
|
|
||||||
width: "6",
|
|
||||||
height: "1",
|
|
||||||
format: "{{msg.payload}}",
|
|
||||||
layout: "row-spread",
|
|
||||||
x: 510, y: 80 + pageOrder * 300,
|
|
||||||
wires: []
|
|
||||||
});
|
|
||||||
|
|
||||||
// Fill % gauge
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_gauge_fill`,
|
|
||||||
type: "ui-gauge",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: `${prefix}_grp_monitoring`,
|
|
||||||
name: `${psLabel} Fill`,
|
|
||||||
gtype: "gauge-34",
|
|
||||||
gstyle: "Rounded",
|
|
||||||
title: "Fill",
|
|
||||||
units: "%",
|
|
||||||
prefix: "",
|
|
||||||
suffix: "%",
|
|
||||||
min: 0,
|
|
||||||
max: 100,
|
|
||||||
segments: [
|
|
||||||
{ color: "#f44336", from: 0 },
|
|
||||||
{ color: "#ff9800", from: 10 },
|
|
||||||
{ color: "#4caf50", from: 25 },
|
|
||||||
{ color: "#ff9800", from: 75 },
|
|
||||||
{ color: "#f44336", from: 90 }
|
|
||||||
],
|
|
||||||
width: 3,
|
|
||||||
height: 3,
|
|
||||||
order: 2,
|
|
||||||
className: "",
|
|
||||||
x: 700, y: 80 + pageOrder * 300,
|
|
||||||
wires: []
|
|
||||||
});
|
|
||||||
|
|
||||||
// Tank gauge
|
|
||||||
nodes.push({
|
|
||||||
id: `${prefix}_gauge_tank`,
|
|
||||||
type: "ui-gauge",
|
|
||||||
z: "demo_tab_dashboard",
|
|
||||||
group: `${prefix}_grp_monitoring`,
|
|
||||||
name: `${psLabel} Tank`,
|
|
||||||
gtype: "gauge-tank",
|
|
||||||
gstyle: "Rounded",
|
|
||||||
title: "Level",
|
|
||||||
units: "m",
|
|
||||||
prefix: "",
|
|
||||||
suffix: "m",
|
|
||||||
min: 0,
|
|
||||||
max: basinHeight,
|
|
||||||
segments: [
|
|
||||||
{ color: "#f44336", from: 0 },
|
|
||||||
{ color: "#ff9800", from: basinHeight * 0.08 },
|
|
||||||
{ color: "#2196f3", from: basinHeight * 0.25 },
|
|
||||||
{ color: "#ff9800", from: basinHeight * 0.62 },
|
|
||||||
{ color: "#f44336", from: basinHeight * 0.8 }
|
|
||||||
],
|
|
||||||
width: 3,
|
|
||||||
height: 4,
|
|
||||||
order: 3,
|
|
||||||
className: "",
|
|
||||||
x: 700, y: 40 + pageOrder * 300,
|
|
||||||
wires: []
|
|
||||||
});
|
|
||||||
|
|
||||||
return nodes;
|
|
||||||
}
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// Create detail pages for each PS
|
|
||||||
// =============================================
|
|
||||||
|
|
||||||
const westNodes = createPSDetailPage({
|
|
||||||
psKey: 'west',
|
|
||||||
psLabel: 'PS West',
|
|
||||||
pagePath: '/ps-west',
|
|
||||||
pageOrder: 2,
|
|
||||||
psNodeId: 'demo_ps_west',
|
|
||||||
pumps: [
|
|
||||||
{ id: 'demo_pump_w1', label: 'W1' },
|
|
||||||
{ id: 'demo_pump_w2', label: 'W2' }
|
|
||||||
],
|
|
||||||
controlModes: ['levelbased', 'flowbased', 'manual'],
|
|
||||||
defaultMode: 'levelbased',
|
|
||||||
maxFlow: 300,
|
|
||||||
basinHeight: 4,
|
|
||||||
tabId: 'demo_tab_ps_west',
|
|
||||||
});
|
|
||||||
|
|
||||||
const northNodes = createPSDetailPage({
|
|
||||||
psKey: 'north',
|
|
||||||
psLabel: 'PS North',
|
|
||||||
pagePath: '/ps-north',
|
|
||||||
pageOrder: 3,
|
|
||||||
psNodeId: 'demo_ps_north',
|
|
||||||
pumps: [
|
|
||||||
{ id: 'demo_pump_n1', label: 'N1' }
|
|
||||||
],
|
|
||||||
controlModes: ['levelbased', 'flowbased', 'manual'],
|
|
||||||
defaultMode: 'flowbased',
|
|
||||||
maxFlow: 200,
|
|
||||||
basinHeight: 3,
|
|
||||||
tabId: 'demo_tab_ps_north',
|
|
||||||
});
|
|
||||||
|
|
||||||
const southNodes = createPSDetailPage({
|
|
||||||
psKey: 'south',
|
|
||||||
psLabel: 'PS South',
|
|
||||||
pagePath: '/ps-south',
|
|
||||||
pageOrder: 4,
|
|
||||||
psNodeId: 'demo_ps_south',
|
|
||||||
pumps: [
|
|
||||||
{ id: 'demo_pump_s1', label: 'S1' }
|
|
||||||
],
|
|
||||||
controlModes: ['levelbased', 'flowbased', 'manual'],
|
|
||||||
defaultMode: 'manual',
|
|
||||||
maxFlow: 100,
|
|
||||||
basinHeight: 2.5,
|
|
||||||
tabId: 'demo_tab_ps_south',
|
|
||||||
});
|
|
||||||
|
|
||||||
flow.push(...westNodes, ...northNodes, ...southNodes);
|
|
||||||
|
|
||||||
// =============================================
|
|
||||||
// Validate
|
|
||||||
// =============================================
|
|
||||||
const allIds = new Set(flow.map(n => n.id));
|
|
||||||
let issues = 0;
|
|
||||||
// Check for duplicate IDs
|
|
||||||
const idCounts = {};
|
|
||||||
flow.forEach(n => { idCounts[n.id] = (idCounts[n.id] || 0) + 1; });
|
|
||||||
for (const [id, count] of Object.entries(idCounts)) {
|
|
||||||
if (count > 1) { console.warn(`DUPLICATE ID: ${id} (${count} instances)`); issues++; }
|
|
||||||
}
|
|
||||||
|
|
||||||
for (const n of flow) {
|
|
||||||
if (!n.wires) continue;
|
|
||||||
for (const port of n.wires) {
|
|
||||||
for (const target of port) {
|
|
||||||
if (!allIds.has(target)) {
|
|
||||||
console.warn(`BROKEN WIRE: ${n.id} → ${target}`);
|
|
||||||
issues++;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (n.type === 'link out' && n.links) {
|
|
||||||
for (const lt of n.links) {
|
|
||||||
if (!allIds.has(lt)) { console.warn(`BROKEN LINK OUT: ${n.id} → ${lt}`); issues++; }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (n.type === 'link in' && n.links) {
|
|
||||||
for (const ls of n.links) {
|
|
||||||
if (!allIds.has(ls)) { console.warn(`BROKEN LINK IN: ${n.id} ← ${ls}`); issues++; }
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
if (issues === 0) console.log('All references valid ✓');
|
|
||||||
else console.log(`Found ${issues} issues`);
|
|
||||||
|
|
||||||
// Count nodes per tab
|
|
||||||
const tabCounts = {};
|
|
||||||
for (const n of flow) {
|
|
||||||
if (n.z) tabCounts[n.z] = (tabCounts[n.z] || 0) + 1;
|
|
||||||
}
|
|
||||||
console.log('Nodes per tab:', JSON.stringify(tabCounts, null, 2));
|
|
||||||
console.log('Total nodes:', flow.length);
|
|
||||||
|
|
||||||
// Count new nodes added
|
|
||||||
const newNodeCount = westNodes.length + northNodes.length + southNodes.length;
|
|
||||||
console.log(`Added ${newNodeCount} new nodes (${westNodes.length} west + ${northNodes.length} north + ${southNodes.length} south)`);
|
|
||||||
|
|
||||||
// Write
|
|
||||||
fs.writeFileSync(FLOW_PATH, JSON.stringify(flow, null, 2) + '\n');
|
|
||||||
console.log(`Wrote ${FLOW_PATH}`);
|
|
||||||
@@ -1,279 +0,0 @@
|
|||||||
#!/usr/bin/env node
|
|
||||||
/**
|
|
||||||
* Script to update docker/demo-flow.json with Fixes 2-5 from the plan.
|
|
||||||
* Run from project root: node scripts/update-demo-flow.js
|
|
||||||
*/
|
|
||||||
const fs = require('fs');
|
|
||||||
const path = require('path');
|
|
||||||
|
|
||||||
const flowPath = path.join(__dirname, '..', 'docker', 'demo-flow.json');
|
|
||||||
const flow = JSON.parse(fs.readFileSync(flowPath, 'utf8'));
|
|
||||||
|
|
||||||
// === Fix 2: Enable simulator on 9 measurement nodes ===
|
|
||||||
const simMeasIds = [
|
|
||||||
'demo_meas_flow', 'demo_meas_do', 'demo_meas_nh4',
|
|
||||||
'demo_meas_ft_n1', 'demo_meas_eff_flow', 'demo_meas_eff_do',
|
|
||||||
'demo_meas_eff_nh4', 'demo_meas_eff_no3', 'demo_meas_eff_tss'
|
|
||||||
];
|
|
||||||
simMeasIds.forEach(id => {
|
|
||||||
const node = flow.find(n => n.id === id);
|
|
||||||
if (node) {
|
|
||||||
node.simulator = true;
|
|
||||||
console.log('Enabled simulator on', id);
|
|
||||||
} else {
|
|
||||||
console.error('NOT FOUND:', id);
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// === Fix 2: Remove 18 inject+function sim pairs ===
|
|
||||||
const removeSimIds = [
|
|
||||||
'demo_inj_meas_flow', 'demo_fn_sim_flow',
|
|
||||||
'demo_inj_meas_do', 'demo_fn_sim_do',
|
|
||||||
'demo_inj_meas_nh4', 'demo_fn_sim_nh4',
|
|
||||||
'demo_inj_ft_n1', 'demo_fn_sim_ft_n1',
|
|
||||||
'demo_inj_eff_flow', 'demo_fn_sim_eff_flow',
|
|
||||||
'demo_inj_eff_do', 'demo_fn_sim_eff_do',
|
|
||||||
'demo_inj_eff_nh4', 'demo_fn_sim_eff_nh4',
|
|
||||||
'demo_inj_eff_no3', 'demo_fn_sim_eff_no3',
|
|
||||||
'demo_inj_eff_tss', 'demo_fn_sim_eff_tss'
|
|
||||||
];
|
|
||||||
|
|
||||||
// === Fix 5: Remove manual pump startup/setpoint injectors ===
|
|
||||||
const removeManualIds = [
|
|
||||||
'demo_inj_w1_startup', 'demo_inj_w1_setpoint',
|
|
||||||
'demo_inj_w2_startup', 'demo_inj_w2_setpoint',
|
|
||||||
'demo_inj_n1_startup',
|
|
||||||
'demo_inj_s1_startup'
|
|
||||||
];
|
|
||||||
|
|
||||||
const allRemoveIds = new Set([...removeSimIds, ...removeManualIds]);
|
|
||||||
const before = flow.length;
|
|
||||||
const filtered = flow.filter(n => !allRemoveIds.has(n.id));
|
|
||||||
console.log(`Removed ${before - filtered.length} nodes (expected 24)`);
|
|
||||||
|
|
||||||
// Remove wires to removed nodes from remaining nodes
|
|
||||||
filtered.forEach(n => {
|
|
||||||
if (n.wires && Array.isArray(n.wires)) {
|
|
||||||
n.wires = n.wires.map(wireGroup => {
|
|
||||||
if (Array.isArray(wireGroup)) {
|
|
||||||
return wireGroup.filter(w => !allRemoveIds.has(w));
|
|
||||||
}
|
|
||||||
return wireGroup;
|
|
||||||
});
|
|
||||||
}
|
|
||||||
});
|
|
||||||
|
|
||||||
// === Fix 3 (demo part): Add speedUpFactor to reactor ===
|
|
||||||
const reactor = filtered.find(n => n.id === 'demo_reactor');
|
|
||||||
if (reactor) {
|
|
||||||
reactor.speedUpFactor = 1;
|
|
||||||
console.log('Added speedUpFactor=1 to reactor');
|
|
||||||
}
|
|
||||||
|
|
||||||
// === Fix 4: Add pressure measurement nodes ===
|
|
||||||
const maxY = Math.max(...filtered.filter(n => n.z === 'demo_tab_wwtp').map(n => n.y || 0));
|
|
||||||
|
|
||||||
const ptBaseConfig = {
|
|
||||||
scaling: true,
|
|
||||||
i_offset: 0,
|
|
||||||
smooth_method: 'mean',
|
|
||||||
count: 3,
|
|
||||||
category: 'sensor',
|
|
||||||
assetType: 'pressure',
|
|
||||||
enableLog: false,
|
|
||||||
logLevel: 'error',
|
|
||||||
positionIcon: '',
|
|
||||||
hasDistance: false
|
|
||||||
};
|
|
||||||
|
|
||||||
// Function to extract level from PS output and convert to hydrostatic pressure
|
|
||||||
const levelExtractFunc = [
|
|
||||||
'// Extract basin level from PS output and convert to hydrostatic pressure (mbar)',
|
|
||||||
'// P = rho * g * h, rho=1000 kg/m3, g=9.81 m/s2',
|
|
||||||
'const p = msg.payload || {};',
|
|
||||||
'const keys = Object.keys(p);',
|
|
||||||
'const levelKey = keys.find(k => k.startsWith("level.predicted.atequipment") || k.startsWith("level.measured.atequipment"));',
|
|
||||||
'if (!levelKey) return null;',
|
|
||||||
'const h = Number(p[levelKey]);',
|
|
||||||
'if (!Number.isFinite(h)) return null;',
|
|
||||||
'msg.topic = "measurement";',
|
|
||||||
'msg.payload = Math.round(h * 98.1 * 10) / 10; // mbar',
|
|
||||||
'return msg;'
|
|
||||||
].join('\n');
|
|
||||||
|
|
||||||
const newNodes = [
|
|
||||||
// Comment
|
|
||||||
{
|
|
||||||
id: 'demo_comment_pressure',
|
|
||||||
type: 'comment',
|
|
||||||
z: 'demo_tab_wwtp',
|
|
||||||
name: '=== PRESSURE MEASUREMENTS (per pumping station) ===',
|
|
||||||
info: '',
|
|
||||||
x: 320,
|
|
||||||
y: maxY + 40
|
|
||||||
},
|
|
||||||
|
|
||||||
// --- PS West upstream PT ---
|
|
||||||
{
|
|
||||||
id: 'demo_fn_level_to_pressure_w',
|
|
||||||
type: 'function',
|
|
||||||
z: 'demo_tab_wwtp',
|
|
||||||
name: 'Level\u2192Pressure (West)',
|
|
||||||
func: levelExtractFunc,
|
|
||||||
outputs: 1,
|
|
||||||
x: 370,
|
|
||||||
y: maxY + 80,
|
|
||||||
wires: [['demo_meas_pt_w_up']]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: 'demo_meas_pt_w_up',
|
|
||||||
type: 'measurement',
|
|
||||||
z: 'demo_tab_wwtp',
|
|
||||||
name: 'PT-W-UP (West Upstream)',
|
|
||||||
...ptBaseConfig,
|
|
||||||
i_min: 0, i_max: 5000, o_min: 0, o_max: 5000,
|
|
||||||
simulator: false,
|
|
||||||
uuid: 'pt-w-up-001',
|
|
||||||
supplier: 'Endress+Hauser',
|
|
||||||
model: 'Cerabar-PMC51',
|
|
||||||
unit: 'mbar',
|
|
||||||
assetTagNumber: 'PT-W-UP',
|
|
||||||
positionVsParent: 'upstream',
|
|
||||||
x: 580,
|
|
||||||
y: maxY + 80,
|
|
||||||
wires: [['demo_link_process_out'], ['demo_link_influx_out'], ['demo_pump_w1', 'demo_pump_w2']]
|
|
||||||
},
|
|
||||||
// PS West downstream PT (simulated)
|
|
||||||
{
|
|
||||||
id: 'demo_meas_pt_w_down',
|
|
||||||
type: 'measurement',
|
|
||||||
z: 'demo_tab_wwtp',
|
|
||||||
name: 'PT-W-DN (West Downstream)',
|
|
||||||
...ptBaseConfig,
|
|
||||||
i_min: 0, i_max: 5000, o_min: 0, o_max: 5000,
|
|
||||||
simulator: true,
|
|
||||||
uuid: 'pt-w-dn-001',
|
|
||||||
supplier: 'Endress+Hauser',
|
|
||||||
model: 'Cerabar-PMC51',
|
|
||||||
unit: 'mbar',
|
|
||||||
assetTagNumber: 'PT-W-DN',
|
|
||||||
positionVsParent: 'downstream',
|
|
||||||
x: 580,
|
|
||||||
y: maxY + 140,
|
|
||||||
wires: [['demo_link_process_out'], ['demo_link_influx_out'], ['demo_pump_w1', 'demo_pump_w2']]
|
|
||||||
},
|
|
||||||
|
|
||||||
// --- PS North upstream PT ---
|
|
||||||
{
|
|
||||||
id: 'demo_fn_level_to_pressure_n',
|
|
||||||
type: 'function',
|
|
||||||
z: 'demo_tab_wwtp',
|
|
||||||
name: 'Level\u2192Pressure (North)',
|
|
||||||
func: levelExtractFunc,
|
|
||||||
outputs: 1,
|
|
||||||
x: 370,
|
|
||||||
y: maxY + 220,
|
|
||||||
wires: [['demo_meas_pt_n_up']]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: 'demo_meas_pt_n_up',
|
|
||||||
type: 'measurement',
|
|
||||||
z: 'demo_tab_wwtp',
|
|
||||||
name: 'PT-N-UP (North Upstream)',
|
|
||||||
...ptBaseConfig,
|
|
||||||
i_min: 0, i_max: 5000, o_min: 0, o_max: 5000,
|
|
||||||
simulator: false,
|
|
||||||
uuid: 'pt-n-up-001',
|
|
||||||
supplier: 'Endress+Hauser',
|
|
||||||
model: 'Cerabar-PMC51',
|
|
||||||
unit: 'mbar',
|
|
||||||
assetTagNumber: 'PT-N-UP',
|
|
||||||
positionVsParent: 'upstream',
|
|
||||||
x: 580,
|
|
||||||
y: maxY + 220,
|
|
||||||
wires: [['demo_link_process_out'], ['demo_link_influx_out'], ['demo_pump_n1']]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: 'demo_meas_pt_n_down',
|
|
||||||
type: 'measurement',
|
|
||||||
z: 'demo_tab_wwtp',
|
|
||||||
name: 'PT-N-DN (North Downstream)',
|
|
||||||
...ptBaseConfig,
|
|
||||||
i_min: 0, i_max: 5000, o_min: 0, o_max: 5000,
|
|
||||||
simulator: true,
|
|
||||||
uuid: 'pt-n-dn-001',
|
|
||||||
supplier: 'Endress+Hauser',
|
|
||||||
model: 'Cerabar-PMC51',
|
|
||||||
unit: 'mbar',
|
|
||||||
assetTagNumber: 'PT-N-DN',
|
|
||||||
positionVsParent: 'downstream',
|
|
||||||
x: 580,
|
|
||||||
y: maxY + 280,
|
|
||||||
wires: [['demo_link_process_out'], ['demo_link_influx_out'], ['demo_pump_n1']]
|
|
||||||
},
|
|
||||||
|
|
||||||
// --- PS South upstream PT ---
|
|
||||||
{
|
|
||||||
id: 'demo_fn_level_to_pressure_s',
|
|
||||||
type: 'function',
|
|
||||||
z: 'demo_tab_wwtp',
|
|
||||||
name: 'Level\u2192Pressure (South)',
|
|
||||||
func: levelExtractFunc,
|
|
||||||
outputs: 1,
|
|
||||||
x: 370,
|
|
||||||
y: maxY + 360,
|
|
||||||
wires: [['demo_meas_pt_s_up']]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: 'demo_meas_pt_s_up',
|
|
||||||
type: 'measurement',
|
|
||||||
z: 'demo_tab_wwtp',
|
|
||||||
name: 'PT-S-UP (South Upstream)',
|
|
||||||
...ptBaseConfig,
|
|
||||||
i_min: 0, i_max: 5000, o_min: 0, o_max: 5000,
|
|
||||||
simulator: false,
|
|
||||||
uuid: 'pt-s-up-001',
|
|
||||||
supplier: 'Endress+Hauser',
|
|
||||||
model: 'Cerabar-PMC51',
|
|
||||||
unit: 'mbar',
|
|
||||||
assetTagNumber: 'PT-S-UP',
|
|
||||||
positionVsParent: 'upstream',
|
|
||||||
x: 580,
|
|
||||||
y: maxY + 360,
|
|
||||||
wires: [['demo_link_process_out'], ['demo_link_influx_out'], ['demo_pump_s1']]
|
|
||||||
},
|
|
||||||
{
|
|
||||||
id: 'demo_meas_pt_s_down',
|
|
||||||
type: 'measurement',
|
|
||||||
z: 'demo_tab_wwtp',
|
|
||||||
name: 'PT-S-DN (South Downstream)',
|
|
||||||
...ptBaseConfig,
|
|
||||||
i_min: 0, i_max: 5000, o_min: 0, o_max: 5000,
|
|
||||||
simulator: true,
|
|
||||||
uuid: 'pt-s-dn-001',
|
|
||||||
supplier: 'Endress+Hauser',
|
|
||||||
model: 'Cerabar-PMC51',
|
|
||||||
unit: 'mbar',
|
|
||||||
assetTagNumber: 'PT-S-DN',
|
|
||||||
positionVsParent: 'downstream',
|
|
||||||
x: 580,
|
|
||||||
y: maxY + 420,
|
|
||||||
wires: [['demo_link_process_out'], ['demo_link_influx_out'], ['demo_pump_s1']]
|
|
||||||
}
|
|
||||||
];
|
|
||||||
|
|
||||||
// Wire PS output port 0 to the level-to-pressure function nodes
|
|
||||||
const psWest = filtered.find(n => n.id === 'demo_ps_west');
|
|
||||||
const psNorth = filtered.find(n => n.id === 'demo_ps_north');
|
|
||||||
const psSouth = filtered.find(n => n.id === 'demo_ps_south');
|
|
||||||
|
|
||||||
if (psWest && psWest.wires[0]) psWest.wires[0].push('demo_fn_level_to_pressure_w');
|
|
||||||
if (psNorth && psNorth.wires[0]) psNorth.wires[0].push('demo_fn_level_to_pressure_n');
|
|
||||||
if (psSouth && psSouth.wires[0]) psSouth.wires[0].push('demo_fn_level_to_pressure_s');
|
|
||||||
|
|
||||||
// Combine and write
|
|
||||||
const result = [...filtered, ...newNodes];
|
|
||||||
console.log(`Final flow has ${result.length} nodes`);
|
|
||||||
|
|
||||||
fs.writeFileSync(flowPath, JSON.stringify(result, null, 2) + '\n');
|
|
||||||
console.log('Done! Written to docker/demo-flow.json');
|
|
||||||
40
third_party/docs/README.md
vendored
40
third_party/docs/README.md
vendored
@@ -1,40 +0,0 @@
|
|||||||
# EVOLV Scientific & Technical Reference Library
|
|
||||||
|
|
||||||
## Purpose
|
|
||||||
|
|
||||||
This directory contains curated reference documents for EVOLV's domain-specialist agents. These summaries distill authoritative sources into actionable knowledge that agents should consult **before making scientific or engineering claims**.
|
|
||||||
|
|
||||||
## How Agents Should Use This
|
|
||||||
|
|
||||||
1. **Before making domain claims**: Read the relevant reference doc to verify your reasoning
|
|
||||||
2. **Cite sources**: When referencing scientific facts, point to the specific reference doc and its cited sources
|
|
||||||
3. **Acknowledge uncertainty**: If the reference docs don't cover a topic, say so rather than guessing
|
|
||||||
4. **Cross-reference with skills**: Combine these references with `.agents/skills/` SKILL.md files for implementation context
|
|
||||||
|
|
||||||
## Index
|
|
||||||
|
|
||||||
| File | Domain | Used By Agents |
|
|
||||||
|------|--------|---------------|
|
|
||||||
| [`asm-models.md`](asm-models.md) | Activated Sludge Models (ASM1-ASM3) | biological-process-engineer |
|
|
||||||
| [`settling-models.md`](settling-models.md) | Sludge Settling & Clarifier Models | biological-process-engineer |
|
|
||||||
| [`pump-affinity-laws.md`](pump-affinity-laws.md) | Pump Affinity Laws & Curve Theory | mechanical-process-engineer |
|
|
||||||
| [`pid-control-theory.md`](pid-control-theory.md) | PID Control for Process Applications | mechanical-process-engineer, node-red-runtime |
|
|
||||||
| [`signal-processing-sensors.md`](signal-processing-sensors.md) | Sensor Signal Conditioning | instrumentation-measurement |
|
|
||||||
| [`wastewater-compliance-nl.md`](wastewater-compliance-nl.md) | Dutch Wastewater Regulations | commissioning-compliance, biological-process-engineer |
|
|
||||||
| [`influxdb-schema-design.md`](influxdb-schema-design.md) | InfluxDB Time-Series Best Practices | telemetry-database |
|
|
||||||
| [`ot-security-iec62443.md`](ot-security-iec62443.md) | OT Security Standards | ot-security-integration |
|
|
||||||
|
|
||||||
## Sources Directory
|
|
||||||
|
|
||||||
The `sources/` subdirectory is for placing actual PDFs of scientific papers, standards, and technical manuals. Agents should prefer these curated summaries but can reference originals when available.
|
|
||||||
|
|
||||||
## Validation Status
|
|
||||||
|
|
||||||
All reference documents have been validated against authoritative sources including:
|
|
||||||
- IWA Scientific and Technical Reports (ASM models)
|
|
||||||
- Peer-reviewed publications (Takacs 1991, Vesilind, Burger-Diehl)
|
|
||||||
- Engineering Toolbox (pump affinity laws)
|
|
||||||
- ISA publications (Astrom & Hagglund PID control)
|
|
||||||
- IEC standards (61298, 62443)
|
|
||||||
- EU Directive 91/271/EEC (wastewater compliance)
|
|
||||||
- InfluxDB official documentation (schema design)
|
|
||||||
0
third_party/docs/sources/.gitkeep
vendored
0
third_party/docs/sources/.gitkeep
vendored
89
wiki/SCHEMA.md
Normal file
89
wiki/SCHEMA.md
Normal file
@@ -0,0 +1,89 @@
|
|||||||
|
# Project Wiki Schema
|
||||||
|
|
||||||
|
## Purpose
|
||||||
|
LLM-maintained knowledge base for this project. The LLM writes and maintains everything. You read it (ideally in Obsidian). Knowledge compounds across sessions instead of being lost in chat history.
|
||||||
|
|
||||||
|
## Directory Structure
|
||||||
|
```
|
||||||
|
wiki/
|
||||||
|
SCHEMA.md — this file (how to maintain the wiki)
|
||||||
|
index.md — catalog of all pages with one-line summaries
|
||||||
|
log.md — chronological record of updates
|
||||||
|
overview.md — project overview and current status
|
||||||
|
metrics.md — all numbers with provenance
|
||||||
|
knowledge-graph.yaml — structured data, machine-queryable
|
||||||
|
tools/ — search, lint, query scripts
|
||||||
|
concepts/ — core ideas and mechanisms
|
||||||
|
architecture/ — design decisions, system internals
|
||||||
|
findings/ — honest results (what worked AND what didn't)
|
||||||
|
sessions/ — per-session summaries
|
||||||
|
```
|
||||||
|
|
||||||
|
## Page Conventions
|
||||||
|
|
||||||
|
### Frontmatter
|
||||||
|
Every page starts with YAML frontmatter:
|
||||||
|
```yaml
|
||||||
|
---
|
||||||
|
title: Page Title
|
||||||
|
created: YYYY-MM-DD
|
||||||
|
updated: YYYY-MM-DD
|
||||||
|
status: proven | disproven | evolving | speculative
|
||||||
|
tags: [tag1, tag2]
|
||||||
|
sources: [path/to/file.py, commit abc1234]
|
||||||
|
---
|
||||||
|
```
|
||||||
|
|
||||||
|
### Status values
|
||||||
|
- **proven**: tested and verified with evidence
|
||||||
|
- **disproven**: tested and honestly shown NOT to work (document WHY)
|
||||||
|
- **evolving**: partially working, boundary not fully mapped
|
||||||
|
- **speculative**: proposed but not yet tested
|
||||||
|
|
||||||
|
### Cross-references
|
||||||
|
Use `[[Page Name]]` Obsidian-style wikilinks.
|
||||||
|
|
||||||
|
### Contradictions
|
||||||
|
When new evidence contradicts a prior claim, DON'T delete the old claim. Add:
|
||||||
|
```
|
||||||
|
> [!warning] Superseded
|
||||||
|
> This was shown to be incorrect on YYYY-MM-DD. See [[New Finding]].
|
||||||
|
```
|
||||||
|
|
||||||
|
### Honesty rule
|
||||||
|
If something doesn't work, say so. If a result was a false positive, document how it was discovered. The wiki must be trustworthy.
|
||||||
|
|
||||||
|
## Operations
|
||||||
|
|
||||||
|
### Ingest (after a session or new source)
|
||||||
|
1. Read outputs, commits, findings
|
||||||
|
2. Update relevant pages
|
||||||
|
3. Create new pages for new concepts
|
||||||
|
4. Update `index.md`, `log.md`, `knowledge-graph.yaml`
|
||||||
|
5. Check for contradictions with existing pages
|
||||||
|
|
||||||
|
### Query
|
||||||
|
1. Use `python3 wiki/tools/query.py` for structured lookup
|
||||||
|
2. Use `wiki/tools/search.sh` for full-text
|
||||||
|
3. Read `index.md` to find relevant pages
|
||||||
|
4. File valuable answers back into the wiki
|
||||||
|
|
||||||
|
### Lint (periodically)
|
||||||
|
```bash
|
||||||
|
bash wiki/tools/lint.sh
|
||||||
|
```
|
||||||
|
Checks: orphan pages, broken wikilinks, missing frontmatter, index completeness.
|
||||||
|
|
||||||
|
## Data Layer
|
||||||
|
|
||||||
|
- `knowledge-graph.yaml` — structured YAML with every metric and data point
|
||||||
|
- `metrics.md` — human-readable dashboard
|
||||||
|
- When adding new results, update BOTH the wiki page AND the knowledge graph
|
||||||
|
- The knowledge graph is the single source of truth for numbers
|
||||||
|
|
||||||
|
## Source of Truth Hierarchy
|
||||||
|
1. **Test results** (actual outputs) — highest authority
|
||||||
|
2. **Code** (current state) — second authority
|
||||||
|
3. **Knowledge graph** (knowledge-graph.yaml) — structured metrics
|
||||||
|
4. **Wiki pages** — synthesis, may lag
|
||||||
|
5. **Chat/memory** — ephemeral, may be stale
|
||||||
56
wiki/architecture/3d-pump-curves.md
Normal file
56
wiki/architecture/3d-pump-curves.md
Normal file
@@ -0,0 +1,56 @@
|
|||||||
|
---
|
||||||
|
title: 3D Pump Curve Architecture
|
||||||
|
created: 2026-04-07
|
||||||
|
updated: 2026-04-07
|
||||||
|
status: proven
|
||||||
|
tags: [predict, curves, interpolation, rotatingMachine]
|
||||||
|
sources: [nodes/generalFunctions/src/predict/predict_class.js, nodes/rotatingMachine/src/specificClass.js]
|
||||||
|
---
|
||||||
|
|
||||||
|
# 3D Pump Curve Prediction
|
||||||
|
|
||||||
|
## Data Structure
|
||||||
|
A family of 2D curves indexed by pressure (f-dimension):
|
||||||
|
- **X-axis**: control position (0-100%)
|
||||||
|
- **Y-axis**: flow (nq) or power (np) in canonical units
|
||||||
|
- **F-dimension**: pressure (Pa) — the 3rd dimension
|
||||||
|
|
||||||
|
Raw curves are in curve units (m3/h, kW, mbar). `_normalizeMachineCurve()` converts to canonical (m3/s, W, Pa).
|
||||||
|
|
||||||
|
## Interpolation
|
||||||
|
Monotonic cubic spline (Fritsch-Carlson) in both dimensions:
|
||||||
|
- **X-Y splines**: at each discrete pressure level
|
||||||
|
- **F-splines**: across pressure levels for intermediate pressure interpolation
|
||||||
|
|
||||||
|
## Prediction Flow
|
||||||
|
```
|
||||||
|
predict.y(x):
|
||||||
|
1. Clamp x to [currentFxyXMin, currentFxyXMax]
|
||||||
|
2. Normalize x to [normMin, normMax]
|
||||||
|
3. Evaluate spline at normalized x for current fDimension
|
||||||
|
4. Return y in canonical units (m3/s or W)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Unit Conversion Chain
|
||||||
|
```
|
||||||
|
Raw curve (m3/h, kW, mbar)
|
||||||
|
→ _normalizeMachineCurve → canonical (m3/s, W, Pa)
|
||||||
|
→ predict class → canonical output
|
||||||
|
→ MeasurementContainer.getCurrentValue(outputUnit) → output units
|
||||||
|
```
|
||||||
|
|
||||||
|
No double-conversion. Clean separation: specificClass handles units, predict handles normalization/interpolation.
|
||||||
|
|
||||||
|
## Three Predict Instances per Machine
|
||||||
|
- `predictFlow`: control % → flow (nq curve)
|
||||||
|
- `predictPower`: control % → power (np curve)
|
||||||
|
- `predictCtrl`: flow → control % (reversed nq curve)
|
||||||
|
|
||||||
|
## Boundary Behavior
|
||||||
|
- Below/above curve X range: flat extrapolation (clamped)
|
||||||
|
- Below/above f-dimension range: clamped to min/max pressure level
|
||||||
|
|
||||||
|
## Performance
|
||||||
|
- `y(x)`: O(log n), effectively O(1) for 5-10 data points
|
||||||
|
- `buildAllFxyCurves`: sub-10ms for typical curves
|
||||||
|
- Full caching of normalized curves, splines, and calculated curves
|
||||||
278
wiki/architecture/deployment-blueprint.md
Normal file
278
wiki/architecture/deployment-blueprint.md
Normal file
@@ -0,0 +1,278 @@
|
|||||||
|
---
|
||||||
|
title: EVOLV Deployment Blueprint
|
||||||
|
created: 2026-03-01
|
||||||
|
updated: 2026-04-07
|
||||||
|
status: evolving
|
||||||
|
tags: [deployment, docker, edge, site, central]
|
||||||
|
---
|
||||||
|
|
||||||
|
# EVOLV Deployment Blueprint
|
||||||
|
|
||||||
|
## Purpose
|
||||||
|
|
||||||
|
This document turns the current EVOLV architecture into a concrete deployment model.
|
||||||
|
|
||||||
|
It focuses on:
|
||||||
|
|
||||||
|
- target infrastructure layout
|
||||||
|
- container/service topology
|
||||||
|
- environment and secret boundaries
|
||||||
|
- rollout order from edge to site to central
|
||||||
|
|
||||||
|
It is the local source document behind the wiki deployment pages.
|
||||||
|
|
||||||
|
## 1. Deployment Principles
|
||||||
|
|
||||||
|
- edge-first operation: plant logic must continue when central is unavailable
|
||||||
|
- site mediation: site services protect field systems and absorb plant-specific complexity
|
||||||
|
- central governance: external APIs, analytics, IAM, CI/CD, and shared dashboards terminate centrally
|
||||||
|
- layered telemetry: InfluxDB exists where operationally justified at edge, site, and central
|
||||||
|
- configuration authority: `tagcodering` should become the source of truth for configuration
|
||||||
|
- secrets hygiene: tracked manifests contain variables only; secrets live in server-side env or secret stores
|
||||||
|
|
||||||
|
## 2. Layered Deployment Model
|
||||||
|
|
||||||
|
### 2.1 Edge node
|
||||||
|
|
||||||
|
Purpose:
|
||||||
|
|
||||||
|
- interface with PLCs and field assets
|
||||||
|
- execute local Node-RED logic
|
||||||
|
- retain local telemetry for resilience and digital-twin use cases
|
||||||
|
|
||||||
|
Recommended services:
|
||||||
|
|
||||||
|
- `evolv-edge-nodered`
|
||||||
|
- `evolv-edge-influxdb`
|
||||||
|
- optional `evolv-edge-grafana`
|
||||||
|
- optional `evolv-edge-broker`
|
||||||
|
|
||||||
|
Should not host:
|
||||||
|
|
||||||
|
- public API ingress
|
||||||
|
- central IAM
|
||||||
|
- source control or CI/CD
|
||||||
|
|
||||||
|
### 2.2 Site node
|
||||||
|
|
||||||
|
Purpose:
|
||||||
|
|
||||||
|
- aggregate one or more edge nodes
|
||||||
|
- host plant-local dashboards and engineering visibility
|
||||||
|
- mediate traffic between edge and central
|
||||||
|
|
||||||
|
Recommended services:
|
||||||
|
|
||||||
|
- `evolv-site-nodered` or `coresync-site`
|
||||||
|
- `evolv-site-influxdb`
|
||||||
|
- `evolv-site-grafana`
|
||||||
|
- optional `evolv-site-broker`
|
||||||
|
|
||||||
|
### 2.3 Central platform
|
||||||
|
|
||||||
|
Purpose:
|
||||||
|
|
||||||
|
- fleet-wide analytics
|
||||||
|
- API and integration ingress
|
||||||
|
- engineering lifecycle and releases
|
||||||
|
- identity and governance
|
||||||
|
|
||||||
|
Recommended services:
|
||||||
|
|
||||||
|
- reverse proxy / ingress
|
||||||
|
- API gateway
|
||||||
|
- IAM
|
||||||
|
- central InfluxDB
|
||||||
|
- central Grafana
|
||||||
|
- Gitea
|
||||||
|
- CI/CD runner/controller
|
||||||
|
- optional broker for asynchronous site/central workflows
|
||||||
|
- configuration services over `tagcodering`
|
||||||
|
|
||||||
|
## 3. Target Container Topology
|
||||||
|
|
||||||
|
### 3.1 Edge host
|
||||||
|
|
||||||
|
Minimum viable edge stack:
|
||||||
|
|
||||||
|
```text
|
||||||
|
edge-host-01
|
||||||
|
- Node-RED
|
||||||
|
- InfluxDB
|
||||||
|
- optional Grafana
|
||||||
|
```
|
||||||
|
|
||||||
|
Preferred production edge stack:
|
||||||
|
|
||||||
|
```text
|
||||||
|
edge-host-01
|
||||||
|
- Node-RED
|
||||||
|
- InfluxDB
|
||||||
|
- local health/export service
|
||||||
|
- optional local broker
|
||||||
|
- optional local dashboard service
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3.2 Site host
|
||||||
|
|
||||||
|
Minimum viable site stack:
|
||||||
|
|
||||||
|
```text
|
||||||
|
site-host-01
|
||||||
|
- Site Node-RED / CoreSync
|
||||||
|
- Site InfluxDB
|
||||||
|
- Site Grafana
|
||||||
|
```
|
||||||
|
|
||||||
|
Preferred production site stack:
|
||||||
|
|
||||||
|
```text
|
||||||
|
site-host-01
|
||||||
|
- Site Node-RED / CoreSync
|
||||||
|
- Site InfluxDB
|
||||||
|
- Site Grafana
|
||||||
|
- API relay / sync service
|
||||||
|
- optional site broker
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3.3 Central host group
|
||||||
|
|
||||||
|
Central should not be one giant undifferentiated host forever. It should trend toward at least these responsibility groups:
|
||||||
|
|
||||||
|
```text
|
||||||
|
central-ingress
|
||||||
|
- reverse proxy
|
||||||
|
- API gateway
|
||||||
|
- IAM
|
||||||
|
|
||||||
|
central-observability
|
||||||
|
- central InfluxDB
|
||||||
|
- Grafana
|
||||||
|
|
||||||
|
central-engineering
|
||||||
|
- Gitea
|
||||||
|
- CI/CD
|
||||||
|
- deployment orchestration
|
||||||
|
|
||||||
|
central-config
|
||||||
|
- tagcodering-backed config services
|
||||||
|
```
|
||||||
|
|
||||||
|
For early rollout these may be colocated, but the responsibility split should remain clear.
|
||||||
|
|
||||||
|
## 4. Compose Strategy
|
||||||
|
|
||||||
|
The current repository shows:
|
||||||
|
|
||||||
|
- `docker-compose.yml` as a development stack
|
||||||
|
- `temp/cloud.yml` as a broad central-stack example
|
||||||
|
|
||||||
|
For production, EVOLV should not rely on one flat compose file for every layer.
|
||||||
|
|
||||||
|
Recommended split:
|
||||||
|
|
||||||
|
- `compose.edge.yml`
|
||||||
|
- `compose.site.yml`
|
||||||
|
- `compose.central.yml`
|
||||||
|
- optional overlay files for site-specific differences
|
||||||
|
|
||||||
|
Benefits:
|
||||||
|
|
||||||
|
- clearer ownership per layer
|
||||||
|
- smaller blast radius during updates
|
||||||
|
- easier secret and env separation
|
||||||
|
- easier rollout per site
|
||||||
|
|
||||||
|
## 5. Environment And Secrets Strategy
|
||||||
|
|
||||||
|
### 5.1 Current baseline
|
||||||
|
|
||||||
|
`temp/cloud.yml` now uses environment variables instead of inline credentials. That is the minimum acceptable baseline.
|
||||||
|
|
||||||
|
### 5.2 Recommended production rule
|
||||||
|
|
||||||
|
- tracked compose files contain `${VARIABLE}` placeholders only
|
||||||
|
- real secrets live in server-local `.env` files or a managed secret store
|
||||||
|
- no shared default production passwords in git
|
||||||
|
- separate env files per layer and per environment
|
||||||
|
|
||||||
|
Suggested structure:
|
||||||
|
|
||||||
|
```text
|
||||||
|
/opt/evolv/
|
||||||
|
compose.edge.yml
|
||||||
|
compose.site.yml
|
||||||
|
compose.central.yml
|
||||||
|
env/
|
||||||
|
edge.env
|
||||||
|
site.env
|
||||||
|
central.env
|
||||||
|
```
|
||||||
|
|
||||||
|
## 6. Recommended Network Flow
|
||||||
|
|
||||||
|
### 6.1 Northbound
|
||||||
|
|
||||||
|
- edge publishes or syncs upward to site
|
||||||
|
- site aggregates and forwards selected data to central
|
||||||
|
- central exposes APIs and dashboards to approved consumers
|
||||||
|
|
||||||
|
### 6.2 Southbound
|
||||||
|
|
||||||
|
- central issues advice, approved config, or mediated requests
|
||||||
|
- site validates and relays to edge where appropriate
|
||||||
|
- edge remains the execution point near PLCs
|
||||||
|
|
||||||
|
### 6.3 Forbidden direct path
|
||||||
|
|
||||||
|
- enterprise or internet clients should not directly query PLC-connected edge runtimes
|
||||||
|
|
||||||
|
## 7. Rollout Order
|
||||||
|
|
||||||
|
### Phase 1: Edge baseline
|
||||||
|
|
||||||
|
- deploy edge Node-RED
|
||||||
|
- deploy local InfluxDB
|
||||||
|
- validate PLC connectivity
|
||||||
|
- validate local telemetry and resilience
|
||||||
|
|
||||||
|
### Phase 2: Site mediation
|
||||||
|
|
||||||
|
- deploy site Node-RED / CoreSync
|
||||||
|
- connect one or more edge nodes
|
||||||
|
- validate site-local dashboards and outage behavior
|
||||||
|
|
||||||
|
### Phase 3: Central services
|
||||||
|
|
||||||
|
- deploy ingress, IAM, API, Grafana, central InfluxDB
|
||||||
|
- deploy Gitea and CI/CD services
|
||||||
|
- validate controlled northbound access
|
||||||
|
|
||||||
|
### Phase 4: Configuration backbone
|
||||||
|
|
||||||
|
- connect runtime layers to `tagcodering`
|
||||||
|
- reduce config duplication in flows
|
||||||
|
- formalize config promotion and rollback
|
||||||
|
|
||||||
|
### Phase 5: Smart telemetry policy
|
||||||
|
|
||||||
|
- classify signals
|
||||||
|
- define reconstruction rules
|
||||||
|
- define authoritative layer per horizon
|
||||||
|
- validate analytics and auditability
|
||||||
|
|
||||||
|
## 8. Immediate Technical Recommendations
|
||||||
|
|
||||||
|
- treat `docker/settings.js` as development-only and create hardened production settings separately
|
||||||
|
- split deployment manifests by layer
|
||||||
|
- define env files per layer and environment
|
||||||
|
- formalize healthchecks and backup procedures for every persistent service
|
||||||
|
- define whether broker usage is required at edge, site, central, or only selectively
|
||||||
|
|
||||||
|
## 9. Next Technical Work Items
|
||||||
|
|
||||||
|
1. create draft `compose.edge.yml`, `compose.site.yml`, and `compose.central.yml`
|
||||||
|
2. define server directory layout and env-file conventions
|
||||||
|
3. define production Node-RED settings profile
|
||||||
|
4. define site-to-central sync path
|
||||||
|
5. define deployment and rollback runbook
|
||||||
45
wiki/architecture/group-optimization.md
Normal file
45
wiki/architecture/group-optimization.md
Normal file
@@ -0,0 +1,45 @@
|
|||||||
|
---
|
||||||
|
title: Group Optimization Architecture
|
||||||
|
created: 2026-04-07
|
||||||
|
updated: 2026-04-07
|
||||||
|
status: proven
|
||||||
|
tags: [machineGroupControl, optimization, BEP-Gravitation]
|
||||||
|
sources: [nodes/machineGroupControl/src/specificClass.js]
|
||||||
|
---
|
||||||
|
|
||||||
|
# machineGroupControl Optimization
|
||||||
|
|
||||||
|
## Algorithm: BEP-Gravitation + Marginal-Cost Refinement
|
||||||
|
|
||||||
|
### Step 1 — Pressure Equalization
|
||||||
|
Sets all non-operational pumps to the group's max downstream / min upstream pressure. Ensures fair curve evaluation across combinations.
|
||||||
|
|
||||||
|
### Step 2 — Combination Enumeration
|
||||||
|
Generates all 2^n pump subsets (n = number of machines). Filters by:
|
||||||
|
- Machine state (excludes off, cooling, stopping, emergency)
|
||||||
|
- Mode compatibility (`execsequence` allowed in auto)
|
||||||
|
- Flow bounds: `sumMinFlow ≤ Qd ≤ sumMaxFlow`
|
||||||
|
- Optional power cap
|
||||||
|
|
||||||
|
### Step 3 — BEP-Gravitation Distribution (per combination)
|
||||||
|
1. **BEP seed**: `estimatedBEP = minFlow + span * NCog` per pump
|
||||||
|
2. **Slope estimation**: samples dP/dQ at BEP ± delta (directional: slopeLeft, slopeRight)
|
||||||
|
3. **Slope redistribution**: iteratively shifts flow from steep to flat curves (weight = 1/slope)
|
||||||
|
4. **Marginal-cost refinement**: after slope redistribution, shifts flow from highest actual dP/dQ to lowest using real `inputFlowCalcPower` evaluations. Converges regardless of curve convexity. Max 50 iterations, typically 5-15.
|
||||||
|
|
||||||
|
### Step 4 — Best Selection
|
||||||
|
Pick combination with lowest total power. Tiebreak by deviation from BEP.
|
||||||
|
|
||||||
|
### Step 5 — Execution
|
||||||
|
Start/stop pumps as needed, send `flowmovement` commands in output units via `_canonicalToOutputFlow()`.
|
||||||
|
|
||||||
|
## Three Control Modes
|
||||||
|
|
||||||
|
| Mode | Distribution | Combination Selection |
|
||||||
|
|------|-------------|----------------------|
|
||||||
|
| optimalControl | BEP-Gravitation + refinement | exhaustive 2^n |
|
||||||
|
| priorityControl | equal split, priority-ordered | sequential add/remove |
|
||||||
|
| priorityPercentageControl | percentage-based, normalized | count-based |
|
||||||
|
|
||||||
|
## Key Design Decision
|
||||||
|
The `flowmovement` command sends flow in the **machine's output units** (m3/h), not canonical (m3/s). The `_canonicalToOutputFlow()` helper converts before sending. Without this conversion, every pump stays at minimum flow (the critical bug fixed on 2026-04-07).
|
||||||
426
wiki/architecture/node-architecture.md
Normal file
426
wiki/architecture/node-architecture.md
Normal file
@@ -0,0 +1,426 @@
|
|||||||
|
---
|
||||||
|
title: EVOLV Architecture
|
||||||
|
created: 2026-03-01
|
||||||
|
updated: 2026-04-07
|
||||||
|
status: evolving
|
||||||
|
tags: [architecture, node-red, three-layer]
|
||||||
|
---
|
||||||
|
|
||||||
|
# EVOLV Architecture
|
||||||
|
|
||||||
|
## 1. System Overview
|
||||||
|
|
||||||
|
High-level view of how EVOLV fits into the wastewater treatment automation stack.
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph LR
|
||||||
|
NR[Node-RED Runtime] <-->|msg objects| EVOLV[EVOLV Nodes]
|
||||||
|
EVOLV -->|InfluxDB line protocol| INFLUX[(InfluxDB)]
|
||||||
|
INFLUX -->|queries| GRAFANA[Grafana Dashboards]
|
||||||
|
EVOLV -->|process output| NR
|
||||||
|
EVOLV -->|parent output| NR
|
||||||
|
|
||||||
|
style NR fill:#b22222,color:#fff
|
||||||
|
style EVOLV fill:#0f52a5,color:#fff
|
||||||
|
style INFLUX fill:#0c99d9,color:#fff
|
||||||
|
style GRAFANA fill:#50a8d9,color:#fff
|
||||||
|
```
|
||||||
|
|
||||||
|
Each EVOLV node produces three outputs:
|
||||||
|
| Port | Name | Purpose |
|
||||||
|
|------|------|---------|
|
||||||
|
| 0 | process | Process data forwarded to downstream nodes |
|
||||||
|
| 1 | dbase | InfluxDB-formatted measurement data |
|
||||||
|
| 2 | parent | Control messages to parent nodes (e.g. registerChild) |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 2. Node Architecture (Three-Layer Pattern)
|
||||||
|
|
||||||
|
Every node follows a consistent three-layer design that separates Node-RED wiring from domain logic.
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TB
|
||||||
|
subgraph "Node-RED Runtime"
|
||||||
|
REG["RED.nodes.registerType()"]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph "Layer 1 — Wrapper (valve.js)"
|
||||||
|
W[wrapper .js]
|
||||||
|
W -->|"new nodeClass(config, RED, this, name)"| NC
|
||||||
|
W -->|MenuManager| MENU[HTTP /name/menu.js]
|
||||||
|
W -->|configManager| CFG[HTTP /name/configData.js]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph "Layer 2 — Node Adapter (src/nodeClass.js)"
|
||||||
|
NC[nodeClass]
|
||||||
|
NC -->|_loadConfig| CFGM[configManager]
|
||||||
|
NC -->|_setupSpecificClass| SC
|
||||||
|
NC -->|_attachInputHandler| INPUT[onInput routing]
|
||||||
|
NC -->|_startTickLoop| TICK[1s tick loop]
|
||||||
|
NC -->|_tick → outputUtils| OUT[formatMsg]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph "Layer 3 — Domain Logic (src/specificClass.js)"
|
||||||
|
SC[specificClass]
|
||||||
|
SC -->|measurements| MC[MeasurementContainer]
|
||||||
|
SC -->|state machine| ST[state]
|
||||||
|
SC -->|hydraulics / biology| DOMAIN[domain models]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph "generalFunctions"
|
||||||
|
GF[shared library]
|
||||||
|
end
|
||||||
|
|
||||||
|
REG --> W
|
||||||
|
GF -.->|logger, outputUtils, configManager,\nMeasurementContainer, validation, ...| NC
|
||||||
|
GF -.->|MeasurementContainer, state,\nconvert, predict, ...| SC
|
||||||
|
|
||||||
|
style W fill:#0f52a5,color:#fff
|
||||||
|
style NC fill:#0c99d9,color:#fff
|
||||||
|
style SC fill:#50a8d9,color:#fff
|
||||||
|
style GF fill:#86bbdd,color:#000
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 3. generalFunctions Module Map
|
||||||
|
|
||||||
|
The shared library (`nodes/generalFunctions/`) provides all cross-cutting concerns.
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TB
|
||||||
|
GF[generalFunctions/index.js]
|
||||||
|
|
||||||
|
subgraph "Core Helpers (src/helper/)"
|
||||||
|
LOGGER[logger]
|
||||||
|
OUTPUT[outputUtils]
|
||||||
|
CHILD[childRegistrationUtils]
|
||||||
|
CFGUTIL[configUtils]
|
||||||
|
ASSERT[assertionUtils]
|
||||||
|
VALID[validationUtils]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph "Validators (src/helper/validators/)"
|
||||||
|
TV[typeValidators]
|
||||||
|
CV[collectionValidators]
|
||||||
|
CURV[curveValidator]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph "Domain Modules (src/)"
|
||||||
|
MC[MeasurementContainer]
|
||||||
|
CFGMGR[configManager]
|
||||||
|
MENUMGR[MenuManager]
|
||||||
|
STATE[state]
|
||||||
|
CONVERT[convert / Fysics]
|
||||||
|
PREDICT[predict / interpolation]
|
||||||
|
NRMSE[nrmse / errorMetrics]
|
||||||
|
COOLPROP[coolprop]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph "Data (datasets/)"
|
||||||
|
CURVES[assetData/curves]
|
||||||
|
ASSETS[assetData/assetData.json]
|
||||||
|
UNITS[unitData.json]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph "Constants (src/constants/)"
|
||||||
|
POS[POSITIONS / POSITION_VALUES]
|
||||||
|
end
|
||||||
|
|
||||||
|
GF --> LOGGER
|
||||||
|
GF --> OUTPUT
|
||||||
|
GF --> CHILD
|
||||||
|
GF --> CFGUTIL
|
||||||
|
GF --> ASSERT
|
||||||
|
GF --> VALID
|
||||||
|
VALID --> TV
|
||||||
|
VALID --> CV
|
||||||
|
VALID --> CURV
|
||||||
|
GF --> MC
|
||||||
|
GF --> CFGMGR
|
||||||
|
GF --> MENUMGR
|
||||||
|
GF --> STATE
|
||||||
|
GF --> CONVERT
|
||||||
|
GF --> PREDICT
|
||||||
|
GF --> NRMSE
|
||||||
|
GF --> COOLPROP
|
||||||
|
GF --> CURVES
|
||||||
|
GF --> POS
|
||||||
|
|
||||||
|
style GF fill:#0f52a5,color:#fff
|
||||||
|
style LOGGER fill:#86bbdd,color:#000
|
||||||
|
style OUTPUT fill:#86bbdd,color:#000
|
||||||
|
style VALID fill:#86bbdd,color:#000
|
||||||
|
style MC fill:#50a8d9,color:#fff
|
||||||
|
style CFGMGR fill:#50a8d9,color:#fff
|
||||||
|
style MENUMGR fill:#50a8d9,color:#fff
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 4. Data Flow (Message Lifecycle)
|
||||||
|
|
||||||
|
Sequence diagram showing a typical input message and the periodic tick output cycle.
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
sequenceDiagram
|
||||||
|
participant NR as Node-RED
|
||||||
|
participant W as wrapper.js
|
||||||
|
participant NC as nodeClass
|
||||||
|
participant SC as specificClass
|
||||||
|
participant OU as outputUtils
|
||||||
|
|
||||||
|
Note over W: Node startup
|
||||||
|
W->>NC: new nodeClass(config, RED, node, name)
|
||||||
|
NC->>NC: _loadConfig (configManager.buildConfig)
|
||||||
|
NC->>SC: new specificClass(config, stateConfig, options)
|
||||||
|
NC->>NR: send([null, null, {topic: registerChild}])
|
||||||
|
|
||||||
|
Note over NC: Every 1 second (tick loop)
|
||||||
|
NC->>SC: getOutput()
|
||||||
|
SC-->>NC: raw measurement data
|
||||||
|
NC->>OU: formatMsg(raw, config, 'process')
|
||||||
|
NC->>OU: formatMsg(raw, config, 'influxdb')
|
||||||
|
NC->>NR: send([processMsg, influxMsg])
|
||||||
|
|
||||||
|
Note over NR: Incoming control message
|
||||||
|
NR->>W: msg {topic: 'execMovement', payload: {...}}
|
||||||
|
W->>NC: onInput(msg)
|
||||||
|
NC->>SC: handleInput(source, action, setpoint)
|
||||||
|
SC->>SC: update state machine & measurements
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## 5. Node Types
|
||||||
|
|
||||||
|
| Node | S88 Level | Purpose |
|
||||||
|
|------|-----------|---------|
|
||||||
|
| **measurement** | Control Module | Generic measurement point — reads, validates, and stores sensor values |
|
||||||
|
| **valve** | Control Module | Valve simulation with hydraulic model, position control, flow/pressure prediction |
|
||||||
|
| **rotatingMachine** | Control Module | Pumps, blowers, mixers — rotating equipment with speed control and efficiency curves |
|
||||||
|
| **diffuser** | Control Module | Aeration diffuser — models oxygen transfer and pressure drop |
|
||||||
|
| **settler** | Equipment | Sludge settler — models settling behavior and sludge blanket |
|
||||||
|
| **reactor** | Equipment | Hydraulic tank and biological process simulator (activated sludge, digestion) |
|
||||||
|
| **monster** | Equipment | MONitoring and STrEam Routing — complex measurement aggregation |
|
||||||
|
| **pumpingStation** | Unit | Coordinates multiple pumps as a pumping station |
|
||||||
|
| **valveGroupControl** | Unit | Manages multiple valves as a coordinated group — distributes flow, monitors pressure |
|
||||||
|
| **machineGroupControl** | Unit | Group control for rotating machines — load balancing and sequencing |
|
||||||
|
| **dashboardAPI** | Utility | Exposes data and unit conversion endpoints for external dashboards |
|
||||||
|
# EVOLV Architecture
|
||||||
|
|
||||||
|
## Node Hierarchy (S88)
|
||||||
|
|
||||||
|
EVOLV follows the ISA-88 (S88) batch control standard. Each node maps to an S88 level and uses a consistent color scheme in the Node-RED editor.
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph TD
|
||||||
|
classDef area fill:#0f52a5,color:#fff,stroke:#0a3d7a
|
||||||
|
classDef processCell fill:#0c99d9,color:#fff,stroke:#0977aa
|
||||||
|
classDef unit fill:#50a8d9,color:#fff,stroke:#3d89b3
|
||||||
|
classDef equipment fill:#86bbdd,color:#000,stroke:#6a9bb8
|
||||||
|
classDef controlModule fill:#a9daee,color:#000,stroke:#87b8cc
|
||||||
|
classDef standalone fill:#f0f0f0,color:#000,stroke:#999
|
||||||
|
|
||||||
|
%% S88 Levels
|
||||||
|
subgraph "S88: Area"
|
||||||
|
PS[pumpingStation]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph "S88: Equipment"
|
||||||
|
MGC[machineGroupControl]
|
||||||
|
VGC[valveGroupControl]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph "S88: Control Module"
|
||||||
|
RM[rotatingMachine]
|
||||||
|
V[valve]
|
||||||
|
M[measurement]
|
||||||
|
R[reactor]
|
||||||
|
S[settler]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph "Standalone"
|
||||||
|
MON[monster]
|
||||||
|
DASH[dashboardAPI]
|
||||||
|
DIFF[diffuser - not implemented]
|
||||||
|
end
|
||||||
|
|
||||||
|
%% Parent-child registration relationships
|
||||||
|
PS -->|"accepts: measurement"| M
|
||||||
|
PS -->|"accepts: machine"| RM
|
||||||
|
PS -->|"accepts: machineGroup"| MGC
|
||||||
|
PS -->|"accepts: pumpingStation"| PS2[pumpingStation]
|
||||||
|
|
||||||
|
MGC -->|"accepts: machine"| RM
|
||||||
|
|
||||||
|
RM -->|"accepts: measurement"| M2[measurement]
|
||||||
|
RM -->|"accepts: reactor"| R
|
||||||
|
|
||||||
|
VGC -->|"accepts: valve"| V
|
||||||
|
VGC -->|"accepts: machine / rotatingmachine"| RM2[rotatingMachine]
|
||||||
|
VGC -->|"accepts: machinegroup / machinegroupcontrol"| MGC2[machineGroupControl]
|
||||||
|
VGC -->|"accepts: pumpingstation / valvegroupcontrol"| PS3["pumpingStation / valveGroupControl"]
|
||||||
|
|
||||||
|
R -->|"accepts: measurement"| M3[measurement]
|
||||||
|
R -->|"accepts: reactor"| R2[reactor]
|
||||||
|
|
||||||
|
S -->|"accepts: measurement"| M4[measurement]
|
||||||
|
S -->|"accepts: reactor"| R3[reactor]
|
||||||
|
S -->|"accepts: machine"| RM3[rotatingMachine]
|
||||||
|
|
||||||
|
%% Styling
|
||||||
|
class PS,PS2,PS3 area
|
||||||
|
class MGC,MGC2 equipment
|
||||||
|
class VGC equipment
|
||||||
|
class RM,RM2,RM3 controlModule
|
||||||
|
class V controlModule
|
||||||
|
class M,M2,M3,M4 controlModule
|
||||||
|
class R,R2,R3 controlModule
|
||||||
|
class S controlModule
|
||||||
|
class MON,DASH,DIFF standalone
|
||||||
|
```
|
||||||
|
|
||||||
|
### Registration Summary
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph LR
|
||||||
|
classDef parent fill:#0c99d9,color:#fff
|
||||||
|
classDef child fill:#a9daee,color:#000
|
||||||
|
|
||||||
|
PS[pumpingStation] -->|measurement| LEAF1((leaf))
|
||||||
|
PS -->|machine| RM1[rotatingMachine]
|
||||||
|
PS -->|machineGroup| MGC1[machineGroupControl]
|
||||||
|
PS -->|pumpingStation| PS1[pumpingStation]
|
||||||
|
|
||||||
|
MGC[machineGroupControl] -->|machine| RM2[rotatingMachine]
|
||||||
|
|
||||||
|
VGC[valveGroupControl] -->|valve| V1[valve]
|
||||||
|
VGC -->|source| SRC["machine, machinegroup,<br/>pumpingstation, valvegroupcontrol"]
|
||||||
|
|
||||||
|
RM[rotatingMachine] -->|measurement| LEAF2((leaf))
|
||||||
|
RM -->|reactor| R1[reactor]
|
||||||
|
|
||||||
|
R[reactor] -->|measurement| LEAF3((leaf))
|
||||||
|
R -->|reactor| R2[reactor]
|
||||||
|
|
||||||
|
S[settler] -->|measurement| LEAF4((leaf))
|
||||||
|
S -->|reactor| R3[reactor]
|
||||||
|
S -->|machine| RM3[rotatingMachine]
|
||||||
|
|
||||||
|
class PS,MGC,VGC,RM,R,S parent
|
||||||
|
class LEAF1,LEAF2,LEAF3,LEAF4,RM1,RM2,RM3,MGC1,PS1,V1,SRC,R1,R2,R3 child
|
||||||
|
```
|
||||||
|
|
||||||
|
## Node Types
|
||||||
|
|
||||||
|
| Node | S88 Level | softwareType | role | Accepts Children | Outputs |
|
||||||
|
|------|-----------|-------------|------|-----------------|---------|
|
||||||
|
| **pumpingStation** | Area | `pumpingstation` | StationController | measurement, machine (rotatingMachine), machineGroup, pumpingStation | [process, dbase, parent] |
|
||||||
|
| **machineGroupControl** | Equipment | `machinegroupcontrol` | GroupController | machine (rotatingMachine) | [process, dbase, parent] |
|
||||||
|
| **valveGroupControl** | Equipment | `valvegroupcontrol` | ValveGroupController | valve, machine, rotatingmachine, machinegroup, machinegroupcontrol, pumpingstation, valvegroupcontrol | [process, dbase, parent] |
|
||||||
|
| **rotatingMachine** | Control Module | `rotatingmachine` | RotationalDeviceController | measurement, reactor | [process, dbase, parent] |
|
||||||
|
| **valve** | Control Module | `valve` | controller | _(leaf node, no children)_ | [process, dbase, parent] |
|
||||||
|
| **measurement** | Control Module | `measurement` | Sensor | _(leaf node, no children)_ | [process, dbase, parent] |
|
||||||
|
| **reactor** | Control Module | `reactor` | Biological reactor | measurement, reactor (upstream chaining) | [process, dbase, parent] |
|
||||||
|
| **settler** | Control Module | `settler` | Secondary settler | measurement, reactor (upstream), machine (return pump) | [process, dbase, parent] |
|
||||||
|
| **monster** | Standalone | - | - | dual-parent, standalone | - |
|
||||||
|
| **dashboardAPI** | Standalone | - | - | accepts any child (Grafana integration) | - |
|
||||||
|
| **diffuser** | Standalone | - | - | _(not implemented)_ | - |
|
||||||
|
|
||||||
|
## Data Flow
|
||||||
|
|
||||||
|
### Measurement Data Flow (upstream to downstream)
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
sequenceDiagram
|
||||||
|
participant Sensor as measurement (sensor)
|
||||||
|
participant Machine as rotatingMachine
|
||||||
|
participant Group as machineGroupControl
|
||||||
|
participant Station as pumpingStation
|
||||||
|
|
||||||
|
Note over Sensor: Sensor reads value<br/>(pressure, flow, level, temp)
|
||||||
|
|
||||||
|
Sensor->>Sensor: measurements.type(t).variant("measured").position(p).value(v)
|
||||||
|
Sensor->>Sensor: emitter.emit("type.measured.position", eventData)
|
||||||
|
|
||||||
|
Sensor->>Machine: Event: "pressure.measured.upstream"
|
||||||
|
Machine->>Machine: Store in own MeasurementContainer
|
||||||
|
Machine->>Machine: getMeasuredPressure() -> calcFlow() -> calcPower()
|
||||||
|
Machine->>Machine: emitter.emit("flow.predicted.downstream", eventData)
|
||||||
|
|
||||||
|
Machine->>Group: Event: "flow.predicted.downstream"
|
||||||
|
Group->>Group: handlePressureChange()
|
||||||
|
Group->>Group: Aggregate flows across all machines
|
||||||
|
Group->>Group: Calculate group totals and efficiency
|
||||||
|
|
||||||
|
Machine->>Station: Event: "flow.predicted.downstream"
|
||||||
|
Station->>Station: Store predicted flow in/out
|
||||||
|
Station->>Station: _updateVolumePrediction()
|
||||||
|
Station->>Station: _calcNetFlow(), _calcTimeRemaining()
|
||||||
|
```
|
||||||
|
|
||||||
|
### Control Command Flow (downstream to upstream)
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
sequenceDiagram
|
||||||
|
participant Station as pumpingStation
|
||||||
|
participant Group as machineGroupControl
|
||||||
|
participant Machine as rotatingMachine
|
||||||
|
participant Machine2 as rotatingMachine (2)
|
||||||
|
|
||||||
|
Station->>Group: handleInput("parent", action, param)
|
||||||
|
|
||||||
|
Group->>Group: Determine scaling strategy
|
||||||
|
Group->>Group: Calculate setpoints per machine
|
||||||
|
|
||||||
|
Group->>Machine: handleInput("parent", "execMovement", setpoint)
|
||||||
|
Group->>Machine2: handleInput("parent", "execMovement", setpoint)
|
||||||
|
|
||||||
|
Machine->>Machine: setpoint() -> state.moveTo(pos)
|
||||||
|
Machine->>Machine: updatePosition() -> calcFlow(), calcPower()
|
||||||
|
Machine->>Machine: emitter.emit("flow.predicted.downstream")
|
||||||
|
|
||||||
|
Machine2->>Machine2: setpoint() -> state.moveTo(pos)
|
||||||
|
Machine2->>Machine2: updatePosition() -> calcFlow(), calcPower()
|
||||||
|
Machine2->>Machine2: emitter.emit("flow.predicted.downstream")
|
||||||
|
```
|
||||||
|
|
||||||
|
### Wastewater Treatment Process Flow
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
graph LR
|
||||||
|
classDef process fill:#50a8d9,color:#fff
|
||||||
|
classDef equipment fill:#86bbdd,color:#000
|
||||||
|
|
||||||
|
PS_IN[pumpingStation<br/>Influent] -->|flow| R1[reactor<br/>Anoxic]
|
||||||
|
R1 -->|effluent| R2[reactor<br/>Aerated]
|
||||||
|
R2 -->|effluent| SET[settler]
|
||||||
|
SET -->|effluent out| PS_OUT[pumpingStation<br/>Effluent]
|
||||||
|
SET -->|sludge return| RM_RET[rotatingMachine<br/>Return pump]
|
||||||
|
RM_RET -->|recirculation| R1
|
||||||
|
|
||||||
|
PS_IN --- MGC_IN[machineGroupControl]
|
||||||
|
MGC_IN --- RM_IN[rotatingMachine<br/>Influent pumps]
|
||||||
|
|
||||||
|
class PS_IN,PS_OUT process
|
||||||
|
class R1,R2,SET process
|
||||||
|
class MGC_IN,RM_IN,RM_RET equipment
|
||||||
|
```
|
||||||
|
|
||||||
|
### Event-Driven Communication Pattern
|
||||||
|
|
||||||
|
All parent-child communication uses Node.js `EventEmitter`:
|
||||||
|
|
||||||
|
1. **Registration**: Parent calls `childRegistrationUtils.registerChild(child, position)` which stores the child and calls the parent's `registerChild(child, softwareType)` method.
|
||||||
|
2. **Event binding**: The parent's `registerChild()` subscribes to the child's `measurements.emitter` events (e.g., `"flow.predicted.downstream"`).
|
||||||
|
3. **Data propagation**: When a child updates a measurement, it emits an event. The parent's listener stores the value in its own `MeasurementContainer` and runs its domain logic.
|
||||||
|
4. **Three outputs**: Every node sends data to three Node-RED outputs: `[process, dbase, parent]` -- process data for downstream nodes, InfluxDB for persistence, and parent aggregation data.
|
||||||
|
|
||||||
|
### Position Convention
|
||||||
|
|
||||||
|
Children register with a position relative to their parent:
|
||||||
|
- `upstream` -- before the parent in the flow direction
|
||||||
|
- `downstream` -- after the parent in the flow direction
|
||||||
|
- `atEquipment` -- physically located at/on the parent equipment
|
||||||
158
wiki/architecture/platform-overview.md
Normal file
158
wiki/architecture/platform-overview.md
Normal file
@@ -0,0 +1,158 @@
|
|||||||
|
---
|
||||||
|
title: EVOLV Platform Architecture
|
||||||
|
created: 2026-03-01
|
||||||
|
updated: 2026-04-07
|
||||||
|
status: evolving
|
||||||
|
tags: [architecture, platform, edge-first]
|
||||||
|
---
|
||||||
|
|
||||||
|
# EVOLV Platform Architecture
|
||||||
|
|
||||||
|
## At A Glance
|
||||||
|
|
||||||
|
EVOLV is not only a Node-RED package. It is a layered automation platform:
|
||||||
|
|
||||||
|
- edge for plant-side execution
|
||||||
|
- site for local aggregation and resilience
|
||||||
|
- central for coordination, analytics, APIs, and governance
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
flowchart LR
|
||||||
|
subgraph EDGE["Edge"]
|
||||||
|
PLC["PLC / IO"]
|
||||||
|
ENR["Node-RED"]
|
||||||
|
EDB["Local InfluxDB"]
|
||||||
|
EUI["Local Monitoring"]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph SITE["Site"]
|
||||||
|
SNR["CoreSync / Site Node-RED"]
|
||||||
|
SDB["Site InfluxDB"]
|
||||||
|
SUI["Site Dashboards"]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph CENTRAL["Central"]
|
||||||
|
API["API Gateway"]
|
||||||
|
CFG["Tagcodering"]
|
||||||
|
CDB["Central InfluxDB"]
|
||||||
|
CGR["Grafana"]
|
||||||
|
INTEL["Overview Intelligence"]
|
||||||
|
GIT["Gitea + CI/CD"]
|
||||||
|
end
|
||||||
|
|
||||||
|
PLC --> ENR
|
||||||
|
ENR --> EDB
|
||||||
|
ENR --> EUI
|
||||||
|
ENR <--> SNR
|
||||||
|
EDB <--> SDB
|
||||||
|
SNR --> SUI
|
||||||
|
SNR <--> API
|
||||||
|
API <--> CFG
|
||||||
|
API --> INTEL
|
||||||
|
SDB <--> CDB
|
||||||
|
CDB --> CGR
|
||||||
|
GIT --> ENR
|
||||||
|
GIT --> SNR
|
||||||
|
```
|
||||||
|
|
||||||
|
## Core Principles
|
||||||
|
|
||||||
|
### 1. Edge-first operation
|
||||||
|
|
||||||
|
The edge layer must remain useful and safe when central systems are down.
|
||||||
|
|
||||||
|
That means:
|
||||||
|
|
||||||
|
- local logic remains operational
|
||||||
|
- local telemetry remains queryable
|
||||||
|
- local dashboards can keep working
|
||||||
|
|
||||||
|
### 2. Multi-level telemetry
|
||||||
|
|
||||||
|
InfluxDB is expected on multiple levels:
|
||||||
|
|
||||||
|
- local for resilience and digital-twin use
|
||||||
|
- site for plant diagnostics
|
||||||
|
- central for fleet analytics and advisory logic
|
||||||
|
|
||||||
|
### 3. Smart storage
|
||||||
|
|
||||||
|
Telemetry should not be stored only with naive deadband rules.
|
||||||
|
|
||||||
|
The target model is signal-aware:
|
||||||
|
|
||||||
|
- preserve critical change points
|
||||||
|
- reduce low-information flat sections
|
||||||
|
- allow downstream reconstruction where justified
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
flowchart LR
|
||||||
|
SIG["Process Signal"] --> EVAL["Slope / Event Evaluation"]
|
||||||
|
EVAL --> KEEP["Keep critical points"]
|
||||||
|
EVAL --> REDUCE["Reduce reconstructable points"]
|
||||||
|
KEEP --> L0["Local InfluxDB"]
|
||||||
|
REDUCE --> L0
|
||||||
|
L0 --> L1["Site InfluxDB"]
|
||||||
|
L1 --> L2["Central InfluxDB"]
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4. Central is the safe entry point
|
||||||
|
|
||||||
|
External systems should enter through central APIs, not by directly calling field-edge systems.
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
flowchart TD
|
||||||
|
EXT["External Request"] --> API["Central API Gateway"]
|
||||||
|
API --> AUTH["Auth / Policy"]
|
||||||
|
AUTH --> SITE["Site Layer"]
|
||||||
|
SITE --> EDGE["Edge Layer"]
|
||||||
|
EDGE --> PLC["Field Assets"]
|
||||||
|
|
||||||
|
EXT -. blocked .-> EDGE
|
||||||
|
EXT -. blocked .-> PLC
|
||||||
|
```
|
||||||
|
|
||||||
|
### 5. Configuration belongs in `tagcodering`
|
||||||
|
|
||||||
|
The intended configuration source of truth is the database-backed `tagcodering` model:
|
||||||
|
|
||||||
|
- machine metadata
|
||||||
|
- asset configuration
|
||||||
|
- runtime-consumable configuration
|
||||||
|
- future central/site configuration services
|
||||||
|
|
||||||
|
This already exists partially but still needs more work before it fully serves that role.
|
||||||
|
|
||||||
|
## Layer Roles
|
||||||
|
|
||||||
|
### Edge
|
||||||
|
|
||||||
|
- PLC connectivity
|
||||||
|
- local logic
|
||||||
|
- protocol translation
|
||||||
|
- local telemetry buffering
|
||||||
|
- local monitoring and digital-twin support
|
||||||
|
|
||||||
|
### Site
|
||||||
|
|
||||||
|
- aggregation of edge systems
|
||||||
|
- local dashboards and diagnostics
|
||||||
|
- mediation between OT and central
|
||||||
|
- protected handoff for central requests
|
||||||
|
|
||||||
|
### Central
|
||||||
|
|
||||||
|
- enterprise/API gateway
|
||||||
|
- fleet dashboards
|
||||||
|
- analytics and intelligence
|
||||||
|
- source control and CI/CD
|
||||||
|
- configuration governance through `tagcodering`
|
||||||
|
|
||||||
|
## Why This Matters
|
||||||
|
|
||||||
|
This architecture gives EVOLV:
|
||||||
|
|
||||||
|
- better resilience
|
||||||
|
- safer external integration
|
||||||
|
- better data quality for analytics
|
||||||
|
- a path from Node-RED package to platform
|
||||||
632
wiki/architecture/stack-review.md
Normal file
632
wiki/architecture/stack-review.md
Normal file
@@ -0,0 +1,632 @@
|
|||||||
|
---
|
||||||
|
title: EVOLV Architecture Review
|
||||||
|
created: 2026-03-01
|
||||||
|
updated: 2026-04-07
|
||||||
|
status: evolving
|
||||||
|
tags: [architecture, stack, review]
|
||||||
|
---
|
||||||
|
|
||||||
|
# EVOLV Architecture Review
|
||||||
|
|
||||||
|
## Purpose
|
||||||
|
|
||||||
|
This document captures:
|
||||||
|
|
||||||
|
- the architecture implemented in this repository today
|
||||||
|
- the broader edge/site/central architecture shown in the drawings under `temp/`
|
||||||
|
- the key strengths and weaknesses of that direction
|
||||||
|
- the currently preferred target stack based on owner decisions from this review
|
||||||
|
|
||||||
|
It is the local staging document for a later wiki update.
|
||||||
|
|
||||||
|
## Evidence Used
|
||||||
|
|
||||||
|
Implemented stack evidence:
|
||||||
|
|
||||||
|
- `docker-compose.yml`
|
||||||
|
- `docker/settings.js`
|
||||||
|
- `docker/grafana/provisioning/datasources/influxdb.yaml`
|
||||||
|
- `package.json`
|
||||||
|
- `nodes/*`
|
||||||
|
|
||||||
|
Target-state evidence:
|
||||||
|
|
||||||
|
- `temp/fullStack.pdf`
|
||||||
|
- `temp/edge.pdf`
|
||||||
|
- `temp/CoreSync.drawio.pdf`
|
||||||
|
- `temp/cloud.yml`
|
||||||
|
|
||||||
|
Owner decisions from this review:
|
||||||
|
|
||||||
|
- local InfluxDB is required for operational resilience
|
||||||
|
- central acts as the advisory/intelligence and API-entry layer, not as a direct field caller
|
||||||
|
- intended configuration authority is the database-backed `tagcodering` model
|
||||||
|
- architecture wiki pages should be visual, not text-only
|
||||||
|
|
||||||
|
## 1. What Exists Today
|
||||||
|
|
||||||
|
### 1.1 Product/runtime layer
|
||||||
|
|
||||||
|
The codebase is currently a modular Node-RED package for wastewater/process automation:
|
||||||
|
|
||||||
|
- EVOLV ships custom Node-RED nodes for plant assets and process logic
|
||||||
|
- nodes emit both process/control messages and telemetry-oriented outputs
|
||||||
|
- shared helper logic lives in `nodes/generalFunctions/`
|
||||||
|
- Grafana-facing integration exists through `dashboardAPI` and Influx-oriented outputs
|
||||||
|
|
||||||
|
### 1.2 Implemented development stack
|
||||||
|
|
||||||
|
The concrete development stack in this repository is:
|
||||||
|
|
||||||
|
- Node-RED
|
||||||
|
- InfluxDB 2.x
|
||||||
|
- Grafana
|
||||||
|
|
||||||
|
That gives a clear local flow:
|
||||||
|
|
||||||
|
1. EVOLV logic runs in Node-RED.
|
||||||
|
2. Telemetry is emitted in a time-series-oriented shape.
|
||||||
|
3. InfluxDB stores the telemetry.
|
||||||
|
4. Grafana renders operational dashboards.
|
||||||
|
|
||||||
|
### 1.3 Existing runtime pattern in the nodes
|
||||||
|
|
||||||
|
A recurring EVOLV pattern is:
|
||||||
|
|
||||||
|
- output 0: process/control message
|
||||||
|
- output 1: Influx/telemetry message
|
||||||
|
- output 2: registration/control plumbing where relevant
|
||||||
|
|
||||||
|
So even in its current implemented form, EVOLV is not only a Node-RED project. It is already a control-plus-observability platform, with Node-RED as orchestration/runtime and InfluxDB/Grafana as telemetry and visualization services.
|
||||||
|
|
||||||
|
## 2. What The Drawings Describe
|
||||||
|
|
||||||
|
Across `temp/fullStack.pdf` and `temp/CoreSync.drawio.pdf`, the intended platform is broader and layered.
|
||||||
|
|
||||||
|
### 2.1 Edge / OT layer
|
||||||
|
|
||||||
|
The drawings consistently place these capabilities at the edge:
|
||||||
|
|
||||||
|
- PLC / OPC UA connectivity
|
||||||
|
- Node-RED container as protocol translator and logic runtime
|
||||||
|
- local broker in some variants
|
||||||
|
- local InfluxDB / Prometheus style storage in some variants
|
||||||
|
- local Grafana/SCADA in some variants
|
||||||
|
|
||||||
|
This is the plant-side operational layer.
|
||||||
|
|
||||||
|
### 2.2 Site / local server layer
|
||||||
|
|
||||||
|
The CoreSync drawings also show a site aggregation layer:
|
||||||
|
|
||||||
|
- RWZI-local server
|
||||||
|
- Node-RED / CoreSync services
|
||||||
|
- site-local broker
|
||||||
|
- site-local database
|
||||||
|
- upward API-based synchronization
|
||||||
|
|
||||||
|
This layer decouples field assets from central services and absorbs plant-specific complexity.
|
||||||
|
|
||||||
|
### 2.3 Central / cloud layer
|
||||||
|
|
||||||
|
The broader stack drawings and `temp/cloud.yml` show a central platform layer with:
|
||||||
|
|
||||||
|
- Gitea
|
||||||
|
- Jenkins
|
||||||
|
- reverse proxy / ingress
|
||||||
|
- Grafana
|
||||||
|
- InfluxDB
|
||||||
|
- Node-RED
|
||||||
|
- RabbitMQ / messaging
|
||||||
|
- VPN / tunnel concepts
|
||||||
|
- Keycloak in the drawing
|
||||||
|
- Portainer in the drawing
|
||||||
|
|
||||||
|
This is a platform-services layer, not just an application runtime.
|
||||||
|
|
||||||
|
## 3. Architecture Decisions From This Review
|
||||||
|
|
||||||
|
These decisions now shape the preferred EVOLV target architecture.
|
||||||
|
|
||||||
|
### 3.1 Local telemetry is mandatory for resilience
|
||||||
|
|
||||||
|
Local InfluxDB is not optional. It is required so that:
|
||||||
|
|
||||||
|
- operations continue when central SCADA or central services are down
|
||||||
|
- local dashboards and advanced digital-twin workflows can still consume recent and relevant process history
|
||||||
|
- local edge/site layers can make smarter decisions without depending on round-trips to central
|
||||||
|
|
||||||
|
### 3.2 Multi-level InfluxDB is part of the architecture
|
||||||
|
|
||||||
|
InfluxDB should exist on multiple levels where it adds operational value:
|
||||||
|
|
||||||
|
- edge/local for resilience and near-real-time replay
|
||||||
|
- site for plant-level history, diagnostics, and resilience
|
||||||
|
- central for fleet-wide analytics, benchmarking, and advisory intelligence
|
||||||
|
|
||||||
|
This is not just copy-paste storage at each level. The design intent is event-driven and selective.
|
||||||
|
|
||||||
|
### 3.3 Storage should be smart, not only deadband-driven
|
||||||
|
|
||||||
|
The target is not simple "store every point" or only a fixed deadband rule such as 1%.
|
||||||
|
|
||||||
|
The desired storage approach is:
|
||||||
|
|
||||||
|
- observe signal slope and change behavior
|
||||||
|
- preserve points where state is changing materially
|
||||||
|
- store fewer points where the signal can be reconstructed downstream with sufficient fidelity
|
||||||
|
- carry enough metadata or conventions so reconstruction quality is auditable
|
||||||
|
|
||||||
|
This implies EVOLV should evolve toward smart storage and signal-aware retention rather than naive event dumping.
|
||||||
|
|
||||||
|
### 3.4 Central is the intelligence and API-entry layer
|
||||||
|
|
||||||
|
Central may advise and coordinate edge/site layers, but external API requests should not hit field-edge systems directly.
|
||||||
|
|
||||||
|
The intended pattern is:
|
||||||
|
|
||||||
|
- external and enterprise integrations terminate centrally
|
||||||
|
- central evaluates, aggregates, authorizes, and advises
|
||||||
|
- site/edge layers receive mediated requests, policies, or setpoints
|
||||||
|
- field-edge remains protected behind an intermediate layer
|
||||||
|
|
||||||
|
This aligns with the stated security direction.
|
||||||
|
|
||||||
|
### 3.5 Configuration source of truth should be database-backed
|
||||||
|
|
||||||
|
The intended configuration authority is the database-backed `tagcodering` model, which already exists but is not yet complete enough to serve as the fully realized source of truth.
|
||||||
|
|
||||||
|
That means the architecture should assume:
|
||||||
|
|
||||||
|
- asset and machine metadata belong in `tagcodering`
|
||||||
|
- Node-RED flows should consume configuration rather than silently becoming the only configuration store
|
||||||
|
- more work is still needed before this behaves as the intended central configuration backbone
|
||||||
|
|
||||||
|
## 4. Visual Model
|
||||||
|
|
||||||
|
### 4.1 Platform topology
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
flowchart LR
|
||||||
|
subgraph OT["OT / Field"]
|
||||||
|
PLC["PLC / IO"]
|
||||||
|
DEV["Sensors / Machines"]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph EDGE["Edge Layer"]
|
||||||
|
ENR["Edge Node-RED"]
|
||||||
|
EDB["Local InfluxDB"]
|
||||||
|
EUI["Local Grafana / Local Monitoring"]
|
||||||
|
EBR["Optional Local Broker"]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph SITE["Site Layer"]
|
||||||
|
SNR["Site Node-RED / CoreSync"]
|
||||||
|
SDB["Site InfluxDB"]
|
||||||
|
SUI["Site Grafana / SCADA Support"]
|
||||||
|
SBR["Site Broker"]
|
||||||
|
end
|
||||||
|
|
||||||
|
subgraph CENTRAL["Central Layer"]
|
||||||
|
API["API / Integration Gateway"]
|
||||||
|
INTEL["Overview Intelligence / Advisory Logic"]
|
||||||
|
CDB["Central InfluxDB"]
|
||||||
|
CGR["Central Grafana"]
|
||||||
|
CFG["Tagcodering Config Model"]
|
||||||
|
GIT["Gitea"]
|
||||||
|
CI["CI/CD"]
|
||||||
|
IAM["IAM / Keycloak"]
|
||||||
|
end
|
||||||
|
|
||||||
|
DEV --> PLC
|
||||||
|
PLC --> ENR
|
||||||
|
ENR --> EDB
|
||||||
|
ENR --> EUI
|
||||||
|
ENR --> EBR
|
||||||
|
ENR <--> SNR
|
||||||
|
EDB <--> SDB
|
||||||
|
SNR --> SDB
|
||||||
|
SNR --> SUI
|
||||||
|
SNR --> SBR
|
||||||
|
SNR <--> API
|
||||||
|
API --> INTEL
|
||||||
|
API <--> CFG
|
||||||
|
SDB <--> CDB
|
||||||
|
INTEL --> SNR
|
||||||
|
CGR --> CDB
|
||||||
|
CI --> GIT
|
||||||
|
IAM --> API
|
||||||
|
IAM --> CGR
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4.2 Command and access boundary
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
flowchart TD
|
||||||
|
EXT["External APIs / Enterprise Requests"] --> API["Central API Gateway"]
|
||||||
|
API --> AUTH["AuthN/AuthZ / Policy Checks"]
|
||||||
|
AUTH --> INTEL["Central Advisory / Decision Support"]
|
||||||
|
INTEL --> SITE["Site Integration Layer"]
|
||||||
|
SITE --> EDGE["Edge Runtime"]
|
||||||
|
EDGE --> PLC["PLC / Field Assets"]
|
||||||
|
|
||||||
|
EXT -. no direct access .-> EDGE
|
||||||
|
EXT -. no direct access .-> PLC
|
||||||
|
```
|
||||||
|
|
||||||
|
### 4.3 Smart telemetry flow
|
||||||
|
|
||||||
|
```mermaid
|
||||||
|
flowchart LR
|
||||||
|
RAW["Raw Signal"] --> EDGELOGIC["Edge Signal Evaluation"]
|
||||||
|
EDGELOGIC --> KEEP["Keep Critical Change Points"]
|
||||||
|
EDGELOGIC --> SKIP["Skip Reconstructable Flat Points"]
|
||||||
|
EDGELOGIC --> LOCAL["Local InfluxDB"]
|
||||||
|
LOCAL --> SITE["Site InfluxDB"]
|
||||||
|
SITE --> CENTRAL["Central InfluxDB"]
|
||||||
|
KEEP --> LOCAL
|
||||||
|
SKIP -. reconstruction assumptions / metadata .-> SITE
|
||||||
|
CENTRAL --> DASH["Fleet Dashboards / Analytics"]
|
||||||
|
```
|
||||||
|
|
||||||
|
## 5. Upsides Of This Direction
|
||||||
|
|
||||||
|
### 5.1 Strong separation between control and observability
|
||||||
|
|
||||||
|
Node-RED for runtime/orchestration and InfluxDB/Grafana for telemetry is still the right structural split:
|
||||||
|
|
||||||
|
- control stays close to the process
|
||||||
|
- telemetry storage/querying stays in time-series-native tooling
|
||||||
|
- dashboards do not need to overload Node-RED itself
|
||||||
|
|
||||||
|
### 5.2 Edge-first matches operational reality
|
||||||
|
|
||||||
|
For wastewater/process systems, edge-first remains correct:
|
||||||
|
|
||||||
|
- lower latency
|
||||||
|
- better degraded-mode behavior
|
||||||
|
- less dependence on WAN or central platform uptime
|
||||||
|
- clearer OT trust boundary
|
||||||
|
|
||||||
|
### 5.3 Site mediation improves safety and security
|
||||||
|
|
||||||
|
Using central as the enterprise/API entry point and site as the mediator improves posture:
|
||||||
|
|
||||||
|
- field systems are less exposed
|
||||||
|
- policy decisions can be centralized
|
||||||
|
- external integrations do not probe the edge directly
|
||||||
|
- site can continue operating even when upstream is degraded
|
||||||
|
|
||||||
|
### 5.4 Multi-level storage enables better analytics
|
||||||
|
|
||||||
|
Multiple Influx layers can support:
|
||||||
|
|
||||||
|
- local resilience
|
||||||
|
- site diagnostics
|
||||||
|
- fleet benchmarking
|
||||||
|
- smarter retention and reconstruction strategies
|
||||||
|
|
||||||
|
That is substantially more capable than a single central historian model.
|
||||||
|
|
||||||
|
### 5.5 `tagcodering` is the right long-term direction
|
||||||
|
|
||||||
|
A database-backed configuration authority is stronger than embedding configuration only in flows because it supports:
|
||||||
|
|
||||||
|
- machine metadata management
|
||||||
|
- controlled rollout of configuration changes
|
||||||
|
- clearer versioning and provenance
|
||||||
|
- future API-driven configuration services
|
||||||
|
|
||||||
|
## 6. Downsides And Risks
|
||||||
|
|
||||||
|
### 6.1 Smart storage raises algorithmic and governance complexity
|
||||||
|
|
||||||
|
Signal-aware storage and reconstruction is promising, but it creates architectural obligations:
|
||||||
|
|
||||||
|
- reconstruction rules must be explicit
|
||||||
|
- acceptable reconstruction error must be defined per signal type
|
||||||
|
- operators must know whether they see raw or reconstructed history
|
||||||
|
- compliance-relevant data may need stricter retention than operational convenience data
|
||||||
|
|
||||||
|
Without those rules, smart storage can become opaque and hard to trust.
|
||||||
|
|
||||||
|
### 6.2 Multi-level databases can create ownership confusion
|
||||||
|
|
||||||
|
If edge, site, and central all store telemetry, you must define:
|
||||||
|
|
||||||
|
- which layer is authoritative for which time horizon
|
||||||
|
- when backfill is allowed
|
||||||
|
- when data is summarized vs copied
|
||||||
|
- how duplicates or gaps are detected
|
||||||
|
|
||||||
|
Otherwise operations will argue over which trend is "the real one."
|
||||||
|
|
||||||
|
### 6.3 Central intelligence must remain advisory-first
|
||||||
|
|
||||||
|
Central guidance can become valuable, but direct closed-loop dependency on central would be risky.
|
||||||
|
|
||||||
|
The architecture should therefore preserve:
|
||||||
|
|
||||||
|
- local control authority at edge/site
|
||||||
|
- bounded and explicit central advice
|
||||||
|
- safe behavior if central recommendations stop arriving
|
||||||
|
|
||||||
|
### 6.4 `tagcodering` is not yet complete enough to lean on blindly
|
||||||
|
|
||||||
|
It is the right target, but its current partial state means there is still architecture debt:
|
||||||
|
|
||||||
|
- incomplete config workflows
|
||||||
|
- likely mismatch between desired and implemented schema behavior
|
||||||
|
- temporary duplication between flows, node config, and database-held metadata
|
||||||
|
|
||||||
|
This should be treated as a core platform workstream, not a side issue.
|
||||||
|
|
||||||
|
### 6.5 Broker responsibilities are still not crisp enough
|
||||||
|
|
||||||
|
The materials still reference MQTT/AMQP/RabbitMQ/brokers without one stable responsibility split. That needs to be resolved before large-scale deployment.
|
||||||
|
|
||||||
|
Questions still open:
|
||||||
|
|
||||||
|
- command bus or event bus?
|
||||||
|
- site-only or cross-site?
|
||||||
|
- telemetry transport or only synchronization/eventing?
|
||||||
|
- durability expectations and replay behavior?
|
||||||
|
|
||||||
|
## 7. Security And Regulatory Positioning
|
||||||
|
|
||||||
|
### 7.1 Purdue-style layering is a good fit
|
||||||
|
|
||||||
|
EVOLV's preferred structure aligns well with a Purdue-style OT/IT layering approach:
|
||||||
|
|
||||||
|
- PLCs and field assets stay at the operational edge
|
||||||
|
- edge runtimes stay close to the process
|
||||||
|
- site systems mediate between OT and broader enterprise concerns
|
||||||
|
- central services host APIs, identity, analytics, and engineering workflows
|
||||||
|
|
||||||
|
That is important because it supports segmented trust boundaries instead of direct enterprise-to-field reach-through.
|
||||||
|
|
||||||
|
### 7.2 NIS2 alignment
|
||||||
|
|
||||||
|
Directive (EU) 2022/2555 (NIS2) requires cybersecurity risk-management measures, incident handling, and stronger governance for covered entities.
|
||||||
|
|
||||||
|
This architecture supports that by:
|
||||||
|
|
||||||
|
- limiting direct exposure of field systems
|
||||||
|
- separating operational layers
|
||||||
|
- enabling central policy and oversight
|
||||||
|
- preserving local operation during upstream failure
|
||||||
|
|
||||||
|
### 7.3 CER alignment
|
||||||
|
|
||||||
|
Directive (EU) 2022/2557 (Critical Entities Resilience Directive) focuses on resilience of essential services.
|
||||||
|
|
||||||
|
The edge-plus-site approach supports that direction because:
|
||||||
|
|
||||||
|
- local/site layers can continue during central disruption
|
||||||
|
- essential service continuity does not depend on one central runtime
|
||||||
|
- degraded-mode behavior can be explicitly designed per layer
|
||||||
|
|
||||||
|
### 7.4 Cyber Resilience Act alignment
|
||||||
|
|
||||||
|
Regulation (EU) 2024/2847 (Cyber Resilience Act) creates cybersecurity requirements for products with digital elements.
|
||||||
|
|
||||||
|
For EVOLV, that means the platform should keep strengthening:
|
||||||
|
|
||||||
|
- secure configuration handling
|
||||||
|
- vulnerability and update management
|
||||||
|
- release traceability
|
||||||
|
- lifecycle ownership of components and dependencies
|
||||||
|
|
||||||
|
### 7.5 GDPR alignment where personal data is present
|
||||||
|
|
||||||
|
Regulation (EU) 2016/679 (GDPR) applies whenever EVOLV processes personal data.
|
||||||
|
|
||||||
|
The architecture helps by:
|
||||||
|
|
||||||
|
- centralizing ingress
|
||||||
|
- reducing unnecessary propagation of data to field layers
|
||||||
|
- making access, retention, and audit boundaries easier to define
|
||||||
|
|
||||||
|
### 7.6 What can and cannot be claimed
|
||||||
|
|
||||||
|
The defensible claim is that EVOLV can be deployed in a way that supports compliance with strict European cybersecurity and resilience expectations.
|
||||||
|
|
||||||
|
The non-defensible claim is that EVOLV is automatically compliant purely because of the architecture diagram.
|
||||||
|
|
||||||
|
Actual compliance still depends on implementation and operations, including:
|
||||||
|
|
||||||
|
- access control
|
||||||
|
- patch and vulnerability management
|
||||||
|
- incident response
|
||||||
|
- logging and audit evidence
|
||||||
|
- retention policy
|
||||||
|
- data classification
|
||||||
|
|
||||||
|
## 8. Recommended Ideal Stack
|
||||||
|
|
||||||
|
The ideal EVOLV stack should be layered around operational boundaries, not around tools.
|
||||||
|
|
||||||
|
### 7.1 Layer A: Edge execution
|
||||||
|
|
||||||
|
Purpose:
|
||||||
|
|
||||||
|
- connect to PLCs and field assets
|
||||||
|
- execute time-sensitive local logic
|
||||||
|
- preserve operation during WAN/central loss
|
||||||
|
- provide local telemetry access for resilience and digital-twin use cases
|
||||||
|
|
||||||
|
Recommended components:
|
||||||
|
|
||||||
|
- Node-RED runtime for EVOLV edge flows
|
||||||
|
- OPC UA and protocol adapters
|
||||||
|
- local InfluxDB
|
||||||
|
- optional local Grafana for local engineering/monitoring
|
||||||
|
- optional local broker only when multiple participants need decoupling
|
||||||
|
|
||||||
|
Principle:
|
||||||
|
|
||||||
|
- edge remains safe and useful when disconnected
|
||||||
|
|
||||||
|
### 7.2 Layer B: Site integration
|
||||||
|
|
||||||
|
Purpose:
|
||||||
|
|
||||||
|
- aggregate multiple edge systems at plant/site level
|
||||||
|
- host plant-local dashboards and diagnostics
|
||||||
|
- mediate between raw OT detail and central standardization
|
||||||
|
- serve as the protected step between field systems and central requests
|
||||||
|
|
||||||
|
Recommended components:
|
||||||
|
|
||||||
|
- site Node-RED / CoreSync services
|
||||||
|
- site InfluxDB
|
||||||
|
- site Grafana / SCADA-supporting dashboards
|
||||||
|
- site broker where asynchronous eventing is justified
|
||||||
|
|
||||||
|
Principle:
|
||||||
|
|
||||||
|
- site absorbs plant complexity and protects field assets
|
||||||
|
|
||||||
|
### 7.3 Layer C: Central platform
|
||||||
|
|
||||||
|
Purpose:
|
||||||
|
|
||||||
|
- fleet-wide analytics
|
||||||
|
- shared dashboards
|
||||||
|
- engineering lifecycle
|
||||||
|
- enterprise/API entry point
|
||||||
|
- overview intelligence and advisory logic
|
||||||
|
|
||||||
|
Recommended components:
|
||||||
|
|
||||||
|
- Gitea
|
||||||
|
- CI/CD
|
||||||
|
- central InfluxDB
|
||||||
|
- central Grafana
|
||||||
|
- API/integration gateway
|
||||||
|
- IAM
|
||||||
|
- VPN/private connectivity
|
||||||
|
- `tagcodering`-backed configuration services
|
||||||
|
|
||||||
|
Principle:
|
||||||
|
|
||||||
|
- central coordinates, advises, and governs; it is not the direct field caller
|
||||||
|
|
||||||
|
### 7.4 Cross-cutting platform services
|
||||||
|
|
||||||
|
These should be explicit architecture elements:
|
||||||
|
|
||||||
|
- secrets management
|
||||||
|
- certificate management
|
||||||
|
- backup/restore
|
||||||
|
- audit logging
|
||||||
|
- monitoring/alerting of the platform itself
|
||||||
|
- versioned configuration and schema management
|
||||||
|
- rollout/rollback strategy
|
||||||
|
|
||||||
|
## 9. Recommended Opinionated Choices
|
||||||
|
|
||||||
|
### 8.1 Keep Node-RED as the orchestration layer, not the whole platform
|
||||||
|
|
||||||
|
Node-RED should own:
|
||||||
|
|
||||||
|
- process orchestration
|
||||||
|
- protocol mediation
|
||||||
|
- edge/site logic
|
||||||
|
- KPI production
|
||||||
|
|
||||||
|
It should not become the sole owner of:
|
||||||
|
|
||||||
|
- identity
|
||||||
|
- long-term configuration authority
|
||||||
|
- secret management
|
||||||
|
- compliance/audit authority
|
||||||
|
|
||||||
|
### 8.2 Use InfluxDB by function and horizon
|
||||||
|
|
||||||
|
Recommended split:
|
||||||
|
|
||||||
|
- edge: resilience, local replay, digital-twin input
|
||||||
|
- site: plant diagnostics and local continuity
|
||||||
|
- central: fleet analytics, advisory intelligence, benchmarking, and long-term cross-site views
|
||||||
|
|
||||||
|
### 8.3 Prefer smart telemetry retention over naive point dumping
|
||||||
|
|
||||||
|
Recommended rule:
|
||||||
|
|
||||||
|
- keep information-rich points
|
||||||
|
- reduce information-poor flat spans
|
||||||
|
- document reconstruction assumptions
|
||||||
|
- define signal-class-specific fidelity expectations
|
||||||
|
|
||||||
|
This needs design discipline, but it is a real differentiator if executed well.
|
||||||
|
|
||||||
|
### 8.4 Put enterprise/API ingress at central, not at edge
|
||||||
|
|
||||||
|
This should become a hard architectural rule:
|
||||||
|
|
||||||
|
- external requests land centrally
|
||||||
|
- central authenticates and authorizes
|
||||||
|
- central or site mediates downward
|
||||||
|
- edge never becomes the exposed public integration surface
|
||||||
|
|
||||||
|
### 8.5 Make `tagcodering` the target configuration backbone
|
||||||
|
|
||||||
|
The architecture should be designed so that `tagcodering` can mature into:
|
||||||
|
|
||||||
|
- machine and asset registry
|
||||||
|
- configuration source of truth
|
||||||
|
- site/central configuration exchange point
|
||||||
|
- API-served configuration source for runtime layers
|
||||||
|
|
||||||
|
## 10. Suggested Phasing
|
||||||
|
|
||||||
|
### Phase 1: Stabilize contracts
|
||||||
|
|
||||||
|
- define topic and payload contracts
|
||||||
|
- define telemetry classes and reconstruction policy
|
||||||
|
- define asset, machine, and site identity model
|
||||||
|
- define `tagcodering` scope and schema ownership
|
||||||
|
|
||||||
|
### Phase 2: Harden local/site resilience
|
||||||
|
|
||||||
|
- formalize edge and site runtime patterns
|
||||||
|
- define local telemetry retention and replay behavior
|
||||||
|
- define central-loss behavior
|
||||||
|
- define dashboard behavior during isolation
|
||||||
|
|
||||||
|
### Phase 3: Harden central platform
|
||||||
|
|
||||||
|
- IAM
|
||||||
|
- API gateway
|
||||||
|
- central observability
|
||||||
|
- CI/CD
|
||||||
|
- backup and disaster recovery
|
||||||
|
- config services over `tagcodering`
|
||||||
|
|
||||||
|
### Phase 4: Introduce selective synchronization and intelligence
|
||||||
|
|
||||||
|
- event-driven telemetry propagation rules
|
||||||
|
- smart-storage promotion/backfill policies
|
||||||
|
- advisory services from central
|
||||||
|
- auditability of downward recommendations and configuration changes
|
||||||
|
|
||||||
|
## 11. Immediate Open Questions Before Wiki Finalization
|
||||||
|
|
||||||
|
1. Which signals are allowed to use reconstruction-aware smart storage, and which must remain raw or near-raw for audit/compliance reasons?
|
||||||
|
2. How should `tagcodering` be exposed to runtime layers: direct database access, a dedicated API, or both?
|
||||||
|
3. What exact responsibility split should EVOLV use between API synchronization and broker-based eventing?
|
||||||
|
|
||||||
|
## 12. Recommended Wiki Structure
|
||||||
|
|
||||||
|
The wiki should not be one long page. It should be split into:
|
||||||
|
|
||||||
|
1. platform overview with the main topology diagram
|
||||||
|
2. edge-site-central runtime model
|
||||||
|
3. telemetry and smart storage model
|
||||||
|
4. security and access-boundary model
|
||||||
|
5. configuration architecture centered on `tagcodering`
|
||||||
|
|
||||||
|
## 13. Next Step
|
||||||
|
|
||||||
|
Use this document as the architecture baseline. The companion markdown page in `architecture/` can then be shaped into a wiki-ready visual overview page with Mermaid diagrams and shorter human-readable sections.
|
||||||
454
wiki/concepts/generalfunctions-api.md
Normal file
454
wiki/concepts/generalfunctions-api.md
Normal file
@@ -0,0 +1,454 @@
|
|||||||
|
---
|
||||||
|
title: generalFunctions API Reference
|
||||||
|
created: 2026-03-01
|
||||||
|
updated: 2026-04-07
|
||||||
|
status: evolving
|
||||||
|
tags: [api, generalFunctions, reference]
|
||||||
|
---
|
||||||
|
|
||||||
|
# generalFunctions API Reference
|
||||||
|
|
||||||
|
Shared library (`nodes/generalFunctions/`) used across all EVOLV Node-RED nodes.
|
||||||
|
|
||||||
|
```js
|
||||||
|
const { logger, outputUtils, MeasurementContainer, ... } = require('generalFunctions');
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Table of Contents
|
||||||
|
|
||||||
|
1. [Logger](#logger)
|
||||||
|
2. [OutputUtils](#outpututils)
|
||||||
|
3. [ValidationUtils](#validationutils)
|
||||||
|
4. [MeasurementContainer](#measurementcontainer)
|
||||||
|
5. [ConfigManager](#configmanager)
|
||||||
|
6. [ChildRegistrationUtils](#childregistrationutils)
|
||||||
|
7. [MenuUtils](#menuutils)
|
||||||
|
8. [EndpointUtils](#endpointutils)
|
||||||
|
9. [Positions](#positions)
|
||||||
|
10. [AssetLoader / loadCurve](#assetloader--loadcurve)
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Logger
|
||||||
|
|
||||||
|
Structured, level-filtered console logger.
|
||||||
|
|
||||||
|
**File:** `src/helper/logger.js`
|
||||||
|
|
||||||
|
### Constructor
|
||||||
|
|
||||||
|
```js
|
||||||
|
new Logger(logging = true, logLevel = 'debug', nameModule = 'N/A')
|
||||||
|
```
|
||||||
|
|
||||||
|
| Param | Type | Default | Description |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `logging` | `boolean` | `true` | Enable/disable all output |
|
||||||
|
| `logLevel` | `string` | `'debug'` | Minimum severity: `'debug'` \| `'info'` \| `'warn'` \| `'error'` |
|
||||||
|
| `nameModule` | `string` | `'N/A'` | Label prefixed to every message |
|
||||||
|
|
||||||
|
### Methods
|
||||||
|
|
||||||
|
| Method | Signature | Description |
|
||||||
|
|---|---|---|
|
||||||
|
| `debug` | `(message: string): void` | Log at DEBUG level |
|
||||||
|
| `info` | `(message: string): void` | Log at INFO level |
|
||||||
|
| `warn` | `(message: string): void` | Log at WARN level |
|
||||||
|
| `error` | `(message: string): void` | Log at ERROR level |
|
||||||
|
| `setLogLevel` | `(level: string): void` | Change minimum level at runtime |
|
||||||
|
| `toggleLogging` | `(): void` | Flip logging on/off |
|
||||||
|
|
||||||
|
### Example
|
||||||
|
|
||||||
|
```js
|
||||||
|
const Logger = require('generalFunctions').logger;
|
||||||
|
const log = new Logger(true, 'info', 'MyNode');
|
||||||
|
log.info('Node started'); // [INFO] -> MyNode: Node started
|
||||||
|
log.debug('ignored'); // silent (below 'info')
|
||||||
|
log.setLogLevel('debug');
|
||||||
|
log.debug('now visible'); // [DEBUG] -> MyNode: now visible
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## OutputUtils
|
||||||
|
|
||||||
|
Tracks output state and formats messages for InfluxDB or process outputs. Only emits changed fields.
|
||||||
|
|
||||||
|
**File:** `src/helper/outputUtils.js`
|
||||||
|
|
||||||
|
### Constructor
|
||||||
|
|
||||||
|
```js
|
||||||
|
new OutputUtils() // no parameters
|
||||||
|
```
|
||||||
|
|
||||||
|
### Methods
|
||||||
|
|
||||||
|
| Method | Signature | Returns | Description |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `formatMsg` | `(output, config, format)` | `object \| undefined` | Diff against last output; returns formatted msg or `undefined` if nothing changed |
|
||||||
|
| `checkForChanges` | `(output, format)` | `object` | Returns only the key/value pairs that changed since last call |
|
||||||
|
|
||||||
|
**`format`** must be `'influxdb'` or `'process'`.
|
||||||
|
|
||||||
|
### Example
|
||||||
|
|
||||||
|
```js
|
||||||
|
const out = new OutputUtils();
|
||||||
|
const msg = out.formatMsg(
|
||||||
|
{ temperature: 22.5, pressure: 1013 },
|
||||||
|
config,
|
||||||
|
'influxdb'
|
||||||
|
);
|
||||||
|
// msg = { topic: 'nodeName', payload: { measurement, fields, tags, timestamp } }
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ValidationUtils
|
||||||
|
|
||||||
|
Schema-driven config validation with type coercion, range clamping, and nested object support.
|
||||||
|
|
||||||
|
**File:** `src/helper/validationUtils.js`
|
||||||
|
|
||||||
|
### Constructor
|
||||||
|
|
||||||
|
```js
|
||||||
|
new ValidationUtils(loggerEnabled = true, loggerLevel = 'warn')
|
||||||
|
```
|
||||||
|
|
||||||
|
### Methods
|
||||||
|
|
||||||
|
| Method | Signature | Returns | Description |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `validateSchema` | `(config, schema, name)` | `object` | Walk the schema, validate every field, return a clean config. Unknown keys are stripped. Missing keys get their schema default. |
|
||||||
|
| `constrain` | `(value, min, max)` | `number` | Clamp a numeric value to `[min, max]` |
|
||||||
|
| `removeUnwantedKeys` | `(obj)` | `object` | Strip `rules`/`description` metadata, collapse `default` values |
|
||||||
|
|
||||||
|
**Supported `rules.type` values:** `number`, `integer`, `boolean`, `string`, `enum`, `array`, `set`, `object`, `curve`, `machineCurve`.
|
||||||
|
|
||||||
|
### Example
|
||||||
|
|
||||||
|
```js
|
||||||
|
const ValidationUtils = require('generalFunctions').validation;
|
||||||
|
const v = new ValidationUtils(true, 'warn');
|
||||||
|
|
||||||
|
const schema = {
|
||||||
|
temperature: { default: 20, rules: { type: 'number', min: -40, max: 100 } },
|
||||||
|
unit: { default: 'C', rules: { type: 'enum', values: [{ value: 'C' }, { value: 'F' }] } }
|
||||||
|
};
|
||||||
|
|
||||||
|
const validated = v.validateSchema({ temperature: 999 }, schema, 'myNode');
|
||||||
|
// validated.temperature === 100 (clamped)
|
||||||
|
// validated.unit === 'C' (default applied)
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## MeasurementContainer
|
||||||
|
|
||||||
|
Chainable measurement storage organised by **type / variant / position**. Supports auto unit conversion, windowed statistics, events, and positional difference calculations.
|
||||||
|
|
||||||
|
**File:** `src/measurements/MeasurementContainer.js`
|
||||||
|
|
||||||
|
### Constructor
|
||||||
|
|
||||||
|
```js
|
||||||
|
new MeasurementContainer(options = {}, logger)
|
||||||
|
```
|
||||||
|
|
||||||
|
| Option | Type | Default | Description |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `windowSize` | `number` | `10` | Rolling window for statistics |
|
||||||
|
| `defaultUnits` | `object` | `{ pressure:'mbar', flow:'m3/h', ... }` | Default unit per measurement type |
|
||||||
|
| `autoConvert` | `boolean` | `true` | Auto-convert values to target unit |
|
||||||
|
| `preferredUnits` | `object` | `{}` | Per-type unit overrides |
|
||||||
|
|
||||||
|
### Chainable Setters
|
||||||
|
|
||||||
|
All return `this` for chaining.
|
||||||
|
|
||||||
|
```js
|
||||||
|
container
|
||||||
|
.type('pressure')
|
||||||
|
.variant('static')
|
||||||
|
.position('upstream')
|
||||||
|
.distance(5)
|
||||||
|
.unit('bar')
|
||||||
|
.value(3.2, Date.now(), 'bar');
|
||||||
|
```
|
||||||
|
|
||||||
|
| Method | Signature | Description |
|
||||||
|
|---|---|---|
|
||||||
|
| `type` | `(typeName): this` | Set measurement type (e.g. `'pressure'`) |
|
||||||
|
| `variant` | `(variantName): this` | Set variant (e.g. `'static'`, `'differential'`) |
|
||||||
|
| `position` | `(positionValue): this` | Set position (e.g. `'upstream'`, `'downstream'`) |
|
||||||
|
| `distance` | `(distance): this` | Set physical distance from parent |
|
||||||
|
| `unit` | `(unitName): this` | Set unit on the underlying measurement |
|
||||||
|
| `value` | `(val, timestamp?, sourceUnit?): this` | Store a value; auto-converts if `sourceUnit` differs from target |
|
||||||
|
|
||||||
|
### Terminal / Query Methods
|
||||||
|
|
||||||
|
| Method | Signature | Returns | Description |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `get` | `()` | `Measurement \| null` | Get the raw measurement object |
|
||||||
|
| `getCurrentValue` | `(requestedUnit?)` | `number \| null` | Latest value, optionally converted |
|
||||||
|
| `getAverage` | `(requestedUnit?)` | `number \| null` | Windowed average |
|
||||||
|
| `getMin` | `()` | `number \| null` | Window minimum |
|
||||||
|
| `getMax` | `()` | `number \| null` | Window maximum |
|
||||||
|
| `getAllValues` | `()` | `array \| null` | All stored samples |
|
||||||
|
| `getLaggedValue` | `(lag?, requestedUnit?)` | `number \| null` | Value from `lag` samples ago |
|
||||||
|
| `getLaggedSample` | `(lag?, requestedUnit?)` | `object \| null` | Full sample `{ value, timestamp, unit }` from `lag` samples ago |
|
||||||
|
| `exists` | `({ type?, variant?, position?, requireValues? })` | `boolean` | Check if a measurement series exists |
|
||||||
|
| `difference` | `({ from?, to?, unit? })` | `object \| null` | Compute `{ value, avgDiff, unit }` between two positions |
|
||||||
|
|
||||||
|
### Introspection / Lifecycle
|
||||||
|
|
||||||
|
| Method | Signature | Returns | Description |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `getTypes` | `()` | `string[]` | All registered measurement types |
|
||||||
|
| `getVariants` | `()` | `string[]` | Variants under current type |
|
||||||
|
| `getPositions` | `()` | `string[]` | Positions under current type+variant |
|
||||||
|
| `getAvailableUnits` | `(measurementType?)` | `string[]` | Units available for a type |
|
||||||
|
| `getBestUnit` | `(excludeUnits?)` | `object \| null` | Best human-readable unit for current value |
|
||||||
|
| `setPreferredUnit` | `(type, unit)` | `this` | Override default unit for a type |
|
||||||
|
| `setChildId` | `(id)` | `this` | Tag container with a child node ID |
|
||||||
|
| `setChildName` | `(name)` | `this` | Tag container with a child node name |
|
||||||
|
| `setParentRef` | `(parent)` | `this` | Store reference to parent node |
|
||||||
|
| `clear` | `()` | `void` | Reset all measurements and chain state |
|
||||||
|
|
||||||
|
### Events
|
||||||
|
|
||||||
|
The internal `emitter` fires `"type.variant.position"` on every `value()` call with:
|
||||||
|
|
||||||
|
```js
|
||||||
|
{ value, originalValue, unit, sourceUnit, timestamp, position, distance, variant, type, childId, childName, parentRef }
|
||||||
|
```
|
||||||
|
|
||||||
|
### Example
|
||||||
|
|
||||||
|
```js
|
||||||
|
const { MeasurementContainer } = require('generalFunctions');
|
||||||
|
const mc = new MeasurementContainer({ windowSize: 5 });
|
||||||
|
|
||||||
|
mc.type('pressure').variant('static').position('upstream').value(3.2);
|
||||||
|
mc.type('pressure').variant('static').position('downstream').value(2.8);
|
||||||
|
|
||||||
|
const diff = mc.type('pressure').variant('static').difference();
|
||||||
|
// diff = { value: -0.4, avgDiff: -0.4, unit: 'mbar', from: 'downstream', to: 'upstream' }
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ConfigManager
|
||||||
|
|
||||||
|
Loads JSON config files from disk and builds merged runtime configs.
|
||||||
|
|
||||||
|
**File:** `src/configs/index.js`
|
||||||
|
|
||||||
|
### Constructor
|
||||||
|
|
||||||
|
```js
|
||||||
|
new ConfigManager(relPath = '.')
|
||||||
|
```
|
||||||
|
|
||||||
|
`relPath` is resolved relative to the configs directory.
|
||||||
|
|
||||||
|
### Methods
|
||||||
|
|
||||||
|
| Method | Signature | Returns | Description |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `getConfig` | `(configName)` | `object` | Load and parse `<configName>.json` |
|
||||||
|
| `getAvailableConfigs` | `()` | `string[]` | List config names (without `.json`) |
|
||||||
|
| `hasConfig` | `(configName)` | `boolean` | Check existence |
|
||||||
|
| `getBaseConfig` | `()` | `object` | Shortcut for `getConfig('baseConfig')` |
|
||||||
|
| `buildConfig` | `(nodeName, uiConfig, nodeId, domainConfig?)` | `object` | Merge base schema + UI overrides into a runtime config |
|
||||||
|
| `createEndpoint` | `(nodeName)` | `string` | Generate browser JS that injects config into `window.EVOLV.nodes` |
|
||||||
|
|
||||||
|
### Example
|
||||||
|
|
||||||
|
```js
|
||||||
|
const { configManager } = require('generalFunctions');
|
||||||
|
const cfg = configManager.buildConfig('measurement', uiConfig, node.id, {
|
||||||
|
scaling: { enabled: true, inputMin: 0, inputMax: 100 }
|
||||||
|
});
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## ChildRegistrationUtils
|
||||||
|
|
||||||
|
Manages parent-child node relationships: registration, lookup, and structure storage.
|
||||||
|
|
||||||
|
**File:** `src/helper/childRegistrationUtils.js`
|
||||||
|
|
||||||
|
### Constructor
|
||||||
|
|
||||||
|
```js
|
||||||
|
new ChildRegistrationUtils(mainClass)
|
||||||
|
```
|
||||||
|
|
||||||
|
`mainClass` is the parent node instance (must expose `.logger` and optionally `.registerChild()`).
|
||||||
|
|
||||||
|
### Methods
|
||||||
|
|
||||||
|
| Method | Signature | Returns | Description |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `registerChild` | `(child, positionVsParent, distance?)` | `Promise<any>` | Register a child node under the parent. Sets up parent refs, measurement context, and stores by softwareType/category. |
|
||||||
|
| `getChildrenOfType` | `(softwareType, category?)` | `array` | Get children filtered by software type and optional category |
|
||||||
|
| `getChildById` | `(childId)` | `object \| null` | Lookup a single child by its ID |
|
||||||
|
| `getAllChildren` | `()` | `array` | All registered children |
|
||||||
|
| `logChildStructure` | `()` | `void` | Debug-print the full child tree |
|
||||||
|
|
||||||
|
### Example
|
||||||
|
|
||||||
|
```js
|
||||||
|
const { childRegistrationUtils: CRU } = require('generalFunctions');
|
||||||
|
const cru = new CRU(parentNode);
|
||||||
|
await cru.registerChild(sensorNode, 'upstream');
|
||||||
|
cru.getChildrenOfType('measurement'); // [sensorNode]
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## MenuUtils
|
||||||
|
|
||||||
|
Browser-side UI helper for Node-RED editor. Methods are mixed in from separate modules: toggles, data fetching, URL utils, dropdown population, and HTML generation.
|
||||||
|
|
||||||
|
**File:** `src/helper/menuUtils.js`
|
||||||
|
|
||||||
|
### Constructor
|
||||||
|
|
||||||
|
```js
|
||||||
|
new MenuUtils() // no parameters; sets isCloud=false, configData=null
|
||||||
|
```
|
||||||
|
|
||||||
|
### Key Methods
|
||||||
|
|
||||||
|
**Toggles** -- control UI element visibility:
|
||||||
|
|
||||||
|
| Method | Signature | Description |
|
||||||
|
|---|---|---|
|
||||||
|
| `initBasicToggles` | `(elements)` | Bind log-level row visibility to log checkbox |
|
||||||
|
| `initMeasurementToggles` | `(elements)` | Bind scaling input rows to scaling checkbox |
|
||||||
|
| `initTensionToggles` | `(elements, node)` | Show/hide tension row based on interpolation method |
|
||||||
|
|
||||||
|
**Data Fetching:**
|
||||||
|
|
||||||
|
| Method | Signature | Returns | Description |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `fetchData` | `(url, fallbackUrl)` | `Promise<array>` | Fetch JSON from primary URL; fall back on failure |
|
||||||
|
| `fetchProjectData` | `(url)` | `Promise<object>` | Fetch project-level data |
|
||||||
|
| `apiCall` | `(node)` | `Promise<object>` | POST to asset-register API |
|
||||||
|
|
||||||
|
**URL Construction:**
|
||||||
|
|
||||||
|
| Method | Signature | Returns | Description |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `getSpecificConfigUrl` | `(nodeName, cloudAPI)` | `{ cloudConfigURL, localConfigURL }` | Build cloud + local config URLs |
|
||||||
|
| `constructUrl` | `(base, ...paths)` | `string` | Join URL segments safely |
|
||||||
|
| `constructCloudURL` | `(base, ...paths)` | `string` | Same as `constructUrl`, for cloud endpoints |
|
||||||
|
|
||||||
|
**Dropdown Population:**
|
||||||
|
|
||||||
|
| Method | Signature | Description |
|
||||||
|
|---|---|---|
|
||||||
|
| `fetchAndPopulateDropdowns` | `(configUrls, elements, node)` | Cascading supplier > subType > model > unit dropdowns |
|
||||||
|
| `populateDropdown` | `(htmlElement, options, node, property, callback?)` | Fill a `<select>` with options and wire change events |
|
||||||
|
| `populateLogLevelOptions` | `(logLevelSelect, configData, node)` | Populate log-level dropdown from config |
|
||||||
|
| `populateSmoothingMethods` | `(configUrls, elements, node)` | Populate smoothing method dropdown |
|
||||||
|
| `populateInterpolationMethods` | `(configUrls, elements, node)` | Populate interpolation method dropdown |
|
||||||
|
| `generateHtml` | `(htmlElement, options, savedValue)` | Write `<option>` HTML into an element |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## EndpointUtils
|
||||||
|
|
||||||
|
Server-side helper that serves `MenuUtils` as browser JavaScript via Node-RED HTTP endpoints.
|
||||||
|
|
||||||
|
**File:** `src/helper/endpointUtils.js`
|
||||||
|
|
||||||
|
### Constructor
|
||||||
|
|
||||||
|
```js
|
||||||
|
new EndpointUtils({ MenuUtilsClass? })
|
||||||
|
```
|
||||||
|
|
||||||
|
| Param | Type | Default | Description |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `MenuUtilsClass` | `class` | `MenuUtils` | The MenuUtils constructor to introspect |
|
||||||
|
|
||||||
|
### Methods
|
||||||
|
|
||||||
|
| Method | Signature | Returns | Description |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `createMenuUtilsEndpoint` | `(RED, nodeName, customHelpers?)` | `void` | Register `GET /<nodeName>/resources/menuUtils.js` |
|
||||||
|
| `generateMenuUtilsCode` | `(nodeName, customHelpers?)` | `string` | Produce the browser JS string (introspects `MenuUtils.prototype`) |
|
||||||
|
|
||||||
|
### Example
|
||||||
|
|
||||||
|
```js
|
||||||
|
const EndpointUtils = require('generalFunctions/src/helper/endpointUtils');
|
||||||
|
const ep = new EndpointUtils();
|
||||||
|
ep.createMenuUtilsEndpoint(RED, 'valve');
|
||||||
|
// Browser can now load: GET /valve/resources/menuUtils.js
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Positions
|
||||||
|
|
||||||
|
Canonical constants for parent-child spatial relationships.
|
||||||
|
|
||||||
|
**File:** `src/constants/positions.js`
|
||||||
|
|
||||||
|
### Exports
|
||||||
|
|
||||||
|
```js
|
||||||
|
const { POSITIONS, POSITION_VALUES, isValidPosition } = require('generalFunctions');
|
||||||
|
```
|
||||||
|
|
||||||
|
| Export | Type | Value |
|
||||||
|
|---|---|---|
|
||||||
|
| `POSITIONS` | `object` | `{ UPSTREAM: 'upstream', DOWNSTREAM: 'downstream', AT_EQUIPMENT: 'atEquipment', DELTA: 'delta' }` |
|
||||||
|
| `POSITION_VALUES` | `string[]` | `['upstream', 'downstream', 'atEquipment', 'delta']` |
|
||||||
|
| `isValidPosition` | `(pos: string): boolean` | Returns `true` if `pos` is one of the four values |
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## AssetLoader / loadCurve
|
||||||
|
|
||||||
|
Loads JSON asset files (machine curves, etc.) from the datasets directory with LRU caching.
|
||||||
|
|
||||||
|
**File:** `datasets/assetData/curves/index.js`
|
||||||
|
|
||||||
|
### Singleton convenience functions
|
||||||
|
|
||||||
|
```js
|
||||||
|
const { loadCurve } = require('generalFunctions');
|
||||||
|
```
|
||||||
|
|
||||||
|
| Function | Signature | Returns | Description |
|
||||||
|
|---|---|---|---|
|
||||||
|
| `loadCurve` | `(curveType: string)` | `object \| null` | Load `<curveType>.json` from the curves directory |
|
||||||
|
| `loadAsset` | `(datasetType, assetId)` | `object \| null` | Load any JSON asset by dataset folder and ID |
|
||||||
|
| `getAvailableAssets` | `(datasetType)` | `string[]` | List asset IDs in a dataset folder |
|
||||||
|
|
||||||
|
### AssetLoader class
|
||||||
|
|
||||||
|
```js
|
||||||
|
new AssetLoader(maxCacheSize = 100)
|
||||||
|
```
|
||||||
|
|
||||||
|
Same methods as above (`loadCurve`, `loadAsset`, `getAvailableAssets`), plus `clearCache()`.
|
||||||
|
|
||||||
|
### Example
|
||||||
|
|
||||||
|
```js
|
||||||
|
const { loadCurve } = require('generalFunctions');
|
||||||
|
const curve = loadCurve('hidrostal-H05K-S03R');
|
||||||
|
// curve = { flow: [...], head: [...], ... } or null
|
||||||
|
```
|
||||||
38
wiki/findings/bep-gravitation-proof.md
Normal file
38
wiki/findings/bep-gravitation-proof.md
Normal file
@@ -0,0 +1,38 @@
|
|||||||
|
---
|
||||||
|
title: BEP-Gravitation Optimality Proof
|
||||||
|
created: 2026-04-07
|
||||||
|
updated: 2026-04-07
|
||||||
|
status: proven
|
||||||
|
tags: [machineGroupControl, optimization, BEP, brute-force]
|
||||||
|
sources: [nodes/machineGroupControl/test/integration/distribution-power-table.integration.test.js]
|
||||||
|
---
|
||||||
|
|
||||||
|
# BEP-Gravitation vs Brute-Force Global Optimum
|
||||||
|
|
||||||
|
## Claim
|
||||||
|
The machineGroupControl BEP-Gravitation algorithm (with marginal-cost refinement) produces near-optimal flow distribution across a pump group.
|
||||||
|
|
||||||
|
## Method
|
||||||
|
Brute-force exhaustive search: 1000 steps per pump, all 2^n combinations, 0.05% flow tolerance. Station: 2x H05K-S03R + 1x C5-D03R-SHN1 @ ΔP=2000 mbar.
|
||||||
|
|
||||||
|
## Results
|
||||||
|
|
||||||
|
| Demand | Brute force | machineGroupControl | Gap |
|
||||||
|
|--------|------------|--------------------|----|
|
||||||
|
| 10% (71 m3/h) | 17.65 kW | 17.63 kW | -0.10% (MGC wins) |
|
||||||
|
| 25% (136 m3/h) | 34.33 kW | 34.33 kW | +0.01% |
|
||||||
|
| 50% (243 m3/h) | 61.62 kW | 61.62 kW | -0.00% |
|
||||||
|
| 75% (351 m3/h) | 96.01 kW | 96.10 kW | +0.08% |
|
||||||
|
| 90% (415 m3/h) | 122.17 kW | 122.26 kW | +0.07% |
|
||||||
|
|
||||||
|
Maximum deviation: **0.1%** from proven global optimum.
|
||||||
|
|
||||||
|
## Why the Refinement Matters
|
||||||
|
|
||||||
|
Before the marginal-cost refinement loop, the gap at 50% demand was **2.12%**. The BEP-Gravitation slope estimate pushed 14.6 m3/h to C5 (costing 5.0 kW) when the optimum was 6.5 m3/h (0.59 kW). The refinement loop corrects this by shifting flow from highest actual dP/dQ to lowest until no improvement is possible.
|
||||||
|
|
||||||
|
## Stability
|
||||||
|
Sweep 5-95% in 2% steps: 1 switch (rising), 1 switch (falling), same transition point. No hysteresis. See [[Pump Switching Stability]].
|
||||||
|
|
||||||
|
## Computational Cost
|
||||||
|
0.027-0.153ms median per optimization call (3 pumps, 6 combinations). Uses 0.015% of the 1000ms tick budget.
|
||||||
34
wiki/findings/curve-non-convexity.md
Normal file
34
wiki/findings/curve-non-convexity.md
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
---
|
||||||
|
title: Pump Curve Non-Convexity
|
||||||
|
created: 2026-04-07
|
||||||
|
updated: 2026-04-07
|
||||||
|
status: proven
|
||||||
|
tags: [curves, interpolation, C5, non-convex]
|
||||||
|
sources: [nodes/generalFunctions/datasets/assetData/curves/hidrostal-C5-D03R-SHN1.json]
|
||||||
|
---
|
||||||
|
|
||||||
|
# Pump Curve Non-Convexity from Sparse Data
|
||||||
|
|
||||||
|
## Finding
|
||||||
|
The C5-D03R-SHN1 pump's power curve is non-convex after spline interpolation. The marginal cost (dP/dQ) shows a spike-then-valley pattern:
|
||||||
|
|
||||||
|
```
|
||||||
|
C5 dP/dQ across flow range @ ΔP=2000 mbar:
|
||||||
|
6.4 m3/h → 1,316,610 (high)
|
||||||
|
10.2 m3/h → 2,199,349 (spikes UP)
|
||||||
|
17.7 m3/h → 1,114,700 (dropping)
|
||||||
|
21.5 m3/h → 453,316 (valley — cheapest)
|
||||||
|
29.0 m3/h → 1,048,375 (rising again)
|
||||||
|
44.1 m3/h → 1,107,708 (high)
|
||||||
|
```
|
||||||
|
|
||||||
|
## Root Cause
|
||||||
|
The C5 curve has only **5 raw data points** per pressure level. The monotonic cubic spline (Fritsch-Carlson) creates a smooth curve through all 5 points, but with such sparse data it introduces non-convex regions that don't match the physical convexity of a real pump.
|
||||||
|
|
||||||
|
## Impact
|
||||||
|
- The equal-marginal-cost theorem (KKT conditions) does not apply — it requires convexity
|
||||||
|
- The BEP-Gravitation slope estimate at a single point can be misleading in non-convex regions
|
||||||
|
- The marginal-cost refinement loop fixes this by using actual power evaluations instead of slope assumptions
|
||||||
|
|
||||||
|
## Recommendation
|
||||||
|
Add more data points (15-20 per pressure level) to the C5 curve. This would make the spline track the real convex physics more closely, eliminating the non-convex artifacts.
|
||||||
42
wiki/findings/ncog-behavior.md
Normal file
42
wiki/findings/ncog-behavior.md
Normal file
@@ -0,0 +1,42 @@
|
|||||||
|
---
|
||||||
|
title: NCog Behavior and Limitations
|
||||||
|
created: 2026-04-07
|
||||||
|
updated: 2026-04-07
|
||||||
|
status: evolving
|
||||||
|
tags: [rotatingMachine, NCog, BEP, efficiency]
|
||||||
|
sources: [nodes/rotatingMachine/src/specificClass.js]
|
||||||
|
---
|
||||||
|
|
||||||
|
# NCog — Normalized Center of Gravity
|
||||||
|
|
||||||
|
## What It Is
|
||||||
|
NCog is a 0-1 value indicating where on its flow range a pump operates most efficiently. Computed per tick from the current pressure slice of the 3D pump curve.
|
||||||
|
|
||||||
|
```
|
||||||
|
BEP_flow = minFlow + (maxFlow - minFlow) * NCog
|
||||||
|
```
|
||||||
|
|
||||||
|
## How It's Computed
|
||||||
|
1. Pressure sensors update → `getMeasuredPressure()` computes differential
|
||||||
|
2. `fDimension` locks the 2D slice at current system pressure
|
||||||
|
3. `calcCog()` computes Q/P (specific flow) across the curve
|
||||||
|
4. Peak Q/P index → `NCog = (flowAtPeak - flowMin) / (flowMax - flowMin)`
|
||||||
|
|
||||||
|
## When NCog is Meaningful
|
||||||
|
NCog requires **differential pressure** (upstream + downstream). With only one pressure sensor, fDimension is the raw sensor value (too high), producing a monotonic Q/P curve and NCog = 0.
|
||||||
|
|
||||||
|
| Condition | NCog for H05K | NCog for C5 |
|
||||||
|
|-----------|--------------|-------------|
|
||||||
|
| ΔP = 400 mbar | 0.333 | 0.355 |
|
||||||
|
| ΔP = 1000 mbar | 0.000 | 0.000 |
|
||||||
|
| ΔP = 1500 mbar | 0.135 | 0.000 |
|
||||||
|
| ΔP = 2000 mbar | 0.351 | 0.000 |
|
||||||
|
|
||||||
|
## Why NCog = 0 Happens
|
||||||
|
For variable-speed centrifugal pumps, Q/P is monotonically decreasing when the affinity laws dominate (P ∝ Q³). At certain pressure levels, the spline interpolation preserves this monotonicity and the peak is always at index 0 (minimum flow).
|
||||||
|
|
||||||
|
## How the machineGroupControl Uses NCog
|
||||||
|
The BEP-Gravitation algorithm seeds each pump at its BEP flow, then redistributes using slope-based weights + marginal-cost refinement. Even when NCog = 0, the slope redistribution produces near-optimal results because it uses actual power evaluations.
|
||||||
|
|
||||||
|
> [!warning] Disproven: NCog as proportional weight
|
||||||
|
> Using NCog directly as a flow-distribution weight (`flow = NCog/totalNCog * Qd`) is wrong. It starves pumps with NCog = 0 and overloads high-NCog pumps. See `calcBestCombination` in machineGroupControl.
|
||||||
88
wiki/findings/open-issues-2026-03.md
Normal file
88
wiki/findings/open-issues-2026-03.md
Normal file
@@ -0,0 +1,88 @@
|
|||||||
|
---
|
||||||
|
title: Open Issues — EVOLV Codebase
|
||||||
|
created: 2026-03-01
|
||||||
|
updated: 2026-04-07
|
||||||
|
status: evolving
|
||||||
|
tags: [issues, backlog]
|
||||||
|
---
|
||||||
|
|
||||||
|
# Open Issues — EVOLV Codebase
|
||||||
|
|
||||||
|
Issues identified during codebase scan (2026-03-12). Create these on Gitea when ready.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Issue 1: Restore diffuser node implementation
|
||||||
|
|
||||||
|
**Labels:** `enhancement`, `node`
|
||||||
|
**Priority:** Medium
|
||||||
|
|
||||||
|
The `nodes/diffuser/` directory contains only `.git`, `LICENSE`, and `README.md` — no implementation. There was a previous experimental version. Needs:
|
||||||
|
|
||||||
|
- Retrieve original diffuser logic from user/backup
|
||||||
|
- Rebuild to current three-layer architecture (wrapper `.js` + `src/nodeClass.js` + `src/specificClass.js`)
|
||||||
|
- Use `require('generalFunctions')` barrel imports
|
||||||
|
- Add config JSON in `generalFunctions/src/configs/diffuser.json`
|
||||||
|
- Register under category `'EVOLV'` with appropriate S88 color
|
||||||
|
- Add tests
|
||||||
|
|
||||||
|
**Blocked on:** User providing original diffuser logic/requirements.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Issue 2: Relocate prediction/ML modules to external service
|
||||||
|
|
||||||
|
**Labels:** `enhancement`, `architecture`
|
||||||
|
**Priority:** Medium
|
||||||
|
|
||||||
|
TensorFlow-based influent prediction code was removed from monster node (was broken/incomplete). The prediction functionality needs a new home:
|
||||||
|
|
||||||
|
- LSTM model for 24-hour flow prediction based on precipitation data
|
||||||
|
- Standardization constants: hours `(mean=11.504, std=6.922)`, precipitation `(mean=0.090, std=0.439)`, response `(mean=1188.01, std=1024.19)`
|
||||||
|
- Model was served from `http://127.0.0.1:1880/generalFunctions/datasets/lstmData/tfjs_model/`
|
||||||
|
- Consider: separate microservice, Python-based inference, or ONNX runtime
|
||||||
|
- Monster node should accept predictions via `model_prediction` message topic from external service
|
||||||
|
|
||||||
|
**Related files removed:** `monster_class.js` methods `get_model_prediction()`, `model_loader()`
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Issue 3: Modernize monster node to three-layer architecture
|
||||||
|
|
||||||
|
**Labels:** `refactor`, `node`
|
||||||
|
**Priority:** Low
|
||||||
|
|
||||||
|
Monster node uses old-style structure (`dependencies/monster/` instead of `src/`). Should be refactored:
|
||||||
|
|
||||||
|
- Move `dependencies/monster/monster_class.js` → `src/specificClass.js`
|
||||||
|
- Create `src/nodeClass.js` adapter (extract from `monster.js`)
|
||||||
|
- Slim down `monster.js` to standard wrapper pattern
|
||||||
|
- Move `monsterConfig.json` → `generalFunctions/src/configs/monster.json`
|
||||||
|
- Remove `modelLoader.js` (TF dependency removed)
|
||||||
|
- Add unit tests
|
||||||
|
|
||||||
|
**Note:** monster_class.js is ~500 lines of domain logic. Keep sampling_program(), aggregation, AQUON integration intact.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Issue 4: Clean up inline test/demo code in specificClass files
|
||||||
|
|
||||||
|
**Labels:** `cleanup`
|
||||||
|
**Priority:** Low
|
||||||
|
|
||||||
|
Several specificClass files have test/demo code after `module.exports`:
|
||||||
|
|
||||||
|
- `pumpingStation/src/specificClass.js` (lines 478-697): Demo code guarded with `require.main === module` — acceptable but could move to `test/` or `examples/`
|
||||||
|
- `machineGroupControl/src/specificClass.js` (lines 969-1158): Block-commented test code with `makeMachines()` — dead code, could be removed or moved to test file
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Issue 5: DashboardAPI node improvements
|
||||||
|
|
||||||
|
**Labels:** `enhancement`, `security`
|
||||||
|
**Priority:** Low
|
||||||
|
|
||||||
|
- Bearer token now relies on `GRAFANA_TOKEN` env var (hardcoded token was removed for security)
|
||||||
|
- Ensure deployment docs mention setting `GRAFANA_TOKEN`
|
||||||
|
- `dashboardapi_class.js` still has `console.log` calls (lines 154, 178) — should use logger
|
||||||
|
- Node doesn't follow three-layer architecture (older style)
|
||||||
34
wiki/findings/pump-switching-stability.md
Normal file
34
wiki/findings/pump-switching-stability.md
Normal file
@@ -0,0 +1,34 @@
|
|||||||
|
---
|
||||||
|
title: Pump Switching Stability
|
||||||
|
created: 2026-04-07
|
||||||
|
updated: 2026-04-07
|
||||||
|
status: proven
|
||||||
|
tags: [machineGroupControl, stability, switching]
|
||||||
|
sources: [nodes/machineGroupControl/test/integration/ncog-distribution.integration.test.js]
|
||||||
|
---
|
||||||
|
|
||||||
|
# Pump Switching Stability
|
||||||
|
|
||||||
|
## Concern
|
||||||
|
Frequent pump on/off cycling causes mechanical wear, water hammer, and process disturbance.
|
||||||
|
|
||||||
|
## Test Method
|
||||||
|
Sweep demand from 5% to 95% in 2% steps, count combination changes. Repeat in reverse to check for hysteresis.
|
||||||
|
|
||||||
|
## Results — Mixed Station (2x H05K + 1x C5)
|
||||||
|
|
||||||
|
Rising 5→95%: **1 switch** at 27% (H05K-1+C5 → all 3)
|
||||||
|
Falling 95→5%: **1 switch** at 25% (all 3 → H05K-1+C5)
|
||||||
|
|
||||||
|
Same transition zone, no hysteresis.
|
||||||
|
|
||||||
|
## Results — Equal Station (3x H05K)
|
||||||
|
|
||||||
|
Rising 5→95%: **2 switches**
|
||||||
|
- 19%: 1 pump → 2 pumps
|
||||||
|
- 37%: 2 pumps → 3 pumps
|
||||||
|
|
||||||
|
Clean monotonic transitions, no flickering.
|
||||||
|
|
||||||
|
## Why It's Stable
|
||||||
|
The marginal-cost refinement only adjusts flow distribution WITHIN a combination — it never changes which pumps are selected. Combination selection is driven by total power comparison, which changes smoothly with demand.
|
||||||
57
wiki/index.md
Normal file
57
wiki/index.md
Normal file
@@ -0,0 +1,57 @@
|
|||||||
|
---
|
||||||
|
title: Wiki Index
|
||||||
|
updated: 2026-04-13
|
||||||
|
---
|
||||||
|
|
||||||
|
# EVOLV Project Wiki Index
|
||||||
|
|
||||||
|
## Overview
|
||||||
|
- [Project Overview](overview.md) — what works, what doesn't, node inventory
|
||||||
|
- [Metrics Dashboard](metrics.md) — test counts, power comparison, performance
|
||||||
|
- [Knowledge Graph](knowledge-graph.yaml) — structured data, machine-queryable
|
||||||
|
|
||||||
|
## Architecture
|
||||||
|
- [Node Architecture](architecture/node-architecture.md) — three-layer pattern, ports, mermaid diagrams
|
||||||
|
- [3D Pump Curves](architecture/3d-pump-curves.md) — predict class, spline interpolation, unit chain
|
||||||
|
- [Group Optimization](architecture/group-optimization.md) — BEP-Gravitation, combination selection, marginal-cost refinement
|
||||||
|
- [Platform Overview](architecture/platform-overview.md) — edge/site/central layering, telemetry model
|
||||||
|
- [Deployment Blueprint](architecture/deployment-blueprint.md) — Docker topology, rollout order
|
||||||
|
- [Stack Review](architecture/stack-review.md) — full stack architecture assessment
|
||||||
|
|
||||||
|
## Core Concepts
|
||||||
|
- [generalFunctions API](concepts/generalfunctions-api.md) — logger, MeasurementContainer, configManager, etc.
|
||||||
|
- [Pump Affinity Laws](concepts/pump-affinity-laws.md) — Q ∝ N, H ∝ N², P ∝ N³
|
||||||
|
- [ASM Models](concepts/asm-models.md) — activated sludge model kinetics
|
||||||
|
- [PID Control Theory](concepts/pid-control-theory.md) — proportional-integral-derivative control
|
||||||
|
- [Settling Models](concepts/settling-models.md) — secondary clarifier sludge settling
|
||||||
|
- [Signal Processing for Sensors](concepts/signal-processing-sensors.md) — sensor conditioning
|
||||||
|
- [InfluxDB Schema Design](concepts/influxdb-schema-design.md) — telemetry data model
|
||||||
|
- [OT Security (IEC 62443)](concepts/ot-security-iec62443.md) — industrial security standard
|
||||||
|
- [Wastewater Compliance NL](concepts/wastewater-compliance-nl.md) — Dutch regulatory requirements
|
||||||
|
|
||||||
|
## Findings
|
||||||
|
- [BEP-Gravitation Proof](findings/bep-gravitation-proof.md) — within 0.1% of brute-force optimum (proven)
|
||||||
|
- [NCog Behavior](findings/ncog-behavior.md) — when NCog works, when it's zero, how it's used (evolving)
|
||||||
|
- [Curve Non-Convexity](findings/curve-non-convexity.md) — C5 sparse data artifacts (proven)
|
||||||
|
- [Pump Switching Stability](findings/pump-switching-stability.md) — 1-2 transitions, no hysteresis (proven)
|
||||||
|
- [Open Issues (2026-03)](findings/open-issues-2026-03.md) — diffuser, monster refactor, ML relocation, etc.
|
||||||
|
|
||||||
|
## Manuals
|
||||||
|
- [rotatingMachine User Manual](manuals/nodes/rotatingMachine.md) — inputs, outputs, state machine, examples
|
||||||
|
- [measurement User Manual](manuals/nodes/measurement.md) — analog + digital modes, smoothing, outlier filtering
|
||||||
|
- [FlowFuse Dashboard Layout](manuals/node-red/flowfuse-dashboard-layout-manual.md)
|
||||||
|
- [FlowFuse Widget Catalog](manuals/node-red/flowfuse-widgets-catalog.md)
|
||||||
|
- [Node-RED Function Patterns](manuals/node-red/function-node-patterns.md)
|
||||||
|
- [Node-RED Runtime](manuals/node-red/runtime-node-js.md)
|
||||||
|
- [Messages and Editor Structure](manuals/node-red/messages-and-editor-structure.md)
|
||||||
|
|
||||||
|
## Sessions
|
||||||
|
- [2026-04-07: Production Hardening](sessions/2026-04-07-production-hardening.md) — rotatingMachine + machineGroupControl
|
||||||
|
- [2026-04-13: rotatingMachine Trial-Ready](sessions/2026-04-13-rotatingMachine-trial-ready.md) — FSM interruptibility, config schema sync, UX polish, dual-curve tests
|
||||||
|
- [2026-04-13: measurement Digital Mode](sessions/2026-04-13-measurement-digital-mode.md) — silent dispatcher bug fix, 59 new tests, MQTT-style multi-channel input mode
|
||||||
|
|
||||||
|
## Other Documentation (outside wiki)
|
||||||
|
- `CLAUDE.md` — Claude Code project guide (root)
|
||||||
|
- `.agents/AGENTS.md` — agent routing table, orchestrator policy
|
||||||
|
- `.agents/` — skills, decisions, function-anchors, improvements
|
||||||
|
- `.claude/` — Claude Code agents and rules
|
||||||
161
wiki/knowledge-graph.yaml
Normal file
161
wiki/knowledge-graph.yaml
Normal file
@@ -0,0 +1,161 @@
|
|||||||
|
# Knowledge Graph — structured data with provenance
|
||||||
|
# Every claim has: value, source (file/commit), date, status
|
||||||
|
|
||||||
|
# ── TESTS ──
|
||||||
|
tests:
|
||||||
|
rotatingMachine:
|
||||||
|
basic:
|
||||||
|
count: 10
|
||||||
|
passing: 10
|
||||||
|
file: nodes/rotatingMachine/test/basic/
|
||||||
|
date: 2026-04-07
|
||||||
|
integration:
|
||||||
|
count: 16
|
||||||
|
passing: 16
|
||||||
|
file: nodes/rotatingMachine/test/integration/
|
||||||
|
date: 2026-04-07
|
||||||
|
edge:
|
||||||
|
count: 17
|
||||||
|
passing: 17
|
||||||
|
file: nodes/rotatingMachine/test/edge/
|
||||||
|
date: 2026-04-07
|
||||||
|
machineGroupControl:
|
||||||
|
basic:
|
||||||
|
count: 1
|
||||||
|
passing: 1
|
||||||
|
file: nodes/machineGroupControl/test/basic/
|
||||||
|
date: 2026-04-07
|
||||||
|
integration:
|
||||||
|
count: 3
|
||||||
|
passing: 3
|
||||||
|
file: nodes/machineGroupControl/test/integration/
|
||||||
|
date: 2026-04-07
|
||||||
|
edge:
|
||||||
|
count: 1
|
||||||
|
passing: 1
|
||||||
|
file: nodes/machineGroupControl/test/edge/
|
||||||
|
date: 2026-04-07
|
||||||
|
|
||||||
|
# ── METRICS ──
|
||||||
|
metrics:
|
||||||
|
optimization_gap_to_brute_force:
|
||||||
|
value: "0.1% max"
|
||||||
|
source: distribution-power-table.integration.test.js
|
||||||
|
date: 2026-04-07
|
||||||
|
conditions: "3 pumps, 1000-step brute force, 0.05% flow tolerance"
|
||||||
|
optimization_time_median:
|
||||||
|
value: "0.027-0.153ms"
|
||||||
|
source: benchmark script
|
||||||
|
date: 2026-04-07
|
||||||
|
conditions: "3 pumps, 6 combinations, BEP-Gravitation + refinement"
|
||||||
|
pump_switching_stability:
|
||||||
|
value: "1-2 transitions across 5-95% demand"
|
||||||
|
source: stability sweep
|
||||||
|
date: 2026-04-07
|
||||||
|
conditions: "2% demand steps, both ascending and descending"
|
||||||
|
pump_curves:
|
||||||
|
H05K-S03R:
|
||||||
|
pressure_levels: 33
|
||||||
|
pressure_range: "700-3900 mbar"
|
||||||
|
flow_range: "28-227 m3/h (at 2000 mbar)"
|
||||||
|
data_points_per_level: 5
|
||||||
|
anomalies_fixed: 3
|
||||||
|
date: 2026-04-07
|
||||||
|
C5-D03R-SHN1:
|
||||||
|
pressure_levels: 26
|
||||||
|
pressure_range: "400-2900 mbar"
|
||||||
|
flow_range: "6-53 m3/h"
|
||||||
|
data_points_per_level: 5
|
||||||
|
non_convex: true
|
||||||
|
date: 2026-04-07
|
||||||
|
|
||||||
|
# ── DISPROVEN CLAIMS ──
|
||||||
|
disproven:
|
||||||
|
ncog_proportional_weight:
|
||||||
|
claimed: "Distributing flow proportional to NCog weights is optimal"
|
||||||
|
claimed_date: 2026-04-07
|
||||||
|
disproven_date: 2026-04-07
|
||||||
|
evidence_for: "Simple implementation in calcBestCombination"
|
||||||
|
evidence_against: "Starves small pumps (NCog=0 gets zero flow), overloads large pumps at high demand. BEP-target + scale is correct approach."
|
||||||
|
root_cause: "NCog is a position indicator (0-1 on flow range), not a distribution weight"
|
||||||
|
efficiency_rounding:
|
||||||
|
claimed: "Math.round(flow/power * 100) / 100 preserves BEP signal"
|
||||||
|
claimed_date: pre-2026-04-07
|
||||||
|
disproven_date: 2026-04-07
|
||||||
|
evidence_for: "Removes floating point noise"
|
||||||
|
evidence_against: "In canonical units (m3/s and W), Q/P ratio is ~1e-6. Rounding to 2 decimals produces 0 for all points. NCog, cog, BEP all became 0."
|
||||||
|
root_cause: "Canonical units make the ratio very small — rounding destroys the signal"
|
||||||
|
equal_marginal_cost_optimal:
|
||||||
|
claimed: "Equal dP/dQ across pumps guarantees global power minimum"
|
||||||
|
claimed_date: 2026-04-07
|
||||||
|
disproven_date: 2026-04-07
|
||||||
|
evidence_for: "KKT conditions for convex functions"
|
||||||
|
evidence_against: "C5 pump curve is non-convex (dP/dQ dips from 1.3M to 453K then rises). Sparse data (5 points) causes spline artifacts."
|
||||||
|
root_cause: "Convexity assumption fails with interpolated curves from sparse data"
|
||||||
|
|
||||||
|
# ── PERFORMANCE ──
|
||||||
|
performance:
|
||||||
|
mgc_optimization:
|
||||||
|
median_ms: 0.09
|
||||||
|
p99_ms: 0.5
|
||||||
|
tick_budget_pct: 0.015
|
||||||
|
source: benchmark script
|
||||||
|
date: 2026-04-07
|
||||||
|
predict_y_call:
|
||||||
|
complexity: "O(log n), ~O(1) for 5-10 data points"
|
||||||
|
source: predict_class.js
|
||||||
|
|
||||||
|
# ── ARCHITECTURE ──
|
||||||
|
architecture:
|
||||||
|
canonical_units:
|
||||||
|
pressure: Pa
|
||||||
|
flow: "m3/s"
|
||||||
|
power: W
|
||||||
|
temperature: K
|
||||||
|
output_units:
|
||||||
|
pressure: mbar
|
||||||
|
flow: "m3/h"
|
||||||
|
power: kW
|
||||||
|
temperature: C
|
||||||
|
node_count: 13
|
||||||
|
submodules: 12
|
||||||
|
|
||||||
|
# ── BUGS FIXED ──
|
||||||
|
bugs_fixed:
|
||||||
|
flowmovement_unit_mismatch:
|
||||||
|
severity: critical
|
||||||
|
description: "machineGroupControl sent flow in canonical (m3/s) but rotatingMachine flowmovement expected output units (m3/h). Every pump stayed at minimum."
|
||||||
|
fix: "_canonicalToOutputFlow() conversion before all flowmovement calls"
|
||||||
|
commit: d55f401
|
||||||
|
date: 2026-04-07
|
||||||
|
emergencystop_case:
|
||||||
|
severity: critical
|
||||||
|
description: "specificClass called executeSequence('emergencyStop') but config key was 'emergencystop'"
|
||||||
|
fix: "Lowercase to match config"
|
||||||
|
commit: 07af7ce
|
||||||
|
date: 2026-04-07
|
||||||
|
curve_data_anomalies:
|
||||||
|
severity: high
|
||||||
|
description: "3 flow values leaked into power column in hidrostal-H05K-S03R.json at pressures 1600, 3200, 3300 mbar"
|
||||||
|
fix: "Linearly interpolated correct values from adjacent levels"
|
||||||
|
commit: 024db55
|
||||||
|
date: 2026-04-07
|
||||||
|
efficiency_rounding:
|
||||||
|
severity: high
|
||||||
|
description: "Math.round(Q/P * 100) / 100 destroyed all NCog/BEP calculations"
|
||||||
|
fix: "Removed rounding, use raw ratio"
|
||||||
|
commit: 07af7ce
|
||||||
|
date: 2026-04-07
|
||||||
|
absolute_scaling_bug:
|
||||||
|
severity: high
|
||||||
|
description: "handleInput compared demandQout (always 0) instead of demandQ for max cap"
|
||||||
|
fix: "Reordered conditions, use demandQ throughout"
|
||||||
|
commit: d55f401
|
||||||
|
date: 2026-04-07
|
||||||
|
|
||||||
|
# ── TIMELINE ──
|
||||||
|
timeline:
|
||||||
|
- {date: 2026-04-07, commit: 024db55, desc: "Fix 3 anomalous power values in hidrostal curve"}
|
||||||
|
- {date: 2026-04-07, commit: 07af7ce, desc: "rotatingMachine production hardening: safety + prediction + 43 tests"}
|
||||||
|
- {date: 2026-04-07, commit: d55f401, desc: "machineGroupControl: unit fix + refinement + stability tests"}
|
||||||
|
- {date: 2026-04-07, commit: fd9d167, desc: "Update EVOLV submodule refs"}
|
||||||
11
wiki/log.md
Normal file
11
wiki/log.md
Normal file
@@ -0,0 +1,11 @@
|
|||||||
|
---
|
||||||
|
title: Wiki Log
|
||||||
|
---
|
||||||
|
|
||||||
|
# Wiki Log
|
||||||
|
|
||||||
|
## [2026-04-07] Wiki initialized | Full codebase scan + session findings
|
||||||
|
- Created overview, metrics, knowledge graph from production hardening session
|
||||||
|
- Architecture pages: 3D pump curves, group optimization
|
||||||
|
- Findings: BEP-Gravitation proof, NCog behavior, curve non-convexity, switching stability
|
||||||
|
- Session log: 2026-04-07 production hardening
|
||||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user