Relocate prediction/ML modules to external service #28

Open
opened 2026-03-12 08:36:54 +00:00 by vps1_gitea_admin · 0 comments

TensorFlow-based influent prediction code was removed from monster node (was broken/incomplete). The prediction functionality needs a new home.

Details:

  • LSTM model for 24-hour flow prediction based on precipitation data
  • Standardization constants: hours (mean=11.504, std=6.922), precipitation (mean=0.090, std=0.439), response (mean=1188.01, std=1024.19)
  • Model was served from http://127.0.0.1:1880/generalFunctions/datasets/lstmData/tfjs_model/
  • Consider: separate microservice, Python-based inference, or ONNX runtime
  • Monster node should accept predictions via model_prediction message topic from external service

Related files removed: monster_class.js methods get_model_prediction(), model_loader()

TensorFlow-based influent prediction code was removed from monster node (was broken/incomplete). The prediction functionality needs a new home. ### Details: - LSTM model for 24-hour flow prediction based on precipitation data - Standardization constants: hours `(mean=11.504, std=6.922)`, precipitation `(mean=0.090, std=0.439)`, response `(mean=1188.01, std=1024.19)` - Model was served from `http://127.0.0.1:1880/generalFunctions/datasets/lstmData/tfjs_model/` - Consider: separate microservice, Python-based inference, or ONNX runtime - Monster node should accept predictions via `model_prediction` message topic from external service **Related files removed:** `monster_class.js` methods `get_model_prediction()`, `model_loader()`
Sign in to join this conversation.