During the Panel Discussion of Node-Red Con 2025 it was called out that a way to ‘simulate’ would be useful. Got me thinking about Steinbergs VST standard for MIDI / analogue device emulations
‘The Gist’:Device-simulation interface for Node-RED, a standard “plugin” model (like VSTs for digital audio workstation) that lets the community and manufacturers simulate or emulate hardware devices for testing, demos, and digital twin workflows before connecting to the real thing/device/software interface.
Napkin Pitch
Maybe Node-RED could treat virtual devices the same way digital audio workstations treat VST plugins.
You drop in a simulated sensor, PLC, or hardware controller, or even a 3rd party software emulator instead of the ‘real’ one. Demo flow runs, tests, and demos perfectly and for manufacturer demo what can be done with their device
When the real hardware arrives, or you want to deploy with a real device, you just swap the node and go live. It’s the same flow, same logic, only the connection changes (in theory)
This feature (Standard?) could open another angle to a new ecosystem of virtual devices and digital twins for Node-RED.
Bonus - visually show the device to see what is triggering, in/outputs, switches, LEDs are active/inactive, simulating 'connecting' the device to show - don't do's or alternative setups.
It would indeed be great to see manufacturers producing virtual devices. Though I'd go further and recommend that they produce node.js compatible virtual devices that can then be consumed by node-red. That would be more of a "sell" for them.
However, I don't think this really needs anything special on the node-red side? It is already capable of having nodes that provide inputs to flows either controlled by a flow or be external events. So really, all this needs is for someone to start building the virtual devices.
So contrary to my previous statement, this probably would need something extra. That is, it might need something if you wanted to provide "real-time" control over the inputs as opposed to the current deploy-time control provided in the Editor.
Some kind of standards for building the controller interface perhaps? At the moment, we have generic UI's that can be created via UIBUILDER, Dashboard or http-in/-response node pairs, but nothing that would easily build and present a controller interface explicitly.
UIBUILDER would be the closest of the current offerings since it allows single-page, multi-page and multi-instance setups each with their own comms interfaces and supporting external libraries, but that isn't something more focused for controller interfaces though it could certainly provide such things. UIBUILDER has its low-code interface which could facilitate conversion of a standardised schema for describing control UI's and dedicated or generic uibuilder nodes could certainly be created to accommodate this. But work would still be needed to define the schema (unless someone knows of something already existing?).