Idea for Flowfuse + Node-RED + XR/AR glasses for Field Agents

Can Flowfuse create QR codes for registered devices? If it doesn't do it currently, maybe it is a feature that could be considered soon, thinking about this future where spatial computing is used to interact with edge devices

Another feature I thought is the ability to place virtual markers in the environment for each device to quickly allow users to locate them in the factory/environment through the glasses.

Does this make sense? I used chatgpt to help me make my idea more cohesive. I had this idea because I remember salesforce created solutions for field agents using a mobile app. Maybe flowfuse could also give some power to field agents using AR.

Although FlowFuse and Node-RED are built to enable remote maintenance of edge devices, there are many scenarios where on-site access is still necessary — whether due to connectivity issues, security restrictions, or the need for physical interaction with hardware. That’s where augmented reality (AR) can take edge management to the next level.

Imagine a field technician approaching an edge device. A QR code, generated by FlowFuse, is attached to the device. When scanned or simply recognized by the technician’s AR glasses, a floating prompt appears:
“Would you like to access this device?”

By saying “Yes” or tapping a virtual button, the technician instantly sees the Node-RED editor open in their field of view — right there on-site, hands-free, and context-aware. They can inspect, adjust, or deploy flows while standing next to the physical hardware.

Even more, the AR interface could support visualizing Dashboard v2 — allowing technicians to monitor live sensor data, device health, and performance metrics directly within their environment. Charts, logs, and alerts could float beside the equipment they relate to, creating an intuitive digital overlay of the device’s internal state.

Benefits:

  • Faster field diagnostics: Technicians can act immediately without needing to contact remote teams or open a laptop.

  • Enhanced situational awareness: Real-time data appears in spatial context, helping connect physical issues to digital flows.

  • Hands-free interaction: Voice commands and gaze-based UI reduce friction during repairs or inspections.

  • Support for restricted environments: Enables flow access in places without remote connectivity or where security limits external access.

  • Training and onboarding: AR can assist less experienced staff by visually guiding them through flows, errors, and possible fixes.

In short, while FlowFuse powers scalable, remote edge management, AR brings that power to the field — combining real-time context, flow editing, and live dashboards into a single immersive experience.

1 Like

What glasses do you use? In the future I would like to try to build something based on three.js but I can't get the latest version to work with dashboard 2.0 (like https://aaqudemo1.bieda.it/).

I have a Pico 4, but its pass-through isn't suitable for industrial floors because it is blurry, has too much distortion and it doesn't work in low light conditions. I was thinking about this feature for future smart glasses built for industrial applications that have xr/ar/vr capabilities. I believe that a future where field agents start to use smart glasses with xr capabilities is not distant, and that flowfuse could take advantage of it by building one of the first enterprise applications for smart factories.

1 Like

A few years before I left IBM research we did explore using AR and VR for field engineering diagnostics and interaction, much as you describe - to scan or recognise a piece of equipment and from that to offer up things like the manuals / common faults / and safety status so you could check current "live" status or flow rates / temperatures etc. - And also to "see through walls" to trace pipes etc. Of course we did use Node-RED to create some of the interactions and "live" demonstations - and some folk did look at exposing Node-RED inside things like the HoloLens - but we found that exposing the flow was not really useful - too complex - took up too much eye-space. Much better to just expose a suitable context specific UI - especially if it could be anchored to the real world - IE overlay / highlight / point at the relevant real control button/switch etc.

2 Likes

What about creating and exposing Dashboard XR widgets?

It would be so cool to control stuff without actually spending money with real world parts. For example, instead of having a physical display or buttons, they can be virtualized using Dashboard XR Widget that would be rendered once the person, wearing xr/ar capable smart glass, is closer to a device managed by Flowfuse. I think that virtualizing physical controller parts using XR/AR could make some savings when designing smart factories.

This is an example of virtualizing real world parts. In this video the user doesn’t need to buy any of those arcade machines to have a very similar "experience"

The idea is to achieve the same for industrial floors and virtualize things to reduce expenses and complexity

What I’m trying to achieve is exactly what @AllanOricil demonstrated in video.

I'm considering getting the XREAL Air 2 Ultra glasses. They supposedly have cameras and support for 6 degrees of freedom, which sounds promising—but I'm holding off for now, hoping some cheaper options will pop up locally.

Has anyone here actually tried them? Is it possible to use the XREAL NRSDK with these glasses to trigger actions based on the camera input?

These glasses look awesome but they seem to target b2c. Maybe there is a company working on a b2b solution that could be cheaper to manufacture and satisfy worker safety rules?

There is a mixed reality control project for home assistant on github that maybe of some interest to you.

1 Like

Just watch out on the cheaper end of the glasses. We spotted that quite a few actually send rather too much info back to the vendor which they then sell to trackers. Been a problem for the NHS.

1 Like