Running node red on a webserver instead of a local computer

Hello everyone,
Ive searched for this but only come up with solutions to problems.
so sorry if this is a silly question.
some here know im making a dashboard for running my brewing system.
when complete i have to work on the security side of it.
thats what leads me to these questions.

  1. can i run nodered somewhere in the cloud rather then on a local computer.(so i can have loginsand such)
  2. can such node red be joined with my local bassically linked and mirroring each action or function.
    (obviously crap goes down and the internet can be dropped and i still need the local system to function without the internet)
  3. im bassiclly looking for a rout like this so i do not have to have a bunch of port fowarding happening leaving me exposed to the world.
    I have seen peeps refer to "docker" "api" and off cours "tls"
    is there a starting tutorial somewhere if this is possible , and or am i over thinking my plans for now?

Yes, is one such cloud provider (from the developer of node-red)
(full disclosure, I work for flowforge too)

I dont fully understand the question but there is no reason you could not send communication between them to cause "same" actions on both to occur

1 Like

There is an article in the Node-red documentation Running on Amazon Web Server.
Also Installing Node-red in an always free vm on Oracle Cloud
Edit - the docs also describe MS Azure, IBM Cloud and of course flowforge.

I've never tried either of these solutions.

I guess you are going to need a cloud based MQTT broker (as well as a local one?) to exchange data. No doubt all of the suggested solutions could provide that.

1 Like

A hosted node-red is one option you have... all the vendors mentioned above are valid options. I would choose Flowforge (from the node-red mantainers)

You also have the option to deploy a cloudflare tunnel: this way you expose your local node-red instance WITHOUT open any port on your home router. With Cloudflare free tier you get WAF, security controls, they take care of renewing SSL certificates... pretty much all you would need.

This video shows you how to set it up for node-red (there's also a playlist with more examples on the channel):

1 Like

yes , exactly that, obviously the gauges in the docker (is that the right word)
would have to get info from local pi.

do you have a starter steps of how to get flowing?
the video further down i think is a head of where i need to start.

You can have identical flows running on both systems pulling from the same data source. The point you'll need to work in is making it so that the same data source can be accessed by both systems. Either it's hosted locally and the remote system pulls it, remotely hosted and the local system pulls it or remotely hosted to both systems and they both pull from the same source. If you have the same data to use for both systems, they'll respond identically provided you have the same flows on both. I wouldn't worry so much about how you're running Node-Red as much as where your data is coming from. The flows will take care of themselves in time.

yes the data will have to come from the local pi and mqtt server.
its now wrapping head around the "docker and figuring out where to start.
Right now i have to do this on a free budget so still lost as to which one to use .
basically I'm wanting the web based version to clone the local. so i can access and others can access with out knowing my local ip and i dont have to have a bunch of ports open on my network.

Do some research on how Node-Red stores flows. They're just flow files filled with JSON descriptions of the flow. That would be very easy to synchronize to a web drive where you're running Node-Red on the server. Then the flows stay synchronized and it's identical to your local system and nobody has access to your system except where needed for data.

1 Like

A good way to solve this may be to first isolate your dashboard nodes from the rest of the logic using MQTT as the interface. So all dashboard nodes receive data via MQTT which they display, and anything back from the dashboard goes into MQTT. The main flows then receive and send data via MQTT.

If you then setup a remote MQTT broker synchronised with your local one (that capability is built into Mosquitto) then on the remote machine you can run node red, but only need the dashboard nodes, not the rest of the flows.

Your local and remote dashboards will automatically remain in sync and a GUI user should not notice any difference using the dashboard on the local machine or on the remote (assuming a decent internet connection).

1 Like

My understanding.
all the nodes in my dashboard that display the temps, volumes, pressure,
any led that says if something is on or off, and the switches i use to turn things on or off
so in short

i would do an mqtt out node directing to the mqtt server to go to the online one after the set payload node.
and then on the online one it would be a simple mqtt in node giving the same data to the gauge i use online.

What do you mean by 'online one'?

Exactly. That's the whole design behind online communications. If you process it once, it only needs to have the results relayed to the remaining clients. Hence why clients can be underpowered Raspberry Pis while servers are big hefty beasts. Process once, display everywhere.

But again, I would do some research on your flows. That way if you adjust the flow on your local system, it can by synchronized to your remote system. Mainly talking about your actual dashboard and possibly not the processing flows. That way it keeps your interaction pages identical for a smooth user experience.

@9toejack in case I wasn't clear, I was suggesting first restructuring your flows so that all dashboard flows are separated from your processing flows, with a local MQTT broker joining the processing flows to the dashboard flows. Once that is all working then setup an mqtt broker on your cloud server and configure the local broker to synchronise the cloud one to itself. Then run node-red on the cloud server, but just import the dashboard flows, and it should all just work.

I think i follow you, when i get back home I'll put together something for one flow and share

read that 5 times and tonight it sank in. so do all the processing on the pi part of my flows.
instead of sending the temps and getting controls straight into processing from the dashboard .
everything from dashboard telling my system what its parameters are should be done via mqtt and same as far as the temperature, pressure and volume conversions should go from the process to the dashboard instead to an mqtt message. then the dashboard will get it from mqtt. making it easier to communicate between 2 mqtt servers. using less bandwidth and server power because the brains and brawns are on the raspberry pi side doing all the hard work.
I'm only using the bandwidth to send mqtt messages. and I'm not using much power on the server level.


This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.