Deployment management

Hey all, I’m wondering what solutions people are using for managing deployments on multiple systems that need to run the same project. Ideally I’d like to be able to remotely deploy a complete configuration to a new machine without needing to configure Node Red and dependent modules. Or at minimum be able to deploy a new version of my NR code to an existing machine. I’ve seen the qbee.io software which looks good but possibly overkill for what I want and I thought there might be some open source tools to achieve a similar outcome.

I was about to start working on a set of scripts to do what I need but thought that someone must have built this wheel already. Thanks for any advice!

I keep the node red flows, settings etc in github repositories and deploy using ansible.

Thanks Colin. I'm using Projects and a github repository, just finding it cumbersome to configure all the necessary nodes and static dependencies for each deployment. I've started using a sledgehammer approach to backup the entire .node-red folder and copy this to a new RPI device. Any downside to this approach? I've had some problems in rebuilding nodes like SQLITE when copying between a RPI3 and RPI4.

What exactly do you mean? Running npm install from the .node-red folder should do most of what you need. Though I don't use projects so maybe that is not right. I use git but not with the projects feature. There aught to be some docs for the feature telling you how to move a project from one machine to another. Perhaps someone else could comment here as my assumptions about what projects can do may be wrong.

It may fail if the version of nodejs is not the same, or if anything about the hardware or OS environment is different. It is generally better not to copy node_modules but to rebuild it by running npm install from the node-red folder.

a solution I tried, port my NR in docker and create my own fully customized image.
Then we put it in a private dockerhub on the net.
In one of the flows of your application, you make a repetitive check (see below), let's say 1x per day to go see on a remote site if there is an update ready.
If you set the flag to Yes, then a script will be triggered on each remote Pi to download your image and thus transparently update the containers of the machines.
Be careful, you have to create an image per hardware architecture! But it's simple and fast.
The same machinery applies if you use grafana, influxdb, mosquitto, etc..

There is also this "Watchtower" container which checks the other containers in the Pi and verifies if there are more recent versions and updates them by itself or with confirmation.

I wanted to go all the way with this idea, but the physical devices connected via the USB port can have malfunctions when they are re-mapped into the containers. I've seen this especially with the serial node. I was getting erroneous values randomly, not often but enough to skew a graph.

As for the private data of the user, if there is any, it is very simple to put them in a directory of the Pi and map it in the container. Thus, the update of the container does not lose the personal data.

in this scenario, everything is free software but obviously not as strong as a pro solution.

Hey @Colin , Ansible, I've heard about it, is this software paid ?
Is it some kind of container orchestrator ?
I'm curious to know more about it.

Flow upgrade for example:

NPM install doesn't seem to work properly. After running I still get a notification in NR that several nodes are not found. I tracked one problem down to a problem with SQLITE which wasn't rebuilding properly.

I realise that this isn't enough information, so I'm going to run a few more tests on a clean install to see what exactly the problem is so I can report back.

Thanks Jean-Luc, I'll give that a try!
Redhat Ansible looks good but it's quite expensive I think.
qbee.io looks very good for device deployment and management but it's also fairly expensive.

Is everything paid for ?
If you find elegant solutions, please share, it's an interesting topic

Ansible is free. It is available in Ubuntu and Raspbian via the apt repositories. I guess it can be installed on Windows, but I don't use Windows. It only needs to be installed on the central machine controlling the deployment, it drives the machines receiving the deployment via ssh.
https://www.guru99.com/ansible-tutorial.html

Thanks @Colin for the link and clarification.
I just skimmed the text.
I see that the ssh port 22 must be rerouted via NAT through the client router, otherwise no remote management.
This is not the case in HTTP.
Ansible has no particular link with docker containers, I thought it went together and made for that. So you can have preconceived ideas.

Sounds good, I didn’t research properly to find the open source version, I had only seen the Redhat (paid) version. I’ll give it a try.

That is one way of doing it, if you need remote management. There are others such as using a VPN, and it doesn't have to be port 22.

How do you do it with HTTP without opening a port?

I am not sure what you mean by that. What sort of link are you looking for? I use it for deploying containers without noticeable problems.

Port 80 HTTP is open by default in all routers-firewalls, internet box of private.
This allows for unrestricted surfing.
It is obviously not advisable to do the same with the ssh port which obviously allows to have access to the OS remotely (on internet) even by changing the port.
On my side I was looking for a simple solution and I found that the container update (docker pull) was done via the http protocol (see https if more encrypted).
This allows to do plug and play. The user receives his Pi configured by us and he has nothing more to do.
It is the Pi via the dedicated flow that will pool to the central point and not the central point that makes push.
It was my idea, but I didn't go further, I tested the concept and that's all.
Obviously, a dedicated administration tool with a dedicated infrastructure is going to do a lot more. But that requires other skills.

No ports will be open by default for incoming connections which is what would be needed for remote deployment.

1 Like

that's why I was thinking more about pulling the update from the Pi or other computer (by going to a URL to see if there is an update available) as opposed to pushing it from the central point.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.