I have a Node Red application that runs on Raspberry Pi controls an industrial device. The application is in fact it's a hybrid Python/Flask REST API and Node Red package.
The current deployment flow is quite tedious. Whenever changes are done to the software (on a development machine), they are pushed to repository using the Project feature (great feature by the way).
And that's when it starts getting ugly. To deploy the changes to a machine/SD card, the card is removed from the Pi (the dev machine), cloned (an image is made), the image is reduced in size, transferred over (we're talking about 7GB right now) to another department, who takes out the SD card from the machine we want to upgrade, burns the new image, re-inserts it.
I was wondering if someone had an application like this and had a more elegant solution.
I've been contemplating remotely connecting to the machine that needs to be upgraded and transfer the .node-red directory over, but I feel this might be a source of other issues.
Can't you just pull from github on the destination and restart node red.
If there have been additional nodes installed then you would also have to run npm install before the restart.
Our development machines are connected to the Net so that's possible. However production machines are not (for now) for vulnerability reasons.
We use ethernet cables to connect to the production (for monitoring if needed) and Wifi is deactivated.
So basically what you're saying is that I could transfer over the .node-red directory to that machine (using scp or other shell scripts), run npm install in that directory and I should be good?
No, npm install needs the internet in order to fetch nodes to install. However if your development machine has the same processor architecture and toolset (nodejs version etc) and same OS etc then if you transfer the whole .node-red folder then that is all you need to do. The installed nodes are in the node_modules folder so they will get transferred too.