Help: Node red in docker swarm

I have searched the forum and not found what i am looking for
apploigise if this has already been answered
I have read the docker help pages provided by node red
(not saying i completely understand everything)
If you ever get to the bottom of this essay you may say to yourself "why do this"
Mostly for fun and learning. If node red was down would it really matter? answer, of course not

Im new to node red and docker so this may be a silly question

I have node red working in my docker swarm fine
I have installed extra nodes (influxdb,dashboard,email......)
They all work.
Now i want to make node red resilient
So i try to use a "bind mount" to the docker host
All fine as long as the node red container stays on that host
If it starts on another host the flows and nodes are lost. (i have the flows backed up)

"get to the question!" i hear you say

My other containers all use a bind mount to a CIFS mounted folder on the 3 docker nodes
If the docker node fails the containers move to another node and continue to work with all data retained.
(Had some trouble but managed to sort it)

So, if i do the same with Node red container i am unable to install any nodes (dashboard,email.....) because the install fails with symbolic link issues. you cant use symboliclinks with a mounted volume ( i believe)

My idea is to create a compose file for node red that installs my required nodes on creation.
then use a bind mount to a mounted folder for the flows and settings.json
That way when docker redploys the service all the nessecary nodes are installed for the flow to work no matter what docker node the container starts on.

(Question yeah!)
How do i add
npm install node-red-dashboard
npm install node-red-contrib-influxdb
npm install node-red-node-email
etc
to the entrypoint file of the node red container?

or is there a better way in node red to fit a docker swarm

Thank you

Resolved for now
So i think the symbolic link issues, when installing node red nodes (influxdb etc..) is due to he fact that the CIFS network share is to a windows server

i tried a few things tonight but the best i came up with is this

I have set the "FLOWS" Env to a new folder and file and then set up a bind mount to put the file in to the container
Then i created a bind mount for the /data folder
installed node red and manually installed the nodes (influx,email,dashboard,weather.....)
then i scp the bind mount folder from the docker node the node red container was currently on to the other docker nodes

so all docker nodes have the same files in /storage/docker/volume/nodered
the container looks there for all its files except the flows.json (i wanted to keep that seperate)

still uses the default nodred red docker container

tested by "draining" each node in the swarm and waiting for the container to move to a new node then web browse to nodered:1880 again and all flows and nodes are still there

I am not saying this is the best way, far from it, but its the way i managed to get it to work with my limited knlowedge

Hope this helps someone else

P

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.