I have searched the forum and not found what i am looking for
apploigise if this has already been answered
I have read the docker help pages provided by node red
(not saying i completely understand everything)
If you ever get to the bottom of this essay you may say to yourself "why do this"
Mostly for fun and learning. If node red was down would it really matter? answer, of course not
Im new to node red and docker so this may be a silly question
I have node red working in my docker swarm fine
I have installed extra nodes (influxdb,dashboard,email......)
They all work.
Now i want to make node red resilient
So i try to use a "bind mount" to the docker host
All fine as long as the node red container stays on that host
If it starts on another host the flows and nodes are lost. (i have the flows backed up)
"get to the question!" i hear you say
My other containers all use a bind mount to a CIFS mounted folder on the 3 docker nodes
If the docker node fails the containers move to another node and continue to work with all data retained.
(Had some trouble but managed to sort it)
So, if i do the same with Node red container i am unable to install any nodes (dashboard,email.....) because the install fails with symbolic link issues. you cant use symboliclinks with a mounted volume ( i believe)
My idea is to create a compose file for node red that installs my required nodes on creation.
then use a bind mount to a mounted folder for the flows and settings.json
That way when docker redploys the service all the nessecary nodes are installed for the flow to work no matter what docker node the container starts on.
(Question yeah!)
How do i add
npm install node-red-dashboard
npm install node-red-contrib-influxdb
npm install node-red-node-email
etc
to the entrypoint file of the node red container?
or is there a better way in node red to fit a docker swarm
Thank you