Node-RED and Docker, how to populate the nodes & flows that are in manage Pallete menu?

Hi,

My idea of I want this to work is ..
If I am not at home and Node-RED becomes unresponsive, someone else can restart the whole of Node-RED with either a simple command or click on a Icon and it starts with all necessary nodes in place and flows.

In this case it would load my default Node-RED that I have setup as I want it.

  • I do have a volume setup that has the flows and nodes on a folder external to Docker.

  • I have tried and failed with a dockerfile.

  • Is there some script I can create that would do what I want ?

Thanks

I know not so many use docker and I am a beginner, but I am totally stuck !

I have it all working, just some minor things like this are very frustrating.

Like you I had a look at Docker because I thought it would make some things easier.

I thought it would be easy to deploy and modify for my home automation project. mostly it is easy, but when I have spent all of yesterday trying to get this to work, it is not.

Like you and many other people posting in this forum I discovered that Docker makes many simple things complicated.

I spent 10 hours yesterday trying to get this to work and for some reason now I cannot even create my folders in /data

I decided that for my home server, the disadvantages of Docker outweighed it's benefits so I got rid of it.
Perhaps the time has come for you to do the same?

Docker offers a lot of advantages, too. It all depends on the use case and its utilization.

My node-red-one/influx/grafana/mqtt/node-red-two/filebrowser/foshk/grab/signal/traefik/zigbee2mqtt/portainer-agent stack runs for almost 4 years now. And I moved it around with all its settings and data between mac, raspi, virtualized arm or amd debian. As long as docker runs on this machines, it only takes 2 minutes to recover my stack.

But, this is not what @questuk is asking for:

Do you often experience unresponsiveness of your nodered instance? And what exactly is unresponsive? Your docker stack? The PC it is running on? If so, what do you do to get it up and running again? Or what would you do, in case of.

I don't mean to suggest that Docker has no advantages.

But I do think it's a relatively advanced system best avoided by beginners until they have got their various applications working and are familiar with them.

1 Like

Hi,

Quick background ... i have been running Node-RED faultlessly for 5 years on RPI with SQLite .
It did struggle to run when the database had 10,00's of rows.

Lately I need to upgrade an Alexa node for it to continue to work as amazon do change things a lot and the node needs to be upgraded often.

So on the RPI i found I needed to use a newer node.js, this i did and the RPI no longer worked, I did get it working again but took many hours and back to how it was !

This prompt me to get a more powerful PC to run Node-RED and my database on. I am now using Intel NUC.

Then I thought this is the ideal time to use Docker as I want to try different flows and nodes out before committing then to the Host PC running my home automation. This has been done on a different PC and has been invaluable at testing flows etc. :grinning:

I am running Docker desktop with external volumes.
I have had a problem creating a Dockerfile to work with Node-RED.
[Node-RED & dockerfile, can make folder, cannot add text file to it?]

At the moment I want to stay with Docker desktop as I can see the advantages, but I do need to get it working as I want with the mange palette pre-populated ... etc.

Thanks

jbudd

I am a beginner and while it is harder than I thought it, I am getting there but sometimes like now I need to come on here for some help and guidance :+1:

If i get it going I will gladly add my code here :grinning:

Do you run your docker desktop on windows?

Hi,

I run it on Linux :grinning:

Just had some success in transferring a text file, seems I cannot use /data to place files in so I tried ...

WORKDIR /gb

COPY test.txt .

WORKDIR /usr/src/node-red

The text file is now in the folder gb :+1:

Later today I will try to copy the Package .json then NPM install ?

Thanks

My setup to quickly restore in case of problems is quite simple:

In my home folder on linux I have a subfolder named services and inside this services folder I have one subfolder for each of my services;

/home/x/services
/home/x/services/nodered
/home/x/services/zigbee2mqtt
etc.

Each service subfolder contains the files docker-compose.yml and .env as well as a subfolder data. Latter I use for service specific volume definitions in docker-compose.yml, f.e.

volumes:
- .data/config:/whatever/config

Each other night I stop each service, tar its directory and start it again. With the exception of influxdb this is only a matter of seconds per service. And at 3 a.m. I don't care :slight_smile: ... backup is done via crontab script, run by root:

#!/bin/bash

cd /home/x/services

# Run through all subfolder in /home/x/services
for dir in */ ; do
#  echo "Change to $dir"
  cd "$dir"
  docker compose down
  cd ..
  tar -cvzf "/mnt/backup/${dir%/}-$(date '+%F').tar.gz" "$dir"
  cd "$dir"
  docker compose up -d
  cd ..
done

With this I have full and separate backups of all my services. Each in a single file with date tag and with all my settings and all my data.

In case I have to recover, I just download required backups, untar them and docker compose up -d each service. Assuming docker is running on the new host, docker will automatically pull the required service image on each services first start.

As for nodered and your example:

No need to deal with dockerfile, package.json, etc. It is only a matter of standard file handling. Any nodered-date.tar.gz not only contains package.json but all installed packages as well.

This does not explain how to achieve a one-button-restore feature. But it could very well lay the foundation. The rest depends on your particular setup.

Thanks Jodelkoenig

Very interesting and that is similar to how I want my system, i did look at backup versions earlier in the week but have not gone any further just yet. It took all afternoon to work out how to do the backups with data and version in the title. where were you then :grinning:

I did not understand this part, if you could explain a bit more as I am very much a beginner. Especially the

.env

Thanks for your help :+1:

Well :smile:

My folder looks like this:

/home/x/services/nodered #service specific folder
/home/x/services/nodered/data/ #service specific data folder -> used for volumes
/home/x/services/nodered/docker-compose.yml #file
/home/x/services/nodered/.env #file

.env is a standard file which is utilized by docker compose in combination with docker-compose.yml. You can easily ignore it and bake everything directly in docker-compose.yml.

I use it f.e. as follows: docker-compose.yml:

 - "traefik.http.routers.servicexyz.rule=Host(`${SERVICE_URL}`)"

This specifies the URL for service whatever as a variable ${SERVICE_URL}. The variable itself is defined in .env

SERVICE_URL=servicexyz.yourdomain.xyz

another example: docker-compose.yml excerpt for traefik

environment:
   - "NETCUP_CUSTOMER_NUMBER=$TRAEFIK_NETCUP_CUSTOMER_NUMBER"
   - "NETCUP_API_KEY=$TRAEFIK_NETCUP_API_KEY"
   - "NETCUP_API_PASSWORD=$TRAEFIK_NETCUP_API_PASSWORD"

With this I set the credentials for traefik and my dns provider "netcup". Variables used are $TRAEFIK_NETCUP_... As you can see, I do not fill in any credentials here. These are present in .env

TRAEFIK_NETCUP_CUSTOMER_NUMBER='123456'
TRAEFIK_NETCUP_API_KEY='1key2key3key'
TRAEFIK_NETCUP_API_PASSWORD='1password2password3'

1 Like

Do yourself a favor and use ChatGPT for stuff like this. I just read your comment, opened ChatGPT and asked

how to tar a directory with time stamp in the archives filename

the immediate answer was

I tend to forget that ChatGPT exists too from time to time. And it continuous to surprise me how fast I am able to solve those little puzzle nowadays :slight_smile:

Hi,

Yes I have been using ChatGPT in the last few days and it did help me with some simple code.

The problem is you need to input the correct words for the solution and sometimes I know what I need, but not the correct terminology :upside_down_face:

Thanks

1 Like

Hi all,

I am pleased to say that I got this working, I can now pre-populate the Node-RED manage palette and build my own local image. Still need to find out how to get all of my Home Automation to run with a button press or a few lines of code, but for now I have the basics that I need in place. :+1:

The code that is required in the Dockerfile is ...


# Using this Repository
FROM  nodered/node-red:3.1.3


# Select the WORKDIR that Node-RED uses. Copy the package.json, ( notice the DOT which means ... place into the current DIR ).
WORKDIR /usr/src/node-red                                                                
COPY    /files/Node-RED .


# Update with the latest 'package.json' (with any changes I made ) to ALL nodes that are in the 'Manage palette menu' in Node-RED.
RUN npm install

My package.json is the only file in a separate directory called Node-RED
I could have had the package.json in the same folder as Dockerfile but I wanted it to be separate.

Thanks to all for your help, as always this forum is so helpful.

3 Likes