One github repository for docker-compose application consisting of a node-red application

The context:

I am developing a docker compose application consisting of multiple services
of which one (or more) service is a node-red application with its specific node-red flows.

For development I am using 2 machines:

  1. laptop (macbook) where I am creating/editing all the files of my docker compose application excluding the specific files of my node-red application.
  2. docker environment (intel nuc) where my docker compose application is running and where I am editing the node-red flows of my node-red application using the node-red editor in a browser.

Currently I am creating the following 2 github repositories for this:

  1. docker-compose application (without the node-red flows)
  2. node-red application flows

When deploying my docker-compose application to a new docker environment I must also open the node-red editor and clone the appropriate node-red application flows repository using the node-red project feature.

The goal

  1. Have one single github repository containing the complete docker compose application including the node-red flows.
  2. Using the node-red editor it should also be possible to update the node-red flows and push those changes to this github repository.

I've tried doing something similar and failed so far...
I currently do you approach: separate repositories for each node-red project and a deploy repository containing the compose files.
Let me know if you find something out...

1 Like

If I find some time, I will give it a shot.

I see that the location of the flows.json file and package.json can be edited via project settings.
So maybe this can be changed to a specific relative subfolder.
It is not clear to me where these project settings are stored and if they can be configured (e.g. via the node-red settings.js file) without using the node-red editor. So that is the first thing I need to figure out.

I did some further testing (see janvda/docker-compose-with-node-red-flows) and selecting the right package.json (located in a subfolder) seem to be working for a very simple flow.

Instead of manually specifying the location of the package json file using above Project settings tab is it also possible to configure this ? The idea is that when cloning this github project that it automatically will look for the package.json file in the correct subfolder ( = ./node-red-service1/) @knolleary

Hi @janvda

Each project's configuration is stored in ~/.node-red/.config.projects.json - this includes the credentialSecret used to decrypt the credentials file and rootPath used to specify where in the repo the Node-RED project is (if it isn't at the root of the repo).

Depending on what workflow you want to achieve, you could pre-create that file somehow with the right contents.

1 Like

Thanks, it is indeed the rootPath parameter that needs to be set properly.

I can create a .config.projects.json with the proper contents but when cloning the repository using the node-red editor it is overwriting these project settings.
I can consider also doing the git clone in the dockerfile but that it is going a bit too far.

... actually I think I can live with the fact that I need to select the package.json from the right folder after cloning the repository using the node-red editor :slight_smile:

I found something :

So if you want to maintain in a "single" github repository:

  1. a docker-compose application having a node-red application X as one of its services AND
  2. the node-red flows and flow credentials of the node-red application X

then I would suggest to have a look at:

I performed some testing and it is working.

More details can be found in the readme.

Any feedback, comments or github issues are welcome.
Jan

So in that case on the target machine there are still 2 cloned repositories ? The first clone that clones your compose file and then the manual clone that will clone the project in the projects folder.

There are indeed 2 cloned repositories but in my case not on the same machine.

  1. The first clone (used to edit my compose file and everything else not related to the node-red flows) is made on my laptop (= my build machine)
  2. I deploy my docker compose application to my target machine (nuc-xxx.local) by
    1. setting the DOCKER_HOST environment variable ( export DOCKER_HOST=ssh://root@nuc-xxx.local) on my laptop
    2. followed by command : docker-compose up --build -d
  3. If I want to edit my flows then I open the node-red editor on my target machine and first clone the repository as outlined in the readme.

DOCKER_HOST is really an interesting feature. It allows you to make any changes on one machine (build machine) and deploy them to another machine (target machine). Your build machine doesn't need to be online all the time. It must only be online when you want to deploy your application changes.

I think it is also possible to do everything from one machine with a single clone.
For that you must make the clone under folder ...XXX/projects/ and in folder ...XXX/ your docker compose application must create a file .config.projects.json with the proper contents.
... of course there are several other things you must figure out to get this bootstrapping working. I admit it is a bit like pulling yourself up on your shoelace.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.