I run NR under Docker and use a bind mount to map a local folder.
I use an NR project with a remote repository and since I also wanted to have a NR runtime defined, I have another repository to store the "settings.js", Dockerfile, Readme etc.
As far as I understand, the main advantage of using NR projects is that you can switch between projects. This is also good, as I want to switch between projects in development.
But my main goal is to create containers from the projects, but with the runtime defined in the other repository. This is where it gets tricky: I don't want to have projects enabled in the final container.
I don't know... there's something I can't wrap my head around. How would you do this?
Maybe I should forget NR projects at all then and store every project with a separate settings.js? But what if I make changes?
Maybe I am overthinking this and there is an easy solution...
I'm interested in your ideas!
I use git for source control of my node-red systems, but not using projects. I simple commit the .node-red folder in each case, using this .gitignore file to leave out unnecessary files
When deploying to a new system it is just necessary (after installing node-red) to clone that and run, from that directory
to install all necessary nodes.
To make git diffs more useful when looking at the flows file, I set flowFilePretty in settings.js (and deploy after making that change) and run this script before committing. That generates a more human readable version of the flows file with function nodes expanded so that diffs can be seen within functions.
# generates formatted versions of node red flow files matching flow*.json
# in files flow*.json.formatted
# NOTE must have flowFilePretty: true in settings.js
for f in flow*.json; do
sed -r 's/\\n/\n/g' "$f" > "$f.formatted"
# remove any cred files converted
Of course, if you adjust your package.json file, you can use npm to install a clone direct from GitHub and get it to do the install automatically using the
postinstall script feature.
I will have to look into that. Thanks.
No problem, give me a shout if you need any help.
I can't immediately see how to use npm to clone the complete .node-red directory. Can you point me in the right direction?
Assuming your github repository contains everything including the package.json file that lives in your userDir, you should simply be able to:
npm install --prefix ./new_nr mygithubid/mygithubreponame
That should clone everything in the repo to
What it doesn't do is leave you with a new git in the new folder, you would need to do that separately if you wanted to.
The nice thing is though that you could update this instance from github just with
npm update --prefix ./new_nr mygithubid/mygithubreponame or, add a script to the package.json:
"update": "npm install --prefix ./ mygithubid/mygithubreponame",
If you were using my alternate installer, you could remove my template
./data folder from the parent folder, then do the install as
npm install --prefix ./data mygithubid/mygithubreponame which would add your repo to the parent package.json and you could then update it simply with an
npm update I think.
npm is fun when you start exploring its capabilities.
PS: Should have warned you that I didn't actually TRY these commands! Sorry. But I think it should work.
My repo contains the .node-red folder contents, so the flows file, settings.js etc. After running that command all I have in new_nr is package.json (containing just a dependency of the github repo), package-lock.json and node_modules. The files that should be in the top level are buried away in node_modules.
Ah, I did wonder about that. I think there may be a way to overcome that but not sure. Sorry, going to be mostly offline for a week or two. Some googling may turn up more.
For me it isn't an issue anyway, as I use Ansible scripts for deploying and the script includes the
npm install after the clone. So the whole thing is automatic anyway.