I back up the {json} files to another machine.
I'll have to maybe put some effort into backing up the entire card.
But it seems a lot of work for just the NR stuff.
I back up the {json} files to another machine.
I'll have to maybe put some effort into backing up the entire card.
But it seems a lot of work for just the NR stuff.
So far so good.
Though this may be of concern.
Woo Hoo! Working.
So, stop it, copy the files back and start it again?
Please don't post screenshots. I can't easily show you the bits that matter.
I don't think you need worry about the rbe node warning at the moment (it has been renamed to Filter node but it is just a warning).
More important is the bit where is says it is creating a new flows file. The default flows file name has changed, edit the settings.js file and put in the name of your flows file.
It depends on how serious it is if a card fails, and how long it would take you to get backup and running from scratch.
Sorry about the screen shots.
I should know better. I guess the excitement got the better of me.
I'll mark that as the solution.
All working now.
Now I hope I don't break it again trying to fix the original problem.
For what it's worth, I use a Node-red flow to backup Node-red. It uses tar which is a pretty antiquated archiving utility but makes it easy to exclude the huge node_modules directory.
[{"id":"8dfc46c064c66e06","type":"tab","label":"Backup Node-red","disabled":false,"info":"","env":[]},{"id":"f8272cf6024174cd","type":"group","z":"8dfc46c064c66e06","name":"Backup Node-red","style":{"label":true},"nodes":["cf19faaabdfb2dbe","fb47ca1393ddc5aa","fcbfc58d8799f180","6218b2d2ddf29d4b","3c4d7bf52fe7f7f6"],"x":14,"y":19,"w":852,"h":142},{"id":"cf19faaabdfb2dbe","type":"exec","z":"8dfc46c064c66e06","g":"f8272cf6024174cd","command":"","addpay":"payload","append":"","useSpawn":"false","timer":"","winHide":false,"oldrc":false,"name":"","x":590,"y":80,"wires":[["fb47ca1393ddc5aa"],["3c4d7bf52fe7f7f6"],[]]},{"id":"fb47ca1393ddc5aa","type":"debug","z":"8dfc46c064c66e06","g":"f8272cf6024174cd","name":"Backup stats","active":true,"tosidebar":true,"console":false,"tostatus":true,"complete":"payload","targetType":"msg","statusVal":"payload","statusType":"auto","x":750,"y":60,"wires":[]},{"id":"fcbfc58d8799f180","type":"inject","z":"8dfc46c064c66e06","g":"f8272cf6024174cd","name":"Node-red location & backup target","props":[{"p":"nodereddirectory","v":".node-red","vt":"str"},{"p":"backupfilename","v":"$moment().format(\"YYYYMMDD.HHmm\") & \"nr.tar.gz\"","vt":"jsonata"},{"p":"backupfilename","v":"nodered.tar.gz","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","x":200,"y":80,"wires":[["6218b2d2ddf29d4b"]]},{"id":"6218b2d2ddf29d4b","type":"template","z":"8dfc46c064c66e06","g":"f8272cf6024174cd","name":"Backup script","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"# To override defaults pass these in as msg properties NB Bash does not see these mustaches\nNODEREDDIR={{{nodereddirectory}}} \nBACKUPTO={{{backupfilename}}}\n\n# Default location\nif [ -z $NODEREDDIR ]\nthen\nNODEREDDIR=.node-red\nfi\n\n# Default backup file\nif [ -z $BACKUPTO ]\nthen\n BACKUPTO=nodered.tar.gz\nfi\n\n# Don't overwrite backupfile\n#if [ -s \"$BACKUPTO\" ]\n#then\n# echo \"Error: $BACKUPTO already exists\" >&2\n# exit 1\n#fi\n\nARCHIVER=\"tar -czf $BACKUPTO --numeric-owner --exclude=node_modules*\"\n$($ARCHIVER $NODEREDDIR) # Do the backup\n\nCOUNTFILES=\"$(tar -tvf $BACKUPTO | wc -l)\"\nFILESIZE=\"$(du -h $BACKUPTO | sed -e 's/\\s.*//')\"\nprintf \"%s files %s in %s\" $COUNTFILES $FILESIZE $BACKUPTO","output":"str","x":440,"y":80,"wires":[["cf19faaabdfb2dbe"]]},{"id":"3c4d7bf52fe7f7f6","type":"debug","z":"8dfc46c064c66e06","g":"f8272cf6024174cd","name":"Errors","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":730,"y":120,"wires":[]}]
After restoring a backup I have to cd .node-red; npm install
to reinstall all the extra nodes.
The current default NR behaviour is for the flows file to be called flows.json not flows_.json. Much simpler I think.
I have various scripts and utilities called from Node-red or the terminal. They all live in ~/bin so that's simple to backup.
It's the other services on my main Pi that are a pain to backup. Notably Mariadb and Pihole.
Not to turn it into a pissing contest, this is my effort at that:
#!/bin/bash
# ---------------------------------------
# Simple backup script v1.0
# ---------------------------------------
# Variables
myDate=`date "+%Y-%m-%d.%H.%M.%S"`
#backupFolderName="Backup_$myDate"
backupFolderName="$myDate"
backupSource="/home/me/.node-red"
backupDest="/home/me/Backups/NR/LOCAL"
backupFilter="*.j*"
backupExclude="lost\+found"
# Tell the user what we're working with
echo "The myDate variable contains: $myDate"
echo "A backup of $backupSource/$backupFilter will be made and stored in $backupDest/$backupFolderName"
# Begin backup
rsync -avz --progress $backupSource/$backupFilter --exclude=$backupExclude $backupDest/$backupFolderName
#RC = $?
# We're done.
echo "Done!"
exit $RC
I have used it to get back older versions.
But it is it...... good enough with what it does?
There is a bit of flow around it. But wanting to stick to the actual backing up.
Indeed
I like your script too. There's always lots of ways in Linux to perform a task.
Actually it wouldn't work for me as is because I also need to backup the editor CSS file, a directory of static content, any filesystem context stores and (maybe) the cronplusdata directory.
I might adapt it for my setup though.
I wrote it like that so I get a dated backup in it's folder.
As you can see there is/are a lot of variables....
But they can be fun sometimes.
Then every now and then I mount the remote machine/s and copy their backups to another (central) machine.
And maybe even copy all those to an external USB drive.
But I haven't automated that part just now.
If you backup the .node-red folder (except node_modules) to a private github repository then you get dated backups as often as you like, with diffs too.
Yeah, ok.... I have not gone down that path yet.
I keep things local.
All this started from me messing around with NR way back ....... 10 years ago?
And I know JS (not that one) about coding.
As I probably still do.
JS - Jack $hit.
It is good enough to give you a SINGLE backup. Personally, I have a week of dailies, a month of weeklies and a year of monthlies.
Actually, it IS NOT good enough. As pointed out, your filter is too restrictive. Simply exclude the node_modules
folder, that will be much better.
They back up to the server but the key areas of the server are backed up to my NAS. Which, in turn, is backed up to the cloud.
For a Pi, a full card backup is probably worth doing. Otherwise, I simply keep a load of notes about what I've installed and how I've configured things. That's because, it is often so long between re-installs that things have changed anyway and a fresh install is often better.
Just hope that your local storage isn't burned down, burgled, overrun by aliens, etc.
My cloud backups are encrypted at source.
Sorry, dumb question. why is this folder not adviced to be included in backup ?
Edit: Gemini answered me, but would still like if you could validate its answer..
The advice against including the node_modules folder in backups, particularly in the context of Node-RED, stems from several key reasons related to efficiency, redundancy, and potential issues:
Redundancy:
The node_modules folder contains all the dependencies (libraries and packages) that your Node-RED flows rely on. These dependencies can be very numerous and consume a significant amount of disk space.
These dependencies are typically readily available from the npm (Node Package Manager) registry.
Instead of backing up the entire node_modules folder, you can back up the package.json and package-lock.json files. These files contain a list of all the dependencies and their specific versions.
When you restore your backup, you can simply run npm install in the Node-RED user directory, and npm will automatically download and install all the necessary dependencies based on the information in your package.json and package-lock.json files. This is much faster and more efficient than backing up and restoring the entire node_modules folder.
Size and Time:
The node_modules folder can be extremely large, especially if you have installed many Node-RED nodes. Backing up and restoring such a large folder can take a considerable amount of time and consume significant storage space.
Backing up only the package.json and package-lock.json results in a much smaller backup size, leading to faster backup and restore times.
Platform and Architecture Dependencies:
Some packages within node_modules contain platform-specific or architecture-specific compiled code (e.g., native modules).
If you back up node_modules on one platform (e.g., a Raspberry Pi) and restore it on a different platform (e.g., a Windows machine), the compiled code may not be compatible, leading to errors.
By using npm install on the target platform, you ensure that the correct versions of the packages are downloaded and compiled for that specific platform.
Corruption Risk:
Due to the sheer number of files within the node_modules folder, there is a higher risk of file corruption during the backup or restore process.
Version Control:
The package-lock.json file ensures that you install the exact same versions of the dependencies that were used when the backup was created. This helps to prevent issues caused by incompatible versions.
In summary, backing up package.json and package-lock.json provides a reliable and efficient way to restore your Node-RED environment without the overhead and potential issues associated with backing up the entire node_modules folder.
On my Pi ~/.node-red is 198MB, of which node_modules is 190MB.
And if you restore from a backup, npm install will recreate node_modules.
Gemini's 2nd point is the key one. It is massive, often with thousands of files. There is no point in backing it up when a simple npm install
will fully repopulate it.
Also, if you restored the backup onto a new machine with a different major version of nodejs, or upgraded nodejs on the original machine and later restored the backup, or restored the backup onto a machine running a different OS or different version of the OS, then any nodes that involved a build operation would generally no longer work as the binary code in node_modules would not be compatible with the new system.
This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.