Trying to get my head around accessing a remote directory for file copying

:wink:

Yeah, confusion.

No problems.

Yes, I am on A. I am copying from B to A.

@dceejay's post is correct for the direction of copying.

I'm giving up on using the rsync node.

I can't get it working.

I can get rsync working from the CLI. Just got to tidy up things.

1 Like

@dceejay

Could you please explain the command though?

I don't get why what you said should work doesn't and I need to negate the ssh in the command.

I just want to make sure I am learning what is right.
I know on my track record that doesn't hold much water.
But they are all small steps and I would prefer they are in the right direction.

Is it I have nutted out what the command is/should be?

Sorry, a little late to this discussion. But just a reminder that if you have a NAS, you may well have much easier to use backups.

For example, the Synology NAS has a backup that will reach out into all manner of devices including Pi's and Windows desktops/servers.

Once you have a replica on the NAS, you can then further backup to cloud. File versioning helps take care of the fact that sync tools aren't really backups (a backup cannot be changed once taken, a sync can). File versioning lets you undo mistakes in a sync.

I use this, the NAS backs up key folders on my Pi's and my Windows PC's and then the NAS backs up to Amazon Prime storage.

1 Like

Thanks Julian.

Yeah, a NAS would be nice.
Though I am kind of doing that.
But I don't have such a high spec one.

And: How do you use rsync from your NAS? So it is the destination (not source)?

My take (at this time) is I have a 5TB USB drive.
It plugs into the NUC and I run a script/flow on the NUC and it sucks the stuff from the remote machines.

That is still way future work in progress. :frowning:
I'm still not 100% up to speed with the command.

@dceejay's post has really thrown a brick through the learning curve's window.

So, as the USB drive will be on this machine, I can write something to check it exists and if it does, then it looks for "who is online" and sucks their data and puts it on the drive.

Alas to complicate the equation I have a backup script (posted in another thread) and it is a basic one. But suffice for my needs.

It only copies the flow/s (*.json) files in the node-red directory.
It makes a directory (else where) timestamped and puts the files in there.

Copying those directories will be fun with their names. But that's another project.

Thanks.

Just for the sake of sharing this is my script.

It was written by someone who knows more than me.
I just tweaked it a bit here and there.

#!/bin/bash
# ---------------------------------------
# Simple backup script v1.0
# ---------------------------------------

# Variables
myDate=`date "+%Y-%m-%d.%H.%M.%S"`
#backupFolderName="Backup_$myDate"
backupFolderName="$myDate"
backupSource="/home/pi/.node-red"
backupDest="/home/pi/Backups/NR"
backupFilter="*.j*"
backupExclude="lost\+found"

# Tell the user what we're working with
echo "The myDate variable contains: $myDate"
echo "A backup of $backupSource/$backupFilter will be made and stored in $backupDest/$backupFolderName"

# Begin backup
rsync -avz --progress $backupSource/$backupFilter --exclude=$backupExclude $backupDest/$backupFolderName

# We're done.
echo "Done!"
#return $?

I don't :smiley_cat: I use the built-in software. However, I could and I could configure rsync either way round. As a server that reaches out to the remote and triggers a sync or from the remote reaching out to the server.

I don't remember rsync commands off the top of my head, too complex, but there are lots of nice tutorials and examples on the Internet to help.

apologies, the correct command is

rsync -avz -e ssh pi@192.168.0.100:.node-red mylocalBackupDir1/

with the extra -e
(but as we keep saying there are many rsync how-tos out there - at least for the CLI version - and indeed you have one that works already so great - all is well - carry on - sorry for the extra spanner)

I'll add to this that an unintended, but potentially very desirable side-effect to this pull-only backup strategy is that if one of the clients is in some way compromised (for example cryptolocked) that client has no write access to go out and mess up the backups of other systems. To further harden this, the back-up server should not have execute permissions on its own local repository.

2 Likes

Even better if you can detect when large numbers of files unexpectedly change. That is a good indicator of a ransomware attack.

1 Like

Indeed. I don't always get listened to, but at work I'm a big fan of carefully monitoring rate of change on the SAN; both for capacity planning and as a warning sign of ransomware. Some of the newer malware is a bit more insidious, though, and starts encrypting slowly but also gives you access to the data for long enough that the typical backup retention policy would lapse. Once you hit 60/90/whatever days, boom it deletes its keys and sends you the ransom notice.

Thanks again @dceejay but again I am perplexed why there is the ssh in the command, and there is no leading / on the destination path.

I wasn't saying I know how to use rsync. I saw your example and it didn't work.
Though pure stubbornness I resolved the command which did.

Though I now would like to understand why/how that ssh get in the equation.

Time to read the rsync manual then....
-e ssh option forces it to use ssh syntax for the authentication process - it is mostly optional as most systems now default to ssh - but it doesn't hurt to make sure.
As ever google is your friend.

2 Likes

So the -e is associated with the ssh part which you originally posted.

That's all I want to check up on.

It is ok if you made a mistake. I should know about how easy that is.

Hey TTL. I recently wrote the rsync node referenced. If you need the rsync to end up with the following:

rsync -avz -e ssh pi@192.168.0.100:.node-red mylocalBackupDir1/

Then you would put pi@192.168.0.100:.node-red in source, mylocalBackupDir1/ in destination and -avz -e ssh in extra options. The last would be done by using JSONata format and setting it to the following:

[
    "-avz",
    "-e ssh"
]

As I mentioned, I've only recently put it up, so it's not very clever yet, but it should work. So far I've only used it with mounted remote folders (using autofs) because that suits me. If you're interested in trying it again, I'd like to hear your feedback.

Thanks.

I shall maybe look at it again.

The line:
rsync -avz -e ssh pi@192.168.0.100:.node-red mylocalBackupDir1/

The path looks wrong.
192.168.0.100:/home/pi/.node-red would be more correct - yes?

where does 192.168.0.100:.node-red point?

Both would work. The rsync command will start in the home directory, so .node-red and /home/pi/.node-red would be the same location when logged on as pi, and assuming its home directory is /home/pi.

OK. Thanks.

Good to know.