Read out log file and add new entries to an existing file

Hello Node Red community,

I would like to achieve the following and hope you can tell me if this is possible with node red.

I want to read a log file from a web server. There is little change here and every few days an entry is added.

A maximum of 100 entries are stored there. So that I don't lose them, I want to read out the log file once a day and add only new entries to a text file on my home server (csv format).
Can this work?
Thanks for your input!

My node red runs in docker and is version nodered/node-red:2.2.2-12

Yes, you can do that as long as Node-RED has access to the log file. Personally, I wouldn't do it that way myself but I understand that it may be easier if you don't know any shell scripting.

To be honest though, if you don't want to lose log files, you should probably use a backup tool to keep as many versions as you need.

Just note though that docker creates a virtual environment around whatever it is running. This means that node-red inside docker is effectively running on a separate server. Working out the details of the required networking can sometimes be challenging.

Hello @TotallyInformation,
thank you for your feedback!
Can you give me a hint what would be the appropriate nodes to compare "text a" (log file) with "text b" (backup) and append the new entries from "text a" to "text b"?
Thanks a lot for your effort!

Do you mean a line is dropped from the beginning of the logfile when line 101 is added?
Or, like most logfiles, it just keeps on growing (but slowly)?

If you are on Linux the easiest way to compare two files is with an exec node calling the command line tool cmp newfile oldfile which will return {code:0} if they are the same and {code: 1} if they differ.

If the only possible change to the log file is that new entries are added, the easiest way to append the entries from the new file to the old file is another command line utility mv newfile oldfile

Depends on the log file. You will need to share a few lines of the log so that we can give better answers.

Hi @jbudd,
exactly, Line Number 1 is dropped, if Line 101 gets added.

Here is an example of the csv.

Date;Duration;Energy;Price;Costs;StartReading;UID;Username
2023-3-10 6:42:14;0:6:1;0,02;29,9;0,00598;1167,326;vehicle;vehicle

But the new file should have the entries starting from 1 to 101 and further on (no dropping of lines).

Here is an example of how to do similar in node-red

[{"id":"5cd955ecde8bbb91","type":"inject","z":"65617ffeb779f51c","name":"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":160,"y":4320,"wires":[["967fbe6db440f583"]]},{"id":"967fbe6db440f583","type":"template","z":"65617ffeb779f51c","name":"simulate read file_a","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"Date;Duration;Energy;Price;Costs;StartReading;UID;Username\n2023-3-10 6:42:14;0:6:1;0,02;29,9;0,00598;1167,326;vehicle;vehicle\n","output":"str","x":330,"y":4320,"wires":[["403ec7b80e39a85c"]]},{"id":"403ec7b80e39a85c","type":"change","z":"65617ffeb779f51c","name":"","rules":[{"t":"set","p":"original_file_a","pt":"msg","to":"payload","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":540,"y":4320,"wires":[["6a38f80097a06eba"]]},{"id":"6a38f80097a06eba","type":"csv","z":"65617ffeb779f51c","name":"","sep":";","hdrin":true,"hdrout":"none","multi":"mult","ret":"\\n","temp":"","skip":"0","strings":true,"include_empty_strings":"","include_null_values":"","x":710,"y":4320,"wires":[["9c360c9412158332"]]},{"id":"9c360c9412158332","type":"change","z":"65617ffeb779f51c","name":"","rules":[{"t":"move","p":"payload","pt":"msg","to":"file_a","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":870,"y":4320,"wires":[["3c5db59cc227b006"]]},{"id":"3c5db59cc227b006","type":"template","z":"65617ffeb779f51c","name":"simulate read file b","field":"payload","fieldType":"msg","format":"handlebars","syntax":"mustache","template":"Date;Duration;Energy;Price;Costs;StartReading;UID;Username\n2023-3-10 6:42:14;0:6:1;0,02;29,9;0,00598;1167,326;vehicle;vehicle\n2023-3-11 6:42:15;0:6:1;0,02;29,9;0,00598;1167,326;vehicle;vehicle\n","output":"str","x":230,"y":4380,"wires":[["e411da20cbae16be"]]},{"id":"e411da20cbae16be","type":"csv","z":"65617ffeb779f51c","name":"","sep":";","hdrin":true,"hdrout":"none","multi":"mult","ret":"\\n","temp":"","skip":"0","strings":true,"include_empty_strings":"","include_null_values":"","x":390,"y":4380,"wires":[["c6d133fbef5674ac"]]},{"id":"c6d133fbef5674ac","type":"change","z":"65617ffeb779f51c","name":"","rules":[{"t":"set","p":"payload","pt":"msg","to":"(\t  $dates_in_file_a := $$.file_a.Date;\t$append($$.file_a,$$.payload[$not($.Date in $dates_in_file_a)][])\t)","tot":"jsonata"}],"action":"","property":"","from":"","to":"","reg":false,"x":560,"y":4380,"wires":[["117ae6632c1bf80e"]]},{"id":"117ae6632c1bf80e","type":"csv","z":"65617ffeb779f51c","name":"","sep":";","hdrin":true,"hdrout":"all","multi":"mult","ret":"\\n","temp":"","skip":"0","strings":true,"include_empty_strings":"","include_null_values":"","x":710,"y":4380,"wires":[["f3d15ea8be9da0d8"]]},{"id":"f3d15ea8be9da0d8","type":"debug","z":"65617ffeb779f51c","name":"file_a contains og file object\\n payload contains new updated file_a\\n original_file_a contains og csv","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":290,"y":4440,"wires":[]}]

I have simulated the reading of the files, these should be read file nodes. The output can be saved using a file write node.

Your example does not explain the differences between the files, best to show a example of both files (obviously not 100 entries maybe five in each). Then explain what the different lines that would be moved are and how they are identified.

Hi @E1cid,

problem is, the api call of the device is currently not working, so i can't provide an example. From the manufacturer I have the info, that with this api call you should get a json file with similar content as the above posted lines.

At the moment I use the WebUI of the device every week or so and there I click "download log file as csv".

This csv delivers the posted 1st line and following 100 lines of data. When new data is generated, the oldest entry of the logfile vanishes.
If possible, I would like to have this process automated.

I think I will experiment with your example, to get a feeling, thanks a lot!

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.