Newbee question extending JSON File Bestpractice?

thanks to the forum I started with node-red some months ago.
I do collect data from within different "smart-things"/IoT and to write to influxdb and some date to files.

For easier handling I started writing small JSON objects like this, every time data is collected,
one line is added to the file by my flow:

{ "timestamp" : "1679028000001", "power" : "23" }
{ "timestamp" : "1689028000001", "power" : "21" }
{ "timestamp" : "1779028000001", "power" : "27" }

Now the files contain some nice data-lines.

To use them in some JS Scripts (offloaded somewhere else) - what would be bestpractice:

  • make the flow better to write correct JSON
[ {....},{....},{....} ]
  • I could not find an easy way to write this with a flow?
  • or should I just make my JS read the Textfile and transform it to a correct data-array

thank you in advance for any tips or opinions
best regards

As each line is correct json I would just read the file one line at a time and process it on the fly. If you make it into one large object then ( depending on size of course ) you may start to run into memory issues holding and manipulating it all at once.

Or, if it is a one time thing, open the file in an editor, replace newline "\n" with comma-newline ",\n"; and finally add square brackets to the start end end of file. Should be a 1 minute effort.

If the data are already being written to Influxdb then perhaps you don't need the files at all. You could instead read the data back out of Influxdb when you need it, which would be much easier and quicker than handling csv files.

1 Like

Thank you so much for all the input and ideas,
as of lots of sick leave in our office I was not able to take time to answer.
Sorry for that!

The thing with the memory is a big argument - never thought on it before, but I think in this case I will keep the filesize/lines about 200 per day for daily stats and the lifetimelog is one line per day, so it should be ok for the next 5-10 years - at least I guess. And in ten years even the "raspis" will handle that.

At least this was the solution I used in JS I did a regexp - I had to tweak it but now it works
(had to deal with the last newline not to be replaced with a comma :-)).

I am not familiar with influx right now - collected too much data within the last 4-5 months.
Had to moove my raspi to a VM. I am looking forward to make great analytics and visualization ... some day

The data from nodered is locally produced and not within the web.
I do write the files for historical-data-safety. If I do brake anything I do have at least my files :slight_smile:
Also changes within nodered or influx should not result in webproject errors or vica versa.
Additional I can not access any ports of influx or node red from the webproject where I do "visualize" the data. So I do copy the Files via cron over several hops every hour to the website - this works fine for me and makes both projects independent.

I do read the file within Javascript as textfile and than "handle" it.

let jsonArray = "[" + data.replace(/\n+$/, "").replace(/^\s*\n/gm, "").replace(/\n/g, ",") + "]";

I realy like this forum! Thank you once again for your time and input - it helpd me a lot to make my desicion
best regards


This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.