Hi,
thanks to the forum I started with node-red some months ago.
I do collect data from within different "smart-things"/IoT and to write to influxdb and some date to files.
For easier handling I started writing small JSON objects like this, every time data is collected,
one line is added to the file by my flow:
As each line is correct json I would just read the file one line at a time and process it on the fly. If you make it into one large object then ( depending on size of course ) you may start to run into memory issues holding and manipulating it all at once.
Or, if it is a one time thing, open the file in an editor, replace newline "\n" with comma-newline ",\n"; and finally add square brackets to the start end end of file. Should be a 1 minute effort.
If the data are already being written to Influxdb then perhaps you don't need the files at all. You could instead read the data back out of Influxdb when you need it, which would be much easier and quicker than handling csv files.
Thank you so much for all the input and ideas,
as of lots of sick leave in our office I was not able to take time to answer.
Sorry for that!
@dceejay
The thing with the memory is a big argument - never thought on it before, but I think in this case I will keep the filesize/lines about 200 per day for daily stats and the lifetimelog is one line per day, so it should be ok for the next 5-10 years - at least I guess. And in ten years even the "raspis" will handle that.
@cameo69
At least this was the solution I used in JS I did a regexp - I had to tweak it but now it works
(had to deal with the last newline not to be replaced with a comma :-)).
@Colin
I am not familiar with influx right now - collected too much data within the last 4-5 months.
Had to moove my raspi to a VM. I am looking forward to make great analytics and visualization ... some day
@all
The data from nodered is locally produced and not within the web.
I do write the files for historical-data-safety. If I do brake anything I do have at least my files
Also changes within nodered or influx should not result in webproject errors or vica versa.
Additional I can not access any ports of influx or node red from the webproject where I do "visualize" the data. So I do copy the Files via cron over several hops every hour to the website - this works fine for me and makes both projects independent.
I do read the file within Javascript as textfile and than "handle" it.