Hello there everyone,
I am rather new to Node-Red, so please bear with me.
What I have:
I´ve got a Node-Red Flow working on a Unipi Neuron M503, which logs sensor values every second to an influxdb measurement table. This works fine via the standard contrib-influxdb nodes.
What I want to do:
I want to extract the data as a csv file in more or less reoccuring intervals. This can be done manually with a button click, preferably by selecting a start time and just extracting the data till "now".
Problem:
When I use the the builtin contrib-influxdb node, the data is all stored in RAM an my Unipi freezes.
Prefered solution:
As I have read in the influxdb documentation, there is an option to read data in chunks of e.g. 20k measurements and write it to the csv file. This I have implemented via a curL command and am able to create csv files through ssh-connection.
This is an example curL command:
curl -G 'http://localhost:8086/query?u=admin&p=password' --data-urlencode "db=mydatabase" --data-urlencode "chunked=true" --data-urlencode "chunk_size=20000" --data-urlencode "q=SELECT * FROM rawvalues WHERE time > now() - 100d AND time <= now()" -H "Accept: application/csv" > /media/usb/unipi75-100d.txt
I would like to get this working with a HTTP Request node, but I am having trouble understanding the right syntax. I already googled several days, tried to understand some examples and also tried the Node-Red cookbook, but I am simply out of my depth.
Any help would be greatly appreciated.
Thank you in advance,
Daniel