Read messages from a file and process time-based

Hi there, I understood an tested successfully how to store temperature sensor data in a file (just timestamp and temperature.

The issue is - the sensor sends data just randomly, if temperature changes. If it does not change, it does not send. But what I want is to send telegram messages with the current temperature lets say exactly once every 4 hours, and also be able to have historical data to fill a chart later.

So I do have a file, where I need to read the last record/message (when the time is come) and send it - and I would also need a function later to read the value that was valid on a given time, lets say yesterday 12:00 - so I would need to find the last record before yesterday 12:00 and read it -

Any good examples how to do this? (I guess just reading the last is easier)

When you write the last temp change to file, why not set it to context storage? Then every 4 hours just read the temp from context.

Storing your data in a database would be better, and easier to retrieve with sql queries., time based or for returning data for charts etc.

It is a possibility, but maybe I want to use the data later with other programs and also not lose the data in case of power fault. Do I really need a Database and query? Seems a bit overkill :slight_smile:

I can read from file into objects (found the file in node) but it gives me all objects on one trigger hit..

You still write to db or file, just keep an last temp local in context. Context storage can survive power failure just edit settings in settings.js file

Using a database has less overheads when reading data, as you do not have to read all data into memory. Also seaching for date specific data is easier, as the database does all the work. On a small scale this makes little difference.

You could probably search using JSONata and return the data you need , from your file object.

The context/flow/global variables are the ones to use for processing data if you can't do it immediately you are sent it.

If you want to process data over time, you will likely want to use a database. InfluxDB is a timeseries database that is ideal for this and is widely used in the Node-RED community.

I don't know about good, or your data structure after you have read your file into an object.

In this simple example you would need to convert dtae times to a timestamp, then with a change node and Jsonata expression you can filter your object for any property that has a timestamp that fall between the time range.

e.g.

[{"id":"e17e1e74.22281","type":"inject","z":"8d22ae29.7df6d","name":"","props":[{"p":"payload"},{"p":"daterange","v":"[  1602777290000,1602777320000]","vt":"json"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"[{\"time\":1602777398269,\"temp\":19.75},{\"time\":1602777398269,\"temp\":19.03},{\"time\":1602777383869,\"temp\":18.82},{\"time\":1602777383869,\"temp\":18.97},{\"time\":1602777383869,\"temp\":19.68},{\"time\":1602777383869,\"temp\":18.7},{\"time\":1602777383869,\"temp\":19.43},{\"time\":1602777383869,\"temp\":19.48},{\"time\":1602777383869,\"temp\":19.17},{\"time\":1602777383869,\"temp\":19.2},{\"time\":1602777369469,\"temp\":18.44},{\"time\":1602777369469,\"temp\":17.95},{\"time\":1602777355069,\"temp\":17.16},{\"time\":1602777355069,\"temp\":16.55},{\"time\":1602777340669,\"temp\":17.33},{\"time\":1602777326269,\"temp\":17.12},{\"time\":1602777311869,\"temp\":17.57},{\"time\":1602777297469,\"temp\":17.63},{\"time\":1602777283069,\"temp\":18.27},{\"time\":1602777283069,\"temp\":18.56}]","payloadType":"json","x":110,"y":660,"wires":[["85fe24b1.2b4f9"]]},{"id":"85fe24b1.2b4f9","type":"change","z":"8d22ae29.7df6d","name":"","rules":[{"t":"set","p":"payload","pt":"msg","to":"$filter(\t   payload,\t   function($v, $i){$v.time > daterange[0] and $v.time < daterange[1]}\t)","tot":"jsonata"}],"action":"","property":"","from":"","to":"","reg":false,"x":310,"y":660,"wires":[["ca18810d.3ed83"]]},{"id":"ca18810d.3ed83","type":"debug","z":"8d22ae29.7df6d","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":610,"y":600,"wires":[]}]

This injects an object of 20 entries and filters for for values that lie between the datrange in the inject

Interesting! / thank you all. The more I think about it, the more I think that the database idea is better. I know SQL and it does likely also the time coversion/handling.

So I could just query like "SELECT TOP 1 FROM ... WHERE Timestamp < (yesterday12:00) ORDER BY Timestamp DESC" in InfluxDB? Or need something like SQlite?

Yes similar to that

not sure about timestamp, may be better to use iso standard

If I understand well, I need to install it first (on Radpberry Pi) to use the InfluxDB node. Same as with other DB - So what is the advantage over SQlite, mariadb /mysql etc. using exactly this?

Read Time series database (TSDB) explained | InfluxData

1 Like

InfluxDB is designed specifically for timeseries data. While you can handle that kind of data in a SQL db, it is much harder to work with.

For example. My IoT sensors output about once every 50sec and I feed them direct into the db. InfluxDB adds the timestamp for me. But when I want to look at or monitor the data, I want those input times normalised - lets say I want to see a minute-by-minute chart of all of my temperature sensors. With InfluxDB this is simple and I don't really need to worry about timestamps. Now, when I want to see the same data but by hour, all I need to change is the timespan and InfluxDB works everything out for me. In SQL I would have to do the calculations myself.

Also, Influx is very efficient at storing timeseries data. It also has some excellent features for trimming data and aggregating for long-term data. So my per-minute data is kept for 7d & automatically trimmed for me but also automatically accumulated into per-hour data (adding max, min and avg readings for me) which is kept for several years. So I can easily have a chart that shows me the avg temperature over the last 3 years for example without it chewing up loads of CPU and I/O as would happen on a SQL DB.

Thanks, sounds good - now I know better. Will try it in some days.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.