Parsing repeated incoming data

I'm new to NR so this has probably been asked/answered many times. If so and there is a sticky I've missed, I apologize and hope someone will direct me to the right thread.

High level question:
If I have data coming into NR that is repeated, I don't want to store all data and waste database space for no benefit. Is there a recommended method for only utilizing data that is at least 30 seconds old for instance?

I have a RPi that I am planning on using to receive data from various sensors transmitting on 433MHz via rtl_433. The messages are received and converted to meaningful data in rtl_433 and then sent to a MQTT broker as individual values for each data type (e.g. temperatureF=60, humidity=99, etc). I have NR set up to take each message and export the value to an Influx database. 433 sensors generally output 3 copies of the same data in case of transmission error, so I see 3 temperature values recorded within 1 second. The next set comes ~30 seconds later. I don't need all 3, just the first one is sufficient.

So what I need help with is guidance on is how to handle the extra data so I don't blow up my database to 3x the size and writes for no benefit. Later on I do need to figure out how to make a short term (last 7 days with 1 minute resolution) versus long term database (all data with 1 hour resolution) but I assume that would be further down the road in another topic.

I'm not really getting this part:

You don't want to store repeated data.
You want to filter data to reduce the size stored.
Yet you say: ... for only usilizing data that is at least 30 secodsn old
All incoming data will become that.

Anyway, may I suggest the RBE node. It may go some way to doing what you really want to do.

I said 30 seconds as an example because most 433 sensors I am currently using or considering in the future transmit 1-2x per minute but send the same data several times. The ones I am using transmit with a gap of between 20 and 40 seconds and provide 3 redundant packets depending on sensor/type. The goal is to reduce storage of redundant data.

I'll check the RBE node. Thanks!

Based on your suggestion, I tried a few search terms in the palette manager beyond RBE. I located a few that look promising including one for rtl_433. I don't know which will work, but it's worth a try to install and check them out.

THe standard node delay set to rate limit each topic after your mqtt node should only allow 1 message per set time.


If I understand this component correctly, it takes data in and holds onto it as a stored value. It outputs at a rate of 20 seconds. The data that is output is whatever is stored at the time the 20 second interval is reached. If the data is stored at 20 second intervals, this would allow exactly 1 instance for each transmission even if there are a burst of 3+ messages that come in at the same time.

If that's true, then it's important to select the output time to make sure that output isn't misrepresented by either duplicating results or skipping results (assuming high fidelity is important).

If there is no method to only output the first message, this seems like a viable option and is easily incorporated!

It takes first bit of data and passes it on, then any other data that arrives within 20secs is discarded. After the 20 seconds the fitrst bit of data that arrives is passed, and repeat.

From what I'm seeing, it appears that all 3 MQTT get processed (show up in the debug window), then the filtered single message. It starts a counter that increases until it gets to the specified limit and then sends the same message again.

I did some more searching based on these suggestions and think I found the correct node. "node-red-contrib-deduplicate" seems to filter the 1st message and the duplicates as defined by the expiry time. If I set this to 5 seconds, no packet transmits longer than 1 second so there is no way I should have a problem with missed data. After quick testing, it does appear to be working.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.