Use of the Smooth Node to process Array Output

I have temperature data in a database and I want to display this over differing time periods.

A query runs (on startup or manually) and the output is produced which feeds chart nodes with historic data. The query returns all the data for the last 24 hours and reduced data beyond that.

I am charting the 24 hour portion of the data, but the result is that 'jitter' shows up and makes the chart look noisy, owing to the data values changing by a small amount between readings. This is not noticeable where the y scale on the chart is large, but becomes apparent when the scale is reduced.

This chart shows the data from the database with the 'noise' and subsequent a real-time expansion of this data (in green) which was passed through a smooth node.

However, when I take the output of the array and feed this to the smooth node, the jitter is not filtered. I tried a payload to number conversion before the smooth node but that didn't work.

So I'd like to know how to smooth the output of this array and then send that to a chart node.



Hi .. what if you set your ui_chart setting to


does that smooth the line ?

Thanks for the suggestion, but nope, that doesn't make an obvious difference.

I do know that I could smooth a copy of the incoming data, write that to another database table and use that for the charting. Ideally I'd like to just save the raw data and then process that when needed.

Can you share small sample of the data (raw data from query preferably)

Stange that the interpollation didnt make a difference.
have you refreshed the browser or restarted NR to clear the previous points ?

I see that you have 4936 points .. and thats for a 24h period ?


maybe you can reduce the frequency of your INSERTS to the db instead of trying to filter them out or smooth them later :wink:

Yep, I cleared the previous and re-injected the query.

Nope, the query is

SELECT unixtime, temperature from table Where ROWID % 10 = 0 or myDate > datetime('now','-24 hours');

This returns every record for the last 24 hours and 'sparse' data beyond that.
I don't think that the problem lies with the query.

SELECT unixtime, temperature from table;

gives the same problem with the chart but is slower.

I am capturing the data every 35s and writing it all to the database. That side of things seems to be fine, and the database limits records to 200 hours so is only 2.5MB.

There should be a way of smoothing data extracted from a database - I just need to get my head around how to do it!

It's not a problem over longer time periods simply because the range of values on the y axis tends to be greater so the effect of this 'jitter' although present is not as readily visible.

apparently there is some problem with the query because it still produces to many points :wink:
which is the whole reason that you cleverly used that Where ROWID % 10 = 0 but that doesnt filter some of the row id's for the last 24h .. it filters the ids of all rows of the whole table. no ?

and why use or shouldnt that be AND ?

possibly you should be doing like a nested sql query
first select the 24h
then filter that result further with ROWID % 10 = 0

ps. dont know how its done

maybe like this ?
SELECT * FROM table WHERE myDate IN ( SELECT myDate FROM table WHERE myDate > datetime('now','-24 hours') ) AND ROWID % 10 = 0

SELECT t.* FROM ( SELECT * FROM table WHERE myDate > datetime('now','-24 hours') ) as t WHERE t.ROWID % 10 = 0

Don't know if most efficient way but smoothing as post-processing can be done

[{"id":"8fc59c20967dec7b","type":"rbe","z":"afcfe6f7a144f0b4","name":"","func":"deadbandEq","gap":"0.5","start":"","inout":"out","septopics":false,"property":"payload.y","topi":"topic","x":240,"y":360,"wires":[["b6aefe5a1d3be161","00b92acddc7f09b1"]]},{"id":"1e23bfae06dabb9a","type":"inject","z":"afcfe6f7a144f0b4","name":"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"test","payloadType":"date","x":140,"y":200,"wires":[["6ad8a978a5fe9a9d"]]},{"id":"6ad8a978a5fe9a9d","type":"function","z":"afcfe6f7a144f0b4","name":"data","func":"let data = []\nfor (let index = 0; index <20; index++) {\n    data.push({y:Math.random()*2,x:new Date().getTime()+index*1000})\n    \n}\nmsg.payload = data\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","libs":[],"x":150,"y":240,"wires":[["1e3f0bc130d29a94","2976984496eee764"]]},{"id":"1e3f0bc130d29a94","type":"debug","z":"afcfe6f7a144f0b4","name":"RAW","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":410,"y":240,"wires":[]},{"id":"ee71b5ec3bb79601","type":"debug","z":"afcfe6f7a144f0b4","name":"SMOOTH","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":420,"y":400,"wires":[]},{"id":"2976984496eee764","type":"split","z":"afcfe6f7a144f0b4","name":"","splt":"","spltType":"str","arraySplt":1,"arraySpltType":"len","stream":false,"addname":"","x":170,"y":280,"wires":[["99203a7b3115cccd"]]},{"id":"b6aefe5a1d3be161","type":"join","z":"afcfe6f7a144f0b4","name":"","mode":"custom","build":"array","property":"payload","propertyType":"msg","key":"topic","joiner":"\\n","joinerType":"str","accumulate":false,"timeout":"","count":"","reduceRight":false,"reduceExp":"","reduceInit":"","reduceInitType":"num","reduceFixup":"","x":270,"y":400,"wires":[["ee71b5ec3bb79601"]]},{"id":"00b92acddc7f09b1","type":"debug","z":"afcfe6f7a144f0b4","name":"FILTER","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":420,"y":360,"wires":[]},{"id":"99203a7b3115cccd","type":"function","z":"afcfe6f7a144f0b4","name":"add complete","func":"if( - 1 =={\n    msg.complete = true\n}\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","libs":[],"x":220,"y":320,"wires":[["8fc59c20967dec7b"]]}]

The query works as intended. It pulls out every record for the last 24 hours and every record ending in '0' before that and is used to provide the pre-populate data for temperature charts over different time periods. As long as the period is large, sampling the data like this doesn't really distort understanding. Equally, the data could be averaged to a reduced number of points, but this approach is simple and quick yielding data points at c. 5 minute intervals

Here's the results of the query at around 24 hours ago showing what I'm describing ...

How many records are these ?

and the result of that query doesnt seem to produce an evenly spread out result

Same minute

Same minute

What you're noticing is simply that the outstation sends data at just over 30s so as time progresses, this is seen in the stored data. I could tweak the outstation to 30s but it doesn't really matter.

ok but you didnt tell us how many records were there for the 24 hour query you showed above
and how that shows on the chart .. is it jittery ?

i saw you edited your post :wink:
whats wrong with doing seperate queries depending on the time period selected ?
if its yearly do a funky sql query to average the months
if its monthly do an average of each day
(sql grouping with AVG() )

and seperate chart for the realtime data

Yes, sorry, I re-read your question and realised that I'd not answered what you actually asked!

Absolutely nothing. It makes no difference to the problem here which is the noisy nature of the data.

Here's a chart for 24 hours made with a 24 hour query. There are approximately 2360 records per 24 hours.

Can you copy the msg with the data using the Copy value button when you hover over the msg in the Debug window .. and paste it in a .txt file and upload it so we can have some data to test ?

You could possibly extract a rolling average from your database, something like this - average of 5 records around the selected ones:

SELECT a.unixtime, sum(b.temperature)/count(b.temperature) as avtemp from table a, table b WHERE ABS(b.ROWID - a.ROWID) < 3 AND (a.ROWID % 10 = 0 or a.myDate > datetime('now','-24 hours'));

The AND (AND/OR) select is surely wrong at my first attempt.
Not sure if you can use ROWID like that. (Trying it out in my MariaDB I linked the tables with an actual autoincrement field)
Also round the result as appropriate.

ps I don't see anything wrong with your sql but If you have 200 hours of data at 35 second intervals there should be roughly 4300 records selected rather than 4900?

That's what I was just working on...!

select unixtime, avg(temperature) OVER(ORDER BY myDate ROWS BETWEEN 5 PRECEDING AND CURRENT ROW )as temperature from table WHERE myDate > datetime('now','-24 hours') ;

Which also has the advantage of off-loading the task from Node Red to sqlite.

I have been trying the approach outline by hotNipi but I've not got this working yet. But I like the approach of doing this in the query.

And needs a round applied too ...

Good grief, is that actual SQLite syntax?
Thank goodness MySQL uses vaguely similar syntax to what I learned decades ago.

Well, it works ! :slight_smile:

For completeness, the solution I used was to do the work in the query.

select unixtime, round(avg(temperature) OVER(ORDER BY myDate ROWS BETWEEN 5 PRECEDING AND CURRENT ROW),2) as temperature from table WHERE myDate > datetime('now','-24 hours') ;

1 Like

Sorry about the delay - got sidetracked with the SQL approach ...

19JAN22_test_data.txt (29.8 KB)