Adding a "reduce" option to the Smooth node

It prob would but as usual its a choice between an extra specialist node or trying to do it with basic ones :slight_smile:

I went for the easy option on this one
image

Agreed. I was just building onto the original request with some similar issues I've seen...

As Simon mentions, I've also used the node-red-contrib-aggregator to do this (until I wrote my own custom node). But I still would prefer a core node that could "pare down" my data, similar to what was proposed, but with a more generic field-based (or even JSONata expression) to do the grouping...

1 Like

... although now that I put it that way, perhaps it makes more sense for this "grouping" to be an option on the join node... hmmm

Of course, now that I look at it, the join node already has a "reduce sequence" feature, which will probably do what I want:

Reduce Sequence mode

When configured to join in reduce mode, an expression is applied to each message in a sequence and the result accumulated to produce a single message.

I had already looked at node-red-contrib-aggregator, but it suffered from the same problem as @Colin's suggestion (using the Delay node w. rate limiting); that messages are averaged over time rather than by message count. I need to take N number of messages and get an average (or minimum, or maximum) of those, regardless of what time they arrived at, and I need that behaviour to be perfectly linear and predictable. The fact that the Smooth node already does 99% of this, and could be quite easily modified to add the missing piece, without any chance of breaking existing flows, to me personally makes this seem like a good solution. But I'm probably an edge case - I'll see if I can figure out how to pull in this node independently from the other node-red-nodes from my own fork for now. Maybe the ultimate answer is to create a completely new "Reduce" function node for this particular behaviour.

I think it is as well

1 Like

Calculate node do what you want.


I use it

1 Like