On-demand delay node

I have a sequence of task parameters.
For executting the task, i use a function node, which cache the sequence in the node context and send the first element of the cache on demand (via array.shift()).

So this function node works a bit like the delay node. i thought, this is could be an additional mode of the delay node. What do you think?

grafik

[{"id":"67fe4939cb96aaea","type":"function","z":"e6998ea910456df2","name":"cache","func":"let payload = msg.payload\n\n\nfunction send_element() {\n    let cache = context.get(\"cache\");\n    const topic = context.get(\"original_topic\") || msg.topic;\n    delete msg.original_topic\n    if (cache.length == 0) {\n        node.status({ fill: \"red\", shape: \"dot\", text: \"cache empty!\" });\n         context.set(\"original_topic\",undefined)\n        return\n    }\n    msg.payload = cache.shift()\n    msg.topic = topic\n    node.status({ fill: \"green\", shape: \"dot\", text: cache.length });\n    node.send(msg);\n}\n\n\n\nif (msg.topic==\"save\") {\n    context.set(\"cache\", payload);\n    context.set(\"original_topic\",msg.original_topic)\n    send_element()\n\n}\n\n\nif (msg.topic == \"get\") {\n    send_element()\n}\n\n\nif (msg.topic == \"delete\") {\n    context.set(\"cache\", []);\n    node.status({ fill: \"red\", shape: \"dot\", text: 0 });\n    return\n}\n\n\n\n","outputs":1,"timeout":0,"noerr":0,"initialize":"// Der Code hier wird ausgefĂĽhrt,\n// wenn der Node gestartet wird\nnode.status({ fill: \"red\", shape: \"dot\", text: \"cache empty!\" });","finalize":"","libs":[],"x":770,"y":4480,"wires":[["a45a8074af69afea"]]},{"id":"a45a8074af69afea","type":"delay","z":"e6998ea910456df2","name":"long_running_task","pauseType":"random","timeout":"5","timeoutUnits":"seconds","rate":"1","nbRateUnits":"1","rateUnits":"second","randomFirst":"1","randomLast":"5","randomUnits":"seconds","drop":false,"allowrate":false,"outputs":1,"x":630,"y":4600,"wires":[["59f3d2bd842b10f3","5b026e831436d2e0"]]},{"id":"59f3d2bd842b10f3","type":"change","z":"e6998ea910456df2","name":"get","rules":[{"t":"set","p":"topic","pt":"msg","to":"get","tot":"str"}],"action":"","property":"","from":"","to":"","reg":false,"x":550,"y":4480,"wires":[["67fe4939cb96aaea"]]},{"id":"67e03d32e8c4c72f","type":"inject","z":"e6998ea910456df2","name":"Add Tasks","props":[{"p":"payload"},{"p":"topic","vt":"str"},{"p":"original_topic","v":"original_topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"save","payload":"[1,2,3,4,5,6,7,8,9,10]","payloadType":"jsonata","x":540,"y":4380,"wires":[["67fe4939cb96aaea"]]},{"id":"5b026e831436d2e0","type":"debug","z":"e6998ea910456df2","name":"debug 382","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":930,"y":4600,"wires":[]}]

Hi @kitori

Sorry, unless I don't get it, this is already possible.
There are also Flows & Nodes developed to aid message control

We always appreciate input, but unless I am missing something, the mechanics are already there for this, and ready to use?

It is not possible on a per topic basis to flush messages, according to my tinkering with this node. I may have misunderstood the idea also.

1 Like

Do you want to create a queue? I cannot remember if Node-RED has a node for but the request is valid. I use a sort of queue for my messages.

Correct. The delay node can be used as a queue, but only holds one array of messages so you can’t flush on a per topic basis (currently). That could be an enhancement, but I can’t think of how to implement it simply within the current node structure.

Would this work for you?

Or this one:

(Sorry)

Or this one which is a development of the second one I mentioned:

In the german translation, there is no hint that you can flush single messages >-<

So the delay node works perfecly fine if i set the delay high enough :slight_smile:

1 Like

Assuming you are talking here about the delay node, the English is
flush
If the received message has this property set to a numeric value then that many messages will be released immediately. If set to any other type (e.g. boolean), then all outstanding messages held by the node are sent immediately.

So messages can be flushed one at a time by setting msg.flush to 1.

This should do what you want

Open the Comment node to see more details off how to use it.

[{"id":"b6630ded2db7d680","type":"inject","z":"bdd7be38.d3b55","name":"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":140,"y":480,"wires":[["ed63ee4225312b40"]]},{"id":"ed63ee4225312b40","type":"delay","z":"bdd7be38.d3b55","name":"Queue","pauseType":"rate","timeout":"5","timeoutUnits":"seconds","rate":"1","nbRateUnits":"1","rateUnits":"minute","randomFirst":"1","randomLast":"5","randomUnits":"seconds","drop":false,"allowrate":false,"outputs":1,"x":310,"y":480,"wires":[["d4d479e614e82a49","7eb760e019b512dc"]]},{"id":"a82c03c3d34f683c","type":"delay","z":"bdd7be38.d3b55","name":"Some more stuff to do","pauseType":"delay","timeout":"5","timeoutUnits":"seconds","rate":"1","nbRateUnits":"1","rateUnits":"second","randomFirst":"1","randomLast":"5","randomUnits":"seconds","drop":false,"allowrate":false,"outputs":1,"x":800,"y":480,"wires":[["7c6253e5d34769ac","b23cea1074943d4d"]]},{"id":"2128a855234c1016","type":"link in","z":"bdd7be38.d3b55","name":"link in 1","links":["7c6253e5d34769ac"],"x":95,"y":560,"wires":[["3a9faf0a95b4a9bb"]]},{"id":"7c6253e5d34769ac","type":"link out","z":"bdd7be38.d3b55","name":"link out 1","mode":"link","links":["2128a855234c1016"],"x":665,"y":560,"wires":[]},{"id":"b23cea1074943d4d","type":"debug","z":"bdd7be38.d3b55","name":"OUT","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":670,"y":400,"wires":[]},{"id":"d4d479e614e82a49","type":"debug","z":"bdd7be38.d3b55","name":"IN","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":470,"y":400,"wires":[]},{"id":"3a9faf0a95b4a9bb","type":"function","z":"bdd7be38.d3b55","name":"Flush","func":"return {flush: 1}","outputs":1,"noerr":0,"initialize":"","finalize":"","libs":[],"x":190,"y":560,"wires":[["ed63ee4225312b40"]]},{"id":"7eb760e019b512dc","type":"function","z":"bdd7be38.d3b55","name":"Some functions to be performed","func":"\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","libs":[],"x":550,"y":480,"wires":[["a82c03c3d34f683c"]]},{"id":"e35f37deeae94860","type":"comment","z":"bdd7be38.d3b55","name":"Set the queue timeout to larger than you ever expect the process to take","info":"This is a simple flow which allows a sequence of nodes to be \nprotected so that only one message is allowed in at a time. \nIt uses a Delay node in Rate Limit mode to queue them, but \nreleases them, using the Flush mechanism, as soon as the \nprevious one is complete. Set the timeout in the delay node to \na value greater than the maximum time you expect it ever to take. \nIf for some reason the flow locks up (a message fails to indicate \ncompletion) then the next message will be released after that time.\n\nMake sure that you trap any errors and feed back to the Flush \nnode when you have handled the error. Also make sure only one \nmessage is fed back for each one in, even in the case of errors.","x":270,"y":360,"wires":[]}]

The German translation came up before but I do not think an issue was raised Queue-gate (q-gate) question - #11 by dceejay

@kitori, If the translation is not correct, please submit an issue at Issues · node-red/node-red · GitHub

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.