Split message and process the individual sequentially, not in parallel

Hello dear forists, I struggle here with a problem. I have a message array and I now want to process this sequentially. An API is requested and the response is merged with the original message. In parallel this is possible with split and join - but with a large number of messages the penetrance of the API is so high that the server faces serious performance issues. I am now looking for a solution where the messages are processed sequentially - one after the other after successful processing. My previous research on this remained unsuccessful. What methodology would you choose?

Thanks and greetings

Stefan

Do you mean that you need to make sure one message goes through the flow before the next is allowed in? If so this example flow uses a Delay node in a mode where it can be used to queue messages and release them when it is told the previous one has completed. In this flow the node named Process Taking 5 Seconds simulates some process that needs to be completed before the next one can be allowed in. If you feed your Split node into this it will handle the messages sequentially.

[{"id":"b6630ded2db7d680","type":"inject","z":"bdd7be38.d3b55","name":"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"","payloadType":"date","x":120,"y":1700,"wires":[["ed63ee4225312b40"]]},{"id":"ed63ee4225312b40","type":"delay","z":"bdd7be38.d3b55","name":"Queue","pauseType":"rate","timeout":"5","timeoutUnits":"seconds","rate":"1","nbRateUnits":"1","rateUnits":"minute","randomFirst":"1","randomLast":"5","randomUnits":"seconds","drop":false,"allowrate":false,"outputs":1,"x":290,"y":1700,"wires":[["a82c03c3d34f683c","d4d479e614e82a49"]]},{"id":"a82c03c3d34f683c","type":"delay","z":"bdd7be38.d3b55","name":"Process taking 5 seconds","pauseType":"delay","timeout":"5","timeoutUnits":"seconds","rate":"1","nbRateUnits":"1","rateUnits":"second","randomFirst":"1","randomLast":"5","randomUnits":"seconds","drop":false,"allowrate":false,"outputs":1,"x":510,"y":1700,"wires":[["7c6253e5d34769ac","b23cea1074943d4d"]]},{"id":"2128a855234c1016","type":"link in","z":"bdd7be38.d3b55","name":"link in 1","links":["7c6253e5d34769ac"],"x":75,"y":1780,"wires":[["3a9faf0a95b4a9bb"]]},{"id":"7c6253e5d34769ac","type":"link out","z":"bdd7be38.d3b55","name":"link out 1","mode":"link","links":["2128a855234c1016"],"x":645,"y":1780,"wires":[]},{"id":"b23cea1074943d4d","type":"debug","z":"bdd7be38.d3b55","name":"OUT","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":650,"y":1620,"wires":[]},{"id":"d4d479e614e82a49","type":"debug","z":"bdd7be38.d3b55","name":"IN","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":450,"y":1620,"wires":[]},{"id":"3a9faf0a95b4a9bb","type":"function","z":"bdd7be38.d3b55","name":"Flush","func":"return {flush: 1}","outputs":1,"noerr":0,"initialize":"","finalize":"","libs":[],"x":170,"y":1780,"wires":[["ed63ee4225312b40"]]}]
1 Like

Hi Colin,

Yes, that's what i need.

This could do the trick! Thanks for sharing this solution.

KR Stefan

Try to make sure that every message that goes into your protected flow generates one and only one flush message back to the delay node. You may have to catch any errors that may arise in the flow and use the error to release the next one, otherwise it will hang until the delay node times out before releasing the next one. Set the timeout in the delay node to a bit more than the max you ever expect your flow to take. Also make sure that if there is an error you don't get two flush messages going back.

1 Like

Hi Colin, many thanks for this additional and useful hints. Appreciate and helps! KR Stefan

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.