Use-case: I am split
'ing an array of recordIds to make sequential API requests (YouTube endpoint) and then evaluating the JSON responses, specifically determining if an array has any items in it. If the array is empty, then I "drop" that msg containing the response with the empty array (there's no further action my flow can perform on it
Problem: as you can begin to intuit, dropping msgs becomes a problem when attempting to aggregate them together again using the join
node which in this case is not propagating the msgs as expected because the node is seeing that not all parts for that sequence/runtime are present
Questions:
- I am considering using the
flow.get()
state to effectively track how many items are in the 'macro' array of channels that I am querying, and then subtract from that array for any msgs that are dropped to determine if themsg.complete
should be assigned to a specific msg - am I overthinking this? - I'm aggregating these values so I can iterate thru them and perform batch updates to my datastore. I wonder if I should just scrap the
join
node and shoot them all off one by one but I see this becoming a massive scaling/volume problem over time - are there any other patterns I should be considering?
Edit: I read this post from earlier this year and tried to follow through with @knolleary's suggestion of using a switch
node with the preserve sequence
checkbox checked and unchecked, but neither approach had the expected outcome I'm looking for.