Flow to update mongo in batches

Hi, I'm brand new to Node-Red, but so far I love it! I need to (repeatedly, but not super quickly) read a couple thousand database rows (Mongodb), calculate a bunch of attributes and then update the objects.

So far, I'm using:

  1. Mongodb find.forEach (mongodb3 node)
  2. running through a few function blocks to do the calculations and set attributes
  3. using batch/join to create json lists of modified objects so I can update in batches
  4. Mongdb bulkWrite with a updateOne row for each id in the list and update from the corresponding object in my list.

It kinda works, but the batch node assumes an endless flow so that's not good. If I end with a partial batch those rows are not processed.

How can I get the batch functionality so that ALL my rows are processed? Any other suggestions?

If you have a large array of all the updates you want to make then rather than batch or join, you can use the split node to split the array into smaller arrays of size n - and then use those for the batches. - This does then handle the small remainder at the end.

I'm reading them from Mongodb in a find.forEach and passing each row through the calculations so I need to "batch" them (I think) or somehow turn them into a group for batch updates. I could probably read them all at once for this project but for future scalability would like a solution where I read/process them individually, batch them and update them.

Open to all suggestions..I'm about 3 days in with NodRed ;-).

thanks!

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.