Hi, I've been using Node-Red for about 1 week, and I like it a lot. I do have what might be a newbie question though. I've searched the forum, but I haven't found a topic that addresses this issue, but maybe I'm missing something.
I'm using a few contributed nodes, specifically node-red-contrib-minio-all, node-red-contrib-bool-gate, node-red-contrib-csvtojson, and node-red-contrib-mongodb4.All of these nodes have output that erases the incoming message. This makes it difficult to resume post-processing work. I've also created a subflow, but these nodes erase _linkSource from the message, forcing me to save and restore the _linkSource so that the flow is able to return to the caller. They also overwrite the payload.
My use cases is that, in response to a new file upload, I use node-red-contrib-minio-all to pull a CSV file from a Minio S3 bucket, then use node-red-contrib-csvtojson to transform the file to JSON, and then push the JSON file back to a different S3 bucket which will trigger further processing. I also move the original object to an archive, but since Minio S3 doesn't have a "move" function, I have to copy and delete the object.
I could do these steps in sequence or use a status node to detect when a node completes and then trigger the next node. However, the minio, mongodb, and csvtojson node output doesn't contain my original payload, so I'm lost.
My approach may be wrong, so I am open to suggestions/correction. I did see that a Flow Context can hold data across nodes (which is what I'm doing with _linkSource). Will that work when many messages may be processed at one time? What about with subflows? Right now I save the _linkSource in the main flow, but restore it in a subflow. Might that cause problems when the subflow is shared? I can't tell from the docs whether it is safe to use a Flow Context in this way, but I'd rather not build it, then have data potentially get overwritten during processing.
Thanks for any useful direction or suggestions.