Best practice fast write to external data storage

I have a flow where I loop through large json objects and then feed the relevant data to a mysql-r2 node to write to a mysql database. In order to ensure safe writing to the database, is there any point limiting the message rate to the mysql node? Are there any other factors here than memory? I'm assuming that both a rate limiting delay node and a mysql node that is not writing to disk fast enough store the queue in memory. So what is safest, rate limiting or feeding the loop straight into the mysql node?

I would think that rate limiting will just retain the data in Node-RED's memory so won't help at all in terms of resilience.

If writing individual records to a DB, always use prepared statements as these will always be fastest. If you really need performance, then batch writing multiple records will be best but you are back to the same resilience question - how many records can you afford to lose in the case of a failure.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.