I have a music library that has a few thousand files in it. I am hoping to create a user interface to help me manage all of the data.
The music library outputs the library contents using XML. I am trying to capture that data into a MySQL database. I convert the XML to an object and then I have a function that goes through the object array and generates the SQL to insert the records into the database. Each loop in the function node generates a single INSERT SQL string. I then do a node.send(msg) to send the string to the MySQL node. In testing the process I would only process a few records. Which worked great. When I tried to process the whole library node-red would crash. So I put a node.warn counter to see where it was crashing. It would process 14 records and then crash with an out-of -memory error. And nothing got written to the database. Below is my flow and code within the function node. I am uncertain where to go from here.
The msg coming into the function node will contain all the data.
Updating the topic and sending it with each iteration is sending all data each time. Additionally, without setting the 2nd parameter, the msg will be cloned each time.
Try sending a {new msg} instead - this will be far more lightweight.
Thanks for the reply. That is exactly what was going on. I did it slightly different and copied msg.payload into a local variable and then deleted msg.payload.
I also added a node.done() after the send. Based on my reading I thought that might help too?
Your database is being bombarded with open/insert/close requests, it won't like that. You could add a delay after each insert, however, you will find much better performance if you insert multiple records at once during the same call (not thousands, but 100 should be fine)
@bakman2 I am sure it is!! There are 21,950 songs in my library. So that is how many records that I am loading. As it is is takes about 10 seconds to grab the XML and create the SQL statements. Then about another 50 seconds for the db to load them. I will not have to do this too often so I am not too worried about the time it takes based on this experience.
@TotallyInformation Thanks for the += info. I will surely use that in the future. However, I am not sure I understand the "And even better performance if you use a prepared statement and feed data into it." comment.
MySQL 8.0 provides support for server-side prepared statements. This support takes advantage of the efficient client/server binary protocol. Using prepared statements with placeholders for parameter values has the following benefits:
Less overhead for parsing the statement each time it is executed. Typically, database applications process large volumes of almost-identical statements, with only changes to literal or variable values in clauses such as WHERE for queries and deletes, SET for updates, and VALUES for inserts.
Protection against SQL injection attacks. The parameter values can contain unescaped SQL quote and delimiter characters.
This import was really a trial and error process. Because it is XML based, they kept adding fields.
I finally got it all imported but it took about 2 dozen attempts by looking at the 'field missing' errors, adding the fields, and re-importing. Finally for all 21,950 imported.