Hi, I have an excel file (1000 rows), I can use spreadsheet-in to convert the data in one whole json. Is it feasible to insert 1000 rows into the mysql table?
If not, I am looking for a method to read the excel one row at the time, but no luck.
Also, is there a way to convert excel to csv? If yes, I think I can read csv line by line. Then, I can insert each row to mySQL.
Maybe not the most efficient way but yes, you could feed the converted data through a split node to generate individual messages then through a function or template node to generate a SQL insert statement, then send that (in msg.topic) to a mysql node.
I have not yet done so, but I would expect that it is feasible to insert 1000 rows in a single mysql statement.
To do so, I would direct the json output of the node-red-contrib-spreadsheet-in to a change node that has a jsonata query that is creating your multiple record sql insert statement
Take care that node-red-contrib-spreadsheet-in might produce a "sparse array" in case the excel has blank cells and a "sparse array" is not proper json (so this might give unexpected results when using jsonata query). You can convert it to proper json by directing the output to 2 json nodes in series. For more details see also this topic: Weird behaviour when processing an array of arrays having null values
Thx, Insert 1000 row to mySQL works.
I have figured it out the easy way.
Use spreadsheet-in to get the json, use json-to-csv convrter block to save the data in csv.
It is easier to clean up and setup the topic for mySQL insert.
Thx