Excel file to mySQL

Hi, I have an excel file (1000 rows), I can use spreadsheet-in to convert the data in one whole json. Is it feasible to insert 1000 rows into the mysql table?
If not, I am looking for a method to read the excel one row at the time, but no luck.
Also, is there a way to convert excel to csv? If yes, I think I can read csv line by line. Then, I can insert each row to mySQL.

Thanks.

Maybe not the most efficient way but yes, you could feed the converted data through a split node to generate individual messages then through a function or template node to generate a SQL insert statement, then send that (in msg.topic) to a mysql node.

I have not yet done so, but I would expect that it is feasible to insert 1000 rows in a single mysql statement.

To do so, I would direct the json output of the node-red-contrib-spreadsheet-in to a change node that has a jsonata query that is creating your multiple record sql insert statement

Take care that node-red-contrib-spreadsheet-in might produce a "sparse array" in case the excel has blank cells and a "sparse array" is not proper json (so this might give unexpected results when using jsonata query). You can convert it to proper json by directing the output to 2 json nodes in series. For more details see also this topic: Weird behaviour when processing an array of arrays having null values

Much better to feed a bulk insert if you can . If not, use a prepared statement as that will be vastly more efficient.

Agreed - but i am not familiar with the capabilities of the MySQL node.

If it offers bulk insert or stored proc + params then definitely go with that.

My offering was an "off the top of my head - this should work" kinda lazy solution :slight_smile:

Thx, Insert 1000 row to mySQL works.
I have figured it out the easy way.
Use spreadsheet-in to get the json, use json-to-csv convrter block to save the data in csv.
It is easier to clean up and setup the topic for mySQL insert.
Thx