Node jsonSQL Issues

I am new to Node-Red so forgive me when I ask stupid questions etc.

I have been trying to get the jsonSQL node to work but it seems that it is not great with large files - It works well with some small sample data but I really need it for is approx 10mb and has 1000's of records. I'm afraid it just locks the system and I have to restart Node-Red.

I have tried running it on a massive Google cloud server to make sure it wasn't just my puny laptop causing the issue but it's the same.

Has anyone got any ideas? - Maybe a way to 'fix' it or a good alternative?

Thanks

ETD

Welcome to the forum.

In what way does node-red lock up ? Meaning, I assume you need a query to work with the jsonsql node. (which node is it specifically?)

If you do a select * from .... and just output it to a debug node, then i can imagine node-red to lock up if it has to display 1000's of records.

What are you trying to do with the data ?

A much better way to deal with bulk SQL updates is to use the native tools - many SQL db's have tools to upload CSV directly. You can use Node-RED to orchestrate the batch jobs.

The screen greys out and stays that way - I have to restart Node-Red.

I thought maybe it was the debug mode as well - I am afraid not though.

The main thing I am trying to achieve is to simply populate a UI dashboard based on the queries from the JSON data.

EDIT: This is the node in question - https://flows.nodered.org/node/node-red-contrib-query-json-with-sql

I pull the raw JSON file from an HTTP request so this seemed a neat way of dealing with it.

I think you are correct though and I will have to covert to CSV, push to a MySQL DB and then query it with a MySql Node and use those results in the UI.

I assume that there is a node that converts JSON to CSV.

Thanks Guys!..

The CSV node :-)....
If it's pulling back 1000's of records then unless it's a chart it's going to hard to fit on a UI... if so - could the query return a reduced amount of data in the first place ?

I am guessing the lock up is caused by this node using all CPU power for an extensive period due to the big data set and the actual queries you are running.

You can consider directly parsing the json object by a jsonata query in a change node.
Although jsonata might be slow (depending on the query and the big input json object) in this case it might be faster than node-red-contrib-query-json-with-sql.

If you only need to upload the data once every hour or every day then waiting a couple of minutes for the queries to complete might be acceptable (so you will temporarily see a node-red UI screen with connection lost but it will automatically restore/repopulate once the queries are finished).

Just remember that a general-purpose, low-code development environment will not always be the best answer to every problem! :wink:

Maybe consider using Node-RED to prototype the process but then look to improve performance by setting up a dedicated set of code (and maybe a shell script to set it off) that you can set running from Node-RED but that runs in a separate process. For example, there are plenty of libraries for Node.js that will convert to/from CSV and plenty work with MySQL.

That way, you could do things like use streaming to read things in logical chunks and pass to a pre-defined insert query.

You could also then make use of other languages if you are more familiar with them. Whether Python, PowerShell (which is very good for piping data from one interface to another) or even C/C++, or maybe Rust or Go (which tend to be very fast).

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.