Uibuilder ping timeout - possibly related to mySQL taking over 1 minute to get large data

Hi all,
I have some problem with browser connection to uibuilder web page when I have to do some "very long and heavy" work on server-side (node-red).

For example:

  • start with a client request on uibuilder web page (i.e. button)
  • this request produce a message to node-red to retrieve a lot of data with mysql node
  • finally, after 1 min or more (it depends), retrieved data come back to uibuilder node with anoter message

During this long wait, browser disconnect socket (ping timeout), so I'm not able to manage this info in uibuilder js code (I think because _socketId is changed).

If I simulate long wait with delay node (i.e. 1 min), it works without problem (no disconnection from browser).

I would expect that uibuilder keep up the connection during server side job (long or short, it doesn't matter).
Probably mysql query use all available resources and uibuilder isn't able to respond to ping?
And when I use delay there is no resource consumption, so uibuilder can reply?

I think that I wrong something but I don't understand what....

Please @TotallyInformation could you help me?

Thanks in advance

node-red is built on nodejs. nodejs is single threaded.

So with that in mind, while your mySQL request is taking 1 minute, it is possible it holding up the nodejs thread (though I am suprised it is not asynchronous). What mySQL node are you using? (full name e.g. node-red-xxxxx-xxxxx)

Regarding your SQL

  • how many rows are you requesting & how many columns is in 1 row?
    • do you need all columns (could you slim down the returned data by specifiy only necessary columns)
    • is it not possible to use a where clause to minimise the amount of rows?
  • does the table have indexes to speed this up (1 minute is a LOOONG time for a DB query)

No, this should not happen. It is not a consequence of a wait of any length from Node-RED. The websocket connection maintains itself using ping and pong messages that you don't normally see, though you can expose them if you want to.

Load the page, open dev tools and go to the network tab then reload the page. Now monitor the network activity. You could also use the new Protocol monitor tab. With that, you will regularly see messages going to/from Node-RED and the browser.

If you are getting websocket disconnections, that is a different problem, generally at a network layer.

Yes, if the websocket connection fails, Socket.IO creates a new connection automatically and that gets a new socketid. That is a function of Socket.IO.

If you need a more reliable method of being able to communicate with an individual browser tab, keep track of the connections and disconnections using the control output in Node-RED.

I am looking at the possibility of adding some optional additional data to the uibuilder communications to potentially make it easier to track an individual physical connection and/or a logical connection (e.g. a "user" across browser sessions).

It does. The problem is not one of uibuilder's making.

I suppose that it is possible that Node-RED itself might not be able to respond, but would be an error in the mysql node or one of its dependencies. Even though Node.js is single threaded, no code should block the thread and there is certainly no reason why mysql interactions should do so, normally it would be a direct, synchronous node.js filing system call that might cause a block. MySQL is a server so any calls to it should be non-blocking.

If a MySQL call is blocking the loop in Node.js, it would be causing more issues, not just uibuilder's sockets. You would see the same effect with Dashboard's websockets connection. Something you could also check.

I commented out the ping/pong audit traces in uibuilder. However, if you don't mind a quick hack, you can put them back in quite easily. If you look at line 630 in ~/.node-red/node_modules/node-red-contrib-uibuilder/nodes/uibuilder.js, you will see 2 socket.on functions. Copy or move them out of the block comment, change the log level in your settings.js file to include audit logging, then restart node-red. You could change the log type if you wanted to avoid changing log levels. Then you will see a log output for every ping and pong message.

Additionally, you can prove that it isn't a uibuilder issue by disconnecting your MySQL call and instead put in an extreme delay using the delay node or a setTimeout in a function node. Both should be async and therefore non-blocking. You should see that the websocket connections continue as expected.

I often leave uibuilder delivered pages open for days or even weeks at a time and never have any issues. Connections are dropped when the client device goes to sleep but otherwise, they stay alive the whole time.

Also check the performance of your server to see if you are getting issues such as sudden SWAP usage due to lack of contiguous memory. On a Pi at least, that can absolutely stop the entire device while large chunks of memory are paged out to SD-Card and back in again.

That should certainly never happen no matter how long the query takes to run. With the exception of the memory issue already mentioned but that would be stopping the machine not just the Node-RED loop.

Thanks all for clear and fast response.

I agree with @TotallyInformation that this problem isn't related to uibuilder node.
In fact during mysql node query all other nodes (uibuilder and others) are down, so it seems that mysql use all node-red resources (in some loop) and block other nodes execution.

My query is very simple, I take all rows from two tables (40k row for each table) and it took 2 s using direct mysql query (to mysql server or front-end like DBeaver).
Also this point seems to indicate a mysql node problem (loop in something to justify 1 minute of work).

I'm using node-red-node-mysql (version 0.1.1).
I'm passing some extra-info to mysql node (i.e. msg.xxx ) that i use in next nodes.

I'll do some investigation and I'll let you know about mysql node query.

Thanks a lot in advance

  • Are you processing these 40K rows in node-red?
  • Are you passing the mySQL result to a debug node?

As a test,

  1. please disconnect debug nodes - does the "freeze" stop happening when mySQL is requesting?
  2. disconnect the output of mySQL node from everything else (leave outputs disconnected) - does the "freeze" stop happening when mySQL is requesting

If 1 and 2 make no difference - I would suggest the issue (if any) is in node-red-node-mysql


other questions...

  • Do you need to request 40K rows?
  • What do you do with these rows - e.g. do you use all of them or do you filter or summarise them?
    The reason I ask is you may be able to do much of the processing in the database (where it belongs)

Thanks @Steve-Mcl ,
I tried some test.

I removed all debug nodes : the problem still remains.

I disconnected output of mySQL node : the problem still remains (but time spent from Node-Red at 100% is lesser than before, so probably output connection increase the problem).
I'll follow your suggestion and I write the question in node-red-node-mysql.

My idea is to not process these 40k rows in node-red, in order to don't charge my machine to this work. I'm passing this info (DB raw rows) to client browser via uibuilder node and i'm doing this processing on client side (via JS).

Unfortunately I need all table rows (40k or more, it depends) and I have a little bit complicated process to do, so I prefere to this this job in JS at client side.

I'm trying to get around this problem using exec node.

Query is executed by shell (myslq command) and all is working good, with very low time response and without blocking node-red.

Now my (new) problem is the limitation of exec stdout buffer dimension (it limits the query result dimension, so number of rows is limited).
Is it possible to set it to infinite or a very big dimension?
I see something in other topic, but I'm not able to do the same on this node:

  • I didn't find this parameter in node setting
  • I tried to put { maxBuffer : Infinity } in extra-input parameters but it not works

I'm sorry if I wrote here, otherwise I could open another topic.

Thanks in advance

@GPG you dont explain why you need 40000+ rows dragged from database then presented to web page.

I will guess you are analysing the values in a web page.

Why not use WHERE clauses to return only values of interest?

Are you proficient with Databases?

Databases are very powerful at filtering and performing summation and calculations in the DB - before sending (small) results back to the client.

I need 40000+ rows because I'm retriving historical data for long period (i.e. 1 year) and I need to concatenate and to present them in a web page.

All values are of interest, I could do some work in mysql (I'm a little bit proficient) but algorithm isn't so easy (and not so portable in mysql) and I prefer to execute this work in PC web browser (client side) instead of embedded (server side).

In any case I couldn't reduce the amount of data: I use post-process to concatenate and associate info of different tables using algorithm, not to filter rows or execute simple calculations, so at the end the number of the rows are still big (40k).

What does the myself log say when you run the query?

Hi, by reading your other thread you stated you are trying to do multiple queries ?!

im not sure if the mySql node allows that by default
read this article for the library that node is based on

You might be better off writing the 40k rows to a file if you need to process it client side. It is possible that, for example, serving up a CSV file might be better. Hard to know without seeing the data and how you are trying to process it.

Truthfully though, a large table is often better processed using tools more suited to the job. Such as Python for example or even Excel.

If you do continue to use uibuilder, I would recommend breaking the data into chunks to send to the uibuilder node. Socket.IO handles the communications between Node-RED and the front-end and I really haven't tested how it behaves with really large datasets. But sending too small a dataset (e.g. 40k times 1 record) might also be slow.

Either way, HTTP will likely handle things a lot better than websockets I would imagine as the server should chunk the data efficiently. Probably be even more efficient to serve a large file directly from something like NGINX than trying to use Node-RED's ExpressJS server.

Ah, you are half-way to my suggestion already.

It is best to do large queries using prepared statements.

If you are doing two independent select queries perhaps it would be better to do them separately and send the two sets separately to the browser. That would halve the amount of data in each one.

Thanks @UnborN for your suggestion.
In case to enable multiplequery I should modify code directly inside contrib-node code, is it correct?

Thanks @TotallyInformation for your suggestion.
I already tried to retrieve data with Python code (pythonshell node) but I still have same problem (worse: python script execution consume also a lot of RAM).
I already tried to save data in a file and to read it (file in node), but I still have the same starvation problem: read this big file blocks node-red (also if the problem is visible for bigger query respect to use mysql directly).
I'll follow your indications and try to break data into chunks (there is an indication also in Exec and daemon nodes break up a data stream), using exec node and stdout or mysql node with single query (I have to try).

Thanks @Colin for your suggestion.
I'm sorry for the question, where I could find "myself log"?

I just sent that article because i wasnt sure if you were doing multiple query.
I havent worked with mySql so i dont know how the node behaves when you do send it one.

I think its better to do two separate requests like @Colin said.
One after the other - the reply of the first somehow to trigger the request for the next.

as a test if you send separately
SELECT * FROM table1
and then
SELECT * FROM table2
do you get a reply from both ?

I think that the underlying node.js module is the constraint. Really, it needs a streaming interface. Unfortunately, Node-RED isn't quite so good at that right now.

Weirdly, the thing that is very good at that is PowerShell - but lets not go down that rabbit hole!

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.