Node-RED-Dashboard freezes when loading MySQL database

Hello people,

I created a dashboard, which you can do messurements with at the Raspberry Pi. The results are getting displayed in a table and saved inside a MySQL database. The user is also able to download the database as CSV.

I almost finished with the project and made a stresstest. So what I did was to do one messurement every 200ms á 50 messuring instruments for one hour. As expected I got around 900.000 rows and a total size of 50MB. Since then whenever I'm trying to load the dashboard it freezes and nothing is working. The CPU goes sometimes up to 200-300%. I've disabled the table so no data is getting loaded by the database.

Without the table it is working fine again but when I try to download the CSV it freezes again. So I logged the MySQL slow queries to a file and got this:

/usr/sbin/mysqld, Version: 10.3.34-MariaDB-0+deb10u1-log (Raspbian 10). started with:
Tcp port: 0  Unix socket: /run/mysqld/mysqld.sock
Time		    Id Command	Argument
# Time: 220923 15:22:27
# User@Host: nodered[nodered] @ localhost [127.0.0.1]
# Thread_id: 36  Schema: devices  QC_hit: No
# Query_time: 2.656102  Lock_time: 0.001574  Rows_sent: 879100  Rows_examined: 879100
# Rows_affected: 0  Bytes_sent: 19673746
use devices;
SET timestamp=1663939347;
SELECT * FROM Messungen;

So I tried to get the database manually over the console.. with success.

Is there anything I can set up in node-RED? Why does node-RED freezes all the time (I mean 50MBs are nothing)?

Thanks in advance

Justin

What exactly do you mean by download the CSV?

And what are you doing with it ?
Reading it all in one chunk ? Reading it one line at a time ? feeding to chart in one chunk ? Or point by point ? -Are you feeding to a chart ? How many points across is it ? Have you limited the numbe you can display ? etc etc

@Colin I'm sending a query to the MySQL-Node to convert the data to CSV and export it to a folder from which the user downloads it

@dceejay Trying to convert the whole database to csv or json. The table is generated by the json and the user can download the database as csv

So it's all in memory then... Hopefully you have at least a 8GB ram,. Or at least 4GB and then edited the service file to set max ram to 3GB - otherwise it will be paging out to disk,

Also you can try breaking the flow at various points after the SQL node to find exactly where the problem is.

Seems like you are right. The browser error-message displayed an out of memory. I've tried to increase the given ram to node red with node --max_old_space_size=3072 /usr/local/bin/node-red but it didn't work. @dceejay what service file are you talking about?

I've got at least the 4GB version of the raspberry.

@Colin That's what I did. In the first case the only node after the SQL node is the table. In the other case there is no node after. The SQL node should convert the database to csv and export the file into the tmp folder:

(SELECT 'Register', 'Druck', 'Zeitstempel') UNION (SELECT Register, Druck, Zeitstempel FROM Messungen INTO OUTFILE '/tmp/Messungen.csv' FIELDS ENCLOSED BY '\"' TERMINATED BY ';' ESCAPED BY '\"' LINES TERMINATED BY '\\r\\n');

Hi ,
the service file is usually /lib/systemd/system/nodered.service - to then edit the max_old_space - then use node-red-start to start the service. (or node-red-reload to stop and start) - but I suspect you may still be out of luck as if it is 2GB and it ever gets modified you will end up with both the original and the "copied" version at the same time = 4GB... but you may be lucky.

Hi, thanks for your answer. There is actually no nodered.service inside my /lib/systemd/system. I also thought about splitting the SQL-request because.. why should I load 900.000 rows inside the table in one time. When crashing chrome uses like 6GB of ram :face_with_peeking_eye: But I don't really know how to do this yet.

Presumably you didn't install using the recommended script in the node red docs.

What are you going to do with a 900,000 route CSV file?

You might be better to run the SQL command using an exec node, then it will run outside the node red environment.

You should probably change your query to pull a certain range of records and have your server (Pi?) piece together the results into one file, if that's all you're doing. You can pull several queries relatively quickly in small chunks and then append each chunk to the file you're trying to create, since appending takes very little extra RAM or disk access. Without knowing what SQL you're using (MySQL, Oracle...), look up the LIMIT keyword and go from there. That should get you a range of results you can then start appending with. The only other thing you would need to know at that point is the size of your table, which you can also get very easily.

2 Likes

These are messurements that will get compared and visualized. Do you think node-red cant handle such requests?

Why do you need to generate CSV file to compare and visualise? Can't you do that directly from the database?

@Colin No because the users access the dashboard from their own device. They dont have access to the database/pi. The download is working fine for now. The only thing I have to fix right now is the table.. I just wrote a script so the database should be available to the ajax-function in tabulator but the table wont show up.. is node-red maybe blocking the communication?

@Colin The table is getting displayed for now. It's trying to load the data but nothing is getting displayed. I think there's something wrong with my syntax.. I've never worked that much with json, would be nice if you could give me a hint.

The content of the ajax looks like this:

{"payload":[{"Register":1,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":2,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":3,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":4,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":5,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":6,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":7,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":8,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":9,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":10,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":11,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":12,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":13,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":14,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":15,"Druck":59,"Zeitstempel":"13:37:41:967"},{"Register":16,"Druck":59,"Zeitstempel":"13:37:41:967"}]}

And the settings of the tabulator-table inside the dashboard-template-node looks like this:

var tableM = new Tabulator("#table-messungen", {
        height:"1000px",
        placeholder:"Keine Daten verfügbar!",
        layout:"fitColumns",
        //groupBy:"Zeitstempel",
        //groupToggleElement:"header",
        ajaxURL:"http://10.207.48.36:5555/fetch-messungen",
        progressiveLoad:"load",
        progressiveLoadDelay:200,
        /*
        initialSort:[
            {column:"Zeitstempel", dir:"asc"},
        ],
         */
        columns:[
            {title:"Register", field:"payload[0].Register", width:150},
            {title:"Druck (Pascal)", field:"payload[0].Druck"},
            {title:"Zeitstempel", field:"payload[0].Zeitstempel"},
        ],
       
    });

The problem is that I'm a bit confused because of the json we are getting from ajax. I dont understand how to access the "inner json" which is inside the payload-array.. I tried some stuff but the content wont show up

I can handle small amounts, yes. But Node-Red is a scripted language and not a native compile that is optimized for the hardware/OS involved. If you were running this same level of pull on something like a desktop, or even a server, you probably wouldn't notice the force involved in making your code happen through all the scripting. Can the Pi handle large queries and big lists of stuff? Sure. Especially in native code. Can it handle that while dynamically RENDERING a table the size of the data you're trying to pull, all in a scripted interpret-on-demand language? Well, I think you can answer that. I've tried to have a set of small graphs monitoring stuff involving small datasets (maybe 1000 time/data pairs each at most) and my Pi started bogging at about 30 graphs. It takes a lot to render something while pulling data in the meantime.

I see nothing wrong with your Tabulator setup. But then again, I don't use the template node to render my tables (I'll use table or etable for that, just simplicity) so I won't be able to spot obvious errors. It sounds to me like your problem isn't actually pulling data, since it appears that is functioning normally. It sounds to me like your problem is the rendering. I would try exporting this code from your Pi, or whatever is hosting your dashboard, to a decently powerful desktop/laptop and see if you still get the same effect. If access is limited or you don't want to export the node, you can try modifying your query to pull the TOP 10,000 or so of your database and seeing if that is successful. If so, increment your pull until it breaks. It should also give you info on how much CPU and memory resources are going into rendering your request.

One final thought I have is why are you doing everything in JSON? There are some really good SQL nodes out there that keep everything in simple data types so you don't need to parse, interpret or otherwise process your data. I'm not saying what you're doing is wrong, by any means. Just wondering if there's a simpler way of doing it that circumvents a lot of the issues you're having.

2 Likes

Thanks for your reply! I'm trying to do the most on my own because I realized that the performance suffers from the most pre-made nodes when it comes to bigger data-flows. I actually found out, that the tabulator-progressive-loading-function only accepts an array. So thats why it's not displaying anything.

I got some code from here, to make the database available to ajax. I think I have to change something in the following code snipped, to retrieve an array instead of json:

var express = require('express')
var path = require('path')
var createError = require('http-errors')
var cors = require('cors')
var bodyParser = require('body-parser')
var app = express()

var dbMySQLNode = require('./database')
// view engine setup
app.set('views', path.join(__dirname, '/'))
app.set('view engine', 'ejs')
app.use(bodyParser.json())
app.use(
  bodyParser.urlencoded({
    extended: true,
  }),
)

app.use(cors())
app.get('/', (req, res) => {
  res.render('index')
})
app.get('/fetch-countries', function (req, res) {
  dbMySQLNode.query('SELECT * FROM Country ORDER BY id desc', function (
    error,
    response,
  ) {
    if (error) {
      res.json({
        msg: error,
      })
    } else {
      res.json({
        msg: 'Data successfully fetched',
        country: response,
      })
    }
  })
})
app.listen(5555, function () {
  console.log('Node app is being served on port: 5555')
})
module.exports = app

But I've actually never worked with "express" or "body-parser"...

Turns out this is the same question cross posted... over here -->

so closing this thread to avoid duplicate effort.

1 Like