Problem with data displayed in UI-Table when switching Tabs

I am trying to use the UI-Table node and I see a curious thing:

  • 1 If I add information to the table like this.
        "textValue": "Line #1",
        "numberValue": 123.12
        "textValue": "Line #2",
        "numberValue": 100

The data is retained, when changing tabs.

  • 2 if I add information by means of a command, for example.
      command: "updateOrAddData",
    arguments: [
                "textValue": "Line #1",
                "numberValue": 123.12
return msg;

The data disappears when switching Tabs.

Do you know of a way to make the data remain when switching Tabs?


if you are using commands to alter your table you have to keep a copy of your table data and play it back if a client connects or a tab is switched. So you should use a ui-control node to trigger a "on connect" and on "change" event to replay your data (as a complete array (fastest) or a sequence of addData commands).

If you store your data in a local non volatile storage (i.e. file system) your table survives Node-RED restarts too.

Noted in the Dokumentation:

important notices
Data which is sent to ui-table through commands is not cached by ui-table! The flow has to take care to update the table for new clients connection or dashboard tab changes!

Thanks Christian
I see the problem.
The part of using a Ui-control node is clear, but the part of saving the data is not so clear (for my programming level).
Maybe the solution is to write in a database.
Can you suggest a database?

It really depends on your data. I use the context storage for all my table. A Database is perhaps overkill. Are you only adding lines or do you plan to update values? Then you will need to define a index column.

In my case, I only have to add lines, basically with the following fields:

  • Timestamp.
  • Event.
  • Status.

@Christian-Me. I just saw, your UI-Table Handler development, great.
But too much for a newbie like me.
First, I'm going to study the subflow documentation and try to use it for my case, please, if I find doubts or difficulties I can send you a message (maybe it's a good idea to close this topic and open a specific one for questions)?

So something like a logfile ....

This is using a little bit more complex (but universal) subflow I use for all my tables.

But you can do it more simple too:

A) a function node which gets an array (of lines) from the flow context and pushes the new line at the end
B) a second function node triggered by ui-control gets the same context array and send it to ui-table

I splittend it up to different nodes to make it easier to understand / debug - The tricky part is to scroll the table if you insert your rows at the end of the table.

[{"id":"5157b913.480ca8","type":"inject","z":"d1227ddb.952f7","name":"","props":[{"p":"timestamp","v":"","vt":"date"}],"repeat":"10","crontab":"","once":true,"onceDelay":0.1,"topic":"","x":135,"y":578,"wires":[["13f98b83.2e2974"]]},{"id":"13f98b83.2e2974","type":"function","z":"d1227ddb.952f7","name":"demo data","func":"let counter = context.get('counter') || 0;\ncounter++;\ncontext.set('counter',counter);\n\nvar output = {\n    payload:{\n        id : counter,\n        timestamp : msg.timestamp,\n        text : `Line #${counter}`,\n        value : counter\n    }\n}\n\nreturn output;","outputs":1,"noerr":0,"initialize":"","finalize":"","libs":[],"x":308,"y":578,"wires":[["81a5a8e6.5418f8"]]},{"id":"81a5a8e6.5418f8","type":"function","z":"d1227ddb.952f7","name":"store data","func":"let tabledata = flow.get('tabledata') || [];\ntabledata.push(msg.payload);\nflow.set('tabledata',tabledata);\n\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","libs":[],"x":485,"y":578,"wires":[["d6ec8129.da035"]],"icon":"node-red/db.svg"},{"id":"d6ec8129.da035","type":"function","z":"d1227ddb.952f7","name":"addData","func":"//\nvar output={\n    payload:{\n        command: \"addData\",\n        arguments: [\n            [ msg.payload ],\n            false\n        ]\n    }\n  }\n\n//\nvar scroll={\n    payload:{\n        command: \"scrollToRow\",\n        arguments: [\n  ,\n            \"top\",\n            true\n        ]\n    }  \n}\nreturn [[output,scroll]];","outputs":1,"noerr":0,"initialize":"","finalize":"","libs":[],"x":655,"y":578,"wires":[["82bc171.d9d5de8","1c88a7d4.6e7b88"]],"icon":"font-awesome/fa-table"},{"id":"f9eb75ea.c53888","type":"ui_ui_control","z":"d1227ddb.952f7","name":"","events":"all","x":315,"y":646,"wires":[["75d7057b.e0eeec"]]},{"id":"75d7057b.e0eeec","type":"function","z":"d1227ddb.952f7","name":"get data","func":"msg.payload = flow.get('tabledata') || [];\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","libs":[],"x":485,"y":646,"wires":[["82bc171.d9d5de8"]],"icon":"node-red/db.svg"},{"id":"82bc171.d9d5de8","type":"ui_table","z":"d1227ddb.952f7","group":"5bf56b63.75b254","name":"","order":0,"width":"6","height":"7","columns":[],"outputs":0,"cts":false,"x":833,"y":612,"wires":[]},{"id":"1c88a7d4.6e7b88","type":"debug","z":"d1227ddb.952f7","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","x":832,"y":578,"wires":[]},{"id":"5bf56b63.75b254","type":"ui_group","name":"Logfile","tab":"bae8e16f.a9faa","order":2,"disp":true,"width":"6","collapse":true},{"id":"bae8e16f.a9faa","type":"ui_tab","name":"ui-table","icon":"dashboard","order":1,"disabled":false,"hidden":false}]

If you plan to dive into the ui.table handler subflow then please open a new question to keep topics sorted.

Attention: The demo flow (above) will overflow sooner or later .... so after a while it would be good to delete oldest entries ...

Updated store data function:

const maxRows = 100;

let tabledata = flow.get('tabledata') || [];
while (tabledata.length > maxRows) tabledata.shift();

return msg;

The table on an open tab can/will still contain more than 100 rows but if you switch tabs the maximum rows will be replayed and node red will not overflow after a while.

You can also store the events in a CSV file, and inject them with the UI_control or with a button on the dashboard:

Here is an example that contains columns of:

[{"id":"404a5cd2ecc70b5b","type":"csv","z":"cffff426.560868","name":"","sep":",","hdrin":"","hdrout":"all","multi":"one","ret":"\\n","temp":"Timestamp,Event,Status","skip":"0","strings":true,"include_empty_strings":false,"include_null_values":false,"x":450,"y":2060,"wires":[["0bb0dbffa697caaf"]]},{"id":"0bb0dbffa697caaf","type":"file","z":"cffff426.560868","name":"","filename":"datalogger/events.csv","appendNewline":false,"createDir":true,"overwriteFile":"false","encoding":"none","x":660,"y":2060,"wires":[[]]},{"id":"a5e4cc0923b7c711","type":"file in","z":"cffff426.560868","name":"","filename":"datalogger/events.csv","format":"utf8","chunk":false,"sendError":false,"encoding":"none","allProps":false,"x":400,"y":2220,"wires":[["939f1773869c3421"]]},{"id":"939f1773869c3421","type":"csv","z":"cffff426.560868","name":"","sep":",","hdrin":true,"hdrout":"none","multi":"mult","ret":"\\r\\n","temp":"Timestamp,Event,Status","skip":"0","strings":true,"include_empty_strings":false,"include_null_values":false,"x":610,"y":2220,"wires":[["67db5a640db1e1a9"]]},{"id":"67db5a640db1e1a9","type":"ui_table","z":"cffff426.560868","group":"160e81fb.f1c86e","name":"Table","order":11,"width":"18","height":"10","columns":[{"field":"Timestamp","title":"Timestamp","width":"","align":"left","formatter":"plaintext","formatterParams":{"target":"_blank"}},{"field":"Event","title":"Event","width":"","align":"left","formatter":"plaintext","formatterParams":{"target":"_blank"}},{"field":"Status","title":"Status","width":"","align":"left","formatter":"plaintext","formatterParams":{"target":"_blank"}}],"outputs":0,"cts":false,"x":810,"y":2220,"wires":[]},{"id":"966a1cd6a231afaf","type":"comment","z":"cffff426.560868","name":"SAVE EVENTS IN CSV","info":"","x":230,"y":1980,"wires":[]},{"id":"3f849cc34cddf4d2","type":"comment","z":"cffff426.560868","name":"LOAD EVENTS FROM CSV","info":"","x":260,"y":2160,"wires":[]},{"id":"89e707d30ffc3d45","type":"inject","z":"cffff426.560868","name":"YOUR DATA","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payloadType":"date","x":210,"y":2060,"wires":[["404a5cd2ecc70b5b"]]},{"id":"4a791bcf00947094","type":"ui_ui_control","z":"cffff426.560868","name":"","events":"all","x":210,"y":2220,"wires":[["a5e4cc0923b7c711"]]},{"id":"160e81fb.f1c86e","type":"ui_group","name":"File Browser","tab":"b63d1f91.68095","order":1,"disp":true,"width":"18","collapse":false},{"id":"b63d1f91.68095","type":"ui_tab","name":"LPR ⭐","icon":"dashboard","order":19,"disabled":false,"hidden":false}]

And to further optimize the system you can create weekly packages and thus delete the files every 2 months or however you want.

I have been doing tests and the most that I have been able to load with the dashboard without saturating is 1,350,000 events in a table.

Even when the CSV files are located locally, you can make backup copies, reporting them by email every month.

Achraf B.

@achrafb Thank you, I must try this new idea, it opens up a new possibility for me.



your idea is nice but seams to have the disadvantage of reading, writing and transferring the hole table each update. And to pull data by a button click. Where the addData command only transfers the data needed (whole data on connects and only one line if a new value arrives)
The context store comes with a nice cache so disk operations are reduced.

But if you need an csv you can combine both flows.

I can confirm that ui-table keeps responsive with a lot of rows until the browser quits with an out of memory error :wink:

@achrafb : No need for a ui-control node in your solution as ui-table replays the last table array on every reconnect or tab change automatically. You only have to do that by your flow if you send single rows or cells by commands. Perhaps a inject node on startup can be used to fill the table with historic data after flow restart.

@carflomi : If you are planning to store long time logs I would recommend a database like influx db as this database is especially created for time based data collection (if time/value based)

Hi @Christian-Me ,

Thanks for the suggestion! I will test the context storage in future projects.
Currently I have a polling and event control system for some security systems that I have with a SQL database only as a backup, I work with the CSV to store the daily events and load them to the table without problem:


Achraf B.

1 Like

Don’t get me wrong, your solution is fine.

But writing and parsing a csv for each update and the need to poll your data and don‘t have an always up to date state on your dashboard should be taken into account.

The big advantage of your solution is that no function node is necessary, so perfect if someone is not familiar with coding.

I use ui-table with updates of status tables on a cell level using the updateOrAddData command to get updates without blanking the table or loosing the scroll positions. The collapse state is preserved too.

The main difference is not the csv part it is more if you prefer using commands to alter your table or if you replace and restart the table on each update. (Under the hood: If you send a new array the tabulator instance gets destroyed and a new one will be created)

1 Like

Hi Christian

Indeed, the CSV solution, is simpler and more functional according to the request of Carlos, if more control of rows is needed, it is highly recommended to work with a database like you recomended.

And for the issue of row updates, there is no problem, in my case the CSV I simply load it to see previous records, if new events arrive, the table is not altered or destroyed, it is overwritten above the CSV and in parallel in the database together to the table:


Achraf B.

Many roads lead to Rome :wink:

Indeed the tabulator instance is always destoyed if a new array arrives. Thanks to modern browsers you don't see it because the browser is smart and our cpu is fast and have a ton of memory (I know the Sourcecode very well)

                            if($scope.table !== undefined) {
                            $scope.table = new Tabulator(basediv, opts);

For me having my roots in computers with 32k and clocked @ 1MHz it don't feel right to encode and send data first to a file, allocate memory, read it back, decode again and then send it to a "display" and then the garbage collection as to get rid of the unused memory if only a row or cell changes. But this is my preference.

Carlos asked how data updated by a command survives tab changes ... Working with complete arrays always survive tab changes without any efford.

I only use influx db for my long term data logging together with grafana. Javascript is quite effective as simple database where SQL is again a lot of overhead (this is why nonSQL databases are popular). For me it is convenient to quickly inspect my data directly in Node-RED:

But now we are getting totaly off topic, sorry.
BTW: Nice table design, I like it. :+1:

It is very interesting what you say, I will try to get the best of both points of view.
It's been a good debate, thanks @Christian-Me and @achrafb

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.