FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory

Hi all,

I have 8 flows with about 60 buttons/inputs on the dashboard (the standard Node-RED dashboard not the UIBuilder) of each flow. If I only enable 2 or 3 of the flows, all is fine when deploying Node-RED after making changes to the flows for example. But if I try to deploy all 8 flows, Node-RED dies and I get the error in node-red-log:

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory
nodered.service: Main process exited, code=killed, status=6/ABRT

This error occurs under the section marked:

<--- JS stacktrace --->
==== JS stack trace =========================================

I am on Node-RED v1.0.2 and Nodejs version: v10.17.0, running on a RasPi 4, 4GB, Buster OS.

When I search this error online I mostly see things about upgrading nodejs versions and I did not see anything specfically related to Node-RED unfortunately.

thanks in advance for any help!

Did you install using the recommended install script for the pi? If so then you will have a systemd service script /lib/systemd/system/nodered.service. In there is a line
Environment="NODE_OPTIONS=--max_old_space_size=256"
I am not entirely sure exactly what that does but I believe it defines how much memory nodejs is allowed to use. That figure is ok for a Pi3 or Zero but on a Pi 4 it can be increased if you have the 2GB or greater version. Try changing that to 1000 (ie 1GB), restart the service and see if it helps. However I am not an expert on this so I might be barking up the wrong tree.

Not sure that is entirely true, I think that it defines when node.js will try to do a garbage collection to recover memory previously used. It is a V8 option and so not really documented in the Node.js docs. However, node.js does come with a library that lets you get and set v8 options. With it you can, for example examine the state of the heap.

https://nodejs.org/dist/latest-v6.x/docs/api/v8.html#v8_v8_getheapspacestatistics

const v8 = require('v8');
const totalHeapSize = v8.getHeapStatistics().total_available_size;
let totalHeapSizaInMB = (totalHeapSize / 1024 / 1024).toFixed(2)

console.log("V8 Total Heap Size", totalHeapSizaInMB, "MB");

However, I really don't think that you want to be messing with all of that. node.js will allocate nearly 2GB of memory on its own without changing any settings. So if you have the setting in your startup, try removing it. But to me, I'd say something else is going on that is leaking all of your available heap and I'm not convinced that adjusting the memory is really going to fix that.

Or to put it another way, not true at all, as I can confirm having done a bit more research. I blame faulty medium term memory devices between the ears.

3 Likes

I was trying to be nice :smiley:

Actually, there is a lot of misinformation around about that setting.

hi @cruxcrider, were you able to resolve this issue? This has become a recurring issue for a less involved workflow that involves splitting an array of objects into distinct msgs

In terms of Node-RED instance config, there are only about 3 flow tabs of low-moderate complexity

Any other thoughts on what it might be?

@utmostGrandPoobah I do not have any real insightful info unfortunately. I think ultimately I had some sort of buggy Node-RED flow. I have the standard install and I did not ultimately change anything regarding Node-RED memory usage. I still have the default ...--max_old_space_size=256

1 Like

On most recent Pi with 1gb ram you should be able to set it to 512mb no problem

1 Like

@cuxcrider, weird. Weirdly, we were able to resolve our issue using this memory queue custom node from the community: https://flows.nodered.org/node/node-red-contrib-memory-queue

Once we implemented that into the flow, the Node-RED server ceased crashing and the flow continued along

Sounds like you were getting some kind of race condition when sending to some node that couldn't quite cope with the speed/size of the data.

The q node seems to us Rxjs to manage a queue of events and so it likely eliminates the race condition.

Just guessing of course.

1 Like