We have a Nodered process that is consuming a maximum of 2GB of ram and we would like to increase consumption to meet demand. Below tracks information and actions taken.
amazon EC2 T3.xlarge
OS: Ubuntu 20.04.3 LTS
Version: 14.18.1 x64
Nodered version: 3.0.1
Executed command to start the process:
sudo pm2 start "node-red -p 1894 -u app_name --max-old-space-size=6144" -n app_name --max-memory-restart 6144M
When we run the sudo pm2 monit command and analyze the process consumption, we see that when it reaches 2GB the line turns red and at some point the application restarts reporting a lack of memory.
Does anyone know how we can adjust to expand consumption to 6GB?
You can certainly attempt to steer the V8 engine to use more memory but it doesn't quite do what most people think.
It mostly steers the garbage collection routines. While this might end up allocating more memory, this isn't a guarantee as far as I can tell. While most articles tell you this increases memory allocation, all it really does is instruct the V8 engine's GC routine not to try to recover memory so quickly. And this only applies to the heap anyway. Even if it does result in more memory being allocated, it can have a nasty side-effect since GC may take longer to perform and your entire node.js loop process will pause while that is happening.
The most effective way to get V8/Node.js to allocate more memory appears to be to make sure that the process has more memory to play with in the first place. Newer versions of V8/node.js are pretty good generally at allocating available memory.
Also worth noting that it is possible to run out of heap anyway if you have very large objects that are regularly being extended in large chunks. That's because, I believe, the heap can be fragmented so that there isn't enough of a single contiguous space to allocate a new section. Obviously, GC takes care of this as well normally but occasionally it can't do anything.
One thing to do is to try and make sure you are using modern JS techniques throughout such as using const/let rather than var since this allows the engine better control over when to release memory allocations. Of course, this can be tricky with something like Node-RED where you do not have access to all of the code.
Here is the initial allocation of memory on my 32GB desktop running Windows 10 with no steering:
0|Node-RED | V8 Total Heap Size: 4122.75 MB
0|Node-RED | Initial Memory Use (MB): RSS=56.72. Heap: Used=20.70, Tot=29.93. Ext C++=3.17
And this from within the running node-red in a function node:
What is it that you are doing that needs such vast amounts of memory?
This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.