Is there a way to increase the amount of memory Node-RED uses? I am processing rather large number of files and I am getting a "JavaScript heap out of memory" error.
This is a Node.JS issue rather than Node-RED as such.
This may help if you have sufficient RAM in your device.
What hardware are you running on?
I saw similar information in other blogs and I tried the following with no luck:
node-red --max-old-space-size=4096
Any ideas on how would I apply this when starting up Node-RED?
Mac Book Pro
How do you start node red on your system? Does it start automatically on boot?
What OS are you running, or is that a silly question on a Mac Book, I don't know.
Hi @rmckinnon
By default, node.js does not start garbage collecting unused memory until it hits 2gb. The maxoldspace setting allows you to set when gc should kick in. It is not a hard limit.
If you are running out of memory in a Mac, then you will almost certainly have over 2gb free memory so the node GC will have already locked in. Changing the point at which it kicks in is unlikely to have any meaningful effect.
This suggests your flows do have a genuine memory leak in them. They can be hard to track down. In general terms, what is your flow doing? Is it handling large messages?
@knolleary to help understand this issue I wrote a simple node I call "memory" that outputs the nodeJS process memory stats. Process | Node.js v21.5.0 Documentation. When I run a simple sequence "inject -> memory -> debug", I am seeing a total size memory of about 54 Megs
In my sequence where I am trying to process thousands of files using nodes from the "node-red-contrib-fs-ops" pallet is where I get the "JavaScript out of heap memory" issue. My thought is that if I could raise the 54M of memory to 4G, I should be good.
I start it from the command line using the node-red script. This script is created when you install node-Red using the npm tool with nodeJS.
It runs OSX High Sierra
You don't allocate memory to node. Node asks the OS for memory when it needs it. You hit OutOfMemory when the OS is unable to allocate any more memory.
If you are trying to process thousands of files using NR then it sounds like you're trying to do too much at once. How does the flow operate? Have you built any throttling in or is your flow trying to handle the thousands of files in one go?
It may not be relevant to the problem here, but I recently ran into a problem when I was using the exec node to carry out operations on a bunch of files. I passed a sequence of messages to the exec node, not realising that each time it received a new message it started the command running even if the previous one had not completed, even though I was running in exec not spawn mode. The result was I had thousands of processes all running at once. I didn't run out of memory but it certainly clogged up my system for a while. I solved it by using node-red-contrib-semaphore to ensure that it would only run one at a time.
I have a JAVA background so I was assuming it worked similar.
For clarity I am adding a picture of the sequence:
I think I am hearing that Node-RED depends on the OS to allocate it's memory. I am assuming that if I am able to give the OS more memory then, theoretically, Node-RED should be able to ask for more memory.
As far as the issue with my sequence: I am sure I can solve it by breaking things up but I was looking to see if we could brute-force a solution.
@Colin - Thanks for the tip, I will check out node-red-contrib-semaphore.
I made some lead-way with the brute-force approach. When I started node-red with the following command I was able to complete the sequence without a "JavaScript heap out of memory" error.
node --max_old_space_size=8192 /Users/rob/.nvm/versions/node/v10.11.0/bin/node-red