I am using the Node-RED running in a Docker container on Ubuntu. When Node-Red starts it consumes about 2.6GB of memory, then in less than 24 hours it is up to 5.95GB (which is near the maximum available). I think there is a memory leak, how can I troubleshoot the issue?
Hi and welcome 
Can we try and simplify the problem
Does it happen with a completely empty flow or with a set of them
Can you (temporary if needed) do a standard (non-docker) install - does it behave the same or not
If you can - post your flow here and maybe someone can test if out on their machine using Docker
Just a shot in the dark, but I had similar issue when I first started using node-red.
Turned out, Ii you use an rbe node on binary data its pretty much inevitable.
Here is the followup thread to the original where I stumbled onto the solution:
https://discourse.nodered.org/t/memory-leak-what-am-i-doing-wrong/627/23
My non-docker install of node-red on Ubuntu says it is using 96MiB.
70MiB on my Windows 10 test setup which includes quite a number of uibuilder, moment, fs and other test flows.
I tried a non-Docker install and it only uses 1.53GB and it stays the same.
So, I think that I will continue with the non-Docker install.
That is still ten or twenty times what is usual. You need to understand why. Are you handling big video files or something?
Maybe they don't ![]()
Maybe they are quite happy that it works ![]()
Unless there is a good and understood reason for it using so much space then there is a strong possibility that the fact that it is apparently working correctly is just an illusion.