It mostly steers the garbage collection routines. While this might end up allocating more memory, this isn't a guarantee as far as I can tell. While most articles tell you this increases memory allocation, all it really does is instruct the V8 engine's GC routine not to try to recover memory so quickly. And this only applies to the heap anyway. Even if it does result in more memory being allocated, it can have a nasty side-effect since GC may take longer to perform and your entire node.js loop process will pause while that is happening.
The most effective way to get V8/Node.js to allocate more memory appears to be to make sure that the process has more memory to play with in the first place. Newer versions of V8/node.js are pretty good generally at allocating available memory.
Also worth noting that it is possible to run out of heap anyway if you have very large objects that are regularly being extended in large chunks. That's because, I believe, the heap can be fragmented so that there isn't enough of a single contiguous space to allocate a new section. Obviously, GC takes care of this as well normally but occasionally it can't do anything.
One thing to do is to try and make sure you are using modern JS techniques throughout such as using const/let rather than var since this allows the engine better control over when to release memory allocations. Of course, this can be tricky with something like Node-RED where you do not have access to all of the code.
Here is the initial allocation of memory on my 32GB desktop running Windows 10 with no steering:
0|Node-RED | V8 Total Heap Size: 4122.75 MB
0|Node-RED | Initial Memory Use (MB): RSS=56.72. Heap: Used=20.70, Tot=29.93. Ext C++=3.17
And this from within the running node-red in a function node: