In order to ship internal custom nodes to our users faster with smaller install size, we are trying out pnpm to replace npm in the context of NodeJS for Node-RED.
Although pnpm seems to be better than npm in terms of installation speed and size for NodeJS, we are not sure how that will turn out to be with Node-RED.
For one, I assume that pnpm will at least co-exist with npm, because community nodes are still installed using npm through Node-RED workspace which cannot be replaced without hacking NR internals, correct?
The scenarios where we want a drop-in replacement using pnpm for npm are roughly:
pnpm install -g node-red
under ~/.node-red, pnpm install /path/to/my_custom_node
under /path/to/my_custom_node: pnpm install
Will NR remain working if we made such a drop-in change?
First time I've heard of pnpm - so you are blazing the trail. You will have to rely on your own testing to ensure it is working as you need. Your feedback would be useful. (I can't see why it shouldn't work - just I have never seen/tested it)
I'm pretty sure this will break uibuilder. It relies heavily on npm and needs to know where modules actually live in some cases in order to be able to make them available as web endpoints. That is hard enough just with npm itself but it looks like pnpm uses an entirely different structure.
Also, uibuilder, like Node-RED itself, assumes that npm will always be available. Since npm does not provide a stable API, you have to call it from an exec.
Did you already benchmark these topics?
I would assume the difference is neglectable - if you intend to ship a single node as an independent unit:
Each node needs it's environment to run, thus all dependencies have to be fulfilled - consequentially either made available online or offline. This is independent of the making of the package manager you use...
Once a package is installed, npm doesn't install it a second time. The check for availability consumes almost no time...
If you ship all your nodes in a combined package, the situation is similar - and you can save some bytes only in case you make your dependencies available offline.
One idea I've had for the future - since uibuilder does have quite a few dependencies, is to find a way to reduce those to dev dependencies and have some kind of build process so that I didn't need "live" dependencies at all. Not quite clever/knowledgeable at present to have worked out exactly how to achieve that. But if it could be, that would be another way out of the issue potentially.
Either way, the use of pnpm is going to need very extensive testing. Unfortunately, not something I can take on right now.
Do you mean that they use actual dependencies, such as published modules on npm, or do they use similar js functions inside the node?
If you find yourself writing the same code at many different places, then just put that into its own module and publish it. The nodes can then require it as a dependency and you can reduce some of the duplication.
If the dependencies are 3rd party modules that you don't own, then as long as you are properly versioning the dependencies with ranges, there should not be multiple copies. Just don't be too explicit with requiring exact versions in each of your nodes, such as firstname.lastname@example.org.
Very soon we'll find ourselves working with lots of node_modules containing duplicates.
Worse, our internal network basically can't reliably npm-install from the internet for security reasons. So we have to ship these node_modules to our uesrs, leading to a huge installer for every little increments.
One strategy we are currently trying is to install everything into ~/.node-red/node_modules. But this seems to force us to use absolute paths when require() dependencies. Not impossible, but very annoying, considering some dependencies of the node dependencies require() things to locate in their desired location such as global installation.
BTW, we've tried to hack module.paths with custom location, but that seems to brings about more troubles than benefits.
The problem though with the alternatives is that they aren't universally used. So no matter what, npm has to be taken into account.
AFAIK, npm is supposed to flatten everything except where it causes a version conflict. So if two modules need different versions of the same dependent module, the first installation should be flat but the 2nd will install relative to the parent. That certainly seems to generally be the case.
Have you tried using the peer dependencies option in package.json? Use that with peer dependencies meta to specify optional false. If your nodes are private and only being used for your application, then you can easily know what the dependencies are and install them directly and make them available for your nodes to use.
I now have a workaround: just don't npm-install but instead, only npm-link MyPackage into ~/.node-red. This way MyPackage is not listed as a dependency in ~/.node-red/package.json and my node runs fine on both Windows and macOS. But I'm not sure if this is the best practice. It feels over-engineering to me.
I hope by upgrading to recommended NodeJS/NPM version things will improve a bit.
I'll report back when that happens. But any advice is welcome at this point!