More VS Code + GitHub Copilot fun - use current documentation

Hi all node devs. Here is today's discovery.

If you are using LLM's for coding assistance you may be aware of the latest thing which are MCP Servers. These are API interfaces that allow the AI to easily get hold of more up-to-date or private data.

I discovered a public (or you can install it locally if you prefer) free MCP server called Context7. This gives LLM's access to current documentation - AI's often lag behind on this because their training is only updated occasionally.

It seems that someone in this community already uses it because I was pleasantly surprised to discover that UIBUILDER has already been added to their catalogue. :smiley:

Various Node-RED related documentation is also there.

I've now also [added the documentation}(Context7 - Up-to-date documentation for LLMs and AI code editors) for my web components library as well.

Context7 seems to work with most AI enhanced code editors but VS Code has a dedicated page for installing MCP servers, very handy.

LLM Copilot has been trained on GitHub

node-red has a repo on github

Using a LLM for node-red will probably not work as intended

As you are providing a wiring programming for task events , you are already doing Cascading architecture. So your sequence of doing things is already subdivided into elementary tasks in a deterministic way.

"Change module" and JSONata diagnosis

Change Module is and has always been way too much powerfull in node-red to its own good -> prod issues are usually coming from this.
You can generate a very large subset of testing evidences to check your change, and it has been quite a great help to discover loop-hole

PROD grade node-red package without docker/podman/forge

I don't know if it's widely use by noderaider , but you might have noticed that you can actually use node-red with wasm

You might want to add the wasm part in it, as it is probably the most powerfull combination for production tooling for node-red projects.

It is more about documentation.