Organizing flows into multiple files


#1

Hi,

I'm trying to find a good pattern to share flows across multiple projects.

Libraries for my understanding are JS code only. Subflows are part of the normal user.flows.

I would like create one or multiple repos which provide reusable flows, which then can be incorporated into bigger projects.

Looking forward to any suggestion. Thanks a lot...


#2

Hi @TheLongAndOnly

the built-in library is not just for JavaScript Functions; you can add flows to it:

  • select the nodes to save to the library
  • pick the Export->Library menu option
  • give it a name and you're done

This helps you copy flows between projects, however when you copy it in, the link is broken - if you update the flow in the library, it doesn't automatically update the projects you once copied that flow into. We don't have a good solution to that particular aspect today.

One of the main outstanding items on our roadmap to 1.0 is an overhaul of the library user experience in the editor. Thinking about how to make library elements more shareable, reusable and updatable is all part of the plan.


#3

Thanks a lot for the clarification. I did not realize the library option. Nevertheless it will not fully suit my needs, as this update capability would be required.

One thing came to my mind: subflows seem to be quite close. Multiple instances of the same flow, but only one source. What about an option for the subflows to store them into separate flow files? Combined with a list of search folders for subflows, it would be close?


#4

If you put your lib folder under git control then you can check it out into other systems as you need it.


#5

I was just about to post a question along these lines. Do you actually do this? What has your experience been? I haven't tried it yet, because I'm not sure whether on balance it's a good idea.

The Library is already shared among projects on the same host, so I think it answers the OP's question. Git would allow sharing among hosts, but I can see some issues. Do you push commits every time a new flow is saved? I would think the common Library could get large if you do, but synchronization could get to be a problem if you don't. What if you modify a library flow on one host and want an instance running on another host to change? or not? Do you use sub-folders? I already do on some of my machines, but do you organize by topic? by host? Etc.


#6

One of the scenarios we need to support is where you have a team of developers collaborating. They may not be working on the same physical instance of Node-RED, but they want to be able to easily share assets (ie flows/subflows) between themselves.

Putting the library under git control is certainly one way of doing this - and something I'm looking at for the library overhaul in the roadmap.

You can certainly do it manually today. But what you don't get is any automatic updating of what your deployed flows are using. As I said in my original reply, once you copy something from the library into your deployed flow, you lose that link. You still have to manually update your deployed flows to any such updates incorporated.


#7

No, as I don't have the need that is being talked about here. Just suggesting one possibility.


#8

That may, or may not, be what is wanted.


#9

In theory, you should be able to "snip" out the part of the flow file you want using a script. It is just JSON after all. You wouldn't get any credentials though of course so you might still have issues.

Not pretty and probably rather fragile but it might be possible?