I want to make simple HTTP calls from within a function node. Ideally so I can do:
a = get_value(b);
from my code. The function needs to call an external REST API to get the value.
The function node needs to make multiple such calls, which are not known beforehand, and accumulate all the results before proceeding. Hence it would not be easy to use an HTTP Request node for this.
I have tried with axios, got, http, https.... and ended up not seeing the wood for the trees because they all work asynchronously (obviously) and I tie myself in knots with all the async/await stuff and/or promises.
Ultimately I would like the definition of "get_value" to be reusable in many function nodes, so it will likely end up as an NPM module.
Can anyone please help me get off the ground with this?
Using HTTP Request nodes would be unwieldy in this case, as I may be making 10-20 such calls. For each call a Function Node would be required to construct the URL; that would chain to HTTP Request, then a Function Node to park the response data somewhere and prepare the next URL... And the exact set of calls to be made cannot be determined up front, so that flow will be horrendously complex and a bugger to understand and maintain later.
Steve has already pointed out how to chain http request nodes and collect all the data by putting in intermediate change nodes.
But to do this in a function node is not much harder either. The trick is not to use the default return msg; at the end of the function. Remove that and use node.send() inside your promise handling. Then you can chain function nodes and not have to worry about the complexities of nested promises.
Thanks... I am fully aware of how to chain nodes together, I have been doing this for several years... I said already that the logic in the function node is complex, and will require multiple calls out in a pattern that is determined by the input message. I am looking for a generic solution here, so it can be reused in multiple nodes, flows and instances. The possible routes through the code are basically infinite. HTTP Request and chains of function nodes is not viable solution.
Perhaps a bit of background might help: I have a home automation setup running on NR with many integrations. It is structured like an IoT system (that's my day job) including an in-memory database of both configuration info and live values, all hung together with MQTT. The solution has grown to a large number of flows and I now want to split the processing across at least two NR instances, initially one for front-end processing (local and external integrations with heating/lighting/Google/Zigbee/bluetooth etc) and one for backend logic (nobody at home => turn heating off, that kind of thing). But now I have an architectural dilemma, as everything is built to assume simple access to the in-memory database (held in the global context). I have made a REST API to return the live data and configuration data, which will run in the frontend instance. Now I want to refactor the code in the backend to get the values it needs from a function call (wrapping the REST call) instead of directly accessing the data in the context.
Yes, but that wouldn't solve the problem. I would then need an implementation of my "get_value()" function to query the database, and that would have to be async too.
The problem is not in the easy bit (the frontend where the processing is essentially linear). It's in the backend, where I might be doing "if Colin arrives home, and the light in the kitchen is OFF, then disarm the alarm". My arrival would be the trigger for the flow, and during the processing I do a lookup on the state of the kitchen light. But all this logic is not written in JS or any other programming language - it is configured declaratively in JSON files which are interpreted by the code in the function nodes. That's why I cannot predict which values I will need (and hence which REST calls will be needed). All I want is an implementation of "get_value()" that "appears" to be synchronous so I get the result of one call before I make the next move.
If I'm being honest, for something like that, I would do it differently. I would almost certainly create either my own module that I loaded into Node-RED via settings globals or into a function node via setup/modules. or I would create a custom node.
By offloading the complex processing into a module or node, I would be able to split down all of the various functions needed into logical components without creating the complexity you refer to in the flow.
As an example, this is what I did with the Drayton Wiser smart heating functions. I initially created a module that I loaded into Node-RED and called via a collection of simple function nodes. Later on, I put everything together into a set of custom custom nodes so that it was easier to build logical flows out of.
Indeed, the Wiser module and nodes make use of another feature that would almost certainly greatly simplify your processing - events. Using a custom event emitter (which is a native node.js library though I used the smarter event2 package for the custom node as it gives wildcard subscriptions), you can disaggregate the async processing quite easily.
The wiser module and nodes actually are quite a similar use-case to your own in that the biggest challenge was handling the async nature of the API calls. You can find both in my GitHub.
Yes, I am definitely planning to put all the REST-handling stuff into a module. I just thought it would be easier to get it working in a function node first. Looks like I was wrong and I am going to have to some refactoring.