Multiplexing HTTP Requests

Hi all,

I'm working on a flow that interrogates a REST API (The EVE Online API) over multiple steps. First, by retrieving a list of entities by ID ( ). Then, the list of entities is parsed and split into further request objects to obtain further information on each info (eg: ). This stage already generates a huge number of TCP connections and a bad load on the Node-RED server. Next, there's a further step where requests are made to obtain a list of sub-entities, and this is where the server finally fails to cope with the load being imposed on it and either the sockets all fail for the requests, or Node-RED gives up the ghost and crashes entirely.

The array returned from the initial request is in the region of 32,000 entries, to give you an idea.

Is there a way I can pass a list of URLs to the HTTP Request node (or another node I can use for this purpose), that will allow me to use HTTP2's multiplexing capability to pass a huge number of requests to the server at once without it collapsing on me?

Alternatively, what would the best way of managing the load on the HTTP request node be? I've looked into delay/rate limiting node and batch node but neither has worked so far.


One possibility would be to use node-red-contrib-semaphore to ensure that it only tries to handle a defined number of transactions at a time.