Accepting more HTTP requests simultaneously?

So i am using HTTP nodes to create an endpoint for a webhook for a particular service, which sends me data , that i format for influxdb, make a response to the original webhook (200 or error) and then i try to push the data to influxdb node.

Initially, this is a lot of data (around 100k requests in 1h) which all get to the flow (counted them), but the service of webhook returns a timeout for 60% of all messages (because it took longer than 30 seconds to send a response). The function nodes don't produce a single error.
And the question is, how can I accept more HTTP requests simultaneously? There isn't any room for 'making better flow', I am running this on Digital Ocean as an App (docker image hosted) with the lowest tier.
In settings.js and haven't noticed any configuration around this, so wanted to double-check, before upgrading my plan.
Also, the endpoint is secured in settings.js via basic authentication(can this produce and lags)
Few more things. All loops in functions have a break statement as soon as needed.
Here is the image of the flow, and here is the function temperature_event_v4(all of them are initially the same):

var msg_inc = RED.util.cloneMessage(msg);

const event_obj = msg.payload.event
const label_obj = msg.payload.labels
var name = (event_obj.targetName).split('/')[3]
var label_name = 'targetName'
if (Object.keys(label_obj).length > 0) {
if (label_obj.name != "undefined") {
    if ((label_obj.name).length > 0) {
        name = (label_obj.name)
        label_name = 'name'
    }
}}
function check_for_samples() {
    if (typeof event_obj.data.temperature.samples == 'undefined'){   //CHANGE ME DEPENDING ON EVENT TYPE
        return false
    }
    if (event_obj.data.temperature.samples.length != 0  ){      //CHANGE ME 
        for (var i of event_obj.data.temperature.samples){  //CHANGE ME
            format_send_message(i.value,i.sampleTime)     //CHANGE ME
        }
    }
    else {return false}
    return true;
}
function get_old_value(){
    format_send_message(event_obj.data.temperature.value,event_obj.data.temperature.updateTime)   //CHANGE ME

}
function format_send_message(value,time) {
    msg= {
        measurement: "/DT/temp_sensor/" + name.replace(/\s/g, '_'),   //CHANGE ME
    payload: [
        { value: value,
          time:  new Date(time) },
        {event_type: "temperature",    //CHANGE ME
        label_name: label_name,
        label_value: name
            }]
    }
    node.send([msg, null])
    return
}


if (check_for_samples() == false) {
    get_old_value()
    msg = {
        req: msg_inc.req,
        res: msg_inc.res
    }
    return [null, msg]
}

Best to ditch any security in node-red and move it to a reverse proxy which will perform a lot better. Let it do the auth and the tls termination.

After that, to improve performance you would probably need to scale horizontally which means creating multiple instances of node-red and use one or more proxies to handle load sharing - maybe as simply as round-robin routing.

3 Likes