Can Node-RED flow utilize multiple cpu cores?

Hi,
The Node-RED framework runs on Node.js, which by default runs in a single process and which uses a single cpu core.
Can Node-RED run a flow utilizing multiple cpu cores in a machine to speed up the execution? Please revert.

Thanks
Surya

Hi. What's are your flows doing that you feel needs speeding up?

Are you processing large data (big arrays or image processing etc)?

Are you creating blocking code in a large function node looping through lots of data?

It may be your design that is the problem rather than the single threaded nature of node js

More precisely the main javascript code runs on a single thread (I wouldn't say a single cpu core). That doesn't mean all the work is single threaded.

For example the node that writes to a file - while the actual file is written by the OS, the event loop continues to run other stuff, since it's an async call and the thread that runs the javascript code is different from the thread that asks the OS to write the file.

3 Likes

There is also some good work going on by @TJKoury looking at clustering

What actions are you doing that involve heavy processor utilisation? It may be possibly to move them out into separate processes.

What OS and hardware are you running on?

Hi ,

Thank you all for the response.

Following is the summary of workflow sequence we are trying to scale:

At scheduled intervals(Ideally 5 min or less) --->Step 1.Looping over data sources list Step 2.Auth token generation through API invocation, caching token details and generating the respective api request to fetch data corresponding each source Step 3.Looping over each record/JSON per data source from API results pages ---> Retrieving each record -->computing hash of the record--> performing a JSON transformation Step 4.Queueing each record ,perform a hash comparison of cached and incoming hash values to identify duplicates and performing another API invocation for processing the transformed non-duplicate JSON record

We are looking for multiple core options in node red as we want to scale our application to process large no of records in unit time. At the same time we wish to keep the number of multiple parallel deployments of the flow to the minimum which can free us from configuration overhead whenever a code/configuration change comes in.

As of now we do not have tasks of the grade of image processing but we may have big arrays as we expect more number of data sources in step 1 and more number of records to fetch and process from data sources in Step 3.Also in future if we need to include any computationally intensive tasks in Step 3 we could easily integrate the same by adopting a scalable architecture

Any inputs/architecture suggestions on achieving the above in Node Red would be helpful

Regards
Surya

Please find below the environment details:
We are running our node-red service as a docker container in a ubuntu 18.04 Ec2 instance with 16 GB RAM , 4vCPU have up to 3.0 GHz Intel Scalable Processor.

Regards
Surya

Do you mean multiple node-red servers?

Yes, true

No doubt you were planning to do this already, but I suggest first profiling the system to find out where the most processor intensive tasks are. Then you will know the areas that need to be addressed. You may well be able to find ways to move them out to different processes. Javascript is not the most efficient language for complex processing, so shifting those bits out to separate process coded in C++ (for example) may give a very significant saving.

Thanks for the inputs.

Quick addition. It is now possible to use worker threads in Node.js as well.

Worker threads | Node.js v12.22.7 Documentation (nodejs.org)

The worker_threads module enables the use of threads that execute JavaScript in parallel.

Getting Started with Node.js Worker Thread | Engineering Education (EngEd) Program | Section

Just in case you want to do some major task on one server.

1 Like

The link above has been changed to point at another site !

Thanks for spotting that. I have suspended the user for 2 weeks in case it was a compromised account or PC with a warning that should it happen again, they will be permanently banned.

Hi, I am looking for experiences on running an API based on Node-Red and the performance/reliability when multiple users are concurrently accessing the API.
I am running Node-Red on a Google Cloud Debian VM.
Have you got some practical experience with that scenario?
Thanks

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.