I have couple of flows that has around 300 mqtt nodes and all are using one single MQTT client.
Now,
40 MQTT nodes will go to one function node,
40 to other function node
60 to other function node
60 to other function node
20 to other function node
30 to other function node
50 to individual functions
So basically there is kind of grouping of nodes and will be calling one function to handle the data I get from topic.
I don't how Internally the function is called. Here is my assumption:
When I receive a message for particular topic mqtt node will get it wires array and call those functions. If there are like 300 MQTT nodes it will have an array of 300 nodes information and then will find the MQTT node for which data is received using BRACKET NOTATION of java-script. or it could be simply using if else conditions.
I read few posts saying that java-script will not always store objects in the form of hash tables. So there could be a performance hit because of dynamic key lookup by java-script.
Suppose, considering V8 java-script engine used by node js too. I should be storing data as hidden classes. But this will happen only when there is no change in structure of objects. If suppose topics are of different data types then that could result is multiple class or it will just store them as dictionary. So, In this case to there will performance hit.
Or if using eval to get the information of the node then also eval will have performance hit.
So, are my assumptions correct or Is the processing done differently which will not effect any performance?
I am using node red to capture events like X event started or X event ended or X event updated etc using multiple topics. Actually topics are from OPC through MQTT to node-red.
One example: There is a process where it has 40 burners of 3 types of gases blown through them. Now for each gas there is separate tag saying which gas is being blown. So it will be 40*3=120 tags. In same way there is many other properties like temperature of zones in process etc.
Now on events start end and at times in between I need to calculate the min, max burners, type of burners, duration's of usage for each event etc.
And I need to call multiple functions to handle data according to tag types.
whenever a message arrives at the mqtt client, it has to then find all of the nodes whose topics match. We can't use any sort of hashing to short-cut the lookup because a node can subscribe with wildcards - so we have to compare each node's topic with the received topic taking into account any wildcards.
The more nodes you have, the more work it takes to find the right nodes to pass the message to.
But I'm not sure what conclusion you want to draw from this. There could be optimisations to make in how it finds the right nodes to pass messages to. But it isn't something we have listed on any backlog to look at. If someone wanted to spend their time look at that, they would be welcome to. But we'd probably want to be certain its an actual performance issue and not a theoretical problem.
@knolleary Thank you for the valuable input. I am planning to run the flows in production. So, just needed an input on this topic.
May be I will use multiple clients which will reduce the lookup.
Basically, I have been working on nodejs and java-script recently. I am learning a lot about how java-script engines work, how they store, obtain, search js objects. One of my important searches were linked to how arrays and objects are searched compared to programming languages like c#, java etc where there will tightly coupled to datatypes and have inbuilt hash classes and uses offect to search.
@ghayne Right now only analyzing data using multiple tags and capturing events, deviations, alerts etc. But process control is not the road-map currently. But we are considering it when there is a need for Process control.
Right now most process control is done using SAP or some other L2, L3 systems(not sure).
I am also looking into Spark streaming stuff. But I am not sure about data migration to desired objects I will be using for processing.
Yes it is possible in few cases where data is processed in batches or coming in low frequency. In node-red using persistent context.
I think it will not be suitable in my case:
I will be checking multiple tags on each tag change. So it will not be a good solution to use database which will have very high performance hit because of reads and writes. I have few tags that give data at 500 ms.
And also I want to capture events at real time. So i feel in-memory execution is good for this scenario.
OK, then you will have to scale gradually as suggested earlier by @cymplecy , I thought that you might also be interested in historical data too, which is why I suggested a database. Good luck!
As posted by you, i feel that you can optimize the incoming messages coming from the burner, that will reduce the computation time. Just for an example you can send complete set of data for one burner in in single message such as { Start , End , update , Temperature , gas } and so on.
Later on you can work on the messages accordingly to get your need.
Can you post you complete requirement so that some ideas can be shared?
@pandeyprakhar00 Start, End, Update events are not received from MQTT. They are calculated in node-red by using raw data that is being received from Plc's, iot device's, DCS systems etc through OPC and MQTT.
Its basically like determining the state of a process or machine using that rawdata I got from tags. For example let's say for event x start, tag a should be value true and tag b should false and for end tag a is false and topic b is true.
Now in MQTT data will be received sequencially one by one, so to determine event start and end in real time I should receive with no latency. Therefore I cant batch data with time and then process.
Coming to my requirement, I just need an opinion on how MQTT node works and will there be any performance issue. I wanted to know it's just a key look or something else that doesn't do key lookup.
And as @knolleary suggested there can be some optimizations to search nodes. It would be nice if someone shows a way of doing it. Until the I will be just looking into web if we can fasten the search of job objects and arrays.
Moreover I just wanted to say that I didn't find any other tool that will allow us to convert data into classes. Like I am defining a model for each process and converting MQTT data to those objects and then processing them. I have MQTT msg structure of same type like {v:1,q:4,t:2333442555}. I recently say there is structured streaming in spark streaming. I don't know it has these scema mapping capibalities. Have to explore it.