Reading & writing global variable at the same time

Need help in understanding the best way to use global.set & global.get

I'm reading OMRON plc D tags using FINS multiple node using below steps:-

injecting below code using function node

var tag_1 = 'D1,D2,D3 .............. D50';
var tag_2 = 'D51,D52,D53 .............. D100';
.........................
var tag_n = ............;

var msg1 = { payload : tag_1 };
var msg2 = { payload : tag_2 };
........................
var msgn = { payload : tag_n };


setTimeout(function(){node.send(msg1);},100); 
setTimeout(function(){node.send(msg2);},300);
.........................
setTimeout(function(){node.send(msgn);},n);

**this method I'm using because if I read all tags at once there's always timeout error so reading 50 tags at max at once.

then joining the output using join node and assigning it to global.set("values",msg.payload) then repeating again at (n+200).
And at the same time reading via global.get("values"). Now the problem I'm facing is that some values are constant but when reading them they are not.
Like four tags holds time values 1300 and 1330 but gives values

โ•”โ•โ•โ•โ•โ•โ•โ•โ•โ•ฆโ•โ•โ•โ•โ•โ•โ•โ•โ•—
โ•‘ time_1 โ•‘ time_2 โ•‘
โ• โ•โ•โ•โ•โ•โ•โ•โ•โ•ฌโ•โ•โ•โ•โ•โ•โ•โ•โ•ฃ
โ•‘ 30     โ•‘        โ•‘
โ• โ•โ•โ•โ•โ•โ•โ•โ•โ•ฌโ•โ•โ•โ•โ•โ•โ•โ•โ•ฃ
โ•‘ 100    โ•‘ 130    โ•‘
โ• โ•โ•โ•โ•โ•โ•โ•โ•โ•ฌโ•โ•โ•โ•โ•โ•โ•โ•โ•ฃ
โ•‘ 1300   โ•‘ 1330   โ•‘
โ• โ•โ•โ•โ•โ•โ•โ•โ•โ•ฌโ•โ•โ•โ•โ•โ•โ•โ•โ•ฃ
โ•‘ 1330   โ•‘        โ•‘
โ• โ•โ•โ•โ•โ•โ•โ•โ•โ•ฌโ•โ•โ•โ•โ•โ•โ•โ•โ•ฃ
โ•‘ 300    โ•‘ 330    โ•‘
โ•šโ•โ•โ•โ•โ•โ•โ•โ•โ•ฉโ•โ•โ•โ•โ•โ•โ•โ•โ•

Also, while reading PLC for some parts of msg I get timeout errors.

Is it becasue of this timeout error that causing it?

Can you share the flow for this?

There are several things you appear to be doing that are completely unnecessary (like setTimeout in a function, using Read Multiple, etc)

my flow :-
flows.json (42.0 KB)

Also, to Parse I do like this:

msg.TIME_1 =
    Buffer.from(parseInt(msg.payload.D2788).toString(16), 'hex').toString().split("").reverse().join("") +
    Buffer.from(parseInt(msg.payload.D2789).toString(16), 'hex').toString().split("").reverse().join("");

msg.TIME_2 =
    Buffer.from(parseInt(msg.payload.D2792).toString(16), 'hex').toString().split("").reverse().join("") +
    Buffer.from(parseInt(msg.payload.D2793).toString(16), 'hex').toString().split("").reverse().join("");

Hi @sahsha !
After a short check of your flow, I'd initially propose you read the chapter discussing output of multiple messages in the Node-RED docu. As Steve already indicated: You're doing things that make your code complicated & lengthy - without any need.

It might be lengthy but not complicated.

It just delays msg2.payload to 200 ms so msg1.payload is executed in FINS to give output

the reason for doing this was not to get this error :point_down: while reading all tags at once.
image
So I divided the payload into parts with delay then join later.

In fact I referred it for creating the code in function node. There is just a small (200 milliseconds) delay with each payload

var msg1 = { payload:"first out of output 1" };
var msg2 = { payload:"second out of output 1" };
var msg3 = { payload:"third out of output 1" };
var msg4 = { payload:"only message from output 2" };

setTimeout(function () { node.send([msg1,null]); }, 100);
setTimeout(function () { node.send([msg2,null]); }, 300);
setTimeout(function () { node.send([msg3,null]); }, 500);
setTimeout(function () { node.send([null,msg4]); }, 700);

If there is a better way to do it then I'm unaware of it due to my limited/low coding skills :neutral_face:

The result of what you're doing is quite unpredictable ... based on my experience.
I would create a msg package, send this via the multiple message functionality (as one batch) & put a delay node afterwards.

The delay node is able to rate limit the messages ... what seems to be the effect you try to achieve.

As the docu says:

A function can return multiple messages on an output by returning an array of messages within the returned array.

let msg = [ [ msg1, msg2, msg3, ... ] ];
return msg;

then (Attention: This is wrong. Check correct setting below!)

1 Like

Where to start :thinking:

Firstly,
According to this manual (this is a guess on the manual since you did not mention PLC model), the max request size is 2012 bytes. Or, in FINS 16 bit word terms 1006 WORDS.

My first recommendation would be to put all PLC data (that are related to one another) into memory blocks smaller than the read limit. Then read these addresses in 1 go to ENSURE data consistency.

Additionally, I would separate the reads for live data (production status, production count) config data (break times etc) and config data (cycle time settings etc)

Lastly, I recommend the Buffer Parser over function nodes to make this more graphical (and avoid hacks like this...

Manual conversions, string splits, joins, parsing, buffer creation - not pretty (or necessary)

msg.TIME_1 =
    Buffer.from(parseInt(msg.payload.D2788).toString(16), 'hex').toString().split("").reverse().join("") +
    Buffer.from(parseInt(msg.payload.D2789).toString(16), 'hex').toString().split("").reverse().join("");


That wont work - it wil just sent every message at once after 200ms. It should be in "Rate Limit" mode

Also, instead of

let msg = [ [ msg1, msg2, msg3, ..., msg50 ] ];
return msg;

it is nicer to send an array in the payload then use the split node (this gives a graphical element to what is happening)

If I have payloads i.e. msg1.payload.D1, msg2.payload.D2, msg3.payload.D3 and after joining I get

{D1 : "some_value_1", D2 : "some_value_2", D3 : "some_value_3"}

And if msg1.payload.D1 stays empty or any sort of error occurs then can this happen?

{D1 : "some_value_2", D2 : " some_va", D3 : "lue_3 "}

It is NX series (NX102-9020 DB)

You're fully right. This is the correct setting:

Update*
Changed the PLC ladder where tags on which no input was fed were given default value as '####'. This solved the problem for me.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.