Trying to do something I thought would be real simple ... basically looking to have a Counter that resides on a PI Raspbian (unix) that could be Cleared, Set, Run and monitored from a node-red client ... this timer is required to control a relay ... on during count and off when 0 ... this is easy enough to do if the client is the keeper of the Counter ... except that if the client disconnects, the counter and relay are in limbo ... the trick would be to initiate a counter and have it run completely on the PI ... with the ability to view the current timer count from any client.
Started out with Python thinking incorrectly that if I define an include with a global and maintain a program in a loop that contains that global then run another instance with the same global that they would somehow be connected ... STRIKE 1
Decided to attempt to use an ENV from Python ... STRIKE 2
Then proceeded to try using a Shell Script ... STRIKE 3
OLD FORGOTTEN FACTOID: ENV vars, even when exported, only affect the current "COPY" of the ENV for the current instance of a shell
Brain damage from old age? ... after redetermining this recollection from another lifetime, AND, more specifically that there is a way to force a script to run in the root shell ( . //.sh) ... while this does work locally, I can't seem to get this to work from a node-red execute node
Took this one step further and created an executable that contained the explicit command by itself
. //.sh $1
But when called this appears to still run in another shell so fails ... I think the "." doesn't explicitly force the script to run in the root shell it appears to be relative to the shell that is calling the script
As the node-red Exec Node has both a command and a param entry ... I have tried using "." as the command or . //.sh neither work
Is there some other way, other than using a File and or an SQL record that might provide a way to maintain a variable that won't lose scope and or die?