Restoring global state from settings.js

Using NR 2.1.1 / Rpi Raspbian Buster
My NR process correctly writes a backup file bu.json every hour. It is ~500kB and contains all global objects.

After a brownout/restart bu.json is imported in settings.js and given to functionGlobalContext:

const lkg = require('/path/to/bu.json');
const fGC = {
    os:require('os'), //note: node function references are not saved in globalContext.json
Object.keys(lkg).forEach(key=> fGC[key] = lkg[key]); <-- restore globals
module.exports = {
     functionGlobalContext: fGC,
     contextStorage: {
        default: {
  1. If I start up a new NR installation (never run before) using that settings.js then the global environment will be complete and just as expected when NR processing starts.
  2. If I start up an installation that has run previously (eg: the system that made the backup) then the result is indeterminate; some objects might be missing completely and/or some arrays will be missing elements.
    2a. If, in case 2, after a failed restore, I use a function node to import bu.json "manually" then global state will be restored properly.

Notes: No error messages are logged during startup either by NR or the system kernel and there is plenty of free memory at all times.
I'm guessing something is happening after settings.js is evaluated which I don't know about. It "kinda smells" as if something is trying to apply vestiges of a previous global context even though contextStorage is set to memory and would hopefully be ignored.

Any insights gratefully received!

Why not make the file a JavaScript file rather than JSON? That way you export all of the global context as a single object in that file and all you need in settings.js is:

    functionGlobalContext: require('/path/to/bu'),

No need to mess with converting back from json. In any case, not everything that you can put in the global context is directly serialisable to JSON and you would need to do more messing in order to serialise/de-serialise the data.

That's a thought, thanks. (Apologies for delay in replying.)

I'm replying to myself to warn anyone interested NOT to try this approach. Setting globals in setting.js this way does not play nice with NR functions global.get/set. You think everything is fine but unpredictable things happen at run time. (It seems like I'm setting the raw value of objects rather than using setters.)

??? I use this regularly and haven't had any issues.

Obviously, anything allocated via settings.js is read-only, it cannot be written to. However, it can reliably be used to share functions. In flows, trying to use flow/global to share functions won't always work - it depends on the underlying storage method. That is because some storage methods require the data to be serialised to JSON and back and you can't necessarily do that with functions.

So as long as you are keeping the names straight and keeping them separate, there should be no problems.

Yup, me too in other projects. I'm slightly surprised it didn't work reliably this time, the only difference is the amount of data in arrays etc. is much larger. I'll get back to it one day.

Thanks for the support!

May be a memory issue then? If the data really is that large and complex, I'd probably recommend switching to a database anyway and then perhaps using separate flows to create some API's. That way, your main business logic would remain independent of the underlying data store and the API's make access easily reusable.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.