I am working on a project where I receive data from an external database and I store these in various global/local variables - as I don't want to bombard the database with requests I pull it every x minutes/hours depending on the need.
Then I use alasql (fantastic stuff) to query the datasets directly from the variables.
My question is, what is the cost of doing this (ie. large datasets in vars - 100k+ arrays with objects ) ? Are there things I need to keep in my in terms of performance ?
alasql is fantastic if you know SQL however I found (same as JSONata) it stops being usable when you get into the size of arrays you are talking about - it gets pretty slow
Storing the data in context is not really an issue but computing anything from large arrays is better (read that as MUCH faster) performed in the function node.
You can easily see the affects of JSONata and alasql for yourself by adding a
flowtimer before and after the alasql/JSONata node.
This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.