Global Storage Reference versus Cache

I've got a process that loads a reference list from a SQL table, and loads it into a Global variable as an Object that I then search through.

I've not run into any issues, but I am up to 7000 some items.

Are there limits with using the Global in this fashion?
Would it be faster to use some sort of cache or redis object?

As an example of my flow...

Given ABC123, provide back a JSON Object that has { ID1, ID2, Customer1, App1, metadata }

The limitation is memory. Global and other context variables have to live in memory. I have at least 1 global with well over 10,000 entries though each entry is only a couple of properties.

That wouldn't help. The only thing that would help would be to query the DB each time with a filter to reduce the number of entries. e.g. making the DB do all the work which is typically more efficient.

But with that number, you will probably be fine. Just be careful not to dump the whole thing to debug as that can be slow.

Thank you @TotallyInformation .

I've thought about making each hit query to the DB, and then only valid results get cached locally, kind of like an intermediate with a fallback to the database.

1 Like

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.