>110,000 Data Points (JSON Objects) causing JSONata error "Invalid JSONata expression: Maximum call stack size exceeded"

Hello all,

I need help resolving a problem I'm facing when I pass large amounts of data (>110,000 JSON objects) to a change node in my flow.

I get the error "Invalid JSONata expression: Maximum call stack size exceeded". Within the change node, I take the incoming data and format it using JSONata to meet the data format that is required by HighChart.

For data that is less than 110,000 JSON objects, my flow is able to process it and display a plot out using the zoom enabled HighChart.

Here is the part of my flow that is messing up when data gets too large.

[{"id":"99305dcc.69bf6","type":"inject","z":"8b707301.89767","name":"trigger read file","topic":"on","payload":"","payloadType":"date","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":140,"y":260,"wires":[["7aecc62a.d5d908"]]},{"id":"7aecc62a.d5d908","type":"file in","z":"8b707301.89767","name":"open http response file","filename":"C:\\Retrieved_PI_data_via_Node_Red_flow\\Temp_Retrieved_data\\http_query_response_150k_JSON.json","format":"utf8","chunk":false,"sendError":false,"encoding":"none","x":280,"y":220,"wires":[["df4d6279.43c37"]]},{"id":"df4d6279.43c37","type":"json","z":"8b707301.89767","name":"","property":"payload","action":"obj","pretty":false,"x":330,"y":260,"wires":[["f652ba75.5c7378","13b6dd6d.1aa0d3"]]},{"id":"13b6dd6d.1aa0d3","type":"debug","z":"8b707301.89767","name":"file in output","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","x":490,"y":260,"wires":[]},{"id":"f652ba75.5c7378","type":"function","z":"8b707301.89767","name":"Data Size [array]","func":"var data_points = msg.payload;\n\nvar dp_al = data_points.length;\n\nmsg.al = dp_al;\n\nreturn msg;","outputs":1,"noerr":0,"x":150,"y":300,"wires":[["150f7fe2.c1c4e","b281afa3.ed77f"]]},{"id":"150f7fe2.c1c4e","type":"debug","z":"8b707301.89767","name":"array length","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"al","targetType":"msg","x":150,"y":340,"wires":[]},{"id":"b281afa3.ed77f","type":"switch","z":"8b707301.89767","name":"if data >10k plot data using HighChart","property":"al","propertyType":"msg","rules":[{"t":"lt","v":"10000","vt":"num"},{"t":"gte","v":"10000","vt":"num"}],"checkall":"true","repair":false,"outputs":2,"x":430,"y":300,"wires":[[],["c55e1ccb.a1674"]]},{"id":"c55e1ccb.a1674","type":"switch","z":"8b707301.89767","name":"Route msg based on Value data type Obj or other","property":"payload[10].Value","propertyType":"msg","rules":[{"t":"istype","v":"number","vt":"number"},{"t":"istype","v":"object","vt":"object"}],"checkall":"true","repair":false,"outputs":2,"x":470,"y":360,"wires":[["f32549c0.278878","5104f50d.1b028c"],["80faae92.2d8c1","422ce3ad.aa53bc"]]},{"id":"5104f50d.1b028c","type":"change","z":"8b707301.89767","name":"Format http output payload data for HighChart","rules":[{"t":"set","p":"payload","pt":"msg","to":"[$$.payload.\t    [\t        $toMillis(Timestamp),\t        Value\t    ]\t]","tot":"jsonata"}],"action":"","property":"","from":"","to":"","reg":false,"x":880,"y":340,"wires":[[]]},{"id":"422ce3ad.aa53bc","type":"change","z":"8b707301.89767","name":"Format http output payload data obj for HighChart","rules":[{"t":"set","p":"payload","pt":"msg","to":"[$$.payload.\t    [\t        $toMillis(Timestamp),\t        Value.Value\t    ]\t]","tot":"jsonata"}],"action":"","property":"","from":"","to":"","reg":false,"x":890,"y":380,"wires":[[]]},{"id":"f32549c0.278878","type":"debug","z":"8b707301.89767","name":"out val","active":true,"tosidebar":true,"console":false,"tostatus":true,"complete":"payload","targetType":"msg","x":750,"y":280,"wires":[]},{"id":"80faae92.2d8c1","type":"debug","z":"8b707301.89767","name":"out obj ","active":true,"tosidebar":true,"console":false,"tostatus":true,"complete":"payload","targetType":"msg","x":750,"y":420,"wires":[]}]

BTW, the data is retrieved via a Web API query to a remote database server within a company network. And the MAX JSON data that can be retrieved per query is 150k. In the interest of this forum, I have saved a retrieved data (dummy data - not real operation data) that has 150,000 JSON objects for your reference. Please, follow the link below as it is too big to be uploaded here. Please scan for malware or virus if you're in doubt.

link to the large JSON data

The path I referenced for the file in my flow is at
"C:\Retrieved_PI_data_via_Node_Red_flow\Temp_Retrieved_data\http_query_response_150k_JSON.json". So, you'd need to put it there in order for my flow to work on your Node-Red setup.

I'm using Node Red 1.0.4, node.js v12.16.1 running on Windows 10, with 8GB RAM and i7 core. And this is what you'll get when you import my flow into your Node Red.

Really look forward to hearing from the Node Red experts in here.

Does it work ok if you do the same transformation in a function node instead of using JSONata?

Thanks for your response Colin. How do I do that in the function node? I mean run JSONata in a function node?

Or do you mean that I need to create an if...else or for loop to change the format to HighChart within a function node?

Yes, I meant that. That will prove that the data is ok and that it is a limitation, or bug, with the JSONata implementation. It may just be that the way it is implemented you end up running out of stack.
Possibly another way would be to increase the stack size, if that is possible. You can probably pass something to nodejs when you start node red, but you would have to research that, unless @dceejay looks in and suggests how.
By the way, when posting flows if you select Compact form it makes it easier to mark and copy it from the forum.

Thanks Colin. I'll try to do the transformation in function instead.

Also, thanks for the tip on the code sharing. I'll change it...

There is a button on the Export page. I have it set to expanded in settings.js so git diff makes a better job, but click the button when I am exporting.

Thanks for the tip Colin. I made use of a for loop to transform my msg.payload into a format required by HighCharts. I also added moment.js into my node_modules and called it via global.get("moment") in function to change my date object into Unix timestamp (or epoch time) for HighCharts x-axis.

All seems to be working fine without running into the "Maximum call stack size exceeded" problem. All 150k data objects could be managed by the modified flow (using function node instead of JSONata) and plotted in HighCharts.

I'll share the modified flow later.

1 Like

You should need to go to that extreme unless your input data is in a strange format or TZ. Just remember that a UNIX timestamp is a JavaScript timestamp divided by 1000. UNIX ts is seconds since 1970 and JS is milliseconds.

1 Like

Thanks for pointing that out TotallyInformation.

Initially I only wanted to convert a standard ISO8601 date format to the epoch timestamp (in milliseconds). And I thought that the moment(insert_ISO8601_Date_here).format('x') would be the quickest option. Hence, deciding to add moment.js module into node red and calling it via global.get("moment").

Only later that I found out about a native JavaScript option where I could just do a Date.parse(insert_ISO8601_Date_here) to do the same thing, and quicker too. :sweat_smile:

But then I realized, "what if" the date format that comes back from my API query is not in the ISO format. So, I think I'll stick to the moment.js solution, in the eventuality that the date served is not in the ISO8601 format.

As promised, here is the updated flow after changing my approach to transforming my Web API query response into HighCharts format, using the function node instead of JSONata expressions in the change node.

[{"id":"a2d31267.75aa8","type":"inject","z":"8b707301.89767","name":"150k obj sample","topic":"on","payload":"","payloadType":"date","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":140,"y":900,"wires":[["5cef91c7.a9ff1"]]},{"id":"5cef91c7.a9ff1","type":"file in","z":"8b707301.89767","name":"open http response file","filename":"C:\\Retrieved_PI_data_via_Node_Red_flow\\Temp_Retrieved_data\\http_query_response_150k_JSON.json","format":"utf8","chunk":false,"sendError":false,"encoding":"none","x":340,"y":900,"wires":[["534fed5b.92d3f4"]]},{"id":"534fed5b.92d3f4","type":"json","z":"8b707301.89767","name":"","property":"payload","action":"obj","pretty":false,"x":530,"y":900,"wires":[["7087612.49121a","e9e7397b.5728e8"]]},{"id":"7087612.49121a","type":"function","z":"8b707301.89767","name":"Data Size [array]","func":"var data_points = msg.payload;\n\nvar dp_al = data_points.length;\n\nmsg.al = dp_al;\n\nreturn msg;","outputs":1,"noerr":0,"x":150,"y":940,"wires":[["396213af.e0f4fc","7f75c560.6199bc"]]},{"id":"e9e7397b.5728e8","type":"debug","z":"8b707301.89767","name":"file in output","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","x":690,"y":900,"wires":[]},{"id":"396213af.e0f4fc","type":"debug","z":"8b707301.89767","name":"array length","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"al","targetType":"msg","x":130,"y":980,"wires":[]},{"id":"7f75c560.6199bc","type":"switch","z":"8b707301.89767","name":"if data >10k plot data using HighChart","property":"al","propertyType":"msg","rules":[{"t":"lt","v":"10000","vt":"num"},{"t":"gte","v":"10000","vt":"num"}],"checkall":"true","repair":false,"outputs":2,"x":430,"y":940,"wires":[[],["5dca5838.5d9148"]]},{"id":"5dca5838.5d9148","type":"switch","z":"8b707301.89767","name":"Route msg based on Value data type Obj or other","property":"payload[10].Value","propertyType":"msg","rules":[{"t":"istype","v":"number","vt":"number"},{"t":"istype","v":"object","vt":"object"}],"checkall":"true","repair":false,"outputs":2,"x":570,"y":1000,"wires":[[],["a67b2509.e2d728"]]},{"id":"a67b2509.e2d728","type":"function","z":"8b707301.89767","name":"Format http output payload data obj for HighChart","func":"//var moment = global.get(\"moment\");\n\nvar data = msg.payload;\nvar data_plot_hc = \"\";\nvar i;\nfor (i = 0; i < data.length; i++) {\n//  data_plot_hc += \"[\" + moment(data[i].Timestamp).format('x') + \",\" + data[i].Value.Value +\"]\"+ \",\";\n  data_plot_hc += \"[\" + Date.parse(data[i].Timestamp) + \",\" + data[i].Value.Value +\"]\"+ \",\";\n\n}\n\n//moment(date).format('X')\n// Date.parse() \n\nmsg.payload = \"[\" + data_plot_hc.slice(0, -1) + \"]\";\n\nreturn msg;","outputs":1,"noerr":0,"x":390,"y":1060,"wires":[["542b135f.dffc0c"]]},{"id":"542b135f.dffc0c","type":"debug","z":"8b707301.89767","name":"obj out function","active":true,"tosidebar":true,"console":false,"tostatus":true,"complete":"payload","targetType":"msg","x":680,"y":1060,"wires":[]}]

The following is the for loop I used to transform the query response

var data = msg.payload;
var data_plot_hc = "";
var i;
for (i = 0; i < data.length; i++) {

  data_plot_hc += "[" + Date.parse(data[i].Timestamp) + "," + data[i].Value.Value +"]"+ ",";

}

msg.payload = "[" + data_plot_hc.slice(0, -1) + "]"; //to remove the last "comma" and wrap "[ ]" around the whole string.

return msg;

Here is the moment.js version of the same for loop. But REMEMBER, you must have already added moment.js into your node_modules directory/folder... AND update your settings.js file at the functionGlobalContext: { moment:require('moment') } for this function node to work.

var moment = global.get("moment");

var data = msg.payload;
var data_plot_hc = "";
var i;
for (i = 0; i < data.length; i++) {
data_plot_hc += "[" + moment(data[i].Timestamp).format('x') + "," + data[i].Value.Value +"]"+ ",";
    
}

msg.payload = "[" + data_plot_hc.slice(0, -1) + "]";  //to remove the last "comma" and wrap "[ ]" around the whole string.

return msg;

Just FYI, my previous expression to do the same thing in JSONata (in my change node) looks like this

[$$.payload.
    [
        $toMillis(Timestamp),
        Value.Value
    ]
]

It looks a lot simpler, but alas, I got hit by the "Invalid JSONata expression: Maximum call stack size exceeded" error when my web API query returned more than 110k JSON data objects.

Yes, JSONata is great for some things but not yet optimised enough for everything I'm afraid. It keeps improving though. But like most high-level languages, you get to a point where you fall of a cliff.

This is a bug in JSONata caused by the use of Array.prototype.push.apply to join arrays.
Related to https://github.com/nodejs/node/issues/27732
Please could you raise a bug on https://github.com/jsonata-js/jsonata/issues/new and I'll recode this.
Thanks!

3 Likes

This has now been fixed in JSONata v1.8.3

2 Likes

Awesome andrew-coleman :+1: :+1: :+1: :+1:

Noted on the bug fix. Would this be updated in node red 1.0.5?

1.0.5 is already out... but 1.0.6 yes probably.

2 Likes

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.