Wrong json-Node convert of bigint

Hello Node-Red-Pros....

I've the following problem with "large" numbers.

I get the following data as a string from an MQTT broker:


The json-node does returns this:


I know it has to do with the maximum size of numbers in javascript, but trying to use node-json-bigint or json-bigint failed.

I'm using Node-Red v0.20.7 with Node.js v11.4.0.

Is there still a way to convert these values correctly without great string manipulation before the json-node?

BTW: I also don't understand by now if v0.20.7 is the latest and what v1.2.9 is about!?!! Is it newer now? A different branch? Should/do I update for my problem?

I am grateful for any advice.

Best regards

I don't know about the big numbers but you can see the node red versions here. 0.20.7 was superseded 18 months ago, so you need to upgrade.

nodejs 11.x is not one of the recommended versions so you need to upgrade that to 12 or 14. I suggest 14 in case the feature you are looking for is a recent addition to nodejs.
Upgrade nodejs before node-red as you have to re-install node-red after upgrading nodejs anyway. You will also need to go into your .node-red folder and run npm rebuild after upgrading nodejs.

@Colin : Thanks for the version hint! I was little confused, but now it's clear(er).

If have update node to v14.16.0 (latest stable) and node-red is now v1.2.9.

But issue with bigint ( 4128196907966543929 vs 4128196907966544000) is still there!

Any suggestions?

Thanks in advance

In order to use bigint64 you need to jump through a few hoops as JSON doesn't support bigint type and there are a fair number of differences when working with bigint. "BigInt - JavaScript | MDN" BigInt - JavaScript | MDN

Keep the value as a string and use the bigint functions to parse the value.

Question: what do these values represent? Where do they come from? If possible, change the source data to something manageable.

Tanks for answer @Steve-Mcl .
I use the json-Convert node in node-red to make a valid JSON Object from the incoming string.

The value is a hex-encoded string that is converted to a hex string with value.toString(16) and then converted to ASCII characters.

4128196907966543929 = 394A 4D37 3636 3839 = 9JM76689

I also don't think JSON is the real problem, but the javascript functions to de- and serialise (see here).

The source data cannot be changed because other clients (C#, Java) with "correctly" working de/serialisers can process the data.

In my opinion: JSON is primarily a data format, independent of the content. The correct interpretation makes it a defined data type. I'm only surprised that Javascript doesn't process 8-byte numbers correctly, since 64-bit didn't just become standard yesterday.

I think I will eventually have to implement your suggestion so that the data is converted appropriately BEFORE the JSON node.

Thanks for the advice.

I am aware that JSON isnt the issue when talking numbers. The issue is that implicitly, numbers in JS are treated as the default Number type. The largest exact integral value of this type is Number.MAX_SAFE_INTEGER , which is: 253-1 or +/- 9,007,199,254,740,991

So to maintain the accuracy after MAX_SAFE_INTEGER you would need to use true 64 bit bigint numbers (as described in your title - bigint) which are represented in JS as a number with trailing n e.g. 4128196907966543929n BUT the JSON spec and JSON functions do not handle bigint...


... so a workaround is required. Store the value as a string in your JSON and use a little coding to convert back to bigint - but as I said before, working with bigint is a whole world of hurt. (see MDN docs)

So thats why I asked what the value was - for example, if it was some kinda timestamp and the lost digits mean you lose a handful of us - it may not be of consequence or worth the trouble of using bigints.

there is however a solution to JSON not handling bigints - you could import into node-red an alternative JSON lib. e.g. json-bigint - npm

This topic was automatically closed 14 days after the last reply. New replies are no longer allowed.