Flow to decode binary data and a walk-through on how to install the npm module


I just wanted to share a simple way to parse and decode a binary stream of data using the npm module binary-parser-encoder. This is an alternative to the node-red-contrib-binary node

1- Install the module , in my case in Windows by typing:

npm install binary-parser

2- Amend settings.js config file by adding the binary_parser line as shown below

 functionGlobalContext: {

Note that the module name uses a, hyphen in the name whereas the property name in functionGlobalContext will use underscore

3- Start (or restart) Node-RED. Hopefully there will be no errors, otherwise the initialization will not proceed( in which case you need to fallback and analyse).

Using the module.

As usual in Node-RED you need to use global.get to require the module. See below how the function node will looks like. This will be the core of the flow. I am using it here to decode a proprietary packet format for a Lidar scanner (but using random test data).

var Packet = global.get('binary_parser');
var buf = msg.payload;

var typea = new Packet()
msg.payload = typea.parse(buf);
return msg;


[{"id":"16dd9094.027e5f","type":"tab","label":"Flow 1","disabled":false,"info":""},{"id":"c899e7c.87f7618","type":"inject","z":"16dd9094.027e5f","name":"Go","topic":"","payload":"","payloadType":"str","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":150,"y":180,"wires":[["1ccf13aa.f5370c"]]},{"id":"1ccf13aa.f5370c","type":"function","z":"16dd9094.027e5f","name":"Dataset Buffer","func":"msg.payload =  Buffer.from([0xa2, 0x5c, 0x01, 0x80, 0x02, 0x10, 0x02, 0x10, 0x02, 0x10, 0x10, 0x02, 0x10, 0x03, 0x80, 0x02, 0x10, 0x02, 0x10, 0x02, 0x15, 0x95, 0x02, 0x10, 0x02, 0x10, 0x02, 0x10, 0x10, 0x02, 0x10, 0x03, 0x01, 0x80, 0x02, 0x10, 0x02, 0x10, 0x02, 0x10, 0x10, 0x02, 0x10, 0x03, 0x80, 0x02, 0x10, 0x02, 0x10, 0x02, 0x15, 0x95, 0x02, 0x10, 0x02, 0x10, 0x02, 0x10, 0x10, 0x02, 0x10, 0x03, 0x03, 0x80, 0x02, 0x10, 0x02, 0x10, 0x02, 0x15, 0x95, 0x02, 0x10, 0x02, 0x10, 0x02, 0x10, 0x10, 0x02, 0x10, 0x03, 0x01, 0x80, 0x02, 0x10, 0x02, 0x10, 0x02, 0x10, 0x10, 0x02, 0x10, 0x03, 0x80, 0x02, 0x10, 0x02, 0x10, 0x02, 0x15, 0x95, 0x02, 0x10, 0x02, 0x10, 0x02, 0x10, 0x10, 0x02, 0x10, 0x03]);\nreturn msg;","outputs":1,"noerr":0,"x":320,"y":180,"wires":[["7df2c612.8102a8"]]},{"id":"7df2c612.8102a8","type":"function","z":"16dd9094.027e5f","name":"binary_parser","func":"var Packet = global.get('binary_parser');\nvar buf = msg.payload;\n\nvar typea = new Packet()\n  .endianess(\"little\")\n  .uint16(\"magic\")\n  .uint16(\"packet_type\")\n  .uint32(\"packet_size\")\n  .uint16(\"header_size\")\n  .uint16(\"scan_number\")\n  .uint16(\"packet_number\")\n  .double(\"timestamp_raw\")\n  .double(\"timestamp_syncw\")\n  .uint32(\"status_flags\")\n  .uint32(\"scan_frequencye\")\n  .uint16(\"num_points_scan\")\n  .uint16(\"num_points_packet\")\n  .uint16(\"first_index\")\n  .uint32(\"first_angle\")\n  .uint32(\"angular_increment\")\n  .uint32(\"iq_input\")\n  .uint32(\"iq_overload\")\n  .double(\"iq_timestamp_raw\")\n  .double(\"iq_timestamp_sync\")\n  .uint8(\"header_padding\");\n  \nmsg.payload = typea.parse(buf);\nreturn msg;\n ","outputs":1,"noerr":0,"x":520,"y":180,"wires":[["92b3fe6d.84a35"]]},{"id":"92b3fe6d.84a35","type":"debug","z":"16dd9094.027e5f","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","x":690,"y":180,"wires":[]}]

As the binary data structure is fixed the flow will be quite simple.


How store JSONata expression inside a global variable and use that for evaluation

What is the advantage over using the contrib node?



HI @Colin,

I found that the size of the Pattern field provided in the config dialog of the contrib-node-binary is a major constraint. The expressions required to parse a binary data are normally quite large and it is cumbersome to be limited to work on one single line. I prefer to work in the larger room in the function node.

Also the binary-parser-encoder module seems to be updated more frequently and have more contributors to the code. Finally i liked more the chainable syntax. The syntax used by the node-red-contrib-binary is somehow obscure (or more likely I did not spend sufficient time working on it to get acquainted).




always happy to look at a Pull Request that changes that field into a larger edit field if you like...



Hi Andrei,

Thanks for the info! I haven't used the node-red-contrib-binary before.
Perhaps my questions are a bit off-topic, but now you have triggered my brain...

When users get an image from their IP camera, they have to set the output of the httprequest node to 'buffer' instead of 'utf8 string'. Otherwise the image becomes corrupt. But do you perhaps have an understanding of what is really going wrong in that case? In other words, what means 'corrupt'.

And a second question. Suppose they would have an image as utf8 string (which will have bad influence on performance anyway, due to the string conversion), but can we convert the string back to a valid image buffer afterwards? E.g. using the node-red-contrib-binary. Or is the image data corrupt until ethernity?

Thanks !



corrupt mean that it converted what was a binary payload into a utf-8 string - which expands many higher numbered characters into two bytes. It may be possible to un-mangle it but I wouldn't rely on it. Rather than try to guess if it is text or binary - it's safer for the http-request node to be explicit.

The contrib-binary node is more for dissecting packets - eg unpacking C struct like objects that many low level protocols used in the days before we didn't care about bandwidth and the verbosity of json, xml, etc :slight_smile:

The binary parser module may indeed be a better choice as it does at least seem to be maintained - even if the syntax is slightly different... (of course) - May be worth a breaking change PR if the owner can be persuaded. Failing that a new node that did this would be useful (especially for all those playing with modbus, plcs etc)

1 Like


Hi Bart,

Dave already replied to question #1 so let me just add my view on question #2.

I strongly believe that the conversion binary > UTF-8 > binary is not possible at all. If you have a look on the character set for UTF-8 you will see that many bytes would be converted to the very same character (mainly in the range 0 to 31 ) in the trip back. So it is not possible a round-trip conversion. Secondly, as you are working with an image data it may contain any possible byte (from 0 to 255). Note that many bytes are invalid in UTF-8 so they break the code or, most likely, the encoder will convert these bytes to the replacement character
I am afraid that the answer is that the image, if converted that way, is lost forever. RIP data.

Edit: another issue is the fact already pointed out by Dave that the conversion from binary to UTF-8 will eventually generate extra bytes since UTF-8 is a variable-length encoding.

1 Like


I'm having the same job and were having trouble sending your email a lot. No you're not able to get an array of measured points that my case is 90. Do you have any idea to do this? Here's how my data is returning and the line I've added to your code to get the cloud of points. I think I have to do an array of 90 positions but when tntei an error occurred in the function. Also I could not make a "for" to fill the Array. Anyway, I believe that I am not very experienced in javascript and so I am having these difficulties.