I have a situation where I need to convert a decimal number into hex and then separate into significant bytes so they can be placed into a Buffer separately. Example: I need to take an input of 300 and split the 12C into 0x2c, 0x01. Or 250 and create 0xFA and 0x00. These will be placed into a Buffer like:
msg.payload = new Buffer([0x70,0x01,0x06,0x07,0x2c,0x01,0x00,0x00]);
I have been messing around with this, but am not getting to where I need to be:
One way to work out what is going wrong is to liberally distribute node.warn statements through the code then you can see where it is failing. Node.warn output is displayed in the debug tab. So for example you might start with
var num = (250).toString(2);
node.warn("num = " + num);
var lower_nibble = (num & 0x3f) ;
node.warn("lower = " + lower);
var higher_nibble = (num & 0xff) >> 8;
node.warn("higher = " + higher);
and so on.
Why are you anding with 0x3f for lower, not 0xff?
When looking at the output of the function with a debug node don't forget to show the complete message object as you are setting the data in members like msg.bitnum, not msg.payload.bitnum.
Thanks for the suggestions, but I am still struggling with this. If I "and" 100101100 with 0xFF I would expect to get 00101100 yet I am getting 11101100. Can you explain to me why that might be happening?
Oh! Just realised you have var num = (250).toString(2); at the start, which converts it to a string. You must not do that, you must keep it as a number.