Object Detection using node-red-contrib-tfjs-coco-ssd

Required nodes in addition to standard nodes (I use the latest current versions):

  • node-red-contrib-image-tools
  • node-red-contrib-telegrambot
  • node-red-contrib-tfjs-coco-ssd
  • node-red-dashboard
  • node-red-node-base64
[{"id":"70f9130a.1f5c0c","type":"base64","z":"2d489fd9.1eedd","name":"","action":"str","property":"image","x":880,"y":530,"wires":[["af833380.f5ea9"]]},{"id":"420b8a89.423b94","type":"http request","z":"2d489fd9.1eedd","name":"","method":"GET","ret":"bin","paytoqs":false,"url":"https://loremflickr.com/320/240/sport","tls":"","persist":false,"proxy":"","authType":"","x":410,"y":430,"wires":[["62e85150.02f7b"]]},{"id":"77fd828a.a6073c","type":"ui_button","z":"2d489fd9.1eedd","name":"","group":"92518872.4d2bc8","order":3,"width":2,"height":1,"passthru":true,"label":"New Picture","tooltip":"","color":"","bgcolor":"","icon":"","payload":"","payloadType":"str","topic":"","x":260,"y":430,"wires":[["420b8a89.423b94"]]},{"id":"18eefa5b.18e276","type":"comment","z":"2d489fd9.1eedd","name":"========== ========== TFJS COCO SSD ========== ==========","info":"\n    ","x":340,"y":490,"wires":[]},{"id":"7c8df2eb.c7408c","type":"inject","z":"2d489fd9.1eedd","name":"test","topic":"","payload":"","payloadType":"date","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":130,"y":430,"wires":[["77fd828a.a6073c"]]},{"id":"d0f460b0.4bdfd","type":"change","z":"2d489fd9.1eedd","name":"pay to detect","rules":[{"t":"move","p":"payload","pt":"msg","to":"detect","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":440,"y":530,"wires":[["d6827244.41d64","7a793272.3245fc"]]},{"id":"506351f8.e4c8","type":"link in","z":"2d489fd9.1eedd","name":"coco","links":["23ae07fd.88e498","c710c20f.8d8cb","ed0b6b69.870e78","3d40617c.70e06e"],"x":195,"y":530,"wires":[["e75695c9.dca4e8"]]},{"id":"3d40617c.70e06e","type":"link out","z":"2d489fd9.1eedd","name":"","links":["506351f8.e4c8"],"x":715,"y":430,"wires":[]},{"id":"3ef708a8.692668","type":"comment","z":"2d489fd9.1eedd","name":"coco","info":"","x":130,"y":530,"wires":[]},{"id":"e75695c9.dca4e8","type":"tensorflowCoco","z":"2d489fd9.1eedd","name":"","scoreThreshold":"","passthru":true,"x":290,"y":530,"wires":[["d0f460b0.4bdfd"]]},{"id":"d6827244.41d64","type":"function","z":"2d489fd9.1eedd","name":"Map Objects Boxes get.Picture","func":"//========== retrieves and transmits the image ==========\n//var pictureBuffer = flow.get('pictureBuffer')|| [];\n//msg.pictureBuffer = pictureBuffer;  // send the picture\n\n//========== Detect Empty Objects ==========\nif(empty(msg.classes)){\nmsg.payload = \"no object detected\"  //prepare for the google translate\nmsg.class=\"no object detected\"      //prepare for display in Template Dashboard\nmsg.detect=[]                       //\nreturn msg\n} //no found object \nelse {\n// prepare the color and the thickness of the line in fct of the image size\nmsg.boxcolor = \"yellow\";\nmsg.textcolor = \"yellow\";\n \nif (msg.shape[1] < 300)                             {msg.textfontsize =\"10px\";msg.boxstroke =1;msg.textstroke = \"3px\";}\nelse if (msg.shape[1] >= 300 && msg.shape[1] < 500) {msg.textfontsize =\"15px\";msg.boxstroke =2;msg.textstroke = \"3px\";}\nelse if (msg.shape[1] >= 500 && msg.shape[1] < 900) {msg.textfontsize =\"20px\";msg.boxstroke =2;msg.textstroke = \"5px\";}\nelse if (msg.shape[1] >= 900 && msg.shape[1] < 2000){msg.textfontsize =\"50px\";msg.boxstroke =5;msg.textstroke = \"10px\";}\nelse if (msg.shape[1] >= 2000)                      {msg.textfontsize =\"80px\";msg.boxstroke =10;msg.textstroke = \"20px\";}\n\n//msg.textfontsize =(msg.shape[1] > 600) ? \"70px\" : \"10px\";\n//msg.boxstroke = (msg.shape[1] > 600) ? 5 : 2;\n//msg.textstroke = (msg.shape[1] > 600) ? \"10px\" : \"2px\";\n\n//========== indicates the type of objects and the quantity ==========\n// Get the array of names\nvar names = Object.keys(msg.classes);\nvar firstCount;\n\n// For each name, map it to \"n name\"\nvar parts = names.map((n,i) => {\n    var count = msg.classes[n];\n    if (i === 0) {\n        //Remember the first count to get the \"is/are\" right later\n        firstCount = count;\n    }\n    // Return \"n name\" and get the pluralisation right\n    return count+\" \"+n+(count>1?\"s\":\"\")\n})\n// If there is more than one name, pop off the last one for later\nvar lastName;\nif (parts.length > 1) {\n    lastName = parts.pop();\n}\n// Build up the response getting \"is/are\" right for the first count and joining\n// the array of names with a comma\nmsg.payload = \"There \"+(firstCount === 1 ? \"is\":\"are\")+\" \"+parts.join(\", \")\n// If there was a last name, add that on the end with an 'and', not a comma\nif (lastName) {\n    msg.payload += \" and \"+lastName;\n}\nreturn msg;\n}//fin de si detecte\n\n//========== EMPTY FONCTION ==========\n/*\nHere's a simpler(short) solution to check for empty variables. \nThis function checks if a variable is empty. \nThe variable provided may contain mixed values (null, undefined, array, object, string, integer, function).\n*/\nfunction empty(mixed_var) {\n if (!mixed_var || mixed_var == '0') {  return true; }\n if (typeof mixed_var == 'object') {  \n    for (var k in mixed_var) {   return false;  }\n    return true; \n }\nreturn false;\n}//EMPTY FONCTION END\n","outputs":1,"noerr":0,"x":680,"y":530,"wires":[["70f9130a.1f5c0c"]]},{"id":"62e85150.02f7b","type":"change","z":"2d489fd9.1eedd","name":"","rules":[{"t":"set","p":"topic","pt":"msg","to":"testDetection","tot":"str"}],"action":"","property":"","from":"","to":"","reg":false,"x":570,"y":430,"wires":[["3d40617c.70e06e"]]},{"id":"c241e6ec.94fda8","type":"comment","z":"2d489fd9.1eedd","name":"coco","info":"","x":780,"y":430,"wires":[]},{"id":"7defeef9.85baf","type":"comment","z":"2d489fd9.1eedd","name":"========== UPLOAD picture ==========","info":"new picture :320x240\n\n0: object\n    bbox: array[4]\n        0: 127\n        1: 86\n        2: 29\n        3: 90\n    class: \"person\"\n    score: 0.9015087080001831\n    \n    x=127 y=86 width =29  height=90\n    ","x":250,"y":380,"wires":[]},{"id":"af833380.f5ea9","type":"ui_template","z":"2d489fd9.1eedd","group":"92518872.4d2bc8","name":"D3 templatev3 TextStroke","order":1,"width":7,"height":5,"format":"<head>\n    <style>\n        :root {\n        --boxcolor: {{msg.boxcolor}};\n        --boxstroke: {{msg.boxstroke}};\n        --textcolor: {{msg.textcolor}};\n        --textfontsize: {{msg.textfontsize}};\n        --textstroke: {{msg.textstroke}};\n        }\n        .imag {\n            width: 100%;\n            height: 100%;\n        }    \n        #svgimage text {\n            font-family: Arial;\n            font-size: var(--textfontsize, 18px);\n            fill       : var(--textcolor, yellow);\n            paint-order: stroke;\n            stroke: black;/*#ffffff;*//*yellow;*/\n            stroke-width:  var(--textstroke, 3px);/*3px;*/\n            font-weight: 600;\n        }\n        rect {\n            fill: blue;\n            fill-opacity: 0;\n            stroke: var(--boxcolor, yellow);\n            stroke-width: var(--boxstroke, 1);\n        }\n        #svgimage {\n            /*background-color: #cccccc;  Used if the image is unavailable */\n            background-repeat: no-repeat;\n            background-size: cover;\n        }\n    </style>\n</head>\n<body>\n    <svg preserveAspectRatio=\"xMidYMid meet\" id=\"svgimage\"  style=\"width:100%\" viewBox=\"0 0 {{msg.shape[1]}} {{msg.shape[0]}}\">\n        <image  class=\"imag\"  href=\"data:image/jpg;base64,{{msg.image}}\"/>\n    </svg>\n    <!-- \n    <svg preserveAspectRatio=\"xMidYMid meet\" id=\"svgimage\"  style=\"width:100%\" viewBox=\"0 0 {{msg.shape[1]}} {{msg.shape[0]}}\">\n        <image  class=\"imag\"  href=\"data:image/jpg;base64,{{msg.pictureBuffer}}\"/>\n    </svg>    \n    \n    -->\n    <div>{{msg.payload}}</div>\n    <script>\n        (function (scope) {\n            scope.$watch('msg', function (msg) {\n                if (msg && msg.detect) {\n                    var svg = d3.select(\"#svgimage\");\n\n                    var box = svg.selectAll(\"rect\").data(msg.detect)\n                        .attr(\"x\", function (d) { return d.bbox[0]; })\n                        .attr(\"y\", function (d) { return d.bbox[1]; })\n                        .attr(\"width\", function (d) { return d.bbox[2]; })\n                        .attr(\"height\", function (d) { return d.bbox[3]; });\n                    box.enter()\n                        .append(\"rect\")\n                        .attr(\"x\", function (d) { return d.bbox[0]; })\n                        .attr(\"y\", function (d) { return d.bbox[1]; })\n                        .attr(\"width\", function (d) { return d.bbox[2]; })\n                        .attr(\"height\", function (d) { return d.bbox[3]; });\n                    box.exit().remove();\n                    \n                    var text = svg.selectAll(\"text\").data(msg.detect)\n                        .text(function (d) { return d.class; })\n                        .attr(\"x\", function (d) { return d.bbox[0]; })\n                        .attr(\"y\", function (d) { return 10 + d.bbox[1]; });\n                    text.enter()\n                        .append(\"text\")\n                        .text(function (d) { return d.class; })\n                        .attr(\"x\", function (d) { return d.bbox[0]; })\n                        .attr(\"y\", function (d) { return 10 + d.bbox[1]; });\n                    text.exit().remove();\n                }\n            });\n        })(scope);\n    </script>\n</body>","storeOutMessages":true,"fwdInMessages":true,"templateScope":"local","x":1060,"y":530,"wires":[["6620cc3f.200064"]]},{"id":"7a793272.3245fc","type":"link out","z":"2d489fd9.1eedd","name":"","links":["892922d6.71f3a","d4e37092.e9a51"],"x":565,"y":580,"wires":[]},{"id":"86e7511.53999b","type":"jimp-image","z":"2d489fd9.1eedd","name":"","data":"image","dataType":"msg","ret":"img","parameter1":"","parameter1Type":"msg","parameter2":"","parameter2Type":"msg","parameter3":"","parameter3Type":"msg","parameter4":"","parameter4Type":"msg","parameter5":"","parameter5Type":"msg","parameter6":"","parameter6Type":"msg","parameter7":"","parameter7Type":"msg","parameter8":"","parameter8Type":"msg","parameterCount":0,"jimpFunction":"none","selectedJimpFunction":{"name":"none","fn":"none","description":"Just loads the image.","parameters":[]},"x":980,"y":660,"wires":[["3797d0ec.4ce87"]]},{"id":"bdfde2a1.a7e7d","type":"ui_template","z":"2d489fd9.1eedd","group":"7b6a751b.c5eb9c","name":"Canvas Selector","order":5,"width":1,"height":1,"format":"    <script>\n    (function(scope) {\n        scope.$watch('msg', function(msg) {\n            if (msg) {\n                // Do something when msg arrives\n                let dataURL = document.querySelector(\"#canvasImage\").toDataURL('image/jpeg');\n                scope.send({payload: dataURL, w: msg.w, h: msg.h});\n            }\n        });\n    })(scope);\n    </script>\n","storeOutMessages":false,"fwdInMessages":false,"templateScope":"local","x":290,"y":780,"wires":[["7643a96c.565538"]]},{"id":"7643a96c.565538","type":"switch","z":"2d489fd9.1eedd","name":"data:image","property":"payload","propertyType":"msg","rules":[{"t":"cont","v":"data:image","vt":"str"}],"checkall":"true","repair":false,"outputs":1,"x":290,"y":840,"wires":[["e60214af.6cc108"]]},{"id":"d4e37092.e9a51","type":"link in","z":"2d489fd9.1eedd","name":"","links":["7a793272.3245fc"],"x":115,"y":660,"wires":[["97bf55b7.ec4198"]]},{"id":"c06dcfae.24d1b","type":"ui_template","z":"2d489fd9.1eedd","group":"7b6a751b.c5eb9c","name":"Canvas Template","order":7,"width":1,"height":1,"format":"<body>\n<canvas id=\"canvasImage\" width=\"1280\" height=\"720\">\n    Your browser does not support the canvas element.\n</canvas>\n\n<script>\n    (function(scope) {\n        scope.$watch('msg', function (msg) {\n            if (msg) {\n                // Do something when msg arrives\n                function squares(canvasid, squarelist, myimag) {\n                    let canvas = document.querySelector(canvasid);\n                    // hide the canvas\n                    canvas.style.display=\"none\";\n                    canvas.width = msg.shape[1];\n                    canvas.height = msg.shape[0];\n                    let ctx = canvas.getContext('2d');\n                    ctx.scale(1, 1);\n                    let image = new Image;\n                    image.src = \"data:image/jpg;base64,\"+myimag;\n                    image.onload = function () {\n                        ctx.drawImage(image, 0, 0, msg.shape[1], msg.shape[0]);\n                        for (const square of squarelist) {\n                            ctx.lineWidth = 2;\n                            ctx.strokeStyle = 'yellow';\n                            ctx.strokeRect(square.bbox[0], square.bbox[1], square.bbox[2], square.bbox[3]);\n                            ctx.font = \"10px Arial\";\n                            ctx.fillStyle = 'yellow';\n                            let width = ctx.measureText(square.class).width;\n                            ctx.fillRect(square.bbox[0], square.bbox[1]+square.bbox[3], square.bbox[2], 12);\n                            ctx.fillStyle = 'red';\n                            ctx.fillText(square.class, square.bbox[0]+square.bbox[2]/2-width/2, square.bbox[1]+square.bbox[3]+8);\n                        }\n                    };\n                }\n\n                let canvasid1 = \"#canvasImage\";\n                let detect = msg.detect;\n                let imag1  = msg.image;\n                scope.send({payload: detect});\n                squares(canvasid1, detect, imag1);\n                scope.send({payload: 'go', w: msg.shape[1], h: msg.shape[0]});\n            }\n        });\n})(scope);\n</script>\n\n</body>","storeOutMessages":false,"fwdInMessages":false,"templateScope":"local","x":290,"y":720,"wires":[["bdfde2a1.a7e7d"]]},{"id":"97bf55b7.ec4198","type":"base64","z":"2d489fd9.1eedd","name":"","action":"str","property":"image","x":290,"y":660,"wires":[["c06dcfae.24d1b","c445e6a1.625ed8"]]},{"id":"892922d6.71f3a","type":"link in","z":"2d489fd9.1eedd","name":"","links":["7a793272.3245fc"],"x":825,"y":660,"wires":[["86e7511.53999b"]]},{"id":"b5d5d39e.401d9","type":"mqtt in","z":"2d489fd9.1eedd","name":"","topic":"imagetf","qos":"2","datatype":"auto","broker":"2a019090.5ba4d","x":490,"y":380,"wires":[["2dd4729f.4e239e"]]},{"id":"2dd4729f.4e239e","type":"switch","z":"2d489fd9.1eedd","name":"","property":"payload","propertyType":"msg","rules":[{"t":"istype","v":"buffer","vt":"buffer"}],"checkall":"true","repair":false,"outputs":1,"x":600,"y":380,"wires":[["3d40617c.70e06e"]]},{"id":"e60214af.6cc108","type":"ui_template","z":"2d489fd9.1eedd","group":"7b6a751b.c5eb9c","name":"SVG Template","order":2,"width":12,"height":9,"format":"<head>\n<style>\n    .imag {\n        width: 100%;\n        height: 100%;\n    }    \n</style>\n</head>\n<body>\n    <svg preserveAspectRatio=\"xMidYMid meet\" id=\"svgimage\"  style=\"width:100%\" viewBox=\"0 0 {{msg.w}} {{msg.h}}\">\n        <image  class=\"imag\"  href=\"{{msg.payload}}\"/>\n    </svg>\n<script>\n(function(scope) {\n    scope.$watch('msg', function(msg) {\n        if (msg) {\n            // Do something when msg arrives\n            //scope.send({payload: dataURL});\n            //alert(msg.w);\n        }\n    });\n})(scope);\n</script>\n</body>\n","storeOutMessages":true,"fwdInMessages":true,"templateScope":"local","x":290,"y":900,"wires":[["7aac40f.94f5bc"]]},{"id":"3797d0ec.4ce87","type":"function","z":"2d489fd9.1eedd","name":"draw Rects","func":"const LINE_WIDTH = 2;\nconst LINE_COLOR = 0xED143DFF;\n\nvar jimpImage = msg.payload;\nlet imgW = jimpImage.bitmap.width;\nlet imgH = jimpImage.bitmap.height;\n\n//create clamping functions to avoid printing outside of image\nlet clampX = (val) => clamp(val,0,imgW);\nlet clampY = (val) => clamp(val,0,imgH);\n\n//add imgBatchOps to msg for drawing text in next node \nmsg.imgBatchOps = [];\n\n\nif (msg.detect.length>0) {\n    drawBoxes(jimpImage, msg.detect);\n}\n\nfunction drawBox(img, box) {\n    //get box ccords\n    let x = box.bbox[0];\n    let y = box.bbox[1];\n    let w = box.bbox[2];\n    let h = box.bbox[3];\n    \n    scanHLine(img, x, y+h-14, w, 14, 0xfa758e22); //draw box for text\n    scanRectangle(img, x, y, w, h, LINE_WIDTH, LINE_COLOR);//draw outer rect\n    \n    //built batch operations for next image node to draw text\n    msg.imgBatchOps.push(  \n        {\n            \"name\": \"print\",\n            \"parameters\": [\n                \"FONT_SANS_10_BLACK\",\n                clampX(x+2),\n                clampY(y+h-16),\n                box.class\n            ]\n        }\n        \n    );  \n}\n\nfunction drawBoxes(img, detections) {\n    detections.forEach(element => drawBox(img, element))\n}\n\n\nfunction makeColorIterator(color) {\n  return function (x, y, offset) {\n    this.bitmap.data.writeUInt32BE(color, offset, true);\n  }\n}\n\nfunction scanRectangle(image,x,y,w,h,linePX,color){\n    let iterator = makeColorIterator(color);\n   \n    let xw, yh;\n    x = clampX(x-(linePX/2));\n    y = clampY(y-(linePX/2));\n    w = clampX(w);\n    h = clampY(h);\n    xw = clampX(x+w);\n    yh = clampY(y+h);\n\n    \n    if(y+linePX <= imgH){\n        image.scan(x, y, w, linePX,  iterator);// scan linePX height line - TOP\n    }\n    if(yh+linePX <= imgH){\n        image.scan(x, yh, w, linePX, iterator);// scan linePX height line - BOTTOM\n    }\n    if(x+linePX <= imgW){\n        image.scan(x, y, linePX, h,  iterator);// scan linePX width line - LEFT\n    }\n    if(xw+linePX <= imgW){\n        image.scan(xw, y, linePX, h, iterator);// scan linePX width line - RIGHT\n    }\n    \n}\n\n\nfunction scanHLine(image,x,y,length,linePX,color){\n    let iterator = makeColorIterator(color);\n    x = clampX(x);\n    y = clampY(y);\n    length = clampX(length);\n    if(y+linePX <= imgH){\n        image.scan(x, y, length, linePX,  iterator);// \n    }\n}\n\nfunction scanVLine(image,x,y,height,linePX,color){\n    let iterator = makeColorIterator(color);\n    x = clampX(x);\n    y = clampY(y);\n    height = clampY(y+height);\n    if(x+linePX <= imgW){\n        image.scan(x, y, linePX, height,  iterator);// \n    }\n}\n\n\n//scanCircle(jimpImage, 92, 170, 43, 0x000000FF);\n//scanCircle(jimpImage, 92, 170, 42, 0xFFCC00FF);\n\nfunction clamp(value, min, max) {\n    return Math.min(Math.max(value, min), max);\n}\n\nfunction scanCircle(image, x, y, radius, color) {\n    let iterator = makeColorIterator(color);\n    return image.scan(x - radius, y - radius, radius*2, radius*2, function (pixX, pixY, idx) {\n        if (Math.pow(pixX - x, 2) + Math.pow(pixY - y, 2) < radius*radius) {\n            iterator.call(this, pixX, pixY, idx);\n        }\n    });\n}\n\n \n\nreturn msg;","outputs":1,"noerr":0,"x":980,"y":720,"wires":[["caf53b8c.5f6138"]]},{"id":"caf53b8c.5f6138","type":"jimp-image","z":"2d489fd9.1eedd","name":"add text","data":"payload","dataType":"msg","ret":"img","parameter1":"imgBatchOps","parameter1Type":"msg","parameter2":"","parameter2Type":"msg","parameter3":"","parameter3Type":"msg","parameter4":"","parameter4Type":"msg","parameter5":"","parameter5Type":"msg","parameter6":"","parameter6Type":"msg","parameter7":"","parameter7Type":"msg","parameter8":"","parameter8Type":"msg","parameterCount":1,"jimpFunction":"batch","selectedJimpFunction":{"name":"batch","fn":"batch","description":"apply one or more functions","parameters":[{"name":"options","type":"json","required":true,"hint":"an object or an array of objects containing {\"name\" : \"function_name\", \"parameters\" : [x,y,z]}.  Refer to info on side panel}"}]},"x":980,"y":780,"wires":[["fd02639.baffca","526acc16.b9dfe4"]]},{"id":"5da5b688.b89a88","type":"telegram sender","z":"2d489fd9.1eedd","name":"","bot":"7e82026b.614f0c","x":980,"y":1200,"wires":[[]]},{"id":"c4973d1a.24498","type":"function","z":"2d489fd9.1eedd","name":"","func":"if ('person' in msg.classes){\n    var m = { content:'', type:'', chatId:''};\n    m.content = msg.payload;\n    m.type = 'photo';\n    m.chatId = 1234567890;\n    msg.payload = m;\n    return msg;    \n}\n\n","outputs":1,"noerr":0,"x":980,"y":1140,"wires":[["5da5b688.b89a88"]]},{"id":"da054679.713928","type":"base64","z":"2d489fd9.1eedd","name":"","action":"","property":"payload","x":290,"y":1020,"wires":[[]]},{"id":"fd02639.baffca","type":"jimp-image","z":"2d489fd9.1eedd","name":"","data":"payload","dataType":"msg","ret":"b64","parameter1":"","parameter1Type":"msg","parameter2":"","parameter2Type":"msg","parameter3":"","parameter3Type":"msg","parameter4":"","parameter4Type":"msg","parameter5":"","parameter5Type":"msg","parameter6":"","parameter6Type":"msg","parameter7":"","parameter7Type":"msg","parameter8":"","parameter8Type":"msg","parameterCount":0,"jimpFunction":"none","selectedJimpFunction":{"name":"none","fn":"none","description":"Just loads the image.","parameters":[]},"x":980,"y":840,"wires":[["21c0651b.0bc42a"]]},{"id":"21c0651b.0bc42a","type":"ui_template","z":"2d489fd9.1eedd","group":"7b6a751b.c5eb9c","name":"SVG Template","order":1,"width":12,"height":9,"format":"<head>\n    <style>\n        .imag {\n            width: 100%;\n            height: 100%;\n        }    \n   </style>\n</head>\n<body>\n    <svg preserveAspectRatio=\"xMidYMid meet\" id=\"svgimage\"  style=\"width:100%\" viewBox=\"0 0 {{msg.shape[1]}} {{msg.shape[0]}}\">\n        <image  class=\"imag\"  href=\"{{msg.payload}}\"/>\n    </svg>\n</body>","storeOutMessages":true,"fwdInMessages":true,"templateScope":"local","x":980,"y":900,"wires":[["81c1a54d.5c19a8"]]},{"id":"4b8734ac.3cb2ac","type":"base64","z":"2d489fd9.1eedd","name":"","action":"","property":"payload","x":980,"y":1080,"wires":[["c4973d1a.24498"]]},{"id":"81c1a54d.5c19a8","type":"split","z":"2d489fd9.1eedd","name":"","splt":",","spltType":"str","arraySplt":1,"arraySpltType":"len","stream":false,"addname":"payload","x":980,"y":960,"wires":[["d309fc02.ddea1"]]},{"id":"d309fc02.ddea1","type":"switch","z":"2d489fd9.1eedd","name":"","property":"payload","propertyType":"msg","rules":[{"t":"neq","v":"data:image","vt":"str"}],"checkall":"true","repair":false,"outputs":1,"x":980,"y":1020,"wires":[["4b8734ac.3cb2ac"]]},{"id":"1fd51178.32a59f","type":"image viewer","z":"2d489fd9.1eedd","name":"","width":"160","data":"payload","dataType":"msg","x":510,"y":960,"wires":[[]]},{"id":"526acc16.b9dfe4","type":"image viewer","z":"2d489fd9.1eedd","name":"","width":160,"data":"payload","dataType":"msg","x":1200,"y":780,"wires":[[]]},{"id":"c445e6a1.625ed8","type":"image viewer","z":"2d489fd9.1eedd","name":"","width":160,"data":"image","dataType":"msg","x":510,"y":660,"wires":[[]]},{"id":"6620cc3f.200064","type":"image viewer","z":"2d489fd9.1eedd","name":"","width":160,"data":"image","dataType":"msg","x":1230,"y":530,"wires":[[]]},{"id":"7aac40f.94f5bc","type":"function","z":"2d489fd9.1eedd","name":"","func":"msg.payload = msg.payload.split(',')[1];\nreturn msg;","outputs":1,"noerr":0,"x":290,"y":960,"wires":[["da054679.713928","1fd51178.32a59f"]]},{"id":"92518872.4d2bc8","type":"ui_group","z":"","name":"Video Capture","tab":"17522d42.149913","disp":false,"width":"7","collapse":false},{"id":"7b6a751b.c5eb9c","type":"ui_group","z":"","name":"NR Image tests","tab":"a0ac7d1b.a925d","disp":false,"width":24,"collapse":false},{"id":"2a019090.5ba4d","type":"mqtt-broker","z":"","name":"","broker":"192.168.0.240","port":"1883","clientid":"","usetls":false,"compatmode":true,"keepalive":"60","cleansession":true,"birthTopic":"","birthQos":"0","birthPayload":"","closeTopic":"","closePayload":"","willTopic":"","willQos":"0","willPayload":""},{"id":"7e82026b.614f0c","type":"telegram bot","z":"","botname":"Dummy","usernames":"","chatids":"","baseapiurl":"","updatemode":"polling","pollinterval":"300","usesocks":false,"sockshost":"","socksport":"6667","socksusername":"anonymous","sockspassword":"","bothost":"","localbotport":"8443","publicbotport":"8443","privatekey":"","certificate":"","useselfsignedcertificate":false,"sslterminated":false,"verboselogging":false},{"id":"17522d42.149913","type":"ui_tab","z":"","name":"Video Capture","icon":"dashboard","order":3,"disabled":false,"hidden":false},{"id":"a0ac7d1b.a925d","type":"ui_tab","z":"","name":"XXX","icon":"dashboard","order":1,"disabled":false,"hidden":false}]

As example, these images are handled excellent by the coco ssd analyze. Some other analyzers I have tested had problems detecting me walking there. The camera is looking down from the roof so no real frontal views that should have been ideal
image
image

5 Likes

Walter, this is awesome! And your flow is very compact.
When I started with Node-RED a couple of years ago, this was the kind of stuff that I wanted to achieve.
Finally we are getting closer to a pure Node-RED based video surveillance solution.
Just love it :+1: :+1: :+1: :+1: :+1: :+1:

3 Likes

a little bug that won't happen often. If we compare it to my flow :


image

The problem is the very loose calculations for position of lines and rectangle for underneath text.

We don't have the luxury of a client side DOM with a plethora of elements and functions - at sever side, we have a bitmap of pixels and we have to calculate which pixels to set.

With a bit more work, we could easily determine form the rectangles coordinates in relation to the bitmap extent & place the string appropriately.

Edit...
On the lighter side, we (humans) can see it's a person haha.

(the annotation is really only for visual debugging - I think :slight_smile: ).

I just have to tell you I have finally got something also working for my NVIDIA Jetson Nano's (arm64)!!! So happy!

I raised an issue earlier regarding missing support for node-tfjs on arm64 platforms

Luckily for me, @yhwang kindly reached out and provided a solution


Maybe this is useful for RPi4 users as well?

Anyway, now I can run Node-RED on my Jetson Nano's with tfjs-coco-ssd detection built-in!

1 Like

Can you share this flow please?

Of course, right away...

I found out that some additional nodes also are needed. I installed them via palette manager before I imported the flow (image-tools you have already I think :wink: )

Best regards, Walter

"dependencies": {
"@tensorflow/tfjs-node": "1.4.0",
"node-red": "^1.0.3",
"node-red-contrib-browser-utils": "0.0.9",
"node-red-contrib-image-tools": "^0.2.5",
"node-red-contrib-post-object-detection": "^0.1.2",
"node-red-contrib-tf-function": "^0.1.0",
"node-red-contrib-tf-model": "^0.1.6"

[{"id":"1545e935.999977","type":"jimp-image","z":"bfd6731.db3089","name":"load the image","data":"payload","dataType":"msg","ret":"buf","parameter1":"","parameter1Type":"msg","parameter2":"","parameter2Type":"msg","parameter3":"","parameter3Type":"msg","parameter4":"","parameter4Type":"msg","parameter5":"","parameter5Type":"msg","parameter6":"","parameter6Type":"msg","parameter7":"","parameter7Type":"msg","parameter8":"","parameter8Type":"msg","parameterCount":0,"jimpFunction":"none","selectedJimpFunction":{"name":"none","fn":"none","description":"Just loads the image.","parameters":[]},"x":360,"y":140,"wires":[["8721e9e0.e949e8","d775e36d.cc825","11e3829b.dec12d"]]},{"id":"8721e9e0.e949e8","type":"image viewer","z":"bfd6731.db3089","name":"Original Image viewer","width":"320","data":"payload","dataType":"msg","x":940,"y":140,"wires":[[]]},{"id":"5606b167.a042b","type":"image viewer","z":"bfd6731.db3089","name":"With bounding boxes","width":"320","data":"payload","dataType":"msg","x":660,"y":570,"wires":[[]]},{"id":"7ceb8aa5.98cbe4","type":"bbox-image","z":"bfd6731.db3089","name":"bounding-box","x":570,"y":500,"wires":[["5606b167.a042b"]]},{"id":"dd52b9b3.e03978","type":"change","z":"bfd6731.db3089","name":"objects","rules":[{"t":"set","p":"complete","pt":"msg","to":"true","tot":"bool"}],"action":"","property":"","from":"","to":"","reg":false,"x":360,"y":430,"wires":[["11e3829b.dec12d"]]},{"id":"bcf6963a.e2f4e8","type":"post-object-detection","z":"bfd6731.db3089","classesURL":"https://s3.sjc.us.cloud-object-storage.appdomain.cloud/tfjs-cos/cocossd/classes.json","iou":"0.5","minScore":"0.5","name":"post-processing","x":360,"y":360,"wires":[["dd52b9b3.e03978"]]},{"id":"d775e36d.cc825","type":"tf-function","z":"bfd6731.db3089","name":"pre-processing","func":"const image = tf.tidy(() => {\n  return tf.node.decodeImage(msg.payload, 3).expandDims(0);\n});\n\nreturn {payload: { image_tensor: image } };","outputs":1,"noerr":0,"x":360,"y":210,"wires":[["8bee1dc1.4ecca"]]},{"id":"8bee1dc1.4ecca","type":"tf-model","z":"bfd6731.db3089","modelURL":"https://storage.googleapis.com/tfjs-models/savedmodel/ssdlite_mobilenet_v2/model.json","outputNode":"","name":"COCO SSD","x":350,"y":280,"wires":[["bcf6963a.e2f4e8"]]},{"id":"11e3829b.dec12d","type":"function","z":"bfd6731.db3089","name":"","func":"let queue = flow.get('queue');\nif (queue === undefined) {\n    queue = [];\n    flow.set('queue', queue);\n}\n\nif (msg.complete === undefined) {\n    queue.push(msg.payload);\n    node.done();\n} else {\n    const image = queue.shift();\n    node.send(\n        {\n            payload: {\n                objects: msg.payload,\n                image: image\n            }\n        });\n}","outputs":1,"noerr":0,"x":565,"y":430,"wires":[["7ceb8aa5.98cbe4"]],"icon":"node-red/join.svg","l":false},{"id":"f8135785.7c4808","type":"mqtt in","z":"bfd6731.db3089","name":"","topic":"epic","qos":"2","datatype":"auto","broker":"a5576de4.82b","x":140,"y":140,"wires":[["1545e935.999977"]]},{"id":"ff74da07.dd21f8","type":"fileinject","z":"bfd6731.db3089","name":"","x":150,"y":100,"wires":[["1545e935.999977"]]},{"id":"a5576de4.82b","type":"mqtt-broker","z":"","name":"","broker":"192.168.0.240","port":"1883","clientid":"","usetls":false,"compatmode":true,"keepalive":"60","cleansession":true,"birthTopic":"","birthQos":"0","birthPayload":"","closeTopic":"","closePayload":"","willTopic":"","willQos":"0","willPayload":""}]

Ah now I understand. It uses node-canvas for the drawing. So it inherits the canvas capabilities - good workaround for the limited drawing capabilities of jimp.

Do you mean you use now the tf-model node instead of the coco-ssd node? And is Jetson Nano a better choice (instead of a rpi 4) for this kind of stuff?

The analysis is not always correct.
E.g. the analysis of this picture results in a motorcycle instead of a car:

I have told my wife so many times that she should be careful when parking our Volkswagen :thinking:

Sorry for this amateurish interruption of a professional discussion ...

In addition, it was an almost new car! :laughing:

Yes, I too see the same problem: plants are detected as People and create false alerts through the loudspeaker of the house at 2:00 am: "There is a person in the garden" :woozy_face:
By doing research in the other examples of flow I found this one:

https://miro.medium.com/max/1400/1*h6Vz9fIQJRoASHOK0uKHGg.gif

It is based on Teachable Machin by google. Using a webcam we take the object from various angles, backgrounds, lighting ... To improve the model, we can add other photos.
The result is a tensorflow.js model, the url of which must be entered in the detection node. Or registered locally. So simple.
To test ...

1 Like

Yes, currently, on the Jetson Nano I use the tf-model node and it is stated that the GPU is used. To make the coco-ssd-node work I suspect that it has to use the updated @tensorflow/tfs-node but I do not know how to make that happen, I have some ideas experimenting, like running "npm install" in node-red-contrib-tfjs-coco-ssd directory to rebuild the bindings, but I don't know

EDIT: I do not know but I do suspect that it is problematic to use the node-red-contrib-tfjs-coco-ssd node on a RPi4. Maybe someone already tried?

If the Jetson Nano is a better choice then the RPi4 for this kind? Well, I don't know. What I know is that the Nano can do impressive real time video analyzes at a high speed when using other tools & libraries specifically written for it. But I'm not sure if it's power is really utilized when running everything through NR.

So very true :smiley:
On the other hand, my wife once worked with a company handling customer fleets (car leasing) and you cannot believe how many calls started "My wife...". In reality, it was not the wife...

Very interesting!!!

1 Like


This decrease in inferencing time brings the Raspberry Pi 4 directly into competition with both the NVIDIA Jetson Nano and the Movidius-based hardware from Intel

For the most curious the link of the comparison (very instructive). The difference between Tensorflow and TensorflowLight is obvious:

1 Like

So I managed to make the coco-ssd-node work on Jetson Nano!!! This is how it worked for me:

Install both nodes:

In the folder "/home/user/.node-red/node_modules/node-red-contrib-tfjs-coco-ssd/node_modules/@tensorflow" delete the following folders

  • tfjs-node
  • tfjs-converter

and then replace with the same folders from the folder "/home/user/.node-red/node_modules/@tensorflow"

After this the coco-ssd node also works on arm64!

image

1 Like

I was able to test the detection of objects but on the pi3 there is a memory consumption that increases without ever going down.
After about 10 detections it blocks. It is necessary to restart NR.
Was this already mentioned-resolved in another post? If so, sorry :face_with_hand_over_mouth:
If not, do you have ideas to solve this problem?
I attach a CPU & Memory visualization to each detection.

mĂŠmoire-cpu

NR-LOG

I'm having the exact same issue in a RPi 4.

I had to write a flow that restarts Node Red once it it 60% or Ram consumed.

Which version of the node are you using ? Can you try shrinking the images before sending them to the node. (While it shouldn’t matter the models only need about 320 x 240 px)

here I have programmed a resizing of the images captured by the cameras at 320x240. Since 2 hours, I have not exceeded 90% of use (Ram 1 GB). I will let it run for a few days to see...

In an other hand, more and more people complain of memory leak when using tfjs:



It seems that using the td.tidy() method and the tf.disposeVariables function solves the problem.
@dceedjay does that mean anything to you? Can it be useful to solve our problem?

worth a try - pushed a version 0.4.1
not sure if I've implemented it correctly - so any help / advice gratefully received...