Object Detection using node-red-contrib-tfjs-coco-ssd

@zenofmud while not a node - you may be able to try this on a Pi0 and then call it via exec etc ? https://medium.com/@haraldfernengel/compiling-tensorflow-lite-for-a-raspberry-pi-786b1b98e646

I'll give it a try for the sake of trying it, but it does say it takes about 5 seconds to run and by that time someone (little kids) might be into the rest of the house. So for the "Wash Your Hands" coronavirus project, I'm going to use a Pi3.

I thought I'd have time today till the 3 yr old grand daughter arrived and - being her toy - I got nothing done. And I just found our our daughter has decided to pull her boys (5 and 7) out of school for a bit. Oh did I mention they live 3 houses up the street!

I need a secret lair!

Good decision, hope you and your relatives are all well! It is a bit scary at the moment, like running in a minefield, suddenly....boom

1 Like

Here in France that's it, we close all schools, universities ... until further notice. Children are at home for 1 month !! :open_mouth:

2 Likes

Python and OpenCV to the rescue...

I decided to try. My target was to draw all boxes of detected & filtered objects (persons in this case) on the image and then send it via Telegram to my iPhone. Well, it works great!!!

Since I'm more used to program in Python, I decided to make such a first try. However, OpenCV is also available for JS and most NR users are familiar with JS, so a new challenge would be to figure out how this could be solved using opencv4nodejs in a function node or in the ui_template

The principal function of my prototype:

  1. An image is sent in parallel to a Python script and to NR via MQTT
  2. In NR the 'node-red-contrib-tfjs-coco-ssd' analyzes the image for objects
  3. The resulting boxes, if any, are then sent from NR to the same Python script, also via MQTT
  4. The Python script uses OpenCV to draw the boxes on the image as well as other textual info like class and score
  5. Finally, if boxes where drawn, it sends the image to my phone and saves it to my server disk for future investigation in case that should be required

Come on, now ready for next image!

2 Likes

Just a teaser, now using opencv4nodejs

1 Like

sorry @krambriw - what is opencv adding here ? vs your original example

No python script required anymore, everything is in js. OpenCV is in this simpler case just drawing all the boxes on top of the image. With OpenCV available in NR, much more can be done in terms of image manipulation

Requires the installation of opencv4nodejs (in a Pi3 I did like this (from /home/pi folder)):

  • Increase the CONF_SWAPSIZE from 100 to 2048:
    sudo nano /etc/dphys-swapfile
    Reboot
  • Run the installation:
    npm install --force --save opencv4nodejs
  • Set the CONF_SWAPSIZE back to 100 :
    sudo nano /etc/dphys-swapfile
    Reboot

In my settings.js I have added:
cv2:require('opencv4nodejs'),

This is the code in the function node:

var image = context.get('mat')||undefined;
var cv = global.get('cv2');
var detections = [];

const rows = 240; // height
const cols = 320; // width

if(typeof(msg.payload)==="string"){
    const buffer = Buffer.from(msg.payload,'base64');
    image = cv.imdecode(buffer); //Image is now represented as Mat
    context.set('mat', image);
}else{
    detections = msg.payload;
    showBoxes();
    //cv.imwrite('/home/pi/Pictures/img.jpg', image);
    // convert Mat to base64 encoded jpg image
    const outBase64 =  cv.imencode('.jpg', image).toString('base64'); // Perform base64 encoding
    msg.payload = outBase64;
    node.send(msg);
}

function writeBox(element) {
    let box = element;
    let x = parseInt(box.bbox[0]);
    let y = parseInt(box.bbox[1]);
    let width = parseInt(box.bbox[2]);
    let height= parseInt(box.bbox[3]);
    let theClass = box["class"];
    let theScore = box["score"].toFixed(2)*100;
    //node.warn(theScore);
    const blue = new cv.Vec3(255, 0, 0);
    const font = cv.FONT_HERSHEY_SIMPLEX;
    const green = new cv.Vec3(0, 255, 0);
    image.drawRectangle(new cv.Point2(x, y),new cv.Point2(x+width, y+height),blue,2);
    image.putText(theClass+' '+theScore+'%', new cv.Point2(x, y-5), font, 0.5, green, 2);
}

function showBoxes() {
    detections.forEach(element => writeBox(element))
}


This is the flow:

[{"id":"379c521a.e9d72e","type":"comment","z":"2d489fd9.1eedd","name":"NR Object detection","info":"","x":170,"y":40,"wires":[]},{"id":"8a69170a.2a6d78","type":"ui_template","z":"2d489fd9.1eedd","group":"31b5112.2d60fee","name":"highlightPict","order":2,"width":"24","height":"14","format":"<!DOCTYPE html>\n<html>\n<style type=\"text/css\"> \n.wrapper {\n    position:absolute;\n    width:320px;\n    height:240px;\n    border:2px solid grey;\n}\n</style> \n\n<script type=\"text/javascript\">\nscope.$watch('msg', function(msg) {\n    inMessage(msg);\n});\n\nfunction inMessage(event) {\n    if (event.payload) {\n        document.getElementById('cX').src = \"data:image/jpg;base64,\"+event.payload;\n    }\n}\n\n</script>\n<body>\n<div class=\"wrapper\">\n     <img src=\"\" id=\"cX\" />\n</div>\n</body>\n</html>\n\n\n\n","storeOutMessages":true,"fwdInMessages":true,"templateScope":"local","x":1070,"y":80,"wires":[[]]},{"id":"2fccc750.eba658","type":"tensorflowCoco","z":"2d489fd9.1eedd","name":"","model":"","scoreThreshold":"","x":610,"y":260,"wires":[["f37f4cb9.3def6","9a7b8548.3747c8","f31c5417.7fee68"]]},{"id":"2085ac81.9563d4","type":"base64","z":"2d489fd9.1eedd","name":"","action":"str","property":"payload","x":610,"y":80,"wires":[["b681c523.3bd768"]]},{"id":"f37f4cb9.3def6","type":"ui_table","z":"2d489fd9.1eedd","group":"31b5112.2d60fee","name":"table detection","order":8,"width":10,"height":"12","columns":[],"outputs":0,"cts":false,"x":830,"y":260,"wires":[]},{"id":"2ac71df3.5060f2","type":"http request","z":"2d489fd9.1eedd","name":"","method":"GET","ret":"bin","paytoqs":false,"url":"https://loremflickr.com/320/240/sport","tls":"","persist":false,"proxy":"","authType":"","x":360,"y":170,"wires":[["2fccc750.eba658","2085ac81.9563d4","9a7b8548.3747c8"]]},{"id":"dfc95a0d.0305c8","type":"ui_button","z":"2d489fd9.1eedd","name":"","group":"31b5112.2d60fee","order":6,"width":1,"height":1,"passthru":true,"label":"New Picture","tooltip":"","color":"","bgcolor":"","icon":"","payload":"","payloadType":"str","topic":"","x":150,"y":170,"wires":[["2ac71df3.5060f2"]]},{"id":"fba73baf.e43ef8","type":"inject","z":"2d489fd9.1eedd","name":"test","topic":"","payload":"","payloadType":"date","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":130,"y":80,"wires":[["dfc95a0d.0305c8"]]},{"id":"9a7b8548.3747c8","type":"function","z":"2d489fd9.1eedd","name":"","func":"oldT = context.get('prev')||0;\nvar time = new Date().getTime();\nif(oldT === 0){\n    context.set('prev', time);\n}\nif(oldT>0){\n    node.warn(time-oldT);\n    context.set('prev', 0);\n}\n","outputs":0,"noerr":0,"x":610,"y":170,"wires":[]},{"id":"3bf60ae8.caf4f6","type":"mqtt in","z":"2d489fd9.1eedd","name":"","topic":"image/#","qos":"2","datatype":"auto","broker":"2a019090.5ba4d","x":140,"y":260,"wires":[["adb4835c.d0472"]]},{"id":"adb4835c.d0472","type":"switch","z":"2d489fd9.1eedd","name":"","property":"payload","propertyType":"msg","rules":[{"t":"istype","v":"buffer","vt":"buffer"}],"checkall":"true","repair":false,"outputs":1,"x":360,"y":260,"wires":[["2085ac81.9563d4","9a7b8548.3747c8","2fccc750.eba658"]]},{"id":"f31c5417.7fee68","type":"switch","z":"2d489fd9.1eedd","name":"","property":"classes","propertyType":"msg","rules":[{"t":"hask","v":"person","vt":"str"}],"checkall":"true","repair":false,"outputs":1,"x":830,"y":170,"wires":[["b681c523.3bd768"]]},{"id":"b681c523.3bd768","type":"function","z":"2d489fd9.1eedd","name":"","func":"var image = context.get('mat')||undefined;\nvar cv = global.get('cv2');\nvar detections = [];\n\nconst rows = 240; // height\nconst cols = 320; // width\n\nif(typeof(msg.payload)===\"string\"){\n    const buffer = Buffer.from(msg.payload,'base64');\n    image = cv.imdecode(buffer); //Image is now represented as Mat\n    context.set('mat', image);\n}else{\n    detections = msg.payload;\n    showBoxes();\n    //cv.imwrite('/home/pi/Pictures/img.jpg', image);\n    // convert Mat to base64 encoded jpg image\n    const outBase64 =  cv.imencode('.jpg', image).toString('base64'); // Perform base64 encoding\n    msg.payload = outBase64;\n    node.send(msg);\n}\n\nfunction writeBox(element) {\n    let box = element;\n    let x = parseInt(box.bbox[0]);\n    let y = parseInt(box.bbox[1]);\n    let width = parseInt(box.bbox[2]);\n    let height= parseInt(box.bbox[3]);\n    let theClass = box[\"class\"];\n    let theScore = box[\"score\"].toFixed(2)*100;\n    //node.warn(theScore);\n    const blue = new cv.Vec3(255, 0, 0);\n    const font = cv.FONT_HERSHEY_SIMPLEX;\n    const green = new cv.Vec3(0, 255, 0);\n    image.drawRectangle(new cv.Point2(x, y),new cv.Point2(x+width, y+height),blue,2);\n    image.putText(theClass+' '+theScore+'%', new cv.Point2(x, y-5), font, 0.5, green, 2);\n}\n\nfunction showBoxes() {\n    detections.forEach(element => writeBox(element))\n}\n\n","outputs":1,"noerr":0,"x":830,"y":80,"wires":[["d4e78cd3.ba85c","8a69170a.2a6d78","831edb10.ef55f8"]]},{"id":"31b5112.2d60fee","type":"ui_group","z":"","name":"detection","tab":"da82a0d.14db76","order":2,"disp":true,"width":"24","collapse":true},{"id":"2a019090.5ba4d","type":"mqtt-broker","z":"","name":"","broker":"192.168.0.240","port":"1883","clientid":"","usetls":false,"compatmode":true,"keepalive":"60","cleansession":true,"birthTopic":"","birthQos":"0","birthPayload":"","closeTopic":"","closePayload":"","willTopic":"","willQos":"0","willPayload":""},{"id":"da82a0d.14db76","type":"ui_tab","z":"","name":"NR Object detection","icon":"dashboard","order":2,"disabled":false,"hidden":false}]
1 Like

@krambriwI just installed ubuntu 18.04.4 on a PC, to test tfjs-coco-ssd, but I also have an error:

node-pre-gyp info This Node instance does not support builds for N-API version 5
node-pre-gyp info This Node instance does not support builds for N-API version 5
22 Mar 13:57:14 - [error] [tensorflowCoco: 74c4cd06.2447b4] Error: Cannot find module'/home/chris/.node-red/node_modules/@tensorflow/tfjs-node/lib/napi-v4/tfjs_binding .node '

I'm really unlucky, it only works on Rpi for me :frowning:

How did you install TFJS on your Debian?

@SuperNinja
I actually did it straight forward from NR palette manager and it worked fine. I must have been lucky that the platform in the laptop was "old enough" to be supported. I have some arm64 based devices I would love to get working but they are not supported. I was given a hint I could build them myself but the process is very complicated and time consuming so I just could not get the right motivation for now,,,

Since the node is depending on @tensorflow tfjs-node, I found some related info when googling N-API version 5

Maybe could be anything mentioned here:

Thanks for your links. I tried them but without success. I will have to resign myself to use exclusively the Rpi3B unless I buy an Rpi4 ... 4GB RAM!
I think there is a version conflict between node.js and other stuf ...

In the meantime I worked on object detection using exclusively Node-Red and here is the result:
image
[EDIT] since v0.2.0, tfjs-coco-ssd pass : add option to pass image through node as msg.image

[{"id":"5617fe56.98f7b","type":"debug","z":"a8b1b208.7036f","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","x":460,"y":2640,"wires":[]},{"id":"83b9c0e7.1436b","type":"debug","z":"a8b1b208.7036f","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"classes","targetType":"msg","x":460,"y":2610,"wires":[]},{"id":"9fec0294.e4c1a","type":"base64","z":"a8b1b208.7036f","name":"","action":"str","property":"image","x":890,"y":2580,"wires":[["4cbfe86c.b89a18"]]},{"id":"e62c51ce.092fd","type":"http request","z":"a8b1b208.7036f","name":"","method":"GET","ret":"bin","paytoqs":false,"url":"https://loremflickr.com/320/240/sport","tls":"","persist":false,"proxy":"","authType":"","x":430,"y":2430,"wires":[["cd835566.3e1698"]]},{"id":"1581af1.85bcf51","type":"ui_button","z":"a8b1b208.7036f","name":"","group":"7c26c8d0.c1e658","order":3,"width":2,"height":1,"passthru":true,"label":"New Picture","tooltip":"","color":"","bgcolor":"","icon":"","payload":"","payloadType":"str","topic":"","x":280,"y":2430,"wires":[["e62c51ce.092fd"]]},{"id":"a295be8c.20765","type":"comment","z":"a8b1b208.7036f","name":"========== ========== TFJS COCO SSD ========== ==========","info":"\n    ","x":330,"y":2540,"wires":[]},{"id":"7dbdcf58.cfb77","type":"debug","z":"a8b1b208.7036f","name":"coco","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","x":440,"y":2670,"wires":[]},{"id":"9d67d128.0c0d6","type":"image","z":"a8b1b208.7036f","name":"","width":"400","data":"payload","dataType":"msg","thumbnail":false,"active":false,"x":290,"y":2730,"wires":[]},{"id":"44dc9948.c91258","type":"inject","z":"a8b1b208.7036f","name":"test","topic":"","payload":"","payloadType":"date","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":150,"y":2430,"wires":[["1581af1.85bcf51"]]},{"id":"8e1912af.b653","type":"change","z":"a8b1b208.7036f","name":"pay to detect","rules":[{"t":"move","p":"payload","pt":"msg","to":"detect","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":460,"y":2580,"wires":[["94e6d3d.a32373"]]},{"id":"ccde240c.6ab728","type":"link in","z":"a8b1b208.7036f","name":"coco","links":["23ae07fd.88e498","c710c20f.8d8cb","ed0b6b69.870e78","d137d328.1051d"],"x":185,"y":2580,"wires":[["6a563402.3008bc"]]},{"id":"d137d328.1051d","type":"link out","z":"a8b1b208.7036f","name":"","links":["ccde240c.6ab728"],"x":725,"y":2480,"wires":[]},{"id":"ba873bc5.14dff8","type":"comment","z":"a8b1b208.7036f","name":"coco","info":"","x":120,"y":2580,"wires":[]},{"id":"6a563402.3008bc","type":"tensorflowCoco","z":"a8b1b208.7036f","name":"","model":"","scoreThreshold":"","passthru":true,"x":280,"y":2580,"wires":[["5617fe56.98f7b","83b9c0e7.1436b","8e1912af.b653","7dbdcf58.cfb77","e260cf07.f7485"]]},{"id":"ef49f5fa.acfcc8","type":"fileinject","z":"a8b1b208.7036f","name":"","x":160,"y":2480,"wires":[["cd835566.3e1698"]]},{"id":"94e6d3d.a32373","type":"function","z":"a8b1b208.7036f","name":"Map Objects Boxes get.Picture","func":"//========== retrieves and transmits the image ==========\n//var pictureBuffer = flow.get('pictureBuffer')|| [];\n//msg.pictureBuffer = pictureBuffer;  // send the picture\n\n//========== Detect Empty Objects ==========\nif(empty(msg.classes)){\nmsg.payload = \"no object detected\"  //prepare for the google translate\nmsg.class=\"no object detected\"      //prepare for display in Template Dashboard\nmsg.detect=[]                       //\nreturn msg\n} //no found object \nelse {\n// prepare the color and the thickness of the line in fct of the image size\nmsg.boxcolor = \"yellow\";\nmsg.textcolor = \"yellow\";\n \nif (msg.shape[1] < 300)                             {msg.textfontsize =\"10px\";msg.boxstroke =1;msg.textstroke = \"3px\";}\nelse if (msg.shape[1] >= 300 && msg.shape[1] < 500) {msg.textfontsize =\"15px\";msg.boxstroke =2;msg.textstroke = \"3px\";}\nelse if (msg.shape[1] >= 500 && msg.shape[1] < 900) {msg.textfontsize =\"20px\";msg.boxstroke =2;msg.textstroke = \"5px\";}\nelse if (msg.shape[1] >= 900 && msg.shape[1] < 2000){msg.textfontsize =\"50px\";msg.boxstroke =5;msg.textstroke = \"10px\";}\nelse if (msg.shape[1] >= 2000)                      {msg.textfontsize =\"80px\";msg.boxstroke =10;msg.textstroke = \"20px\";}\n\n//msg.textfontsize =(msg.shape[1] > 600) ? \"70px\" : \"10px\";\n//msg.boxstroke = (msg.shape[1] > 600) ? 5 : 2;\n//msg.textstroke = (msg.shape[1] > 600) ? \"10px\" : \"2px\";\n\n//========== indicates the type of objects and the quantity ==========\n// Get the array of names\nvar names = Object.keys(msg.classes);\nvar firstCount;\n\n// For each name, map it to \"n name\"\nvar parts = names.map((n,i) => {\n    var count = msg.classes[n];\n    if (i === 0) {\n        //Remember the first count to get the \"is/are\" right later\n        firstCount = count;\n    }\n    // Return \"n name\" and get the pluralisation right\n    return count+\" \"+n+(count>1?\"s\":\"\")\n})\n// If there is more than one name, pop off the last one for later\nvar lastName;\nif (parts.length > 1) {\n    lastName = parts.pop();\n}\n// Build up the response getting \"is/are\" right for the first count and joining\n// the array of names with a comma\nmsg.payload = \"There \"+(firstCount === 1 ? \"is\":\"are\")+\" \"+parts.join(\", \")\n// If there was a last name, add that on the end with an 'and', not a comma\nif (lastName) {\n    msg.payload += \" and \"+lastName;\n}\nreturn msg;\n}//fin de si detecte\n\n//========== EMPTY FONCTION ==========\n/*\nHere's a simpler(short) solution to check for empty variables. \nThis function checks if a variable is empty. \nThe variable provided may contain mixed values (null, undefined, array, object, string, integer, function).\n*/\nfunction empty(mixed_var) {\n if (!mixed_var || mixed_var == '0') {  return true; }\n if (typeof mixed_var == 'object') {  \n    for (var k in mixed_var) {   return false;  }\n    return true; \n }\nreturn false;\n}//EMPTY FONCTION END\n","outputs":1,"noerr":0,"x":680,"y":2580,"wires":[["9fec0294.e4c1a"]]},{"id":"cd835566.3e1698","type":"change","z":"a8b1b208.7036f","name":"","rules":[{"t":"set","p":"topic","pt":"msg","to":"testDetection","tot":"str"}],"action":"","property":"","from":"","to":"","reg":false,"x":600,"y":2480,"wires":[["d137d328.1051d"]]},{"id":"b4c46bd1.78b018","type":"comment","z":"a8b1b208.7036f","name":"coco","info":"","x":790,"y":2480,"wires":[]},{"id":"b51df799.00cd48","type":"comment","z":"a8b1b208.7036f","name":"========== UPLOAD picture ==========","info":"new picture :320x240\n\n0: object\n    bbox: array[4]\n        0: 127\n        1: 86\n        2: 29\n        3: 90\n    class: \"person\"\n    score: 0.9015087080001831\n    \n    x=127 y=86 width =29  height=90\n    ","x":240,"y":2390,"wires":[]},{"id":"4cbfe86c.b89a18","type":"ui_template","z":"a8b1b208.7036f","group":"7c26c8d0.c1e658","name":"D3 templatev3 TextStroke","order":1,"width":"7","height":"5","format":"<head>\n    <style>\n        :root {\n        --boxcolor: {{msg.boxcolor}};\n        --boxstroke: {{msg.boxstroke}};\n        --textcolor: {{msg.textcolor}};\n        --textfontsize: {{msg.textfontsize}};\n        --textstroke: {{msg.textstroke}};\n        }\n        .imag {\n            width: 100%;\n            height: 100%;\n        }    \n        #svgimage text {\n            font-family: Arial;\n            font-size: var(--textfontsize, 18px);\n            fill       : var(--textcolor, yellow);\n            paint-order: stroke;\n            stroke: black;/*#ffffff;*//*yellow;*/\n            stroke-width:  var(--textstroke, 3px);/*3px;*/\n            font-weight: 600;\n        }\n        rect {\n            fill: blue;\n            fill-opacity: 0;\n            stroke: var(--boxcolor, yellow);\n            stroke-width: var(--boxstroke, 1);\n        }\n        #svgimage {\n            /*background-color: #cccccc;  Used if the image is unavailable */\n            background-repeat: no-repeat;\n            background-size: cover;\n        }\n    </style>\n</head>\n<body>\n    <svg preserveAspectRatio=\"xMidYMid meet\" id=\"svgimage\"  style=\"width:100%\" viewBox=\"0 0 {{msg.shape[1]}} {{msg.shape[0]}}\">\n        <image  class=\"imag\"  href=\"data:image/jpg;base64,{{msg.image}}\"/>\n    </svg>\n    <!-- \n    <svg preserveAspectRatio=\"xMidYMid meet\" id=\"svgimage\"  style=\"width:100%\" viewBox=\"0 0 {{msg.shape[1]}} {{msg.shape[0]}}\">\n        <image  class=\"imag\"  href=\"data:image/jpg;base64,{{msg.pictureBuffer}}\"/>\n    </svg>    \n    \n    -->\n    <div>{{msg.payload}}</div>\n    <script>\n        (function (scope) {\n            scope.$watch('msg', function (msg) {\n                if (msg && msg.detect) {\n                    var svg = d3.select(\"#svgimage\");\n\n                    var box = svg.selectAll(\"rect\").data(msg.detect)\n                        .attr(\"x\", function (d) { return d.bbox[0]; })\n                        .attr(\"y\", function (d) { return d.bbox[1]; })\n                        .attr(\"width\", function (d) { return d.bbox[2]; })\n                        .attr(\"height\", function (d) { return d.bbox[3]; });\n                    box.enter()\n                        .append(\"rect\")\n                        .attr(\"x\", function (d) { return d.bbox[0]; })\n                        .attr(\"y\", function (d) { return d.bbox[1]; })\n                        .attr(\"width\", function (d) { return d.bbox[2]; })\n                        .attr(\"height\", function (d) { return d.bbox[3]; });\n                    box.exit().remove();\n                    \n                    var text = svg.selectAll(\"text\").data(msg.detect)\n                        .text(function (d) { return d.class; })\n                        .attr(\"x\", function (d) { return d.bbox[0]; })\n                        .attr(\"y\", function (d) { return 10 + d.bbox[1]; });\n                    text.enter()\n                        .append(\"text\")\n                        .text(function (d) { return d.class; })\n                        .attr(\"x\", function (d) { return d.bbox[0]; })\n                        .attr(\"y\", function (d) { return 10 + d.bbox[1]; });\n                    text.exit().remove();\n                }\n            });\n        })(scope);\n    </script>\n</body>","storeOutMessages":true,"fwdInMessages":true,"templateScope":"local","x":1080,"y":2580,"wires":[[]]},{"id":"e260cf07.f7485","type":"change","z":"a8b1b208.7036f","name":"image to pay","rules":[{"t":"move","p":"image","pt":"msg","to":"payload","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":280,"y":2690,"wires":[["9d67d128.0c0d6"]]},{"id":"7c26c8d0.c1e658","type":"ui_group","z":"","name":"detection","tab":"b123e6c8.d3e948","order":2,"disp":true,"width":"10","collapse":true},{"id":"b123e6c8.d3e948","type":"ui_tab","z":"","name":"Slide","icon":"dashboard","order":2,"disabled":false,"hidden":false}]

What do you think about @dceejay ?

Special Thanks to

  • @Andrei for the CSS help in the Template node
  • And @knolleary for helping me put the object list into sentences
5 Likes

Very nice done!!!

Thank you @krambriw
there are certainly optimizations in the Function and Template nodes, but it works like that :smiley:

@dceejay thinks about lots of things ! (or I like to think I do)... but yes - very nice - if you want to add it as an example by a Pull request to the node - please feel free to do so.

Evening @SuperNinja, @dceejay,
I had already prepared locally a pull request to add some extra information and simple example flows to the readme page of the coco-ssd node. Reason for my pull-request: a couple of weeks ago I thought this kind of AI stuff would be way too difficult for me to configure, but afterwards it appeared that the coco ssd node is VERY easy to use. By adding some basic info the readme page, I would like to make this powerful node a but more invitable for users like me ...
Is it ok if I create a pull request with some basic stuff, so you can add your nice advanced example afterwards to it. Just to avoid that we are doing stuff twice ...
Thanks!

:+1: agree !

I saw that your request to pass the image through the node has been implemented. I'm modifying the example above, as you said it will save the image from memory.

Update here.
Do you want to put in this topic examples of simple uses, we can sort out what is duplicate or not?

That is indeed something we should mention! Good that you thought about it...

I have already created a pull request.
On my fork you can see how the updated readme looks like.
Then you can get a first impression ....

2 Likes

readme looks good to me.

1 Like

@SuperNinja
A small suggestion to improve/add a little feature to use other services, sending the analyzed image to your smartphone or to other ui solutions

With my solution using opencv4nodejs, I could easily just forward the analyzed image with the boxes drawn to a Telegram node

Could you look, if possible, to forward the analyzed image to the output of the ui template node (instead of just passing through the unprocessed image)?