[Announce] node-red-contrib-facial-recognition

Well xcode was already installed and so was python2 so I updated the node.

First test using the last example - facial recognition works in the example. Using an image of my own also workeds. Tried a second image of my own and get a dump in the log:

12 Dec 18:36:44 - [info] Started modified flows
(node:38881) UnhandledPromiseRejectionWarning: Error: Size(1048576) must match the product of shape 512,512,3
    at inferFromImplicitShape (/Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:884:19)
    at forward (/Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:7427:17)
    at /Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3480:55
    at /Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3319:22
    at Engine.scopedRun (/Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3329:23)
    at Engine.tidy (/Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3318:21)
    at kernelFunc (/Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3480:29)
    at /Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3492:27
    at Engine.scopedRun (/Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3329:23)
    at Engine.runKernelFunc (/Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3490:14)
    at reshape_ (/Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:7432:19)
    at reshape__op (/Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:4258:29)
    at Tensor.as3D (/Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:29472:12)
    at /Users/Paul/.node-red/node_modules/@vladmandic/face-api/dist/face-api.node.js:8:34933
    at Array.map (<anonymous>)
    at /Users/Paul/.node-red/node_modules/@vladmandic/face-api/dist/face-api.node.js:8:34594
    at /Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3319:22
    at Engine.scopedRun (/Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3329:23)
    at Engine.tidy (/Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:3318:21)
    at Object.tidy (/Users/Paul/.node-red/node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:8898:19)
    at NetInput.toBatchTensor (/Users/Paul/.node-red/node_modules/@vladmandic/face-api/dist/face-api.node.js:8:34541)
    at /Users/Paul/.node-red/node_modules/@vladmandic/face-api/dist/face-api.node.js:8:100674
(node:38881) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 2)

@meeki007 Ahha!, I may have discovered part of the problem. On the Mac, every folder has a .DS_Store file added to it and the code is trying to process that file and chokes. You need to make sure to ignore any hidden files processing the images.

edit: however this is not the issue with the second image I use, it still creates the dump in the pervious msg

1 Like

I've done this in the past with linux. however I went back to my book marks and found this:

list = list.filter(item => !(/(^|\/)\.[^\/\.]/g).test(item));

However it fails on

item = '/.DS_Store';

So Ill just make a note in the Documentation to make dang sure you don't have any hidden files in your folder that contains the images.

Good find! though. However if it failed on .DS_Store it should have thrown a error in the node

and if it sees it as a hidden directory inside the img folder it should ignore it. as the code is only looking for files in the dir

This is buried so deep in the faceapi i can't write a catch for it.

However I put a dent in the loading times. I thought if i got my lazy code out of the eval I could get at the UnhandledPromiseRejectionWarning ...but no...... however nice optimizations came out of it.

I'm giving up on making it deal with bad files (for the faceapi) submitted by users.

changes to code can be found here --> https://github.com/meeki007/node-red-contrib-facial-recognition/pull/2/files

@zenofmud submitted a fix for this. I can't verify it as I don't own a Mac but it looks good and did not break my linux install.

new release pushed.

Change to Code ---> https://github.com/meeki007/node-red-contrib-facial-recognition/pull/3/files

@BartButenaers

I finished the solution to persisting data so load times can be lower.

Storing the Data/Work of running through all the images in the KnownFacesPath folder of a user needs to be done before the faceMatcher so the distanceThreshold can be set on the fly.

I found a helper for JSON.stringify in the face-api called toJSON and fromJSONfor for serializing the data.

The new solution:
A output for user to save the value of payload.labeledFaceDescriptors
User can save it how ever they like. I just copied the value into a function node and created a msg for it.

Then the user sets
settings.FaceRecognition.enabled.ReInitializeFaceMatcher = false
and
settings.FaceRecognition.enabled.labeledFaceDescriptors = user saved labeledFaceDescriptors

I'm Creating a example flow now to show how to do/use it.

I'll post here when I push the update for this feature.

Can't wait to try this.
I have been trying to integrate Amazon Rekognition or Microsoft azure for facial rec, but this looks like it will fit the bill just nicely
Cheers

FaceRecognition_Persistant_labeledFaceDescriptors

make sure you update to ver.0.28.95

Note: FaceDetector minConfidence Properties affect the labeledFaceDescriptors. If you have a minConfidence of .9 you may miss a bunch of faces when building your labeledFaceDescriptors. after you run the node once or supply it a labeledFaceDescriptors from a persistant storage you can set the value to any level you wish for screening the input image you send.

You can save the FaceRecognition labeledFaceDescriptors to persistant storage solution so you don't have to load all the images in your KnownFacesPath folder every time you deploy node red or if node-red/your device restarts.

This could be a HUGE time savings if you have thousands of people in your KnownFacesPath folder.

Also this shows how the msg.settings is used. Just un-comment anything you wish to override in the Properties menu of the node.

example flow found here ---> https://github.com/meeki007/node-red-contrib-facial-recognition#FaceRecognition_Persistant_labeledFaceDescriptors

Windows users: Run the commands in the Windows troubleshooting guide from within your Node-RED user directory - typically ~/.node-red

the first example doesn't work. In the change node "set msg.payload to msg.payload.labeledFaceDescriptors" there is no msg.payload.labeledFaceDescriptors in the msg coming from the facial-recognition node

1 Like

Well I imported the flow and it works just fine for me ......... hmmm going to have to spin up a fresh copy of node-red and try fresh to verify

@zenofmud

have you updated to ver.0.28.95 >>>>?????<<<<<

BTW thats @dceejay goto response for when something(new feature) does not work for a user in the dashboard. Figured it must work or he would not use it.

Arrugh (head slap), I missed that - with the update it is working. On to more testing.

Wow amazing speed improvement with the persistence!

1 Like

yep I do that all the time. Mistakes happen. the more I make the more I learn. Last night I was drinking a beer just reading through the code with no pressure and noticed the GPU option will not work because of a missed making a requires for it. I'm going to work on that. Then Im going to spin up docker and see if it installs in docker. I've been getting windows users with issues. I think they may have to go Docker route and I want to make sure it works.

Yep but if you go the file route some extra logic needs to happen. I wanted the example to be as simple as possible. You will notice it still writes a file every time you trigger it from the persistent inject.

You can get rid of the json -> file-out and the file-in -> json by replacing them with a change node to write to a flow variable and if you have it set up so your context variables are written to storage, the flow is smaller and the 'labeledFaceDescriptors' data will remain over NR stop/starts.

I'd also have a dummy image and have an inject run at startup/deploy time to prep the 'labeledFaceDescriptors' so then you would always send the images from another starting point.

Good point. I should create a prep-ing example flow so users get the idea. me and you get these ideas but a basic user might not see it out the gate.

I'm in the process of doing that give me a bit and I'll post it

Here is my take on using persistent storage.

[{"id":"a820369.a74ddc8","type":"group","z":"6fb28ca.5920374","style":{"stroke":"#999999","stroke-opacity":"1","fill":"none","fill-opacity":"1","label":true,"label-position":"nw","color":"#a4a4a4"},"nodes":["366d25af.834d4a","fba109c3.4a3a7","7bacc5f5.f6f874","93f27667.3f2e28","cf779b78.0b7448","9c32a61b.b55b78","bb16f2af.7bf1f8","d76e9005.11f698","afe8866c.746ca8"],"x":54,"y":219,"w":532,"h":302},{"id":"366d25af.834d4a","type":"comment","z":"6fb28ca.5920374","g":"a820369.a74ddc8","name":"This is where images are sent into the flow","info":"","x":240,"y":260,"wires":[]},{"id":"fba109c3.4a3a7","type":"inject","z":"6fb28ca.5920374","g":"a820369.a74ddc8","name":"sample (6).jpg","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"sample (6).jpg","payloadType":"str","x":170,"y":340,"wires":[["d76e9005.11f698","afe8866c.746ca8"]]},{"id":"7bacc5f5.f6f874","type":"function","z":"6fb28ca.5920374","g":"a820369.a74ddc8","name":"msg.settings","func":"var labeledFaceDescriptors = msg.payload;\nmsg.settings = {\n  //FaceDetector :\n  //{\n  //  SsdMobilenetv1 :\n  //  {\n  //    maxResults : 4,\n  //    minConfidence : 0.6\n  //  }\n  //},\n  //Tasks :\n  //{\n  //  detectAllFaces :\n  //  {\n  //    withFaceLandmarks : true,\n  //    withFaceExpressions : true,\n  //    withAgeAndGender : true,\n  //    withFaceDescriptors : true\n  //  }\n  //},\n  FaceRecognition :\n  {\n    enabled :\n    {\n      //KnownFacesPath : \"/example/labeled_face\",\n      //distanceThreshold : 0.6,\n      ReInitializeFaceMatcher : false,\n      labeledFaceDescriptors : labeledFaceDescriptors\n    }\n  }            \n};\n\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","x":390,"y":380,"wires":[["bb16f2af.7bf1f8"]]},{"id":"93f27667.3f2e28","type":"comment","z":"6fb28ca.5920374","g":"a820369.a74ddc8","name":"images to process","info":"","x":170,"y":300,"wires":[]},{"id":"cf779b78.0b7448","type":"change","z":"6fb28ca.5920374","g":"a820369.a74ddc8","name":"get labeledFaceDescriptors","rules":[{"t":"set","p":"payload","pt":"msg","to":"labeledFaceDescriptors","tot":"flow"}],"action":"","property":"","from":"","to":"","reg":false,"x":440,"y":340,"wires":[["7bacc5f5.f6f874"]]},{"id":"9c32a61b.b55b78","type":"inject","z":"6fb28ca.5920374","g":"a820369.a74ddc8","name":"tracy.jpg","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"tracy.jpg","payloadType":"str","x":160,"y":380,"wires":[["d76e9005.11f698","afe8866c.746ca8"]]},{"id":"bb16f2af.7bf1f8","type":"link out","z":"6fb28ca.5920374","g":"a820369.a74ddc8","name":"to Get Image","links":["8f2be870.db157"],"x":390,"y":420,"wires":[],"l":true},{"id":"d76e9005.11f698","type":"change","z":"6fb28ca.5920374","g":"a820369.a74ddc8","name":"","rules":[{"t":"set","p":"filename","pt":"msg","to":"$flowContext('basepath')&payload\t","tot":"jsonata"}],"action":"","property":"","from":"","to":"","reg":false,"x":410,"y":300,"wires":[["cf779b78.0b7448"]]},{"id":"afe8866c.746ca8","type":"debug","z":"6fb28ca.5920374","g":"a820369.a74ddc8","name":"show start time","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":280,"y":480,"wires":[]},{"id":"bd7929d8.398cb","type":"group","z":"6fb28ca.5920374","style":{"stroke":"#999999","stroke-opacity":"1","fill":"none","fill-opacity":"1","label":true,"label-position":"nw","color":"#a4a4a4"},"nodes":["c3bd1e6e.bf8798","cf8e3656.7677e8","4ecdca5c.5158dc","cd260561.83b6e8","fecb8468.46893","815dd78d.0fcad","8f2be870.db157","c72e589.6a1fb28"],"x":634,"y":239,"w":572,"h":282},{"id":"c3bd1e6e.bf8798","type":"file in","z":"6fb28ca.5920374","g":"bd7929d8.398cb","name":"Get the image file","filename":"","format":"","chunk":false,"sendError":false,"encoding":"none","x":750,"y":360,"wires":[["cf8e3656.7677e8"]]},{"id":"cf8e3656.7677e8","type":"facial-recognition","z":"6fb28ca.5920374","g":"bd7929d8.398cb","image":"payload","settings":"settings","name":"","bindings":"CPU","FaceDetector":"SsdMobilenetv1","FaceDetector_SsdMobilenetv1_maxResults":"4","FaceDetector_SsdMobilenetv1_minConfidence":"0.4","FaceDetector_tinyFaceDetector_inputSize":"416","FaceDetector_tinyFaceDetector_scoreThreshold":".4","Tasks":"detectAllFaces","FaceLandmarks":true,"FaceExpressions":true,"AgeAndGender":true,"FaceDescriptors":true,"Face_Recognition":"Face_Recognition_enabled","Face_Recognition_enabled_path":"/Users/Paul/facial/labeled_faces","Face_Recognition_distanceThreshold":"0.7","x":750,"y":400,"wires":[["cd260561.83b6e8","fecb8468.46893","c72e589.6a1fb28"]]},{"id":"4ecdca5c.5158dc","type":"comment","z":"6fb28ca.5920374","g":"bd7929d8.398cb","name":"This is where the image is processed","info":"","x":810,"y":280,"wires":[]},{"id":"cd260561.83b6e8","type":"debug","z":"6fb28ca.5920374","g":"bd7929d8.398cb","name":"Full results of the processing","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":1040,"y":400,"wires":[]},{"id":"fecb8468.46893","type":"change","z":"6fb28ca.5920374","g":"bd7929d8.398cb","name":"copy msg.payload.labeledFaceDescriptors to context","rules":[{"t":"set","p":"labeledFaceDescriptors","pt":"flow","to":"payload.labeledFaceDescriptors","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":860,"y":440,"wires":[["815dd78d.0fcad"]]},{"id":"815dd78d.0fcad","type":"debug","z":"6fb28ca.5920374","g":"bd7929d8.398cb","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":710,"y":480,"wires":[]},{"id":"8f2be870.db157","type":"link in","z":"6fb28ca.5920374","g":"bd7929d8.398cb","name":"Get Image","links":["f612d4f8.680d38","bb16f2af.7bf1f8"],"x":720,"y":320,"wires":[["c3bd1e6e.bf8798"]],"l":true},{"id":"c72e589.6a1fb28","type":"debug","z":"6fb28ca.5920374","g":"bd7929d8.398cb","name":"Matched name","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload.Result[0].match._label","targetType":"msg","statusVal":"","statusType":"auto","x":1000,"y":360,"wires":[]},{"id":"d32a1e0a.86271","type":"group","z":"6fb28ca.5920374","style":{"stroke":"#999999","stroke-opacity":"1","fill":"none","fill-opacity":"1","label":true,"label-position":"nw","color":"#a4a4a4"},"nodes":["e797356c.e82b88","10d2815f.6a92a7","e6d749ed.c9d9d","fd5dfb5d.d1a7a8","2f9ad081.3d18a","40124966.875ea8","275d04cc.d65674","4d719cc.3f574e4","f612d4f8.680d38"],"x":54,"y":15,"w":992,"h":166},{"id":"e797356c.e82b88","type":"comment","z":"6fb28ca.5920374","g":"d32a1e0a.86271","name":"in case of errors ","info":"","x":940,"y":60,"wires":[]},{"id":"10d2815f.6a92a7","type":"catch","z":"6fb28ca.5920374","g":"d32a1e0a.86271","name":"","scope":null,"uncaught":false,"x":940,"y":97,"wires":[["e6d749ed.c9d9d"]]},{"id":"e6d749ed.c9d9d","type":"debug","z":"6fb28ca.5920374","g":"d32a1e0a.86271","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":930,"y":140,"wires":[]},{"id":"fd5dfb5d.d1a7a8","type":"inject","z":"6fb28ca.5920374","g":"d32a1e0a.86271","name":"dummy image name will be in here","props":[{"p":"payload"}],"repeat":"","crontab":"","once":true,"onceDelay":0.1,"topic":"","payload":"tracy.jpg","payloadType":"str","x":240,"y":140,"wires":[["4d719cc.3f574e4"]]},{"id":"2f9ad081.3d18a","type":"inject","z":"6fb28ca.5920374","g":"d32a1e0a.86271","name":"Put your base filepath in msg.payload","props":[{"p":"payload"}],"repeat":"","crontab":"","once":true,"onceDelay":0.1,"topic":"","payload":"/Users/Paul/","payloadType":"str","x":250,"y":100,"wires":[["40124966.875ea8"]]},{"id":"40124966.875ea8","type":"change","z":"6fb28ca.5920374","g":"d32a1e0a.86271","name":"","rules":[{"t":"set","p":"basepath","pt":"flow","to":"payload","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":530,"y":100,"wires":[[]]},{"id":"275d04cc.d65674","type":"comment","z":"6fb28ca.5920374","g":"d32a1e0a.86271","name":"Enter your base path here - this will vary based on your OS - see examples","info":"Rpi - '$HOME/pi/'\nmacOS - '/Users/your_user_name/'\nWindows - 'C:/...' needa windows person to fill this in","x":360,"y":56,"wires":[]},{"id":"4d719cc.3f574e4","type":"change","z":"6fb28ca.5920374","g":"d32a1e0a.86271","name":"","rules":[{"t":"set","p":"filename","pt":"msg","to":"$flowContext('basepath')&payload\t","tot":"jsonata"}],"action":"","property":"","from":"","to":"","reg":false,"x":530,"y":140,"wires":[["f612d4f8.680d38"]]},{"id":"f612d4f8.680d38","type":"link out","z":"6fb28ca.5920374","g":"d32a1e0a.86271","name":"to Get Image","links":["8f2be870.db157"],"x":730,"y":140,"wires":[],"l":true}]

1 Like

I am getting this same message after change the example folder /example/labeled_face by my own set of faces folder. I checked by other kind of files, but it only contents the jpg images. I am using the same folder structure than the example: /[base_folder]/[person_name]/pics.jpg
The error is throw on same file/line
at inferFromImplicitShape (.../node_modules/@tensorflow/tfjs-core/dist/tf-core.node.js:884:19)

An additional question, is it possible locate this folder out of /node_modules ?

absolutely! Make sure to change the path to the labled faces in the node itself. On my Pi, I created a folder in the pi's home directory called 'facial'. In that I put the 'labeled_faces' folder and I then edited the facial-recognition node and changed the 'KnownFacesPath' like this:
Screen Shot 2020-12-15 at 6.37.03 AM

For a similar setup on my Mac, I used '/Users/Paul/facial/labeled_faces'

I've found it is best to crop the images in the labeled_faces folder to just show the head and the file size should be under 100k - on a pi, an input image that was too big (1.8M) which caused a Resource exhausted: OOM (OOM = Out Of Memory) issue so smaller is better.

1 Like