How to use external NVR (Frigate, BlueIris, ...) with Node-RED

No worries @BartButenaers, we are all your fans!!

4 Likes

Aah i was only joking (sarcasm) - i think you started from the wrong place with Frigate. I think you would be far better off with the Google TPU on USB to ensure CPU utilisation stays low.

The main thing is to make sure that the detection is done on the lower frame rate.

Remeber as well - this is all in docker - so very easy to blow it away and start again with a new pull and a new CFG in YAML

Craig

Thank you, Bart, for this thread. I instantly bought a Coral TPU and am very satisfied with the frigate solution (in my case under UnRAID) in contrast to the former solution with motion.
Much to optimize, but it works so far.

4 Likes

If you don't have success with your current approach give my version2 system a look:
https://github.com/wb666greene/AI-Person-Detector-with-YOLO-verification-Version-2/tree/main
My goal is to get the false positives on "person detection" to be as close to zero as possible. My collection of false positives contains 17 images, mostly of bugs in front of the camera and my neighbors cat which still occasionally falsely detects as a "person" with high confidence. Most of these are from when the system was running yolo4 for the final verification. Upgrading to yolo8 made quite a difference, I'm working on upgrading to yolo11. It has been in operation for about 18 months with the YOLO verification.

NVR/DVR is not a part of it, I've had Lorex/Amcrest DVRs running since circa 2015 and rarely look at the live or recorded video, mostly for grins like when I saw a racoon climb up a tree, or when a car crashed into a light pole across the street, and when a large city truck crashed into a tree limb. My system started as an addon to the Lorex DVR, which is "good enough" for my limited 24/7/365 recording needs.

It is all python code with a simple node-red controller for sending the alerts and showing the most recently detected "person". Your superior skills could build on this starting point for a node-red interface that makes you happy. Regard my system as a black box to feed person detection images to node-red via MQTT. I use node-red exec nodes to launch scripts to start and stop the system, enable/disable the push alerts (detected person images are always saved) and remove images older than X days so the storage doesn't fill up.

The basic strategy is I run MobilenetSSD_v2 on all images. When a person is detected I "zoom in", aka crop out the region of the detected person and rerun the detection with a bit higher detection threshold. If that passes I write the cropped image to a seperate thread and run the YOLO8 model and if that is a positive detection it is sent vial MQTT to node-red for notification.

With two TPUs a Pi5 should support might three or four 4K cameras acceptably, a Pi4 might be usable. The cheapest way is to buy a refurb i3 or better laptop with integrated Intel UHD graphics chip (8th gen or better see the OpenVINO docs) and a USB3 TPU ~$60 (or remove the laptop WiFi module and use an A+E key M.2 TPU (~$25). The MobilenetSSD runs on the TPU, the Yolo8 runs on the integrated GPU. These systems are available "refurbished" for $120-180 with the display and RAM being the main price point factor. Your kids old "gaming computer" might be perfect if it has a CUDA capable Nvidia GPU. A bottom of the line Lenovo 11th generation i3 "Ideapad" can do four 4K cameras at 3fps each using CPU AI for the MobilenetSSD_v2 detections and the iGPU for YOLO8 verification. Other test system results are given in the github README.

If you want to try the Pi4, you can start with one TPU and not enable the YOLO8 verification which is fine if you are more tolerant of false alerts than I am. Raise issues on the github and I'll try and help out and improve my instructions for installation. It might work on Windows if you have the "right" TPU drivers I gave up windows a long time ago, but since everything is python, node-red and OpenVINO (if using iGPU) all are available for Windows, but the control scrips will need to be redone for Windows.

1 Like

As a result of this thread, I decided to 'bite the bullet' and buy a RPi5 and Coral TPU to go with my recently bought Reolink Duo 2 POE.

My first few attempts at using Docker and Frigate ended with me starting from scratch several times after finding Frigate not working. It seems that Frigate does not like a badly formed config.yml file, in my case, it seemed to obstinately refuse to load.

Once I had got that sorted out, I then tried to install the Coral USB TPU, I couldn't get the lsusb' to change from Global Unichip Corp.toGoogle Inc.with the added complication of the version of Python and usingpyenv` and getting frustrated with the Software telling me I had the wrong version of Python (PEBKAS).

Eventually decided to get Coral TPU working first and then go with the Docker and then Frigate installs. Cracked it!!

Still getting some of the YAML config wrong, but just revert the changes and all returns to working. Still fine tuning, but getting places now, just need to learn how to display the MQTT snapshot in Node-RED!

Thank you all for your input!

4 Likes

@mudwalker - Nice Job! for the snapshot check the traffic-monitor ui-monitoring flow, I use the Frigate HTTP API to grab the snapshot frame from the MQTT event type='end' using event id. This displays the last 5 events onto the UI dashboard with template code:

[{"id":"e5391514131a2b5a","type":"function","z":"28627559bebdc324","g":"9b9e4fc50b744fc3","name":"store last N radar events","func":"//events_recent_radar should show the last N events where payload.entered_zones CONTAINS zone_radar\n\nvar myEvents = flow.get(\"events_recent_radar\") || [];\n//Keep only last X elements\nvar myArrLength = 5;\n\n//invert array so OLDEST IS FIRST, for functions\nmyEvents.reverse();\n\nif (myEvents.length >= myArrLength) {\n    //remove first element (oldest event)\n    myEvents.shift();\n}\n\n//add current event to end of array\nmyEvents.push(msg.payload);\n\n\n//invert array so NEWEST IS FIRST, for display\nmyEvents.reverse();\n\nflow.set(\"events_recent_radar\", myEvents);\n\nmsg.payload = myEvents;\n\nreturn msg;","outputs":1,"timeout":0,"noerr":0,"initialize":"// Code added here will be run once\n// whenever the node is started.\n\nif (flow.get(\"events_recent_radar\") === undefined) {\n    flow.set(\"events_recent_radar\", [])\n}\n","finalize":"","libs":[],"x":760,"y":1900,"wires":[["c36c71706e494b5e","3214e8cd3ea79630"]]},{"id":"c36c71706e494b5e","type":"ui_template","z":"28627559bebdc324","g":"9b9e4fc50b744fc3","group":"7866c2aa313ab8b9","name":"last N radar events","order":2,"width":0,"height":0,"format":"<p class=\"label nr-dashboard-chart-title nr-dashboard-chart-titlel\">last N zone_radar events for radar_camera</p>\n<table class=\"table\">\n    <tr ng-repeat=\"payload in msg.payload\">\n        <td><img src=\"data:image/jpg;base64, {{payload.thumbnail_base64jpg}}\" alt=\"thumbnail\" /></td>\n        <td>\n            id: {{payload.id}} <br /> \n            label: {{payload.label}} <br />\n            top_score: {{payload.top_score}} <br />\n            frame_time: {{payload.frame_time_datestring}} <br />\n            direction: {{payload.direction_calc}} <br />\n            speed: {{payload.speed_calc}}\n        </td>\n    </tr>\n</table>","storeOutMessages":true,"fwdInMessages":true,"resendOnRefresh":true,"templateScope":"local","className":"","x":1060,"y":1900,"wires":[[]]},{"id":"0f65ceafe3f7c069","type":"function","z":"28627559bebdc324","g":"9b9e4fc50b744fc3","name":"format last N radar event for output","func":"\n\nconst newMsg = {};\nnewMsg.payload = {}; // contain everything going into a DB record, single record\n\nnewMsg.payload.id = msg.frigate_event.id;\nnewMsg.payload.camera = msg.frigate_event.camera;\nnewMsg.payload.label = msg.frigate_event.label;\nnewMsg.payload.sub_label = msg.frigate_event.sub_label;\nnewMsg.payload.top_score = msg.frigate_event.top_score;\n\nnewMsg.payload.frame_time_datestring = new Date(msg.frigate_event.frame_time * 1000).toLocaleString();\nnewMsg.payload.frame_time = msg.frigate_event.frame_time;\nnewMsg.payload.start_time = msg.frigate_event.start_time;\nnewMsg.payload.end_time = msg.frigate_event.end_time;\n\nnewMsg.payload.entered_zones = msg.frigate_event.entered_zones; //array\nnewMsg.payload.direction_calc = msg.frigate_event.direction_calc;\nnewMsg.payload.speed_calc = msg.frigate_event.speed_calc;\n\nnewMsg.payload.thumbnail_base64jpg = msg.frigate_event_api_thumbnail;\n\nnewMsg.payload.location = msg.frigate_event.location;\n\nreturn newMsg;\n\n\nreturn msg;","outputs":1,"timeout":0,"noerr":0,"initialize":"","finalize":"","libs":[],"x":470,"y":1900,"wires":[["e5391514131a2b5a","d66541089b01306e"]]},{"id":"3214e8cd3ea79630","type":"debug","z":"28627559bebdc324","g":"9b9e4fc50b744fc3","name":"debug: last N radar events","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","statusVal":"","statusType":"auto","x":1080,"y":1840,"wires":[]},{"id":"f5d2d92d50d18ce2","type":"group","z":"28627559bebdc324","g":"9b9e4fc50b744fc3","name":"frigate event api for thumbnail","style":{"label":true},"nodes":["09f99c931e5535bf","08121a31488784bc","9226865e3385eb56","aa0afb5180297e2b"],"x":304,"y":1739,"w":612,"h":122},{"id":"09f99c931e5535bf","type":"http request","z":"28627559bebdc324","g":"f5d2d92d50d18ce2","name":"frigate http api for thumbnail","method":"GET","ret":"bin","paytoqs":"ignore","url":"http://localhost:5000//api/events/{{{frigate_event.id}}}/thumbnail.jpg","tls":"","persist":false,"proxy":"","insecureHTTPParser":false,"authType":"","senderr":false,"headers":[],"x":450,"y":1780,"wires":[["9226865e3385eb56","aa0afb5180297e2b"]]},{"id":"08121a31488784bc","type":"change","z":"28627559bebdc324","g":"f5d2d92d50d18ce2","name":"set frigate_event_api_thumbnail","rules":[{"t":"set","p":"frigate_event_api_thumbnail","pt":"msg","to":"payload","tot":"msg"}],"action":"","property":"","from":"","to":"","reg":false,"x":760,"y":1820,"wires":[["0f65ceafe3f7c069"]]},{"id":"9226865e3385eb56","type":"image","z":"28627559bebdc324","g":"f5d2d92d50d18ce2","name":"","width":"175","data":"payload","dataType":"msg","thumbnail":false,"active":false,"pass":false,"outputs":0,"x":710,"y":1780,"wires":[]},{"id":"aa0afb5180297e2b","type":"base64","z":"28627559bebdc324","g":"f5d2d92d50d18ce2","name":"","action":"","property":"payload","x":550,"y":1820,"wires":[["08121a31488784bc"]]},{"id":"7866c2aa313ab8b9","type":"ui_group","name":"Live updates","tab":"665d947f9a312a33","order":2,"disp":true,"width":12,"collapse":false,"className":""},{"id":"665d947f9a312a33","type":"ui_tab","name":"Monitoring Tab","icon":"dashboard","order":1,"disabled":false,"hidden":false}]
2 Likes

Thank you for this! I will try it out and learn.

I am heading towards the @krambriw school of thought of just recording events. I do have a flow that uses ONVIF to send events from the camera to Telegram - thanks to @BartButenaers. But I see Frigate being more accurate in it's detection/inferencing, and more flexible in it's use.

Frigate in a single Dashboard, for possibly cameras of different manufacture with good communication/control from Node-RED is what I am after.

Just to help you avoid re-inventing the wheel (forgive my Frigate advertisement): Frigate also has a good Event Review UI and other controls such as automated PZT camera control, so I lean on that app to do what it does rather than building a new UI myself.

The power of Node-RED for the setup comes in for doing something new with the Frigate events via MQTT messages and/or the HTTP API, as you mention with a Telegram message example. You can even set up zones in Frigate and parse who/what stopped or passed through an area, model inference stats, additional attributes, etc. e.g. I attach radar detected speeds to events that pass through the zone I have set up on a street using Node-RED and the Frigate event.

1 Like

Thank you again.

I am still familiarising myself with some of the more basic stuff, but I see that there is also ONVIF protocol available to be used, I have also found the Review page with the timeline/Filter/Events etc., very nice.

In browsing through the Frigate Reference, and examples I have come across, I thought I could record with audio from the Reolink main stream and detect on the substream, but that lost me the moving streams of any event - you live and learn, but I will have another go!!

Basically familiarising myself with Frigate and it's MQTT options, the Node-RED side I can handle MQTT and generate my Telegram messages from that. I have now downloaded @Steve-Mcl node-red-contrib-image-tools, which should do all I need, but that can wait a few days!! I learn by doing though, which can lead me up the garden path.

Thank you for the encouragement.

1 Like

Also if you are having problems with YAML formatting (between it at Python) i do not know which is more stupid !!

I have found a cut and past of the YAML into ChatGTP and asking for proper formatting works in 99% of cases

Just make sure you have your passwords obfuscated out (or referenced elsewhere)

Craig

1 Like

Hi guys,

Apologies for not responding earlier!!!
During the end of the year, I got very negative feedback (via various channels) on my developments (dashboard D2 related). I was completely fed up with open-source stuff. So I decided to stay away from Discourse for some time, until my mindset was a bit ok again. Anyway I want NOT to discuss this further, and hopefully then the Node-RED fun factor comes back...

Last weekend I have been playing a bit with Frigate. Just love it. A quick summary of what I did:

  • I have installed an aluminium plate on my both driveways to alert people that I have video surveillance.
    Before I install extra cams in the public area of my garden, I will need to register them online with the government and create a logbook of my recordings. Which is all legally required in Belgium, otherwise you can get suited and you cannot use your images in case of a burglary otherwise.
  • My Frigate crashed continiously the entire weekend, every time I switch in the web ui to the live camera view. After a LOT of headache about that memory segmentation fault, I found that it was fixed two days ago already but it is not released yet. After I have applied the fix of the python code manually inside my docker container, it is now running stable :champagne:
  • You guys are soooo right that I should not try to show the video in my Node-RED dashboard. The Frigate web interface is very powerful and it has no use to put a huge amount of time to desparate try to mimic that in our dashboard. Thanks for convincing me about that :+1:
  • However that means that I need to have the Frigate web ui available on my smartphone. So that I can still have a look at my cameras, even when I am not at home. Therefore I wanted to make their web ui available via my Tailscale tailnet network:
    sudo tailscale serve --https=443 --bg --set-path /frigate http://localhost:5000/frigate
    
    That works. But it seems that Frigate does not allow sub-paths (e.g. /frigate). You can only achieve that by adding a http header X-Ingress-Path containing the sub-path. However the reverse proxy of my Tailscale agents don't support that, so I needed to fix manually the Frigate Nginx config file inside the Docker container manually (using this workaround). But that is of course not a future proof solution. However now it at least works and I can access the Frigate web ui via https://<my-rpi-virtual-device>.<my-tailnet>.ts.net/frigate nicely with LetsEncrypt certificates :champagne:
  • At last I wanted to create a shortcut icon on my Android phone, to access the Frigate web ui easily. But when I ask Chrome to add it to my home screen (both as PWA or as plain url shortcut), Chrome always removes the sub-path \frigate so I still cannot access it via an icon :frowning_face:. That seems to be a know issue with their PWA manifest file, so I have responded to that and hopefully they can fix it.

About MQTT. Forgive me if I overlooked it somewhere, but:

  1. Can somebody share a simple flow to allow Node-RED to get a message when a person is detected on some camera?
  2. Does anybody know whether it is possible to turn object detection for camera's ON and OFF via MQTT? Currently Frigate is doing object detection and recording a lot when I am working in the garden. Which is useless (eating CPU and disk space) and pollutes my Event View, seeing me walking by everywhere. I only want to do object detection and (as a result of those detection events) recording when I turn it ON or OFF in my dashboard D2.

Thanks!!!

@wb666greene Very impressive stuff you have made!! Unfortunately I won't ever find time in my life to spend on it. For me it is now only Frigate or nothing I am afraid. Otherwise it will never happen for me...

I plan to use node-red-contrib-facial-recognition

@smcgann99
Seems that last week the Frigate guys created a draft pull request (see here) for their next major 0.16 release. It is very early day, but I see that they have already added some interesting keywords to their documentation: facenet and alpr. Spoiler alert :wink: Not sure if that is going to happen (because I don't see it on their roadmap), but if my assumption is correct they will implement facial recognition and license plate recognition. Which would be awesome to have for my home automation...

I will need to have a look how I can buy these guys some coffees for their hard labour...

1 Like

Hi Bart,

Nice to see you back here :slight_smile:

Change of plan, I'm now working on forking these nodes to make them more user friendly.
The detection accuracy seems to be better than the original node I tried.

1 Like

That looks interesting!!

However - only for the people that use Frigate of course - it would be very interesting if Frigate could do that by itself. Because they have already the decoded raw image buffers in their Python code in memory. Which means they could do those extra calculations with as minimum overhead as possible.

I have not been following this thread, but can you not use
sudo tailscale serve --https=443 --bg --set-path /frigate http://localhost:5000/
or am I missing something?

jardin being the the name of one of my cameras

Sends a debug message (or telegram or whatever...) if a person is detected by jardin.

By sending ON/OFF to detect/set of a particular camera will switch on/off the recording.

Typically, in my case, when the doors to the garden are open (so my wife, or I are out), I disable detection.

The full list of MQTT commands is described there : MQTT | Frigate

2 Likes

Thanks @greengolfer
Really appreciated!! Going to try that tonight!

Colin,
Such a rule for the reverse proxy would behave like this:

  1. In your browser you navigate to https://<your-raspberry-virtual>.<your-tailnet>.ts.net/frigate.
  2. The tailscale agent reverse proxy will forward that - based on your rule - to localhost port 5000, where Frigate is listening.
  3. Frigate returns its html page. So far so good...
  4. That page contains links to css, js, ... resources (relative links) which your browser will try to fetch.
  5. The http requests for those resources will arrive at the reverse proxy (of your tailscale agent). However their urls don't contain the sub-path prefix /frigate/..., so the reverse proxy won't know that those http requests need to be forwarded to Frigate. Since the reverse proxy can find nowhere the specified resources, it will return a http error that the specified resource cannot be found.
  6. Since no resources can be loaded, you will end up with a blanc white window...

So you need to be able to specify in Frigate that it should use \frigate as base path. Similar to how you use httpAdminroot to access your Node-RED editor.

However Frigate does not allow you to specify such a base path unfortunately. You can only pass the base path as a http header variable X-Ingress-Path, however the Tailscale reverse proxy does not support that. In that case you would need to install an Nginx or something like that, but that is complete overkill for this. So I need to have a look at Frigate again, whether I can create a pull request...

But yes good question!!

There is something I don't understand here, not having frigate I experimented with the node-red routes. If in tailscale I proxy /db to ..:1880/dashboard that works fine through the tailscale domain/db. As I expected, relative routes in the dashboard are requested by the browser relative the url that it knows, which is /db.

However, if I set httpAdminRoute to /flow-editor and then proxy /editor to ...:1880/flow-editor then relative routes in the editor do not work. I can only get the editor to work if I proxy /flow-editor to /flow-editor, which is what you found with frigate I think. Can you explain that?

@greengolfer
Thanks to your feedback I was able to setup two-way communication (via MQTT) between Node-RED and Frigate in no time :heart_eyes: This is really powerful stuff.

That is a very good question. I need to digest it. Please send me a reminder in a couple of days, when I should have forgotten about it. Unless somebody else meanwhile can explain it of course.

1 Like

@mudwalker
Yes I am also struggling with the yaml's. Of course the yaml format is very simple, but my brain seems to be able somehow only to mess up the content over and over again. I really hope some contributors popup in the near future, to make it all adjustable via the Frigate user interface...

Thanks for the pointer about the MQTT snapshot! I wasn't even aware that the snapshot images were automatically send via MQTT, but indeed there it is:

And using the node-red-contrib-image-output node it is very easy to preview the snapshot images inside a Node-RED flow:

BTW the snapshot images seem to contain the bounding boxes and score (percentage). However it seems not to be possible to have that inside the recordings, since that would be tremendous slow. Which is quite obvious of course...

2 Likes

@wb666greene
You are way ahead of me, with your professional setup.

At the time being I was creating my own implementation of object detection on a Coral TPU stick in Javascript via tfjs. Lots of playing with tensor objects. One of my biggest problem was that the scores of real humans was very close to false positives. E.g. a bush moving in the wind got a score of 57%, and me walking through the garden was 62%. Which was way too close to each other. Never found what caused that...

Anyway looks like the Frigate team did a much better job. Everytime a familly member passed by a camera during this weekend, a person was detected. And I got - until now - no false positives. Of course I am sure that will happen soon... But so far it looks ok to me, although I need to do some extra tests (e.g. running, night, ...).

But I was wondering how you collect those false positives statistics? Do you simply write them down and count them once and a while? Or do you have some overengineered algorithm that does this all out of the box for you?

1 Like