How to use external NVR (Frigate, BlueIris, ...) with Node-RED

Bart, what is considered motion by Frigate? Also sudden changes in background light conditons that may occure without any other "obvious" motion might be recognized as motion. Like if some of your cameras may "see" parts of a partly cloudy sky?

(And sudden shadows are exactly that)

Hey Walter,
That is a good question. I have not really found a compact summary of how motion works. I only found some details that might perhaps give you an idea:

  1. Seems they have implemented background subtraction. An old acquaintance from our community discusses it here.
  2. Via the Frigate config you can configure a series of motion related settings. For example the threshold value is a critical parameter in motion detection, determining how much change in a pixel's luminance is necessary for it to be recognized as motion. Here they explain that you need to adjust the sensitivity to avoid grass moving, lighting changes, ...
  3. Once motion is detected, Frigate tries to group up nearby areas of motion together in hopes of identifying a rectangle in the image that will capture the area worth inspecting. These are the red motion boxes you see in the debug viewer.
  4. Motion masks prevent motion from being detected. But they don't prevent objects from being detected in that area, because the object detection might have started due to motion in unmasked areas
  5. Motion masks should not be used to avoid detecting objects in specific areas. Instead use a required zone. See more details in this article.

Based on their tip in the last point I have tonight changed my driveway setup:

Because in my AI recordings I saw today a delivery van on my driveway. It suddenly was there, but you can't see it arriving. I "think" it was because I had put a mask on the public street, so it wasn't detected at all. Which means the van object lifecycle only starts as soon as it is on my driveway. Don't know. Will keep it running for a few days.

Although it might use more cpu that way, because motion on the street is now detected. Again I don't know yet.

Bart,

I was looking at your image and wondered if that hedge should be included ?

If someone stands close to it would you miss detecting them ?

BTW the black area in my last screenshots is the privacy mask that I have applied via the user interface of my Reolink camera. To make sure I respect the privacy of my neighbours. Unfortunately Reolink cams don't offer a polygon mask, but instead you can add up to a maximum of 4 filled rectangles. Which not only looks ugly, but is also very inefficient as explained below.

Last night I had used larger rectangles to mask also the public street. Because I am not interested at all in what other people are doing. Moreover in Belgium it is also legally not allowed to film public areas, which I agree to. However as a result of the large privacy mask, you cannot even record (higher) vehicles on your own property. For example the delivery van that arrived today was not fully visible:

You got a point. Need to test that also. But I don't think it will be a problem. Because the bounding box is a bit larger than the person (i.e. some margin around it). And in the zone doc you can find this info:

Presence in a zone is evaluated based on the bottom center of the bounding box for the object. It does not matter how much of the bounding box overlaps with the zone.

Since the bottom center is inside the zone, the person should (theoretically) be detected:

2 Likes

Although you might be complete right. I don't think the COCO SSD model has been trained on stick figures. My god if these guys attack my house which huge numbers, I won't even detect them :cold_face:

Image from alarmy

4 Likes

Of course you can read 100 books about it, but the proof of the pudding is still in the eating:

3 Likes

You should be OK as you are a bit taller than a stick figure :rofl: :facepunch:

1 Like

That is actually not very flexible but if you have to use it, you have to. Isn't it instead better/possible to let Frigate apply a privacy mask that you have drawn with a better suitable tool?

I know it doesn't help but I like the way AXIS have implemented the privacy masking in there cameras where they are blurring detected people (I assume this is an approved solution fulfilling the regulations). With this technique you still get a good preserved overall overview

Check the live camera in Hungary I found ( without knowning, I assume it is possible to define zones where blurring shall occure and where not)

http://mail.bekescsaba.hu:8080/camera/index.html?imagePath=/mjpg/video.mjpg&size=1#/video

Regarding dynamic motion masks, I did have those activated earlier but I could not make it work well in the system I use. On a windy day, you could move in without being detected. Maybe I did not configure things correctly, anyway I do not use it anymore

Yes that would have been my favorite solution, because then all the masks are applied in one central component. However as you can see here on Reddit, Frigate does not support privacy mask drawing. Because then all the frames need to be decoded to be able to draw the privacy mask, and encoded again.

As you can see in that same discussion, it can be done using the provided ffmpeg command by reencoding the rtsp stream and apply a filter overlay mask. But I assume that will consume too much resources on my Raspberry Pi.

Seems I overlooked a few things:

  • It is not enough to draw a zone. You also need to specify in the config (of this particular camera) which alerts/detection are limited to that zone (see docs). For example otherwise this car is detected while it is completely outside the area of my driveway:

  • Seems you need to take into account some margins between the zone and the area of no interest (in my case the public street). Especially when viewing under an angle. For example otherwise this car is completely outside my driveway, but the bottom center of the bounding box is very shortly within my driveway zone:

In both cases a false positive alert occured.

Did something wrong somewhere, because when I now want to playback the video of my events the progress bar keeps spinning :frowning_face:

In the developer tools I see that - when I click on a review item in my user interface - it cannot be found anymore for some reason:

Although I like Frigate a lot, I find it very hard to setup this way...
I probably did something wrong in my yaml, but no idea.
So unfortunately again I need start wasting my precious free time to start digging into details :frowning_face:

I thought that it was enough to let Frigate know which objects it needs to detect, and for which of those objects you want to get an alert (i.e. a review item or event as it was called in Frigate versions 0.13 or below). But it seems that there are 3 things you need to configure. I "think" it is correct to visualize it like this:

What happens:

  1. By default the COCO SSD model can detect 90 different object types in your images. If you use a paid Frigate+ model, these will be able to detect only object types required in video surveillance, and be able to detect those types better (with higher confidence scores). But let's continue here with the default model.
  2. Since most of us are not interested in our house being attacked by a group of snowboards or bananas, we need to tell the object tracking which objects need to be tracked. By default that is person and car only. All other types will be ignored from here on.
  3. Specify for which object types you want to get an alert/event.
  4. If there are other object types detected in the image (which have not triggered an alert/event), these will be labelled as a "detection". But you can again filter which object types need to pass.
  5. Both for alerts and detections you can specify whether a zone is required. We will continue in our diagram if no zone, or if the (bottom center of) the bounding box is inside that zone.
  6. In both cases a review item will be created.
    Note: the former "event" would result in 2 separate events, which means you had to search through multiple videos. In contradiction a review item combines everything.

If anybody knows the difference between a "detection" and an "alert" please let me know. Not clear to me yet, because both result in a review item...

Now off to the daily job...

I hear ya! I had my Frigate working reasonably well although I hadn't tuned it much, preferring to get my Pi-cams working first, and now it has stopped detecting anything.

One thing I haven't tried yet is an AI to troubleshoot. I've tried several and they aren't very good at this sort of thing but Claude has helped with other config issues even though it seemed to be guessing at every step. Still, its guesses were more efficient than relying on my own. You might see how you fare with it. YMMV

EDIT: I just looked at the Frigate docker logs and found...
Detection appears to be stuck. Restarting detection process

Now to find out why. :man_detective:

I did that already a couple of times. Sometimes it helps, and sometimes not. In the latter case it generated unexisting logical looking configs, or obsolete configs (based on config parameters that have been deprecated in the past). But it can be a good way to get started...

I learned a few things from this Reddit discussion.
While both "detections" and "alerts" are detected objects (with a bouding box, score percentage, label, ...) that trigger a review item (formerly called "event"), there are some subtle differences:

  • Alerts have a higher severity level. For example:
    • if a person walks by, it is just a detection (i.e. less important).
    • if a person walks on your driveway, it becomes an alert (i.e. more important).
  • You can specify a different retention period because typically you want to store videos about alerts for a longer time period.
  • It is used for categorization in the Frigate UI. For example the display of small thumbnails images at the top on the Live dashboard is only for recent alerts (not for detections).
  • It affects the severity which is communicated over MQTT on the /reviews topic. Some services use this to send notifications, and you could for example make it so that only alerts get notifications sent. That way third-party services can use it.

My custom base path related changes in my local Frigate codebase are still running fine:

  1. I now only needed to add once an environment variable to my Docker compose file:
    services:
      frigate:
         image: ghcr.io/blakeblackshear/frigate:stable-standard-arm64
         environment:
           - BASE_PATH=frigate
    
  2. Then I needed to add once a forwarding rule to the revers proxy of my Tailscale daemon:
    sudo tailscale serve --https=443 --bg --set-path /frigate http://localhost:5000/frigate
    
  3. After that Frigate is available within my tailnet (vnp mesh network) via https://<my-raspberry-virtual-hostname>.<my-tailnet>.ts.net/frigate (again with LetsEncrypt certificates out of the box).
    So no need to remember port numbers anymore, like with all of my other web applications.
  4. The base path is now also automatically applied to the Frigate webmanifest file, so in Chrome Android you just need to click "Add to homescreen" and choose PWA (Progressive Web App). Which means I can now watch my video surveillance on my phone when not at home.

I only need to get my changes approved by the Frigate team, so I have kindly asked them again (see here).

Something strange happens. I have a zone drawn on top of my driveway, and I "think" my config requires that I get only alerts when there is an intersection between deteced objects and my driveway zone:

record:
  enabled: true
  retain:
    # Don't store the continious video stream (i.e. no 24/7 live recording)
    days: 0
    mode: all
  alerts:
    retain:
      days: 30
    pre_capture: 10
    post_capture: 10
  detections:
    retain:
      days: 30
    pre_capture: 10
    post_capture: 10
objects:
  track:
    - person
    - dog
cameras:
  cam_oprit_garage:
    enabled: true
    live:
      stream_name: rtsp_cam_oprit_garage_sub
    ffmpeg:
      inputs:
        - path: rtsp://127.0.0.1:8554/rtsp_cam_oprit_garage_sub
          input_args: preset-rtsp-restream
          # Use the cam's low-resolution sub stream for motion/object detection
          roles:
            - detect
        - path: rtsp://127.0.0.1:8554/rtsp_cam_oprit_garage_main
          input_args: preset-rtsp-restream
          # Use the cam's high-resolution sub stream for recording
          roles:
            - record
    motion: {}
    review:
      alerts:
        # Only alert when an object enters the driveway (zone)
        required_zones:
          - Driveway
      detections:
        # Only alert when an object enters the driveway (zone)
        required_zones:
          - Driveway
    objects:
      track:
        - person
        - dog
        - car
    zones:
      Driveway:
        coordinates: 
          ...
        loitering_time: 0
        inertia: 3

Note (see reference config) that by default there are alerts for person and car:

  alerts:
    # Optional: labels that qualify as an alert (default: shown below)
    labels:
      - car
      - person

And normal detections (with lower priority) for all other object types, but I only do object tracking in my config the dogs, cars and persons. Which means that on my driveway: cars and persons would become alerts, while dogs would be normal detections (with lower importance).

So I would expect that I only get an alert when a car or person intersects with my driveway zone. However also cars way outside of my zone are detected, and a review item is created:

I cannot find the root cause and Chatgpt cannot explain it neither :frowning_face:
So hopefully anybody else sees a stupid mistake in my config :folded_hands:

BTW I also see no recorded movie for these review items, which I also cannot explain at all...

Closing the loop here. Nothing wrong with my coral, the problem is the PCIe card I added to an older machine that only has USB-A 2.x ports. The idea is that USB-C is faster with more power available but not if the card is bad :wink: My coral is working well, albeit slower, now that it's plugged into the native USB.

It also could be that it's buggy. It is beta software and we know that the docs are not 100% accurate or up to date. Not a knock on the devs as most software is like this even later in life.

Bart, maybe you can tell me why my picam feed has about 1s of latency when viewed with VLC but a full 25s when viewed on the frigate dashboard.

I also have some high latency. Not sure when you have it, but in my case it is when I want to playback a recorded video of a review item. Since quite a long flow is being traversed, lots of bottlenecks can cause this (SSD disk IO, ffmpeg, CPU, ...). I did not have time yet to analyze why I have a large latency sometimes, but I think it won't be easy to find the root cause.