How to use external NVR (Frigate, BlueIris, ...) with Node-RED

Enabling snapshot images is also very easy in the config.yaml file:

snapshots:
  # Store snapshot jpg images (for 30 days) in /media/frigate/clips for each object that is detected
  enabled: True
  retain:
    default: 30

Which are automatically stored on your filesystem, in my case a Samsung SSD T5 USB disc (attached to my Raspberry Pi4). For each snapshot image it seems to store a jpg (which contains the bounding boxes) and a png (which contains the original image):

And those snapshot images can easily be viewed in the "Explore" tab of the web interface, when you click one of the events:

This is particular useful in case of false positives. Today I had my first one, because the AI thought he had seen a "dog". After watching the recorded video I saw no dog at all. So I looked at the snapshot image of that event, and seems the AI thought one of the bushes in my garden looks like a dog for the AI:

But now at least I know what the AI did wrong, so I can finetune the detection: by experimenting with thresholds, by applying motion/detection masks, by applying max/min limits to the bounding box sizes, and so on...

1 Like

Yes Bart, I do this as well. Finding settings that will filter out detections so only those that "most likely" matches your targeted objects (in my case "persons", everything else is for me non relevant)

Threshold or confidence level is a given one to experiment with. Two others I have that has shown very useful is bbox ratio and area. The ratio I calculate as bbox width/height and the area as width*height

In the analyser I have settings in tables (for each camera) for confidence level, max ratio, min area and max area. In code the evaluation is like this in Python

if probability >= conf_lev[cam] and ratio < ratios[cam] and area > min_area[cam] and area < max_area[cam]:

So in my case I filter out detections that doesn't match those criteria. This means you catch hopefully only "realistic" objects that are more likley to "fit inside" the bbox as well as filtering out those where the bbox is unrealistic small, large or has the wrong geometry

Im not sure you are able to do this kind of "user scripting" in frigate but this is how I have added rules to the basic detection, a kind of post-analyze

2 Likes

Just to add something about the post-analyze I do; the normal object detection runs on 2D images (x,y) and takes no consideration about the distance (z) to the objects. Regardless if it is MobilenetSSD, YOLO etc. So there is no 3D analyze made. I felt I wanted to try to improve the analyze by considering also the depth. If you look at a 2D you are most likely not interested in objects that are too far away. Maybe also not to objects that are too close. By calculting the size of the bbox I should be able to set a reasonable starting and ending point to the z-axis. Basically cutting a "slice" with a defined thickness out of the "infinite" depth

That sounds reasonable and most systems do allow for some kind of min max object size.

But - What if you have a very big man far away and a very small one near by :rofl:

It is a period of recording data required that will give you the necessary know-how where to set the boundaries. In my case I started logging those calculated values, comparing them with the actual detected objects near as well as far, finally finding settings that gives the expected result. Per camera I did this. Some settings happened to be the same for all cameras since they have a full view with similar monitored depth. Like the ratio, in my case it has to be less than 0,85 to qualify for a person object. But for the min and max areas it differs more, the reason as you just mentioned, objects are expected to be detected either closer or more distant relative the cameras physical location

You can stream the video element as a link right to the nodered world map. I Just use the most recent image from frigate as I don't need to see a live view when I click the camera on the map.

Here are two flow tabs to add a camera to the map using a frigate link.
/cameramap is the link you send users. Disabled allot of settings so they cant do anything but click a camera.
/cameramapadmin is the link for adding cameras to the map. I keep users from accessing this url. I use ssh port forwarding to get to this link as it is only accessible from localhost :slight_smile: but you can hide it any way ya like.

http://10.x.x.1/frigate3/#YourCameraHere

You feed /cameramapadmin the above url and It does all the other work in code to add and remove from the map. It should work for any frigate url as it only cares about the camera name and the complete url to the camera.
So If you fed it http:127.0.0.1:8971/#YourCameraHere it should work. I've not tested that as my frigate server is behind nginx with a diff url.
how it parses the url/cameraname

var cameraName = frigateCamURL.split('#')[1];
var baseURL = frigateCamURL.split('#')[0];
var imageURL = baseURL + 'api/' + cameraName + '/latest.webp?height=275';

^^^ yes I know I could save the split as an array and then call them from the array but when I come back 1 year later the above is easy on the brain/eyes

Well you do have to refresh (F5) browser window every time you add a camera for the first time. I know how to fix this but f5 is quick and easy :slight_smile:

/cameramap

[
    {
        "id": "fd4afd8865bbfe5e",
        "type": "tab",
        "label": "/cameramap",
        "disabled": false,
        "info": "",
        "env": []
    },
    {
        "id": "05c71746369cd16b",
        "type": "worldmap in",
        "z": "fd4afd8865bbfe5e",
        "name": "",
        "path": "/cameramap",
        "events": "connect,disconnect,point,draw,layer,bounds,files,other",
        "x": 150,
        "y": 160,
        "wires": [
            [
                "3a3e9f781bfd22c4"
            ]
        ]
    },
    {
        "id": "a4d6ba7a0784034b",
        "type": "worldmap",
        "z": "fd4afd8865bbfe5e",
        "name": "",
        "lat": "38",
        "lon": "-100",
        "zoom": "18",
        "layer": "EsriS",
        "cluster": "",
        "maxage": "",
        "usermenu": "hide",
        "layers": "show",
        "panit": "false",
        "panlock": "false",
        "zoomlock": "false",
        "hiderightclick": "true",
        "coords": "none",
        "showgrid": "false",
        "showruler": "false",
        "allowFileDrop": "false",
        "path": "/cameramap",
        "overlist": "",
        "maplist": "OSMG,OSMC,EsriC,EsriS",
        "mapname": "",
        "mapurl": "",
        "mapopt": "",
        "mapwms": false,
        "x": 1070,
        "y": 160,
        "wires": []
    },
    {
        "id": "ae0cf05b120281d5",
        "type": "function",
        "z": "fd4afd8865bbfe5e",
        "name": "add known cameras to map",
        "func": "const camLat = Number(msg.payload.lat);\nconst camLon = Number(msg.payload.lon);\nconst cameraName = msg.payload.name;\nconst imageURL = msg.payload.imageURL;\nconst reviewURL = msg.payload.reviewURL;\n\nmsg.payload = {\n    \"name\":cameraName,\n    \"layer\":\"cameras\",\n    \"lat\":camLat,\n    \"lon\":camLon,\n    \"icon\":\"fa-video-camera\",\n    \"iconColor\":\"forestgreen\",\n    \"photoUrl\": imageURL,\n    \"weblink\":[\n        {\n            \"name\":\"Review\",\n            \"url\": reviewURL,\n            \"target\":\"_new\"\n        }]\n};\n\n\nreturn msg;\n\n",
        "outputs": 1,
        "timeout": "",
        "noerr": 0,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 860,
        "y": 160,
        "wires": [
            [
                "a4d6ba7a0784034b"
            ]
        ]
    },
    {
        "id": "3a3e9f781bfd22c4",
        "type": "switch",
        "z": "fd4afd8865bbfe5e",
        "name": "",
        "property": "payload.action",
        "propertyType": "msg",
        "rules": [
            {
                "t": "eq",
                "v": "connected",
                "vt": "str"
            }
        ],
        "checkall": "true",
        "repair": false,
        "outputs": 1,
        "x": 310,
        "y": 160,
        "wires": [
            [
                "bca2390ed4de4437"
            ]
        ]
    },
    {
        "id": "bca2390ed4de4437",
        "type": "change",
        "z": "fd4afd8865bbfe5e",
        "name": "",
        "rules": [
            {
                "t": "set",
                "p": "payload",
                "pt": "msg",
                "to": "cameras",
                "tot": "global"
            }
        ],
        "action": "",
        "property": "",
        "from": "",
        "to": "",
        "reg": false,
        "x": 480,
        "y": 160,
        "wires": [
            [
                "2ae2510bfd834c47"
            ]
        ]
    },
    {
        "id": "2ae2510bfd834c47",
        "type": "split",
        "z": "fd4afd8865bbfe5e",
        "name": "",
        "splt": "\\n",
        "spltType": "str",
        "arraySplt": 1,
        "arraySpltType": "len",
        "stream": false,
        "addname": "",
        "property": "payload",
        "x": 650,
        "y": 160,
        "wires": [
            [
                "ae0cf05b120281d5"
            ]
        ]
    }
]

/cameramapadmin

[
    {
        "id": "b28addf86596775b",
        "type": "tab",
        "label": "/cameramapadmin",
        "disabled": false,
        "info": "",
        "env": []
    },
    {
        "id": "22262ff764863a7d",
        "type": "worldmap",
        "z": "b28addf86596775b",
        "name": "",
        "lat": "30",
        "lon": "-100",
        "zoom": "18",
        "layer": "EsriS",
        "cluster": "",
        "maxage": "",
        "usermenu": "show",
        "layers": "show",
        "panit": "false",
        "panlock": "false",
        "zoomlock": "false",
        "hiderightclick": "false",
        "coords": "deg",
        "showgrid": "false",
        "showruler": "false",
        "allowFileDrop": "false",
        "path": "/cameramapadmin",
        "overlist": "",
        "maplist": "OSMG,OSMC,EsriC,EsriS",
        "mapname": "",
        "mapurl": "",
        "mapopt": "",
        "mapwms": false,
        "x": 1030,
        "y": 280,
        "wires": []
    },
    {
        "id": "eba074a4bdf03c43",
        "type": "worldmap in",
        "z": "b28addf86596775b",
        "name": "",
        "path": "/cameramapadmin",
        "events": "connect,disconnect,point,draw,layer,bounds,files,other",
        "x": 110,
        "y": 280,
        "wires": [
            [
                "94bd3683073ee7a9",
                "fe5418fc957d97ee",
                "03a28bda6eafa07a",
                "c1b245853ab11ad5"
            ]
        ]
    },
    {
        "id": "94bd3683073ee7a9",
        "type": "function",
        "z": "b28addf86596775b",
        "name": "contextmenu Add Camera",
        "func": "var menu ='<center><br/>';\n//menu += '';\nmenu += 'Frigate Camera URL<br/><input type=\"text\" id=\"frigateCamURL\" placeholder=\"http://x.x.x.x/frigateX/#CameraName\"><br/><br/>';\nmenu += '<input type=\"button\" value=\"Submit\" onclick=';\nmenu += '\\'feedback(\"myform\",{';\nmenu += '\"frigateCamURL\":document.getElementById(\"frigateCamURL\").value,';\nmenu += '\"lat\":rclk.lat.toFixed(12),';\nmenu += '\"lon\":rclk.lng.toFixed(12),';\nmenu += '},\"formAction\",true)\\' > <br/><br/> ';\n\nmsg.payload = { command: { \"contextmenu\":menu } }\n\nreturn msg;\n\n",
        "outputs": 1,
        "timeout": "",
        "noerr": 0,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 410,
        "y": 280,
        "wires": [
            [
                "22262ff764863a7d"
            ]
        ]
    },
    {
        "id": "fe5418fc957d97ee",
        "type": "switch",
        "z": "b28addf86596775b",
        "name": "",
        "property": "payload.value.frigateCamURL",
        "propertyType": "msg",
        "rules": [
            {
                "t": "cont",
                "v": "#",
                "vt": "str"
            }
        ],
        "checkall": "true",
        "repair": false,
        "outputs": 1,
        "x": 110,
        "y": 420,
        "wires": [
            [
                "b450c167f679e3fe"
            ]
        ]
    },
    {
        "id": "6a8244dbffa52d45",
        "type": "debug",
        "z": "b28addf86596775b",
        "name": "",
        "active": true,
        "tosidebar": true,
        "console": false,
        "tostatus": false,
        "complete": "true",
        "targetType": "full",
        "statusVal": "",
        "statusType": "auto",
        "x": 1130,
        "y": 420,
        "wires": []
    },
    {
        "id": "b450c167f679e3fe",
        "type": "function",
        "z": "b28addf86596775b",
        "name": "Add Camera",
        "func": "const frigateCamURL = msg.payload.value.frigateCamURL;\nconst lat = msg.payload.value.lat;\nconst lon = msg.payload.value.lon;\nvar camerasGlobalObject = global.get('cameras');\nvar cameraName = frigateCamURL.split('#')[1];\nvar baseURL = frigateCamURL.split('#')[0];\nvar imageURL = baseURL + 'api/' + cameraName + '/latest.webp?height=275';\n\n// add camera\ncamerasGlobalObject[cameraName] = {\n    name: cameraName,\n    baseURL: baseURL,\n    imageURL: imageURL,\n    reviewURL: frigateCamURL,\n    lat: lat,\n    lon: lon,\n};\nglobal.set('cameras', camerasGlobalObject);\n\nmsg.payload = camerasGlobalObject;\nreturn msg;",
        "outputs": 1,
        "timeout": 0,
        "noerr": 0,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 370,
        "y": 420,
        "wires": [
            [
                "dd67295e522719f0"
            ]
        ]
    },
    {
        "id": "cfa75bd2365b2435",
        "type": "file",
        "z": "b28addf86596775b",
        "name": "",
        "filename": "cameras.json",
        "filenameType": "str",
        "appendNewline": false,
        "createDir": false,
        "overwriteFile": "true",
        "encoding": "utf8",
        "x": 690,
        "y": 420,
        "wires": [
            [
                "6a8244dbffa52d45"
            ]
        ]
    },
    {
        "id": "dd67295e522719f0",
        "type": "json",
        "z": "b28addf86596775b",
        "name": "",
        "property": "payload",
        "action": "str",
        "pretty": false,
        "x": 510,
        "y": 420,
        "wires": [
            [
                "cfa75bd2365b2435"
            ]
        ]
    },
    {
        "id": "aa79ac3a6f2d5edd",
        "type": "file in",
        "z": "b28addf86596775b",
        "name": "",
        "filename": "cameras.json",
        "filenameType": "str",
        "format": "utf8",
        "chunk": false,
        "sendError": false,
        "encoding": "utf8",
        "allProps": false,
        "x": 330,
        "y": 40,
        "wires": [
            [
                "52e8689070516784"
            ]
        ]
    },
    {
        "id": "123f14f2cbf5ad41",
        "type": "inject",
        "z": "b28addf86596775b",
        "name": "",
        "props": [
            {
                "p": "payload"
            },
            {
                "p": "topic",
                "vt": "str"
            }
        ],
        "repeat": "",
        "crontab": "",
        "once": true,
        "onceDelay": 0.1,
        "topic": "",
        "payload": "",
        "payloadType": "date",
        "x": 110,
        "y": 40,
        "wires": [
            [
                "aa79ac3a6f2d5edd"
            ]
        ]
    },
    {
        "id": "05e15798bc3d36c5",
        "type": "debug",
        "z": "b28addf86596775b",
        "name": "",
        "active": true,
        "tosidebar": true,
        "console": false,
        "tostatus": false,
        "complete": "true",
        "targetType": "full",
        "statusVal": "",
        "statusType": "auto",
        "x": 1090,
        "y": 40,
        "wires": []
    },
    {
        "id": "685aaa075a4f1c8d",
        "type": "catch",
        "z": "b28addf86596775b",
        "name": "",
        "scope": [
            "aa79ac3a6f2d5edd"
        ],
        "uncaught": false,
        "x": 110,
        "y": 100,
        "wires": [
            [
                "a9c723185eaafc45"
            ]
        ]
    },
    {
        "id": "a9c723185eaafc45",
        "type": "switch",
        "z": "b28addf86596775b",
        "name": "cameras.json contains: no such file or directory, else",
        "property": "error.message",
        "propertyType": "msg",
        "rules": [
            {
                "t": "cont",
                "v": "no such file or directory",
                "vt": "str"
            },
            {
                "t": "else"
            }
        ],
        "checkall": "true",
        "repair": false,
        "outputs": 2,
        "x": 400,
        "y": 100,
        "wires": [
            [
                "a17f5349df83f8ae"
            ],
            [
                "238d661f47a22fdd"
            ]
        ]
    },
    {
        "id": "238d661f47a22fdd",
        "type": "debug",
        "z": "b28addf86596775b",
        "name": "",
        "active": true,
        "tosidebar": true,
        "console": false,
        "tostatus": false,
        "complete": "true",
        "targetType": "full",
        "statusVal": "",
        "statusType": "auto",
        "x": 790,
        "y": 120,
        "wires": []
    },
    {
        "id": "a17f5349df83f8ae",
        "type": "function",
        "z": "b28addf86596775b",
        "name": "",
        "func": "msg.payload = {};\nreturn msg;",
        "outputs": 1,
        "timeout": 0,
        "noerr": 0,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 720,
        "y": 80,
        "wires": [
            [
                "fe1e4011a8e5bb7f"
            ]
        ]
    },
    {
        "id": "3b05e7442f14ca73",
        "type": "file",
        "z": "b28addf86596775b",
        "name": "",
        "filename": "cameras.json",
        "filenameType": "str",
        "appendNewline": false,
        "createDir": false,
        "overwriteFile": "false",
        "encoding": "utf8",
        "x": 1070,
        "y": 80,
        "wires": [
            [
                "52e8689070516784"
            ]
        ]
    },
    {
        "id": "fe1e4011a8e5bb7f",
        "type": "json",
        "z": "b28addf86596775b",
        "name": "",
        "property": "payload",
        "action": "str",
        "pretty": false,
        "x": 890,
        "y": 80,
        "wires": [
            [
                "3b05e7442f14ca73"
            ]
        ]
    },
    {
        "id": "52e8689070516784",
        "type": "json",
        "z": "b28addf86596775b",
        "name": "",
        "property": "payload",
        "action": "obj",
        "pretty": false,
        "x": 530,
        "y": 40,
        "wires": [
            [
                "f7f25076860735dc"
            ]
        ]
    },
    {
        "id": "f7f25076860735dc",
        "type": "change",
        "z": "b28addf86596775b",
        "name": "",
        "rules": [
            {
                "t": "set",
                "p": "cameras",
                "pt": "global",
                "to": "payload",
                "tot": "msg"
            }
        ],
        "action": "",
        "property": "",
        "from": "",
        "to": "",
        "reg": false,
        "x": 750,
        "y": 40,
        "wires": [
            [
                "05e15798bc3d36c5"
            ]
        ]
    },
    {
        "id": "521fd84220a23967",
        "type": "function",
        "z": "b28addf86596775b",
        "name": "add known cameras to map",
        "func": "const camLat = Number(msg.payload.lat);\nconst camLon = Number(msg.payload.lon);\nconst cameraName = msg.payload.name;\nconst imageURL = msg.payload.imageURL;\nconst reviewURL = msg.payload.reviewURL;\n\nmsg.payload = {\n    \"name\":cameraName,\n    \"layer\":\"cameras\",\n    \"lat\":camLat,\n    \"lon\":camLon,\n    \"icon\":\"fa-video-camera\",\n    \"iconColor\":\"forestgreen\",\n    \"photoUrl\": imageURL,\n    \"weblink\":[\n        {\n            \"name\":\"Review\",\n            \"url\": reviewURL,\n            \"target\":\"_new\"\n        }]\n};\n\n\nreturn msg;\n\n",
        "outputs": 1,
        "timeout": "",
        "noerr": 0,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 740,
        "y": 340,
        "wires": [
            [
                "22262ff764863a7d"
            ]
        ]
    },
    {
        "id": "03a28bda6eafa07a",
        "type": "switch",
        "z": "b28addf86596775b",
        "name": "",
        "property": "payload.action",
        "propertyType": "msg",
        "rules": [
            {
                "t": "eq",
                "v": "connected",
                "vt": "str"
            }
        ],
        "checkall": "true",
        "repair": false,
        "outputs": 1,
        "x": 110,
        "y": 340,
        "wires": [
            [
                "21e766a363bbc0a0"
            ]
        ]
    },
    {
        "id": "21e766a363bbc0a0",
        "type": "change",
        "z": "b28addf86596775b",
        "name": "",
        "rules": [
            {
                "t": "set",
                "p": "payload",
                "pt": "msg",
                "to": "cameras",
                "tot": "global"
            }
        ],
        "action": "",
        "property": "",
        "from": "",
        "to": "",
        "reg": false,
        "x": 380,
        "y": 340,
        "wires": [
            [
                "b3f8a6ea6a91af25"
            ]
        ]
    },
    {
        "id": "b3f8a6ea6a91af25",
        "type": "split",
        "z": "b28addf86596775b",
        "name": "",
        "splt": "\\n",
        "spltType": "str",
        "arraySplt": 1,
        "arraySpltType": "len",
        "stream": false,
        "addname": "",
        "property": "payload",
        "x": 530,
        "y": 340,
        "wires": [
            [
                "521fd84220a23967"
            ]
        ]
    },
    {
        "id": "c1b245853ab11ad5",
        "type": "switch",
        "z": "b28addf86596775b",
        "name": "action: delete?",
        "property": "payload.action",
        "propertyType": "msg",
        "rules": [
            {
                "t": "eq",
                "v": "delete",
                "vt": "str"
            }
        ],
        "checkall": "true",
        "repair": false,
        "outputs": 1,
        "x": 140,
        "y": 380,
        "wires": [
            [
                "1490cf6df21b98ca"
            ]
        ]
    },
    {
        "id": "a9f194a666ada1ba",
        "type": "function",
        "z": "b28addf86596775b",
        "name": "Delete Camera",
        "func": "\nvar camerasGlobalObject = msg.camerasGlobalObject;\ndelete camerasGlobalObject[msg.payload.name];\n\nglobal.set('cameras', camerasGlobalObject);\n\nmsg.payload = camerasGlobalObject;\nreturn msg;",
        "outputs": 1,
        "timeout": 0,
        "noerr": 0,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 700,
        "y": 380,
        "wires": [
            [
                "e486d90a2a40b564"
            ]
        ]
    },
    {
        "id": "1490cf6df21b98ca",
        "type": "change",
        "z": "b28addf86596775b",
        "name": "",
        "rules": [
            {
                "t": "set",
                "p": "camerasGlobalObject",
                "pt": "msg",
                "to": "cameras",
                "tot": "global"
            }
        ],
        "action": "",
        "property": "",
        "from": "",
        "to": "",
        "reg": false,
        "x": 430,
        "y": 380,
        "wires": [
            [
                "a9f194a666ada1ba"
            ]
        ]
    },
    {
        "id": "b3721a931a153081",
        "type": "debug",
        "z": "b28addf86596775b",
        "name": "",
        "active": true,
        "tosidebar": true,
        "console": false,
        "tostatus": false,
        "complete": "true",
        "targetType": "full",
        "statusVal": "",
        "statusType": "auto",
        "x": 1150,
        "y": 380,
        "wires": []
    },
    {
        "id": "849f434fc6e87b95",
        "type": "file",
        "z": "b28addf86596775b",
        "name": "",
        "filename": "cameras.json",
        "filenameType": "str",
        "appendNewline": false,
        "createDir": false,
        "overwriteFile": "true",
        "encoding": "utf8",
        "x": 1010,
        "y": 380,
        "wires": [
            [
                "b3721a931a153081"
            ]
        ]
    },
    {
        "id": "e486d90a2a40b564",
        "type": "json",
        "z": "b28addf86596775b",
        "name": "",
        "property": "payload",
        "action": "str",
        "pretty": false,
        "x": 850,
        "y": 380,
        "wires": [
            [
                "849f434fc6e87b95"
            ]
        ]
    }
]

I'm a node-red worldmap fanatic. Got me 11 small approved pull requests to this project by @dceejay . 1/2 of it is just documentation.

1 Like

At the moment I have not much success with Frigate. I have a lot of false positives with high scores and real humans with low scores (see issue). So I can't filter away the false positives. Since others don't have this kind of problems, I assume I am doing something completely wrong. I am getting nowhere with my own home automation setup this way :frowning_face:

I was getting quite a few false positives from bushes moving in the wind - but managed to mask out the critical area over time. The key thing I found was that the "active point" that has to be inside the area is the middle of the bottom edge of the suspect area - so I only needed to mask out the bottom part of some of the areas / bushes that were causing me a problem - not the whole area.

2 Likes

@meeki007
Thanks for sharing the implementation details!

That is a brilliant idea @dceejay
That I haven't thought about that myself...
I will definately need that in the near future. Thanks for sharing!!

I got very useful feedback from the Frigate contributors yesterday.
Since the scores of my false positives where very close (read overlapping) to the scores of real humans, I needed to filter away objects with a area above a max_area. That problem is solved now.

I also learned from that discussion that their standard coco model can offer only a maximum score of 84% for real humans, and I am indeed very near to that in my setup. If you pay 50 euro per year for Frigate+, you have access to better models that can have a recognition rate to near 100% (because those models have been trained on images dedicated to video surveillance). And then you can also upload your own images to train the model even better, and remove false positives on your own camera. For now I am going to try to get it running with their free version.

Now I would really like to be able to run Frigate a some specified base path, to make sure it fits nicely into my new Tailscale setup. The problem is that Frigate does all of the routing via an Nginx reverse proxy inside their Docker container. I have already fixed some stuff in my local container, but still lot of work to do. A couple of things I found already:

  • Even when you set the base-path via a X-Ingress-Path http header, not everything works well. But they haven't responded to my discussion about that yet.
  • I found a way to allow Nginx to patch the base-path into the webmanifest file, to allow Frigate to be installed as PWA app on Android via the base-path. But currently it is still a manual fix in the Nginx config file...

So I will "try" to refactor their Nginx config myself. However my Nginx knowledge is very limited, so it will become a very painful process unfortunately...

1 Like

It seems that only the thumbnail images in my Explore tab did not load correctly.
I have added a fix for that in the discussion, so hopefully they like/approve it.
Now I need to be able to specify the sub-path /frigate as an environment variable in my docker compose file. Since Nginx does not support environment variables, it seems I need to use envsubst. But that is for the next time...

Not wanting to kill their income stream but - I wonder if its possible to substitute your own / other models ?

1 Like

Main issue is better models generally need more resources. My system, which I've mentioned in "share your projects", does an initial full frame detection using MobilenetSSDv2_coco (which can give 100% for person), followed by a digital zoom on the detection box and another inference, and then verified via yolo8 inference on the zoomed detection. On an IOT class computer it needs two Coral TPUs to get a usable frame rate for more than a camera or two.

1 Like

Yes I assume that is possible. But then you again need to gain expertise in that area, like @wb666greene and @krambriw ... In case you don't have time to become an expert, it is possible to join such a group of people that contribute to a big dataset. I don't say that I am going that road in the near future. Don't know yet. Just sharing here that it is one of the available options...

2 Likes

I have spend my evenings to figure out how some of the Frigate stuff works under the cover, and I have now maneged to:

  1. Allow the base path \frigate to be set as a simple environment variable in the docker compose file, so that I can use Frigate very easily/nicely with Tailscale.
  2. Allow Chrome on my Android phone to create an icon, to run Frigate as a PWA app (via that same base path).

That is working fine now. So far so good...

However I am not sure if my changes are going to be accepted by the Frigate team. For some reason they don't seem very enthousiastic about such a feature. I don't really understand why, because every decent application (which Frigate is btw) with a web interface offers that. Their advice is to put an extra reverse proxy in between my Tailscale agent and Frigate (to set the base path via a http header), or create a custom Nginx config file that I need to rework at every Frigate upgrade. Both will increase the complexity of my setup, and make it impossible for my wife and kids to maintain this (when I would not be around here anymore...).

Let's see and wait...

Another noob question.
In my Amcrest cams I have a high-resolution main stream which I use for recording, and a low-resolution sub stream which I use for object detection. Both stream have a different aspect ratio:

To reduce CPU usage on my Raspberry, I don't change the resolution of my streams (which I could have done like in this example):

go2rtc:
  streams:
    rtsp_cam_achter_garage_sub:
      - rtsp://<username>:<password>@<ip address>/cam/realmonitor?channel=1&subtype=1
      - ffmpeg://rtsp_cam_achter_garage_sub#video=copy
    rtsp_cam_achter_garage_main:
      - rtsp://<username>:<password>@<ip address>/cam/realmonitor?channel=1&subtype=0
      - ffmpeg:rtsp_cam_achter_garage_main#video=copy

Not sure which I should show the main or sub stream in my "Live" view. Because when showing all cams in a grid the low-resolution sub stream is enough (especially on a smartphone with a limited 4G connection), but when opening a single cam in full screen it would be nice to have high resolution? This is how I decide which of both go2rtc streams to show in the Live view:

cameras:
  cam_achter_garage:
    enabled: true
    live:
      stream_name: rtsp_cam_achter_garage_sub
  1. For the 4:3 sub stream I get black vertical borders on the left and right:

  2. For the 16:9 main stream it stretches and a lot of distortions appear:

I found this information:

in Frigate 0.14's new Live view (whether the default "All cameras" grid or any camera group draggable grids) use cells with a 16/9 aspect ratio. So for cameras with different aspect ratios, black bars around the camera images are expected and normal.

That explains the black borders for my 4:3 sub stream. But from that I would expect no distortions on my 16:9 main stream. Don't understand it :exploding_head:

Is there anybody that has any tips I could try?

I have the impression if I use my sub streams for Live view of all camera's that the distortions are solved (i.e. that the distortions occurred on the live view of main streams). Will need to check again tomorrow, because now I have only night view.

Cannot explain why. That stuff is above my paygrade...

BTW there is somebody that has shared today his own visual config editor web interface, which can connect to Frigate (see here). It is for people - like me - that don't like to digg into the full config reference to figure out which lines need to be added to the config yaml file.

Note that this is only a temporary solution/workaround. In the first comment of that discussion you will see the feedback from one of the core developers:

Hopefully the configuration editor via the standard Frigate web interface arrives in the near future...

1 Like

This may or may not be what you are seeing...

I believe that the distortions are from the lense and are possibly eliminated slightly by the corrected form factor of the change in aspect (stretching and compressing width and height) depending upon how the correction is done.

Most of these cameras have a wider field of view and this will lead to distortion of the picture - uprights will appear bowed ( - either that or you need to straighten your walls!! :rofl: ). Telephoto will show no apparent distortion - think of the often shown photo of a man down a manhole with truck seemingly being so close to him that he appears to be about to get hit by the truck.

This might help explain more.

Hi guys,
Thought everything was (more or less) working fine, but yet another problem.
In my Live view not all streams are live. Some are, but others just show an old image. When I refresh the screen sometimes other cams are live. And sometimes all cams are live, but then some of the stop live viewing. Sometimes a cam stops live viewing and then it halts and a bit later it starts again.

I don't see any pattern in this behaviour unfortunately. And I also don't have a plan of how to troubleshoot this:

  • In my Chrome developer tools network tab, I see a lot of things going on but not sure what I should expect to see here.
  • In the go2rtc web interface I can start the streams, but that is as far as my knowledge goes.
  • The frigate logs also don't make me smarter.
  • The docker container logs move too fast, so no idea what to look for.

Since version 0.14 they have removed the dropdown from the Frigate user interface where you can select a stream type (MSE, WebRTC, jsmpeg). Instead Frigate now automatically selects a stream type. On this Reddit discussion, one of their core developers explains it a bit:

MSE is used by default if a user has go2rtc set up, jsmpeg is used exclusively otherwise. If MSE is slow to connect or bandwidth is low, it falls back to jsmpeg. If you are in a two-way talk session and have enabled the microphone, WebRTC is used. If there are MSE decoding errors, it may default to WebRTC or jsmpeg, depending on the browser. There are other scenarios where different modes may be auto-selected by Frigate, but those are the big ones.
Technically you could use your browser's devtools to examine the type of data streaming over the websocket.

From that last sentence it looks like me that the data for all 3 type connections is being pushed via websockets? If I follow their advice and have a look at the Network tab in the Chrome developer tools, it looks that the stream type is MSE for the ws (websocket) channel of my cam "cam_koer_portaal". You can see that by looking at the first messages that have been pushed through that websocket channel:

In this case the cam is showing live data, which is also visible in the websocket where continiously data messages are being pushed by the Frigate container:

frigate_websocket

And indeed for the cams that have no live view, no data messages are being pushed (anymore) through their websocket channel. But don't know how to find the root cause of that?

And there is something else I don't understand. When I apply a filter to only see websocket connections, it seems also events are being pushed to update the web interface. So far so good. But I also see different websocket connections for the same camera. I first thought that perhaps - based on the above Reddit info - the Frigate user interface had decided to switch to another stream type, but it uses the same stream parameters:

Would be nice if somebody could illuminate me!

Ah I should have read the docs of the Live view a bit better:

Frigate intelligently displays your camera streams on the Live view dashboard. Your camera images update once per minute when no detectable activity is occurring to conserve bandwidth and resources. As soon as any motion is detected, cameras seamlessly switch to a live stream.

And that indeed seems to happen:

  • If there is not much motion, the image updates once every minute.
  • If e.g. a car passes by, the live stream automatically starts and ends a bit afterwards.

So a live stream is only live when enough motion...

But then I have 2 new doubts which I need figure out tonight:

  • why do I have live streaming all the time on some cams, although I "think" there is not much motion.
  • why does it detect motion when a car passes by, although I have a motion mask on the area of the public street. I assume that the shadow of the car results in a lot of pixel changes on my driveway (which is interpreted as motion).