How to build a video surveillance system from scratch?

Splendid Noob level info. Just what I needed, to get started step by step, and gain understanding...
Will get back to you later on. It is my little boy's birthday today. And he has planned a lot of activities :wink:

2 Likes

As you can see in the screenshot above about my camera configuration, I have 20 fps and an I-frame interval of 50. So I expect every segment to contain 2,5 seconds of video footage? But when I use a node-red-contrib-msg-speed node, then I see that have on average 4 messages per 5 seconds:

image

Based on your explanation I would have expected only two messages to arrive every 5 seconds...
Do you have an idea what I am doing wrong?

Is there an easy way to display those segments e.g. using a node-red-contrib-image-output node (as jpeg images...)? Just to make sure that my segments contain uncorrupted images, and to be able to determine whether the quality is good (e.g. while experimenting to find an acceptable bitrate).

Message !== segment

Remember, you are piping buffer from an external software into node-red. Your system limitation for a pi is about 65k. Your segments are bigger than that, which mean you will have more messages than segments since they are broken into chunks of buffer.

1 Like

That makes sense!
Not sure how to continue from here on...
My messages contain a chunk of a segment, that contain both audio and video. So I cannot determine whether the content is correct. Would be nice if I could extract the images at this point (and display them in a node-red-contrib-image-output node), to test whether my segments are ok. Not sure whether someting like that is possible?

I believe the first chunk of each segment should be playable, but you can't play the second unless you join it to the first.

Wow when I feed it to your mp4frag nodes, then I can see the stream in my dashboard :champagne:
That is good news already...

And still would be nice if I could extract the images, e.g. for testing. Or for example to do license plate recognition, or whatever image processing...

Hey Colin,
I see buffers arriving in my messages, but I have no idea what the content is (format, ...). So would be nice to have some insight in what kind of data is running through my Node-RED wires.

Now that we know you can connect to the cam and stream copy the rtsp content into an mp4 container, we can move on to also creating jpegs. The same situation will occur when outputting other video/images if the size is larger than the system's pipe, the content will come out in chunks. So, if you want to output a jpeg without catching the chunks and re-assembling them, then you would have to lower the jpeg quality or resolution so that it will be within 65k. That would also require tweaking the settings to get right, or just simply use node-red-contrib-pipe2jpeg to catch the chunks and ensure they are complete.

If you are using the exec node, then you will either have to stop outputting mp4 on pipe:1 or use pipe:2, but then you will have to tell ffmpeg not to do error logging which normally goes to that pipe. If using node-red-contrib-ffmpeg-spawn, then you can output as much as you want by selecting more pipes than the standard stdio[1] and stdio[2].

silencing logging to be able to use sterr output for jpegs:
ffmpeg -loglevel quiet -f rtsp -rtsp_transport tcp -i rtsp://your.cameras.url -f mp4 -c:v copy -c:a aac -movflags +frag_keyframe+empty_moov+default_base_moof pipe:1 -f image2pipe -c mjpeg -vf fps=fps=1 pipe:2

or only outputting jpegs:
ffmpeg -f rtsp -rtsp_transport tcp -i rtsp://your.cameras.url -f image2pipe -c mjpeg -vf fps=fps=1 pipe:1

The vf is a filter and can also be used to change the resolution w x h of the output.

Because you will be decoding h264 and encoding jpeg, this will cause high cpu load. Depending on your system and version of ffmpeg and also the input video's resolution, you may be able to decode the h264 encoded video using hardware to reduce some cpu load.

1 Like

So if you need to have separate images in your Node-RED flow, then you would create an (extra) rtsp stream to get those images directly from the camera. I had expecgted you would propose to convert the segments (that are currently running through my Node-RED wires) to images, by an extra Exec node in my flow (which uses ffmpeg).

But that would perhaps be a bit too heavy for a Raspberry?

[EDIT] I think I am telling no-nonsense now... Your first command both gets the segments AND separate images? Or not?

yes, but you will face limits using the exec node instead of the experimental node-red-contrib-ffmpeg-spawn node. If you can confirm which node, exec or ffmpeg-spawn, then I can give better answer of how to proceed.

It depends on many factors. If you can take advantage of the sub streams from your cam and use that as the source for generating a jpeg, while also using hardware acceleration h264 encoding, then you might be able to get away with minimal cpu load. Personally, I have 28 ffmpeg instances running on a pi4 4gb node-red v1.3 and it runs pretty good, but I am not often creating jpegs so there is very little decoding/encoding, just stream copying.

That would still give a high cpu load by re-encoding, plus the extra overhead of another ffmpeg instance running. I suspect that you want a fairly high fps for your jpeg stream, or is it just 1 jpeg per 5 minute period?

Since I want to support your developments a bit, I am using the node-red-contrib-ffmpeg-spawn node.
But when I use your command in that node:

[
    "-loglevel",
    "quiet",
    "-rtsp_transport",
    "tcp",
    "-i",
    "rtsp:// ...",
    "-f",
    "image2pipe",
    "-c",
    "mjpeg",
    "vf=fps=1",
    "pipe:1"
]

Then I get this error: File 'vf=fps=1' already exists. Overwrite ? [y/N] ...

Until now I needed as much jpegs per second, to have a fluent stream. But with your nodes, all my live views and recordings will be done with mp4 fragments. That would be really nice!

From now on I only need to have jpegs at much lower rates as before. Because they will only be used for image processing. For example a cam on my driveway, for which I need jpeg images:

  1. The node-red-contrib-tfjs-coco-ssd node counts the number of cars in the image.
  2. Only proceed if the number of cars has changed : if increased then a car has arrived, if decreased then a car has departed. Of course only proceed when the number of cars is > 0 ...
  3. Then send only those images to my node-red-contrib-plate-recognizer node, for which I have about 1500 recognitions for free every month.
  4. Lookup the license plate in a list of known license plates
  5. Trigger different actions in the flow, depending whether the license plate is know or not.

So one jpeg every 30 seconds might be sufficient for this kind of use cases...

There was a typo in the command i posted. The vf part should be -vf fps=fps=1 to get 1 jpeg per second. If you want to change the rate, you can use fractions for even less, such as fps=fps=1/2

[
    "-loglevel",
    "quiet",
    "-rtsp_transport",
    "tcp",
    "-i",
    "rtsp:// ...",
    "-f",
    "image2pipe",
    "-c",
    "mjpeg",
    "-vf",
    "fps=fps=1/2",
    "pipe:1"
]

If you only want to create 1 jpeg every 30 seconds or so, it would be best if we use mp4frag to occasionally output its buffer and pass that to a another ffmpeg-spawn node to create a single jpeg. i actually do that once per 10 minutes on one of my video streams.

this flow might help:

[{"id":"1a29fa75.ef7866","type":"subflow","name":"progress","info":"","category":"","in":[{"x":60,"y":100,"wires":[{"id":"1370adb2.08e592"}]}],"out":[{"x":340,"y":100,"wires":[{"id":"1370adb2.08e592","port":1}]},{"x":340,"y":160,"wires":[{"id":"1370adb2.08e592","port":2}]}],"env":[],"color":"#DDAA99","status":{"x":340,"y":40,"wires":[{"id":"1370adb2.08e592","port":0}]}},{"id":"1370adb2.08e592","type":"function","z":"1a29fa75.ef7866","name":"progress","func":"const props = msg.payload.toString().split('\\n');\n\nconst progress = {};\n\nprops.forEach(item => {\n    \n    const [name, value] = item.split('=');\n    \n    if (name && value) {\n    \n        progress[name] = value;\n\n    }\n    \n});\n\n//node.warn(props);\n\nconst fps = progress['fps'] || '0';\n\nconst bitrate = progress['bitrate'] || '0';\n\nconst kbps = bitrate.replace('kbits/s', '');\n\nconst color = progress['progress'] === 'continue' ? 'green' : 'red';\n\nconst text = `fps: ${fps}, kbps: ${kbps}`;\n\nnode.send([{ payload: { fill: color, shape: 'dot', text } }, { payload: fps }, { payload: kbps } ]);","outputs":3,"noerr":0,"initialize":"","finalize":"","x":200,"y":100,"wires":[[],[],[]]},{"id":"32d61e0f.0490b2","type":"subflow","name":"stderr","info":"","category":"","in":[{"x":60,"y":80,"wires":[{"id":"211d46bf.a79c7a"}]}],"out":[],"env":[],"color":"#DDAA99","status":{"x":320,"y":120,"wires":[{"id":"211d46bf.a79c7a","port":1}]}},{"id":"211d46bf.a79c7a","type":"function","z":"32d61e0f.0490b2","name":"stderr","func":"const stderr = msg.payload.toString().split('\\n');\n\nnode.send([{ stderr }, { payload: {fill: 'red', text: `${new Date().toString()}` } } ]);","outputs":2,"noerr":0,"initialize":"","finalize":"","x":190,"y":73,"wires":[["c6e5408.f19c4c"],[]]},{"id":"c6e5408.f19c4c","type":"debug","z":"32d61e0f.0490b2","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"stderr","targetType":"msg","statusVal":"","statusType":"auto","x":370,"y":60,"wires":[]},{"id":"a21c0feb.7430b","type":"inject","z":"6b2d261e.1e2d48","name":"start","props":[{"p":"action","v":"{\"command\":\"start\"}","vt":"json"}],"repeat":"","crontab":"","once":true,"onceDelay":"5","topic":"","payloadType":"str","x":110,"y":200,"wires":[["2b5e30c9.9fbf6"]]},{"id":"2b5e30c9.9fbf6","type":"ffmpeg-spawn","z":"6b2d261e.1e2d48","name":"","outputs":4,"cmdPath":"","cmdArgs":"[\"-loglevel\",\"+level+fatal\",\"-nostats\",\"-rtsp_transport\",\"tcp\",\"-i\",\"rtsp://url\",\"-f\",\"mp4\",\"-c:v\",\"copy\",\"-c:a\",\"copy\",\"-movflags\",\"+frag_keyframe+empty_moov+default_base_moof\",\"-metadata\",\"title=hello bart\",\"pipe:1\",\"-progress\",\"pipe:3\"]","cmdOutputs":3,"killSignal":"SIGTERM","x":340,"y":240,"wires":[["18df6234.18f3ae"],["18df6234.18f3ae"],["7045d885.95ea38"],["bfaa7dae.a23c1"]],"info":"ffmpeg -loglevel quiet -rtsp_transport tcp -i rtsp://192.168.1.18:554/user=admin&password=pass&channel=2&stream=1.sdp -reset_timestamps 1 -an -c:v copy -f mp4 -movflags +frag_keyframe+empty_moov+default_base_moof pipe:1\n\n```\n[\n    \"-loglevel\",\n    \"error\",\n    \"-nostats\",\n    \"-rtsp_transport\",\n    \"+tcp+http+udp+udp_multicast\",\n    \"-rtsp_flags\",\n    \"+prefer_tcp\",\n    \"-i\",\n    \"rtsp://192.168.1.18:554/user=admin&password=pass&channel=1&stream=0.sdp\",\n    \"-reset_timestamps\",\n    \"1\",\n    \"-muxdelay\",\n    \"0.1\",\n    \"-an\",\n    \"-c:v\",\n    \"copy\",\n    \"-f\",\n    \"mp4\",\n    \"-movflags\",\n    \"+frag_every_frame+empty_moov+default_base_moof\",\n    \"-min_frag_duration\",\n    \"500000\",\n    \"-metadata\",\n    \"title=garage 1 main\",\n    \"-reset_timestamps\",\n    \"1\",\n    \"-vsync\",\n    \"1\",\n    \"pipe:1\"\n]\n```\n\n```\n[\n    \"-loglevel\",\n    \"fatal\",\n    \"-nostats\",\n    \"-stimeout\",\n    \"20000000\",\n    \"-rtsp_transport\",\n    \"tcp\",\n    \"-i\",\n    \"rtsp://admin:Purple@2026@192.168.1.174:554/cam/realmonitor?channel=1&subtype=0\",\n    \"-reset_timestamps\",\n    \"1\",\n    \"-muxdelay\",\n    \"0.1\",\n    \"-c:a\",\n    \"copy\",\n    \"-c:v\",\n    \"copy\",\n    \"-f\",\n    \"mp4\",\n    \"-movflags\",\n    \"+frag_every_frame+empty_moov+default_base_moof\",\n    \"-min_frag_duration\",\n    \"500000\",\n    \"-metadata\",\n    \"title=front corner main\",\n    \"-reset_timestamps\",\n    \"1\",\n    \"-vsync\",\n    \"1\",\n    \"pipe:1\",\n    \"-progress\",\n    \"pipe:3\"\n]\n```\n\n/mnt/surveillance1/mp4frag/%Y/%m/%d/\n\n\"-segment_format_options\",\n    \"movflags=+faststart\",\n    \n    movflags=+faststart:\n    \n    \"-c:v\",\n    \"copy\",\n    \"-c:a\",\n    \"copy\",\n    \"-strftime\",\n    \"1\",\n    \"-strftime_mkdir\",\n    \"1\",\n    \"-segment_time\",\n    \"30\",\n    \"-segment_atclocktime\",\n    \"1\",\n    \"-reset_timestamps\",\n    \"1\",\n    \"-f\",\n    \"segment\",\n    \"-segment_format\",\n    \"hls\",\n    \"-segment_format_options\",\n    \"movflags=+faststart\",\n    \"-metadata\",\n    \"title=front corner main recording\",\n    \"/mnt/surveillance1/mp4frag_%Y_%m_%d_file-%Y%m%d-%s.mp4\",\n    \"-y\"\n    \n    \n    \n    +default_base_moof\n    \n    -----------------------\n    \n    [\n    \"-y\",\n    \"-use_wallclock_as_timestamps\",\n    \"1\",\n    \"-loglevel\",\n    \"+level+fatal\",\n    \"-nostats\",\n    \"-stimeout\",\n    \"20000000\",\n    \"-rtsp_transport\",\n    \"tcp\",\n    \"-err_detect\",\n    \"ignore_err\",\n    \"-re\",\n    \"-i\",\n    \"rtsp://admin:Purple@2026@192.168.1.32:554/cam/realmonitor?channel=1&subtype=0\",\n    \"-c:a\",\n    \"copy\",\n    \"-c:v\",\n    \"copy\",\n    \"-f\",\n    \"mp4\",\n    \"-movflags\",\n    \"+frag_keyframe+default_base_moof\",\n    \"-metadata\",\n    \"title=front corner main live\",\n    \"pipe:1\",\n    \"-progress\",\n    \"pipe:3\",\n    \"-c:v\",\n    \"copy\",\n    \"-c:a\",\n    \"copy\",\n    \"-f\",\n    \"ssegment\",\n    \"-copytb\",\n    \"1\",\n    \"-reset_timestamps\",\n    \"1\",\n    \"-segment_format\",\n    \"mp4\",\n    \"-segment_atclocktime\",\n    \"1\",\n    \"-segment_time\",\n    \"900\",\n    \"-segment_list\",\n    \"pipe:4\",\n    \"-segment_list_type\",\n    \"flat\",\n    \"-segment_list_entry_prefix\",\n    \"/media/pi/surveillance1/mp4frag/\",\n    \"-segment_list_size\",\n    \"1\",\n    \"-segment_format_options\",\n    \"movflags=+frag_keyframe+empty_moov+default_base_moof\",\n    \"-metadata\",\n    \"title=front corner main recording\",\n    \"-strftime\",\n    \"1\",\n    \"/media/pi/surveillance1/mp4frag/front_corner_main~%Y~%m~%d~%Hh.%Mm.%Ss.mp4\",\n    \"-f\",\n    \"mp4\",\n    \"-vn\",\n    \"-c:a\",\n    \"copy\",\n    \"-movflags\",\n    \"+frag_keyframe+empty_moov+default_base_moof\",\n    \"-metadata\",\n    \"title=front corner main live audio\",\n    \"-muxdelay\",\n    \"0.1\",\n    \"-max_delay\",\n    \"0.1\",\n    \"-frag_duration\",\n    \"2000000\",\n    \"pipe:5\"\n]"},{"id":"ed279648.408e78","type":"inject","z":"6b2d261e.1e2d48","name":"stop","props":[{"p":"action","v":"{\"command\":\"stop\"}","vt":"json"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payloadType":"str","x":110,"y":280,"wires":[["2b5e30c9.9fbf6"]]},{"id":"18df6234.18f3ae","type":"mp4frag","z":"6b2d261e.1e2d48","name":"","hlsPlaylistSize":"10","hlsPlaylistExtra":"5","basePath":"id","repeated":"false","timeLimit":"5000","preBuffer":"1","autoStart":"false","x":640,"y":140,"wires":[[],["a0699adf.6c46b8"]]},{"id":"7045d885.95ea38","type":"subflow:32d61e0f.0490b2","z":"6b2d261e.1e2d48","name":"","env":[],"x":570,"y":220,"wires":[]},{"id":"bfaa7dae.a23c1","type":"subflow:1a29fa75.ef7866","z":"6b2d261e.1e2d48","name":"progress","env":[],"x":580,"y":300,"wires":[["54d8fa93.685b54"],["19fbd271.47958e"]]},{"id":"2a19bbd9.96e314","type":"inject","z":"6b2d261e.1e2d48","name":"write start","props":[{"p":"action","v":"{\"command\":\"start\",\"subject\":\"write\",\"preBuffer\":1,\"timeLimit\":3000,\"repeated\":false}","vt":"json"}],"repeat":"30","crontab":"","once":true,"onceDelay":"30","topic":"","payloadType":"str","x":350,"y":80,"wires":[["18df6234.18f3ae"]]},{"id":"a0699adf.6c46b8","type":"ffmpeg-spawn","z":"6b2d261e.1e2d48","name":"","outputs":3,"cmdPath":"","cmdArgs":"[\"-loglevel\",\"+level+fatal\",\"-nostats\",\"-f\",\"mp4\",\"-i\",\"pipe:0\",\"-vf\",\"drawtext=text='%{localtime\\\\:%a %b %d %Y %H꞉%M꞉%S}':x=(w-text_w)/2:y=(h-text_h)/2:fontsize=120:fontcolor=black:box=1:boxborderw=10:boxcolor=white@0.5\",\"-c\",\"mjpeg\",\"-f\",\"image2pipe\",\"-vframes\",\"1\",\"pipe:1\"]","cmdOutputs":2,"killSignal":"SIGTERM","x":940,"y":140,"wires":[[],["2f843587.558ada"],["4caf5ea7.8cbcd"]],"info":"[ffmpeg drawtext](https://ffmpeg.org/ffmpeg-filters.html#drawtext)\n\n[drawtext tutorial](https://ottverse.com/ffmpeg-drawtext-filter-dynamic-overlays-timecode-scrolling-text-credits/)\n\n\n[\n    \"-f\",\n    \"mp4\",\n    \"-i\",\n    \"pipe:0\",\n    \"-vf\",\n    \"drawtext=text='video playback ready':x=(w-text_w)/2:y=(h-text_h)/2:fontsize=120:fontcolor=black:box=1:boxborderw=10:boxcolor=white@0.5\",\n    \"-c\",\n    \"mjpeg\",\n    \"-f\",\n    \"image2pipe\",\n    \"-vframes\",\n    \"1\",\n    \"pipe:1\"\n]"},{"id":"54d8fa93.685b54","type":"ui_text","z":"6b2d261e.1e2d48","group":"","order":4,"width":"2","height":"1","name":"fps","label":"fps","format":"{{msg.payload}}","layout":"col-center","x":790,"y":260,"wires":[]},{"id":"19fbd271.47958e","type":"ui_text","z":"6b2d261e.1e2d48","group":"","order":3,"width":"2","height":"1","name":"kbps","label":"kbps","format":"{{msg.payload}}","layout":"col-center","x":790,"y":340,"wires":[]},{"id":"2f843587.558ada","type":"pipe2jpeg","z":"6b2d261e.1e2d48","name":"","x":1180,"y":100,"wires":[["6ef1021c.45c03c"]]},{"id":"4caf5ea7.8cbcd","type":"subflow:32d61e0f.0490b2","z":"6b2d261e.1e2d48","name":"","env":[],"x":1170,"y":180,"wires":[]},{"id":"6ef1021c.45c03c","type":"ui_template","z":"6b2d261e.1e2d48","group":"","name":"","order":5,"width":"4","height":"3","format":"<img ng-src=\"{{src}}\" ng-on-load=\"onLoad()\"/>\n\n<script>\n\n((scope) => {\n\n    scope.$watch('msg', (msg) => {\n\n        if (msg && msg.payload instanceof ArrayBuffer) {\n\n            const arrayBufferView = new Uint8Array(msg.payload);\n    \n            const blob = new window.Blob([arrayBufferView], { type: 'image/jpeg' });\n    \n            const urlCreator = window.URL || window.webkitURL;\n    \n            const objectURL = urlCreator.createObjectURL(blob);\n\n            scope.src = objectURL;\n\n            scope.onLoad = () => {\n\n                urlCreator.revokeObjectURL(objectURL);\n\n            }\n\n        }\n\n    });\n\n})(scope);\n</script>\n","storeOutMessages":true,"fwdInMessages":true,"resendOnRefresh":true,"templateScope":"local","x":1400,"y":100,"wires":[[]]}]

So the "-vf" defines a filtergraph which is a chain of connected filters. And in this case it contains twice "fps", because "fps" is also an input of the "fps" filter:

-vf fps=fps=1/60
     ↑   ↑   ↑
     |   |   |
     |   |   |__ value
     |   |______ option
     |__________ filter

It starts to get making sense in my head.

But I cannot get it working. The node-red-contrib-image-output shows me from time to time a camera image, but most of the time I get an "undefined" unfortunately. Even if I

ffmpeg_framerate

When I see at the buffer sizes that arrive, there is a lot of size differences:

image

I would expect the size to be rather constant, because all the recorded images contain the same data?

When I only display buffers with length > 20000 with this flow:

[
    {
        "id": "1c9e5395db440d48",
        "type": "inject",
        "z": "09a003ecac576bb5",
        "name": "start default",
        "props": [
            {
                "p": "action",
                "v": "{\"command\":\"start\"}",
                "vt": "json"
            }
        ],
        "repeat": "",
        "crontab": "",
        "once": false,
        "onceDelay": "1",
        "topic": "",
        "x": 370,
        "y": 1580,
        "wires": [
            [
                "3533dab8e87d80f0"
            ]
        ]
    },
    {
        "id": "3533dab8e87d80f0",
        "type": "ffmpeg-spawn",
        "z": "09a003ecac576bb5",
        "name": "Rtsp stream cam hikvision",
        "outputs": 2,
        "cmdPath": "ffmpeg",
        "cmdArgs": "[\"-loglevel\",\"quiet\",\"-rtsp_transport\",\"tcp\",\"-i\",\"<my_rtsp_url>\",\"-f\",\"image2pipe\",\"-c\",\"mjpeg\",\"-vf\",\"fps=fps=4\",\"pipe:1\"]",
        "cmdOutputs": 1,
        "killSignal": "SIGTERM",
        "x": 620,
        "y": 1600,
        "wires": [
            [],
            [
                "ccc17d59ca4f7d9e"
            ]
        ],
        "info": "ORIGINAL Kevin ffmpeg:\n------------------------\n[\n    \"-loglevel\",\n    \"error\",\n    \"-nostats\",\n    \"-f\",\n    \"hls\",\n    \"-http_multiple\",\n    \"1\",\n    \"-re\",\n    \"-i\",\n    \"https://weather-lh.akamaihd.net/i/twc_1@92006/index_1200_av-p.m3u8?sd=10&rebase=on\",\n    \"-c:v\",\n    \"copy\",\n    \"-c:a\",\n    \"aac\",\n    \"-f\",\n    \"mp4\",\n    \"-movflags\",\n    \"+frag_keyframe+empty_moov+default_base_moof\",\n    \"pipe:1\",\n    \"-progress\",\n    \"pipe:3\",\n    \"-f\",\n    \"image2pipe\",\n    \"-vf\",\n    \"select='eq(pict_type,PICT_TYPE_I)',scale=trunc(iw/4):-2\",\n    \"-vsync\",\n    \"vfr\",\n    \"pipe:4\"\n]"
    },
    {
        "id": "dc99fd8dde2910c4",
        "type": "inject",
        "z": "09a003ecac576bb5",
        "name": "stop default",
        "props": [
            {
                "p": "action",
                "v": "{\"command\":\"stop\"}",
                "vt": "json"
            }
        ],
        "repeat": "",
        "crontab": "",
        "once": false,
        "onceDelay": 0.1,
        "topic": "",
        "payloadType": "str",
        "x": 370,
        "y": 1640,
        "wires": [
            [
                "3533dab8e87d80f0"
            ]
        ]
    },
    {
        "id": "8c77927894271e70",
        "type": "image",
        "z": "09a003ecac576bb5",
        "name": "",
        "width": "320",
        "data": "payload",
        "dataType": "msg",
        "thumbnail": false,
        "active": true,
        "pass": false,
        "outputs": 0,
        "x": 1340,
        "y": 1600,
        "wires": []
    },
    {
        "id": "ccc17d59ca4f7d9e",
        "type": "function",
        "z": "09a003ecac576bb5",
        "name": "Get buffer size",
        "func": "return [msg, {payload: msg.payload.length}];",
        "outputs": 2,
        "noerr": 0,
        "initialize": "",
        "finalize": "",
        "libs": [],
        "x": 900,
        "y": 1600,
        "wires": [
            [
                "2fce4b4ac529e197"
            ],
            [
                "faa00215c4bcc48a"
            ]
        ],
        "outputLabels": [
            "Original msg",
            "Buffer length"
        ]
    },
    {
        "id": "faa00215c4bcc48a",
        "type": "debug",
        "z": "09a003ecac576bb5",
        "name": "Buffer length",
        "active": true,
        "tosidebar": true,
        "console": false,
        "tostatus": false,
        "complete": "payload",
        "targetType": "msg",
        "statusVal": "",
        "statusType": "auto",
        "x": 1090,
        "y": 1680,
        "wires": []
    },
    {
        "id": "2fce4b4ac529e197",
        "type": "switch",
        "z": "09a003ecac576bb5",
        "name": "Buffer size",
        "property": "payload.length",
        "propertyType": "msg",
        "rules": [
            {
                "t": "gt",
                "v": "20000",
                "vt": "num"
            },
            {
                "t": "else"
            }
        ],
        "checkall": "true",
        "repair": false,
        "outputs": 2,
        "x": 1110,
        "y": 1600,
        "wires": [
            [
                "8c77927894271e70"
            ],
            []
        ]
    }
]

Then the resulting effect is even more strange:

ffmpeg_larger_buffers

Do you have any idea what could cause this kind of behaviour?

About the incomplete images receive via rtsp. I found a solution/workaround here, posted by a rather clever dude :wink:

When I lower the resolution in my camera's web interface:

image

Then the problem is solved, and I get complete images...

It is a pitty that it is not possible to get a full resolution image this way...
And I was wondering: is it required to change the resolution in the web interface, or can I also pass the resolution in my ffmpeg rtsp command?

Perhaps then this isn't the best way to capture snapshot images.

Yes, the jpeg files are not complete. message !== jpeg, too. jpegs that are bigger than the pipe's buffer limit will be broken into chunks, same as the mp4 segments. You have to catch the pieces and re-assemble them.

Nah, we don't have to settle for that. That is why pipe2jpeg was created, and years later node-red-contrib-pipe2jpeg. It can handle all sizes of jpegs. Just plug that node after the ffmpeg's jpeg buffer output.

Both are acceptable, but you will have to experiment on what works best for your system.

Personally, I use both video streams coming from each of my cams, main and sub. I would likely choose the sub stream and use that for creating jpegs. For example, if you have an input with a resolution of 1080 and want to reduce it to 1/4 the size for a jpeg, then there be an extra cost to the rescaling.

For scaling with ffmpeg, we can add to the vf filter with the following examples.

dynamic scaling for 75 percent of the original input width and having the width and height divisible by 2 and keeping the aspect ratio:
-vf scale=trunc(iw*0.75/2)*2:-2

fixed size scaling:
-vf scale=400:300

try this flow to create a jpeg once per 30 seconds using mp4frag and a 2nd ffmpeg-spawn:

You can see in the screenshot that there have been 412 segments with durations approximately 1 second each. 412/30 ~ 14 jpegs created. This uses alot less cpu than having the 1st ffmpeg outputting a jpeg once per 30 seconds.

[{"id":"1a29fa75.ef7866","type":"subflow","name":"progress","info":"","category":"","in":[{"x":60,"y":100,"wires":[{"id":"1370adb2.08e592"}]}],"out":[{"x":340,"y":100,"wires":[{"id":"1370adb2.08e592","port":1}]},{"x":340,"y":160,"wires":[{"id":"1370adb2.08e592","port":2}]}],"env":[],"color":"#DDAA99","status":{"x":340,"y":40,"wires":[{"id":"1370adb2.08e592","port":0}]}},{"id":"1370adb2.08e592","type":"function","z":"1a29fa75.ef7866","name":"progress","func":"const props = msg.payload.toString().split('\\n');\n\nconst progress = {};\n\nprops.forEach(item => {\n    \n    const [name, value] = item.split('=');\n    \n    if (name && value) {\n    \n        progress[name] = value;\n\n    }\n    \n});\n\n//node.warn(props);\n\nconst fps = progress['fps'] || '0';\n\nconst bitrate = progress['bitrate'] || '0';\n\nconst kbps = bitrate.replace('kbits/s', '');\n\nconst color = progress['progress'] === 'continue' ? 'green' : 'red';\n\nconst text = `fps: ${fps}, kbps: ${kbps}`;\n\nnode.send([{ payload: { fill: color, shape: 'dot', text } }, { payload: fps }, { payload: kbps } ]);","outputs":3,"noerr":0,"initialize":"","finalize":"","x":200,"y":100,"wires":[[],[],[]]},{"id":"32d61e0f.0490b2","type":"subflow","name":"stderr","info":"","category":"","in":[{"x":60,"y":80,"wires":[{"id":"211d46bf.a79c7a"}]}],"out":[],"env":[],"color":"#DDAA99","status":{"x":320,"y":120,"wires":[{"id":"211d46bf.a79c7a","port":1}]}},{"id":"211d46bf.a79c7a","type":"function","z":"32d61e0f.0490b2","name":"stderr","func":"const stderr = msg.payload.toString().split('\\n');\n\nnode.send([{ stderr }, { payload: {fill: 'red', text: `${new Date().toString()}` } } ]);","outputs":2,"noerr":0,"initialize":"","finalize":"","x":190,"y":73,"wires":[["c6e5408.f19c4c"],[]]},{"id":"c6e5408.f19c4c","type":"debug","z":"32d61e0f.0490b2","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"stderr","targetType":"msg","statusVal":"","statusType":"auto","x":370,"y":60,"wires":[]},{"id":"b4ccc97e.e6a588","type":"inject","z":"4d9df903.2567d8","name":"start","props":[{"p":"action","v":"{\"command\":\"start\"}","vt":"json"}],"repeat":"","crontab":"","once":true,"onceDelay":"5","topic":"","payloadType":"str","x":150,"y":280,"wires":[["dc48904a.5770d"]]},{"id":"dc48904a.5770d","type":"ffmpeg-spawn","z":"4d9df903.2567d8","name":"","outputs":4,"cmdPath":"","cmdArgs":"[\"-loglevel\",\"+level+fatal\",\"-nostats\",\"-rtsp_transport\",\"tcp\",\"-i\",\"rtsp://...\",\"-f\",\"mp4\",\"-c:v\",\"copy\",\"-c:a\",\"copy\",\"-movflags\",\"+frag_keyframe+empty_moov+default_base_moof\",\"-metadata\",\"title=hello bart\",\"pipe:1\",\"-progress\",\"pipe:3\"]","cmdOutputs":3,"killSignal":"SIGTERM","x":380,"y":320,"wires":[["b4b70cb6.e18c6"],["b4b70cb6.e18c6"],["3e8cc973.e00ce6"],["38130165.18f05e"]],"info":"ffmpeg -loglevel quiet -rtsp_transport tcp -i rtsp://192.168.1.18:554/user=admin&password=pass&channel=2&stream=1.sdp -reset_timestamps 1 -an -c:v copy -f mp4 -movflags +frag_keyframe+empty_moov+default_base_moof pipe:1\n\n```\n[\n    \"-loglevel\",\n    \"error\",\n    \"-nostats\",\n    \"-rtsp_transport\",\n    \"+tcp+http+udp+udp_multicast\",\n    \"-rtsp_flags\",\n    \"+prefer_tcp\",\n    \"-i\",\n    \"rtsp://192.168.1.18:554/user=admin&password=pass&channel=1&stream=0.sdp\",\n    \"-reset_timestamps\",\n    \"1\",\n    \"-muxdelay\",\n    \"0.1\",\n    \"-an\",\n    \"-c:v\",\n    \"copy\",\n    \"-f\",\n    \"mp4\",\n    \"-movflags\",\n    \"+frag_every_frame+empty_moov+default_base_moof\",\n    \"-min_frag_duration\",\n    \"500000\",\n    \"-metadata\",\n    \"title=garage 1 main\",\n    \"-reset_timestamps\",\n    \"1\",\n    \"-vsync\",\n    \"1\",\n    \"pipe:1\"\n]\n```\n\n```\n[\n    \"-loglevel\",\n    \"fatal\",\n    \"-nostats\",\n    \"-stimeout\",\n    \"20000000\",\n    \"-rtsp_transport\",\n    \"tcp\",\n    \"-i\",\n    \"rtsp://admin:Purple@2026@192.168.1.174:554/cam/realmonitor?channel=1&subtype=0\",\n    \"-reset_timestamps\",\n    \"1\",\n    \"-muxdelay\",\n    \"0.1\",\n    \"-c:a\",\n    \"copy\",\n    \"-c:v\",\n    \"copy\",\n    \"-f\",\n    \"mp4\",\n    \"-movflags\",\n    \"+frag_every_frame+empty_moov+default_base_moof\",\n    \"-min_frag_duration\",\n    \"500000\",\n    \"-metadata\",\n    \"title=front corner main\",\n    \"-reset_timestamps\",\n    \"1\",\n    \"-vsync\",\n    \"1\",\n    \"pipe:1\",\n    \"-progress\",\n    \"pipe:3\"\n]\n```\n\n/mnt/surveillance1/mp4frag/%Y/%m/%d/\n\n\"-segment_format_options\",\n    \"movflags=+faststart\",\n    \n    movflags=+faststart:\n    \n    \"-c:v\",\n    \"copy\",\n    \"-c:a\",\n    \"copy\",\n    \"-strftime\",\n    \"1\",\n    \"-strftime_mkdir\",\n    \"1\",\n    \"-segment_time\",\n    \"30\",\n    \"-segment_atclocktime\",\n    \"1\",\n    \"-reset_timestamps\",\n    \"1\",\n    \"-f\",\n    \"segment\",\n    \"-segment_format\",\n    \"hls\",\n    \"-segment_format_options\",\n    \"movflags=+faststart\",\n    \"-metadata\",\n    \"title=front corner main recording\",\n    \"/mnt/surveillance1/mp4frag_%Y_%m_%d_file-%Y%m%d-%s.mp4\",\n    \"-y\"\n    \n    \n    \n    +default_base_moof\n    \n    -----------------------\n    \n    [\n    \"-y\",\n    \"-use_wallclock_as_timestamps\",\n    \"1\",\n    \"-loglevel\",\n    \"+level+fatal\",\n    \"-nostats\",\n    \"-stimeout\",\n    \"20000000\",\n    \"-rtsp_transport\",\n    \"tcp\",\n    \"-err_detect\",\n    \"ignore_err\",\n    \"-re\",\n    \"-i\",\n    \"rtsp://admin:Purple@2026@192.168.1.32:554/cam/realmonitor?channel=1&subtype=0\",\n    \"-c:a\",\n    \"copy\",\n    \"-c:v\",\n    \"copy\",\n    \"-f\",\n    \"mp4\",\n    \"-movflags\",\n    \"+frag_keyframe+default_base_moof\",\n    \"-metadata\",\n    \"title=front corner main live\",\n    \"pipe:1\",\n    \"-progress\",\n    \"pipe:3\",\n    \"-c:v\",\n    \"copy\",\n    \"-c:a\",\n    \"copy\",\n    \"-f\",\n    \"ssegment\",\n    \"-copytb\",\n    \"1\",\n    \"-reset_timestamps\",\n    \"1\",\n    \"-segment_format\",\n    \"mp4\",\n    \"-segment_atclocktime\",\n    \"1\",\n    \"-segment_time\",\n    \"900\",\n    \"-segment_list\",\n    \"pipe:4\",\n    \"-segment_list_type\",\n    \"flat\",\n    \"-segment_list_entry_prefix\",\n    \"/media/pi/surveillance1/mp4frag/\",\n    \"-segment_list_size\",\n    \"1\",\n    \"-segment_format_options\",\n    \"movflags=+frag_keyframe+empty_moov+default_base_moof\",\n    \"-metadata\",\n    \"title=front corner main recording\",\n    \"-strftime\",\n    \"1\",\n    \"/media/pi/surveillance1/mp4frag/front_corner_main~%Y~%m~%d~%Hh.%Mm.%Ss.mp4\",\n    \"-f\",\n    \"mp4\",\n    \"-vn\",\n    \"-c:a\",\n    \"copy\",\n    \"-movflags\",\n    \"+frag_keyframe+empty_moov+default_base_moof\",\n    \"-metadata\",\n    \"title=front corner main live audio\",\n    \"-muxdelay\",\n    \"0.1\",\n    \"-max_delay\",\n    \"0.1\",\n    \"-frag_duration\",\n    \"2000000\",\n    \"pipe:5\"\n]"},{"id":"b4b70cb6.e18c6","type":"mp4frag","z":"4d9df903.2567d8","name":"","hlsPlaylistSize":"10","hlsPlaylistExtra":"5","basePath":"id","repeated":"false","timeLimit":"3000","preBuffer":"1","autoStart":"true","x":680,"y":220,"wires":[[],["522f5cb.734b5a4"]]},{"id":"3e8cc973.e00ce6","type":"subflow:32d61e0f.0490b2","z":"4d9df903.2567d8","name":"","env":[],"x":610,"y":300,"wires":[]},{"id":"38130165.18f05e","type":"subflow:1a29fa75.ef7866","z":"4d9df903.2567d8","name":"progress","env":[],"x":620,"y":380,"wires":[["21b001fa.1ca22e"],["5065e4ea.9d9f4c"]]},{"id":"33b4cab7.2d9b16","type":"inject","z":"4d9df903.2567d8","name":"stop","props":[{"p":"action","v":"{\"command\":\"stop\"}","vt":"json"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payloadType":"str","x":150,"y":360,"wires":[["dc48904a.5770d"]]},{"id":"522f5cb.734b5a4","type":"ffmpeg-spawn","z":"4d9df903.2567d8","name":"","outputs":3,"cmdPath":"","cmdArgs":"[\"-loglevel\",\"+level+fatal\",\"-nostats\",\"-f\",\"mp4\",\"-i\",\"pipe:0\",\"-c\",\"mjpeg\",\"-f\",\"image2pipe\",\"-vframes\",\"1\",\"-vf\",\"scale=trunc(iw*0.75/2)*2:-2\",\"pipe:1\"]","cmdOutputs":2,"killSignal":"SIGTERM","x":980,"y":220,"wires":[[],["cd27717a.1831b"],["4811dbab.5def74"]],"info":"[ffmpeg drawtext](https://ffmpeg.org/ffmpeg-filters.html#drawtext)\n\n[drawtext tutorial](https://ottverse.com/ffmpeg-drawtext-filter-dynamic-overlays-timecode-scrolling-text-credits/)\n\n\n[\n    \"-f\",\n    \"mp4\",\n    \"-i\",\n    \"pipe:0\",\n    \"-vf\",\n    \"drawtext=text='video playback ready':x=(w-text_w)/2:y=(h-text_h)/2:fontsize=120:fontcolor=black:box=1:boxborderw=10:boxcolor=white@0.5\",\n    \"-c\",\n    \"mjpeg\",\n    \"-f\",\n    \"image2pipe\",\n    \"-vframes\",\n    \"1\",\n    \"pipe:1\"\n]"},{"id":"d80db3bd.4ce42","type":"inject","z":"4d9df903.2567d8","name":"write start","props":[{"p":"action","v":"{\"command\":\"start\",\"subject\":\"write\",\"preBuffer\":1,\"timeLimit\":3000,\"repeated\":false}","vt":"json"}],"repeat":"30","crontab":"","once":false,"onceDelay":"30","topic":"","payloadType":"str","x":390,"y":160,"wires":[["b4b70cb6.e18c6"]]},{"id":"21b001fa.1ca22e","type":"ui_text","z":"4d9df903.2567d8","group":"","order":4,"width":"2","height":"1","name":"fps","label":"fps","format":"{{msg.payload}}","layout":"col-center","x":830,"y":340,"wires":[]},{"id":"5065e4ea.9d9f4c","type":"ui_text","z":"4d9df903.2567d8","group":"","order":3,"width":"2","height":"1","name":"kbps","label":"kbps","format":"{{msg.payload}}","layout":"col-center","x":830,"y":420,"wires":[]},{"id":"cd27717a.1831b","type":"pipe2jpeg","z":"4d9df903.2567d8","name":"","x":1220,"y":180,"wires":[["4b40a99d.a142d8"]]},{"id":"4811dbab.5def74","type":"subflow:32d61e0f.0490b2","z":"4d9df903.2567d8","name":"","env":[],"x":1210,"y":260,"wires":[]},{"id":"4b40a99d.a142d8","type":"ui_template","z":"4d9df903.2567d8","group":"","name":"","order":5,"width":"4","height":"3","format":"<img ng-src=\"{{src}}\" ng-on-load=\"onLoad()\"/>\n\n<script>\n\n((scope) => {\n\n    scope.$watch('msg', (msg) => {\n\n        if (msg && msg.payload instanceof ArrayBuffer) {\n\n            const arrayBufferView = new Uint8Array(msg.payload);\n    \n            const blob = new window.Blob([arrayBufferView], { type: 'image/jpeg' });\n    \n            const urlCreator = window.URL || window.webkitURL;\n    \n            const objectURL = urlCreator.createObjectURL(blob);\n\n            scope.src = objectURL;\n\n            scope.onLoad = () => {\n\n                urlCreator.revokeObjectURL(objectURL);\n\n            }\n\n        }\n\n    });\n\n})(scope);\n</script>\n","storeOutMessages":true,"fwdInMessages":true,"resendOnRefresh":true,"templateScope":"local","x":1440,"y":180,"wires":[[]]}]

p.s. I think i will build the interval output option into mp4frag to remove the need for the extra inject node. Not sure when that will happen.

1 Like

Awesome! That indeed works fine even at the highest resolution:

[
    {
        "id": "1c9e5395db440d48",
        "type": "inject",
        "z": "09a003ecac576bb5",
        "name": "start default",
        "props": [
            {
                "p": "action",
                "v": "{\"command\":\"start\"}",
                "vt": "json"
            }
        ],
        "repeat": "",
        "crontab": "",
        "once": false,
        "onceDelay": "1",
        "topic": "",
        "x": 370,
        "y": 1580,
        "wires": [
            [
                "3533dab8e87d80f0"
            ]
        ]
    },
    {
        "id": "3533dab8e87d80f0",
        "type": "ffmpeg-spawn",
        "z": "09a003ecac576bb5",
        "name": "Rtsp stream cam hikvision",
        "outputs": 2,
        "cmdPath": "ffmpeg",
        "cmdArgs": "[\"-loglevel\",\"quiet\",\"-rtsp_transport\",\"tcp\",\"-i\",\"<your_rtsp_url>\",\"-f\",\"image2pipe\",\"-c\",\"mjpeg\",\"-vf\",\"fps=fps=4\",\"pipe:1\"]",
        "cmdOutputs": 1,
        "killSignal": "SIGTERM",
        "x": 620,
        "y": 1600,
        "wires": [
            [],
            [
                "f5d45781ec2de11f"
            ]
        ],
        "info": "ORIGINAL Kevin ffmpeg:\n------------------------\n[\n    \"-loglevel\",\n    \"error\",\n    \"-nostats\",\n    \"-f\",\n    \"hls\",\n    \"-http_multiple\",\n    \"1\",\n    \"-re\",\n    \"-i\",\n    \"https://weather-lh.akamaihd.net/i/twc_1@92006/index_1200_av-p.m3u8?sd=10&rebase=on\",\n    \"-c:v\",\n    \"copy\",\n    \"-c:a\",\n    \"aac\",\n    \"-f\",\n    \"mp4\",\n    \"-movflags\",\n    \"+frag_keyframe+empty_moov+default_base_moof\",\n    \"pipe:1\",\n    \"-progress\",\n    \"pipe:3\",\n    \"-f\",\n    \"image2pipe\",\n    \"-vf\",\n    \"select='eq(pict_type,PICT_TYPE_I)',scale=trunc(iw/4):-2\",\n    \"-vsync\",\n    \"vfr\",\n    \"pipe:4\"\n]"
    },
    {
        "id": "dc99fd8dde2910c4",
        "type": "inject",
        "z": "09a003ecac576bb5",
        "name": "stop default",
        "props": [
            {
                "p": "action",
                "v": "{\"command\":\"stop\"}",
                "vt": "json"
            }
        ],
        "repeat": "",
        "crontab": "",
        "once": false,
        "onceDelay": 0.1,
        "topic": "",
        "payloadType": "str",
        "x": 370,
        "y": 1640,
        "wires": [
            [
                "3533dab8e87d80f0"
            ]
        ]
    },
    {
        "id": "8c77927894271e70",
        "type": "image",
        "z": "09a003ecac576bb5",
        "name": "",
        "width": "320",
        "data": "payload",
        "dataType": "msg",
        "thumbnail": false,
        "active": true,
        "pass": false,
        "outputs": 0,
        "x": 1060,
        "y": 1600,
        "wires": []
    },
    {
        "id": "f5d45781ec2de11f",
        "type": "pipe2jpeg",
        "z": "09a003ecac576bb5",
        "name": "",
        "x": 860,
        "y": 1600,
        "wires": [
            [
                "8c77927894271e70"
            ]
        ]
    }
]

I have some questions about this flow:

  1. The first ffmpeg command contains "-f mp4": is that because the camera uses H264? I think I'm mixing up things now ...

  2. The first ffmpeg command also contains "pipe:1" and "pipe:3" but not "pipe:2". Is "pipe:2" stderr or why isn't that used?

  3. Would be nice if you could explain a bit more in detail what the parameters in the mp4frag (playlist, prebuffer and timeout) are used for. To get the (technical) picture. And if there are best practises for those values.

  4. So this flow creates a jpeg once every 30 seconds. I assume the "-vframes 1" does the magic, by specifying that we want a single output image per segment? But does this mean that the segments have a length of 30 seconds?
    [EDIT] Now I see that the Inject node sends an input message every 30 seconds, and that input message contains a time limit of 3000. Will have to wait for your answer on question 3, and then I hopfully can understand why the value of 3000 is specified...

  5. In the second ffmpeg node there is a "-c mjpeg", which means an MJPEG code. Is that required to get separate images?

I'm feeling bad having to ask you all those questions...
But others might have similar doubts, and can learn from this discussion.

1 Like

The stream of individual images is VERY useful for image processing (face recognition, license plate recognition). But of course when you need only a snapshot image from time to time (e.g. with 20 seconds in between), a simple snapshot image (via a HTTP GET) also could do the job:

image

Although my snapshot image contains lots of blocks:

image

Will need to play with the snapshot settings. Had a quick look and I see in the Hikvision documentation "HTTP commands get stream only be available under Sub stream", so I assume I need to play with the sub-stream settings ... That is quite different to my old Panasonic cameras, where I could specify the settings (resolution ...) inside the snapshot url.

I am now wondering what is best: capturing images from an rtsp stream, or stream images via an MJPEG stream (via ffmpeg or my node-red-contrib-multipart-stream-decoder). I assume that the network bandwidth of mjpeg will be higher compared to the rtsp stream, but that the Node-RED server will consume more cpu for rtsp. Although when transcoding can be skipped, that might perhaps not be true ...

We are taking your h264 encoded video that is inside the rtsp stream and muxing it to an mp4 container so that it can be playable in the browser.

ffmpeg does it logging to stderr (stdio[2]). You can use pipe:2 to output video or images, but you must first tell ffmpeg not to output any logging by setting -loglevel quiet.

-vframes 1 tells ffmpeg to exit after it outputs 1 frame. The 3000 milliseconds is to make sure that the ffmpeg process had enough video to process to be able to create the jpeg.

We are using that to be explicit setting the codec when using the image2pipe muxer. It defaults to mjpeg if not specified, but could also use other codecs for other image types.

Bart,

I think you can specify the stream with the http get command, first digit is camera, second is stream eg 1 or 2.
I also plan to use this for alarm monitoring from my DVR -

Hi @smcgann99,
That is indeed a very nice set of nodes!
But I have already a few different brands of camera's now, and as you can see above there are already some good experiences for Amcrest cams. So I will need to expand my Onvif node set a bit more in the near future to make them a bit more user friendly. By using Onvif, my PTZ and event streams and ... should work on all camera's using identical Node-RED flows.

But this discussion is more focussed on using the mp4 related nodes from @kevinGodell. I have (e-read last week the storing-video-as-mp4 discussion again, but need to let it digest slowly. Seems there are different ways to store video, but need to have a fresh brain to get the entire 'picture'. Would like to create an overview, with pros and cons of each way of working...

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.