Video flickering when showing two mp4frag streams

Yes indeed imho it is a bit ridiculous to send such high def images to the browser, since you watch them on a very small screen. Moreover things (like e.g. the timestamp) are MUCH more clearly readable with the lowest resolution:

So based on this, I would like to show the low resolution stream on my browser. Especially if I want to show a grid of NxM camera's on my screen.

Which means - like you already said before - to use a sub-stream with lower resolution for viewing, and a main-stream with high-resolution for recording.

But then I have misinterpreted in our other discussion last week. You somewhere said that the buffer output has been added to your mp4frag node, in case you want to do "extra" stuff with your segments (besides live viewing). But typically for recording you want high resolution and high speed images, which is not the case for live viewing.

So you can't use 1 rtsp stream and 1 mp4frag for both recording and live viewing. Which means you need 1 mp4frag node for viewing, and one for recording. Is that correct?

If so that would be a pitty. Because I already use the sub-stream for mjpeg streams, which allows me to get the separate images on my Raspberry (without cpu intensive decoding). Then I need to decide:

  • A high res rtsp stream (for recording) and a low resp rstp stream (for viewing)
  • A high res rtsp stream (for recording) and a low resp mjpeg stream (for object detection)

Reason for the mjpeg stream is that I wasn't able to extract the I-frames from the rtsp stream, like you adviced some time ago. Never got rid of that damn high cpu usage while doing that (see here).

Ok, so due to the large resolution images, the hls.js buffers are full of a few high resolution images. While the hls.js player needs to have a larger series of smaller images in its buffer, to play without gaps. Is that correct?