Video flickering when showing two mp4frag streams

Hi folks,

Started experimenting with the mp4frag nodes from our video guru @kevinGodell.

Did some succesfull tests:

  1. Starting a single rtsp stream from camera 1 works fine, and low cpu usage on my Raspberry.
  2. Life viewing and recording such a stream works fine.
  3. Starting a second rtsp stream from camera 2 works fine. Running two streams simultaneously and still not much cpu usage. Awesome.
  4. Life viewing of this camera still works fine.
  5. However when I view the two live streams (with option hls.js preferred in the ui_frag node) simultaneously, then both streams start flickering:
    flickering

Even when I stop the second stream, the first stream keeps flickering until I redeploy my flow.

Does anybody has some tips on how to troubleshoot this? When looking at the mp4frag cheat sheet, I am not sure where to start searching. I had enabled the hls.js debug:true setting, but the console log is then filled with cryptic messages (for me at least).

Not sure if it is due to my resolution or fps, which might be too much for hls.js?

But need it so high because I use the same segments for recording...

Thanks!

Bart

1 Like

When I set the main stream on both camera's to the lowest resolution, then the images are still flickering:

image

So that is not the root cause.

What I also find weird:

  1. The flickering continues even if I stop the stream. It only stops when I restart my flow (by deploying something).

  2. The timestamp is very bad readable when I have two streams active in my dashboard:

    image

    While it is much more clear when I have only one stream active in my dashboard:

    image

Don't have any clue at all at the moment what could cause this.

The flickering seems to be happening only in Chrome (on my Windows 10 portable). If I use Edge on that same portable to view my dashboard (with the minimum cam resolution), then NO flickering...

So I have setup again the maximum resolution again on my both my camera's. Then the flickering still doesn't occur in Edge, but every few seconds a round progress bar appears, and meanwhile the stream viusally halts for about a second:

image

So that is another issue...

I have seen that flickering before. Can't remember when or which browser or operating system. Realistically, these high def videos are meant to be played only 1 at a time, but we abuse our systems and try to have the browser decode multiple using gpu or cpu. I can't remember what to do with the info right now, but there are the super secret chrome urls such as chrome://gpu/ that allow to get a sneak peak on debugging info.

Also, good luck with a bit rate of 8192. That is alot to try to get to your browser. My high def cams are set to about 3000kb/s and work pretty good. You will possibly have to lower the bitrate to something your server and/or browser can handle.

If using mp4frag, you can try to increase the size and extra, which may help hlsjs by making more media segments available. Also, read up on the settings of hlsjs. There is probably something that you can change or increase or decrease to help it.

That is normal and part of the native video player in the browser. Usually, that means it was starved and wasn't fed enough video buffer to keep playing. Makes sense if using a high bitrate and perhaps playing back on a less powerful system. Real world for me, my mac struggles to play more than 1 video at a time, but my wife can load all 11 cams (main) on the screen at a single time and they all play perfectly. The spinning icon can be hidden if you choose to hide the native video player controls.

Yes indeed imho it is a bit ridiculous to send such high def images to the browser, since you watch them on a very small screen. Moreover things (like e.g. the timestamp) are MUCH more clearly readable with the lowest resolution:

So based on this, I would like to show the low resolution stream on my browser. Especially if I want to show a grid of NxM camera's on my screen.

Which means - like you already said before - to use a sub-stream with lower resolution for viewing, and a main-stream with high-resolution for recording.

But then I have misinterpreted in our other discussion last week. You somewhere said that the buffer output has been added to your mp4frag node, in case you want to do "extra" stuff with your segments (besides live viewing). But typically for recording you want high resolution and high speed images, which is not the case for live viewing.

So you can't use 1 rtsp stream and 1 mp4frag for both recording and live viewing. Which means you need 1 mp4frag node for viewing, and one for recording. Is that correct?

If so that would be a pitty. Because I already use the sub-stream for mjpeg streams, which allows me to get the separate images on my Raspberry (without cpu intensive decoding). Then I need to decide:

  • A high res rtsp stream (for recording) and a low resp rstp stream (for viewing)
  • A high res rtsp stream (for recording) and a low resp mjpeg stream (for object detection)

Reason for the mjpeg stream is that I wasn't able to extract the I-frames from the rtsp stream, like you adviced some time ago. Never got rid of that damn high cpu usage while doing that (see here).

Ok, so due to the large resolution images, the hls.js buffers are full of a few high resolution images. While the hls.js player needs to have a larger series of smaller images in its buffer, to play without gaps. Is that correct?

No, you must adjust your settings so that it can be handled by your system. I am able to view my main stream and record it at the same time. But, I have my bitrate set to 3000, so it is manageable. Just to keep 11 cams 24/7 recording on file for 2 weeks, uses up ~ 4TB of disk space. I suppose if i used a higher bitrate, I would need much more space. The bitrate really affects quality. You should lower it a bit until you reach your least tolerable level and see how it works for you. Or get a bigger server.

I use my sub stream to view live and also generate jpegs(for video thumbnails for ui-mp4frag) and create an mjpeg stream. The mp4 video is decoded using hardware acceleration of the pi and the cpu is not very high per ffmpeg process. I know you have your heart set on using the node you created to do the mjpeg stuff, but you are limiting your options due to there being only 2 streams.

Yes, and don't forget that hlsjs is making very many http requests, constantly. That can impact performance on the express server. That's one of the reasons I made the socketio video streaming option.

I will give you some further details of my setup on pi 4 8gb so you can get a better idea of what can be done. 11 cams, each using main and sub streams. 22 ffmpegs running. 22 mp4frags. 24/7 recording all 11 main streams and they can be viewed live and also able to watch the recordings as they are being recorded. 11 sub streams generating jpegs and can be viewed as live mp4 or jpeg snapshots or mjpeg streams. 6TB external drive keeping 14 days of video that automatically deletes old video nightly to keep it only 75% full. Previously, I was on a pi4 4gb and it was all working ok, too.

If you do want to take the route of using hardware acceleration to decode some mp4, then you may need to give more memory to the gpu otherwise you will get ffmpeg errors that are not very helpful.

Also, it seems to perform much better with swap off. That may not be recommended, but it made my system run way better. I always ended up with swap even though I had plenty of extra memory. Never got to the bottom of that. Also, I think I allocated more memory to node-red, can't remember right now. Ah, yes, I just check and see that I setup an environment file for node-red and added NODE_OPTIONS='--max-old-space-size=4096'. It never uses that much, but since I have much to spare, you know I couldn't resist experimenting with it.

2 Likes

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.