[beta testing] nodes for live streaming mp4

For those of you interested in re-streaming your ip cams or other videos sources as HLS and playable in the browser or app, I have some nodes that are available to test.

npm install kevinGodell/node-red-contrib-mp4frag
npm install kevinGodell/node-red-contrib-ui-mp4frag

node-red-contrib-mp4frag is a wrapper around an existing lib, mp4frag. It goes a little further and sets up http routes for serving the files to any player capable of consuming HLS video, such as VLC, HLS.js in most modern browsers, or native HLS on iPhone using Safari.

node-red-contrib-ui-mp4frag is an implemention of HLS.js, but can fall back to native HLS on devices that support it. It needs much work, especially for handling the many errors that can occur with video streaming.

The 2 nodes seem tightly coupled, but are not intended to be. Just as you can view the HLS stream created by mp4frag from outside of nodered, you can also directly give an HLS playlist for ui_mp4frag to play without needing mp4frag to create it.

As for myself, I currently have 14 ip cams setup and I can play them all live to any of my laptops, desktop, or iPhones. Of course, I am using the sub stream of my cams so that I don't overload my browser when asking it to decode so many large videos simultaneously.

Future plans: I would like to setup sockets for feeding the HLS video files from server to client. HLS.js does not seem like it was created for live streaming and tends to be far behind realtime. My old video player using socket.io could keep up near realtime, delayed only by the duration of a single segment. The tradeoff for keeping up near realtime was video and audio much less smooth than HLS.js.

Edit: forgot to include the flow example code.

For trying out just the ui_mp4frag:

[{"id":"80969b76.764fc8","type":"tab","label":"Video Test 2","disabled":false,"info":""},{"id":"42030e.9afbb4f4","type":"inject","z":"80969b76.764fc8","name":"Sintel","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"https://commondatastorage.googleapis.com/gtv-videos-bucket/CastVideos/hls/Sintel.m3u8","payloadType":"str","x":136,"y":157,"wires":[["c3f14086.b7964"]]},{"id":"c3f14086.b7964","type":"ui_mp4frag","z":"80969b76.764fc8","name":"","group":"c1a8b6f8.c0022","order":3,"width":"6","height":4,"readyPoster":"https://raw.githubusercontent.com/kevinGodell/node-red-contrib-ui-mp4frag/master/video_playback_ready.png","errorPoster":"https://raw.githubusercontent.com/kevinGodell/node-red-contrib-ui-mp4frag/master/video_playback_error.png","hlsJsConfig":"{\"liveDurationInfinity\":true,\"liveBackBufferLength\":0,\"maxBufferLength\":5,\"manifestLoadingTimeOut\":1000,\"manifestLoadingMaxRetry\":10,\"manifestLoadingRetryDelay\":500}","restart":"true","autoplay":"true","x":481,"y":156,"wires":[[]]},{"id":"2a1133d2.816f2c","type":"inject","z":"80969b76.764fc8","name":"Elephants Dream","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"https://commondatastorage.googleapis.com/gtv-videos-bucket/CastVideos/hls/ElephantsDream.m3u8","payloadType":"str","x":165,"y":216,"wires":[["c3f14086.b7964"]]},{"id":"bd240e19.706d48","type":"inject","z":"80969b76.764fc8","name":"Big Buck Bunny","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"https://commondatastorage.googleapis.com/gtv-videos-bucket/CastVideos/hls/BigBuckBunny.m3u8","payloadType":"str","x":166,"y":97,"wires":[["c3f14086.b7964"]]},{"id":"c1a8b6f8.c0022","type":"ui_group","z":"","name":"Video Test 2","tab":"2418e319.003404","order":1,"disp":true,"width":"6","collapse":false},{"id":"2418e319.003404","type":"ui_tab","z":"","name":"Video Test 2","icon":"dashboard","disabled":false,"hidden":false}]

For trying out the mp4frag and ui_mp4frag:

[{"id":"198dbe2b.3d8d9a","type":"tab","label":"Video Test 1","disabled":false,"info":""},{"id":"5c1f54b6.454f7c","type":"inject","z":"198dbe2b.3d8d9a","name":"Start stream","props":[{"p":"payload"}],"repeat":"","crontab":"","once":true,"onceDelay":"1","topic":"","payload":"true","payloadType":"bool","x":190,"y":100,"wires":[["372f3933.65a7a6"]]},{"id":"d52ddf1.b4a51a","type":"inject","z":"198dbe2b.3d8d9a","name":"Stop stream","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"false","payloadType":"bool","x":190,"y":146,"wires":[["372f3933.65a7a6"]]},{"id":"372f3933.65a7a6","type":"switch","z":"198dbe2b.3d8d9a","name":"","property":"payload","propertyType":"msg","rules":[{"t":"true"},{"t":"false"}],"checkall":"true","repair":false,"outputs":2,"x":341,"y":100,"wires":[["6f7b48af.2862f"],["3f385b73.54d284"]]},{"id":"3f385b73.54d284","type":"function","z":"198dbe2b.3d8d9a","name":"stop","func":"msg = {\n    kill:'SIGHUP',\n    payload : 'SIGHUP'  \n}\n\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","x":361,"y":149,"wires":[["6f7b48af.2862f"]]},{"id":"6f7b48af.2862f","type":"exec","z":"198dbe2b.3d8d9a","command":"ffmpeg -loglevel quiet -f mp4 -re -i https://commondatastorage.googleapis.com/gtv-videos-bucket/sample/ElephantsDream.mp4 -c:a copy -c:v copy -f mp4 -movflags +frag_keyframe+empty_moov+default_base_moof pipe:1","addpay":false,"append":"","useSpawn":"true","timer":"","oldrc":false,"name":"elephants dream ffmpeg","x":572,"y":120,"wires":[["609e41bb.4ac748"],[],["609e41bb.4ac748"]]},{"id":"5aedd6d7.d727d","type":"ui_mp4frag","z":"198dbe2b.3d8d9a","name":"elephants dream ui_mp4frag","group":"28171ec9.c62efa","order":0,"width":"5","height":"4","readyPoster":"https://raw.githubusercontent.com/kevinGodell/node-red-contrib-ui-mp4frag/master/video_playback_ready.png","errorPoster":"https://raw.githubusercontent.com/kevinGodell/node-red-contrib-ui-mp4frag/master/video_playback_error.png","hlsJsConfig":"{\"liveDurationInfinity\":true,\"liveBackBufferLength\":0,\"maxBufferLength\":5,\"manifestLoadingTimeOut\":1000,\"manifestLoadingMaxRetry\":10,\"manifestLoadingRetryDelay\":500}","restart":"true","autoplay":"true","x":1103,"y":117,"wires":[[]]},{"id":"609e41bb.4ac748","type":"mp4frag","z":"198dbe2b.3d8d9a","name":"elephants dream mp4frag","hlsPlaylistSize":"10","hlsPlaylistExtra":"5","hlsPlaylistUrl":"609e41bb.4ac748","x":832,"y":117,"wires":[["5aedd6d7.d727d"]]},{"id":"5c018bd6.6c9e54","type":"inject","z":"198dbe2b.3d8d9a","name":"Start stream","props":[{"p":"payload"}],"repeat":"","crontab":"","once":true,"onceDelay":"1","topic":"","payload":"true","payloadType":"bool","x":190,"y":220,"wires":[["72de9e54.1c0068"]]},{"id":"2a3b4b00.fa1354","type":"inject","z":"198dbe2b.3d8d9a","name":"Stop stream","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"false","payloadType":"bool","x":190,"y":266,"wires":[["72de9e54.1c0068"]]},{"id":"72de9e54.1c0068","type":"switch","z":"198dbe2b.3d8d9a","name":"","property":"payload","propertyType":"msg","rules":[{"t":"true"},{"t":"false"}],"checkall":"true","repair":false,"outputs":2,"x":341,"y":220,"wires":[["9f93c481.f530c8"],["c9f285e2.5f9bb"]]},{"id":"c9f285e2.5f9bb","type":"function","z":"198dbe2b.3d8d9a","name":"stop","func":"msg = {\n    kill:'SIGHUP',\n    payload : 'SIGHUP'  \n}\n\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","x":361,"y":269,"wires":[["9f93c481.f530c8"]]},{"id":"9f93c481.f530c8","type":"exec","z":"198dbe2b.3d8d9a","command":"ffmpeg -loglevel quiet -f mp4 -re -i https://commondatastorage.googleapis.com/gtv-videos-bucket/sample/Sintel.mp4 -c:a copy -c:v copy -f mp4 -movflags +frag_keyframe+empty_moov+default_base_moof pipe:1","addpay":false,"append":"","useSpawn":"true","timer":"","oldrc":false,"name":"sintel ffmpeg","x":569,"y":240,"wires":[["47dbee0c.608cc8"],[],["47dbee0c.608cc8"]]},{"id":"a5e6df5b.fe7c28","type":"ui_mp4frag","z":"198dbe2b.3d8d9a","name":"sintel ui_mp4frag","group":"28171ec9.c62efa","order":0,"width":"5","height":"4","readyPoster":"https://raw.githubusercontent.com/kevinGodell/node-red-contrib-ui-mp4frag/master/video_playback_ready.png","errorPoster":"https://raw.githubusercontent.com/kevinGodell/node-red-contrib-ui-mp4frag/master/video_playback_error.png","hlsJsConfig":"{\"liveDurationInfinity\":true,\"liveBackBufferLength\":0,\"maxBufferLength\":5,\"manifestLoadingTimeOut\":1000,\"manifestLoadingMaxRetry\":10,\"manifestLoadingRetryDelay\":500}","restart":"true","autoplay":"true","x":1130,"y":237,"wires":[[]]},{"id":"47dbee0c.608cc8","type":"mp4frag","z":"198dbe2b.3d8d9a","name":"sintel mp4frag","hlsPlaylistSize":"10","hlsPlaylistExtra":"5","hlsPlaylistUrl":"47dbee0c.608cc8","x":828,"y":237,"wires":[["a5e6df5b.fe7c28"]]},{"id":"d22a455c.0e5c48","type":"inject","z":"198dbe2b.3d8d9a","name":"Start stream","props":[{"p":"payload"}],"repeat":"","crontab":"","once":true,"onceDelay":"1","topic":"","payload":"true","payloadType":"bool","x":190,"y":340,"wires":[["ff047477.259928"]]},{"id":"7f59ae27.f6ab3","type":"inject","z":"198dbe2b.3d8d9a","name":"Stop stream","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"false","payloadType":"bool","x":190,"y":386,"wires":[["ff047477.259928"]]},{"id":"ff047477.259928","type":"switch","z":"198dbe2b.3d8d9a","name":"","property":"payload","propertyType":"msg","rules":[{"t":"true"},{"t":"false"}],"checkall":"true","repair":false,"outputs":2,"x":341,"y":340,"wires":[["27f81ab5.717a9e"],["6af91471.f4237c"]]},{"id":"6af91471.f4237c","type":"function","z":"198dbe2b.3d8d9a","name":"stop","func":"msg = {\n    kill:'SIGHUP',\n    payload : 'SIGHUP'  \n}\n\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","x":361,"y":389,"wires":[["27f81ab5.717a9e"]]},{"id":"27f81ab5.717a9e","type":"exec","z":"198dbe2b.3d8d9a","command":"ffmpeg -loglevel quiet -f mp4 -re -i https://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4 -c:a copy -c:v copy -f mp4 -movflags +frag_keyframe+empty_moov+default_base_moof pipe:1","addpay":false,"append":"","useSpawn":"true","timer":"","oldrc":false,"name":"big buck bunny ffmpeg","x":576,"y":360,"wires":[["7c65c57a.0c4c2c"],[],["7c65c57a.0c4c2c"]]},{"id":"22fedad.e9cdaa6","type":"ui_mp4frag","z":"198dbe2b.3d8d9a","name":"big buck bunny ui_mp4frag","group":"28171ec9.c62efa","order":0,"width":"5","height":"4","readyPoster":"https://raw.githubusercontent.com/kevinGodell/node-red-contrib-ui-mp4frag/master/video_playback_ready.png","errorPoster":"https://raw.githubusercontent.com/kevinGodell/node-red-contrib-ui-mp4frag/master/video_playback_error.png","hlsJsConfig":"{\"liveDurationInfinity\":true,\"liveBackBufferLength\":0,\"maxBufferLength\":5,\"manifestLoadingTimeOut\":1000,\"manifestLoadingMaxRetry\":10,\"manifestLoadingRetryDelay\":500}","restart":"true","autoplay":"true","x":1101,"y":357,"wires":[[]]},{"id":"7c65c57a.0c4c2c","type":"mp4frag","z":"198dbe2b.3d8d9a","name":"big buck bunny mp4frag","hlsPlaylistSize":"10","hlsPlaylistExtra":"5","hlsPlaylistUrl":"7c65c57a.0c4c2c","x":832,"y":357,"wires":[["22fedad.e9cdaa6"]]},{"id":"28171ec9.c62efa","type":"ui_group","z":"","name":"Video Test 1","tab":"4e54a2ca.371c7c","order":15,"disp":true,"width":"15","collapse":true},{"id":"4e54a2ca.371c7c","type":"ui_tab","z":"","name":"Video Test 1","icon":"dashboard","disabled":false,"hidden":false}]

And some screenshots:

image

11 Likes

Hey Kevin,
this looks like some pretty nice work!!! Now finding some time to install ffmpeg and test it...

Question about the node-red-contrib-ui-mp4frag. Is that a replacement of my node-red-contrib-ui-mp4-player or did you fork it to merge it afterwards? Because in our other discussion - where I presented my player node - I thought that we would build 1 player UI node that can support in the future all kind of video sources (mp4, mjpeg, snapshots via web push) to support all kind of users. We also have lot of less technical skilled users that won't be using non-mp4 sources, and we should give those users also the possibility to view their cameras in their dashboard.
And to support multiple player nodes is (imho) quite (free) time consuming, especially if we add e.g. an SVG overlay layer and other camera viewing features. And it would be a pity if one of the players offered those ui features, while the other ui node would not have the same features ...
So I still hope you are welcoming the idea to build one single UI node for camera viewing ...

Hi @BartButenaers,

Those are great points. At this time, I am simply trying to make live streaming fragmented mp4 work efficiently and cross-browser. That is no easy task to accomplish as working with hls.js has reminded me.

The jpeg stuff is super easy and will require a small effort, since I have already made jpeg snapshot, mjpeg, and jpeg over socket.io streaming options in the past (never got around to trying regular websockets). Which reminds me, I will be building out pipe2jpeg to also serve the jpegs via http/sockets.

For the video player, since I am a very vanilla person, my code is usually very vanilla. I wear a plain colored shirt with no logo, and similarly I envision a very plain video player. I think it would be difficult for us both to commit to the same repo due to our style differences? I don't see why there there can't be multiple video player nodes, each dedicated to a specific video type, and one big one that does it all. Any development that I do will be open source and readily available to go into your "plays everything" video player.

1 Like

I appreciate your honesty :rofl:
But yeah, I know what you mean. I'm also a bit less vanilla ...
And I have T-shirts in all kind of colors :wink:

I have to work currently lots of hours for my daily job, so unfortunately very few time to test.
But there are some other folks in this community, who are very experienced with camera's in Node-RED (like @SuperNinja, @krambriw, @wb666greene ... to name only a few of them). Don't know if they all use the Node-RED dashboard, but perhaps some of them might be interested to test your new nodes.

2 Likes

+1
I already use the pipe2jpeg node for my dashboard, why not the following?

2 Likes

I have several netcams with various degrees of "Onvif" compatability. Currently the only part of Onvif I really use is the "discovery" process to get the streaming and snapshot URLs.

If you post a test flow I can try it on several of my cameras and get back to you. I'm really short if free time at the moment, so much of any "learning curve" is a real impediment for probably the rest of this lousy year.

2 Likes

Kevin, this works very well!
I have tried your flows on an "old" RPi3B+ and I must say I'm impressed with what I see

  • the quality of the videos are really superb
  • the total load on the cpu's in the RPi is just around 5-8% when all three movies are shown in my browser (and I have other stuff running there as well) and the cpu temperature is just 46 degress celsius
  • natively it worked instantly when I tried on my mac, ipad and iphone, with chrome as well as safari
  • nice controls to pause/play the videos and speaker mute on/off

So some general, just my personal ones:

I think having a single ui node that could support all formats would be the ultimate goal. Having multiple number of nodes for specific source formats (mp4, h264, rtsp, http, ts etc,) is not a problem what I think. You pick the node that fits the type of source but it would be fantastic if you do not have to worry about what ui node you should use. In the ui node additional controls could maybe be added like start recording, take snapshots etc

It would also be great if the "source nodes" themselves could "embedd" the ffmpeg command with parameters, just to get rid of the exec nodes and the confusion this would give many users

So as example, we have a number of sources, cameras providing rtsp and http, video sources like mp4 (I have a number of .ts, transport stream) and it would be elegant to choose the correct nodes for the sources and just link them to the ui node to get it positioned in a designated area in the dashboard.

Configuring the ffmpeg params is rather complex story and for most users simply impossible. For instance, how should those parameters look like if you would get the idea to stream rtsp fro camera to your nice ui_mp4frag node?

Best regards, Walter

3 Likes

Just to add, I think a good working ui node I use in my dashboard is the node-red-contrib-ui-media. Before finding that I have also used, and still use, the ui_template node/widget for http streams

1 Like

Thanks for recruiting everybody.

Did you get a chance to try it?

Was this using mp4frag with exec, or just using the other example with just ui_mp4frag? If using just the ui_mp4frag, then all the work is being done by your browser consuming the HLS video from that remote location. If your tried the example that used ffmpeg in exec, then that was connecting to a remotely hosted mp4, re-muxing it to fragmented mp4, at which point my lib takes over and creates the HLS playlist parts and serves it via http.

You give me too much credit. Those are the built-in controls created by your browser. I simply show or hide them based on when there is a video source loaded. I do have plans to create my own controls and hide the native ones. The thing that is missing from the native player is a stop button. On the client side browser, once the hls.js or native hls source is loaded, it continues to download in the background even when you are paused, which is considered to be a feature by the author of hls.js. I don't like that default behavior as it could be a large usage of bandwidth without the user aware.

I also have plans to make a ffmpeg wrapper node based on a lib I made a while back meant for keeping ffmpeg alive and restarting it if crashed, but not exactly as a daemon. I monitor progress that detects if video stream sources are stalled, kill the ffmpeg process and restart it based on some delay settings, and retry x number of times.

Also, I have to pipe more than just stdio 0, 1 and 2 that the current exec node is limited to. I reserve those 3 pipes for internal use. I pipe out video starting at stdio[3] and up. Personally, with my own processes at my home cctv system, I pipe out mp4, jpeg, and pam images using stdio 3, 4 and 5. The nice thing about nodejs is that is makes it really easy for piping and setting up fd(s) more than just the basic stdin, stdout, and stderr to move data around. I tried to do this with golang a while back but it was not supported on windows at the time and I really feel that any software I make must be cross-platform compatible, so I stuck with nodejs.

The pam image mentioned is just an uncompressed array of pixels and is used for my crude motion detection setup. I pass the pam buffer from ffmpeg to pipe2pam which passes that to pam-diff which uses polygon-points and pixel-change(written in c++ as a nodejs addon) to report back how many pixels have changed and groups them together in a way that I use to trigger that motion has occurred. At that point, I can save the mp4 buffer being held inside mp4frag so that I will have a little bit of video that happened right before motion detection has occurred.

For now, I am just trying to make basic mp4 streaming work in the browser. I haven't quite grasped the server to client life-cycle with ui nodes in node-red and it has slowed me down a bit. I had never touched anything angular before, so the learning curve for me has been very steep. I don't know what I don't know, so I am just doing much debugging to figure out the possibilities. It seems that every ui node that I have picked through are built from the same original example.

1 Like

Hi Kevin, I went for this while testing. Is this flow off-loading the RPi in my case? If your server is used, I guess that is not what you want forever?

Interesting what you mention. Been running Motion for years now, mainly for it's ability to do exactly that, counting pixel differences to react on motion and saving video before and after the event. Negative thing with Motion is the low fps when no Motion is detected. In my case I currently just have usb cameras and my main usage is anyway to do AI on frames when motion is detected. Live viewing is not such a big concern for me but it would be nice to have a little bit more than 1 fps that the http streams provide. I read Motion can also pipe but I haven't tried and I don't know if it then will be at the same love rate. I also tried source (the original video without Motion processing) but it was the same

Cheers, Walter

2 Likes

Thanks for confirming that you used both the server and client nodes and things ran ok for you.

Hi Kevin,
yes this is my result :

Very good job !! I can even watch the news at the same time ! :rofl:
image

image
I do like this: I inject anything, in your node and the streaming stops, I have no more bandwidth consumption.

What I need now is to be able to view my RTSP stream, what should I do?

3 Likes

Hey guys (@SuperNinja, @krambriw, @wb666greene),
Thanks for helping Kevin with his developments! I appreciate a lot that you have responded to my call for assistance, since I'm aware that you have not much free time (like the most of us...).
If it works fine at the end for the three of you, then I'm pretty sure his nodes can survive a nuclear blast :wink:

3 Likes

Hi Bart,
It's such an incredible nice and friendly place to be, so much interesting discussions and contributions from you and all brilliant users, this is the way things are moving forward!!!!

I really enjoy being a small part of this community, contributing with what I have and best know, by far not much compared what you and others have achieved
Best regards, Walter

4 Likes

That should be fairly straight forward. First, copy the flow example that also include exec and mp4frag. We will then edit the exec node to point the input to be the rtsp url of the ip cam and change a couple of params specific to rtsp input. The output params may stay the same or change depending if you have audio in the cam. When I get to my computer, I will give some example ffmpeg commands that might work for you.

edit:
Some of my working ffmpeg commands in the exec node:

  • this one removes audio from output
    ffmpeg -loglevel quiet -rtsp_transport tcp -i rtsp://192.168.1.4:554/user=admin_password=pass_channel=0_stream=0.sdp?real_stream -an -c:v copy -f mp4 -movflags +frag_keyframe+empty_moov+default_base_moof pipe:1

  • this one encodes audio
    ffmpeg -loglevel quiet -rtsp_transport tcp -i rtsp://admin:pass@192.168.1.14:554/stream2 -c:a aac -c:v copy -f mp4 -movflags +frag_keyframe+empty_moov+default_base_moof pipe:1

Try this. Amazing!! The cpu load on the RPi3 is just 2-5% full streaming, resolution is 1280x720

What I have seen earlier with other implementations is that streaming rtsp normally loads the cpu pretty heavy. What is the magic with this solution??

[{"id":"17b65c21.7ccc04","type":"inject","z":"252b9369.44a80c","name":"Start stream","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":"1","topic":"","payload":"true","payloadType":"bool","x":200,"y":2110,"wires":[["f5b5c38a.da9e1"]]},{"id":"276bae27.561f22","type":"inject","z":"252b9369.44a80c","name":"Stop stream","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"false","payloadType":"bool","x":200,"y":2156,"wires":[["f5b5c38a.da9e1"]]},{"id":"f5b5c38a.da9e1","type":"switch","z":"252b9369.44a80c","name":"","property":"payload","propertyType":"msg","rules":[{"t":"true"},{"t":"false"}],"checkall":"true","repair":false,"outputs":2,"x":351,"y":2110,"wires":[["ff51760d.697c18"],["c50afbe.1105d08"]]},{"id":"c50afbe.1105d08","type":"function","z":"252b9369.44a80c","name":"stop","func":"msg = {\n    kill:'SIGHUP',\n    payload : 'SIGHUP'  \n}\n\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","x":371,"y":2159,"wires":[["ff51760d.697c18"]]},{"id":"ff51760d.697c18","type":"exec","z":"252b9369.44a80c","command":"ffmpeg -loglevel quiet -rtsp_transport tcp -i rtsp://freja.hiof.no:1935/rtplive/_definst_/hessdalen03.stream -an -c:v copy -f mp4  -movflags +frag_keyframe+empty_moov+default_base_moof pipe:1","addpay":false,"append":"","useSpawn":"true","timer":"","oldrc":false,"name":"Camera in Norway","x":570,"y":2130,"wires":[["2ae7c8d7.f9d5d8","e2faba94.e42518"],[],["2ae7c8d7.f9d5d8"]]},{"id":"be24c09a.8c9c6","type":"ui_mp4frag","z":"252b9369.44a80c","name":"Camera in Norway ui_mp4frag","group":"4246a880.b5ddb8","order":0,"width":"10","height":"6","readyPoster":"https://raw.githubusercontent.com/kevinGodell/node-red-contrib-ui-mp4frag/master/video_playback_ready.png","errorPoster":"https://raw.githubusercontent.com/kevinGodell/node-red-contrib-ui-mp4frag/master/video_playback_error.png","hlsJsConfig":"{\"liveDurationInfinity\":true,\"liveBackBufferLength\":0,\"maxBufferLength\":5,\"manifestLoadingTimeOut\":1000,\"manifestLoadingMaxRetry\":10,\"manifestLoadingRetryDelay\":500}","restart":"true","autoplay":"true","x":1130,"y":2130,"wires":[[]]},{"id":"2ae7c8d7.f9d5d8","type":"mp4frag","z":"252b9369.44a80c","name":"Camera in Norway mp4frag","hlsPlaylistSize":"10","hlsPlaylistExtra":"5","hlsPlaylistUrl":"norway","x":850,"y":2130,"wires":[["be24c09a.8c9c6"]]},{"id":"e2faba94.e42518","type":"msg-speed","z":"252b9369.44a80c","name":"","frequency":"sec","estimation":true,"ignore":false,"x":800,"y":2190,"wires":[[],[]]},{"id":"4246a880.b5ddb8","type":"ui_group","z":"","name":"Video Test 1","tab":"e338d88b.0fb2b8","order":15,"disp":true,"width":"15","collapse":true},{"id":"e338d88b.0fb2b8","type":"ui_tab","z":"","name":"Video Test 1","icon":"dashboard","disabled":false,"hidden":false}]
2 Likes

Where did you find that video feed? That is a good test source.

When we use -c:v copy, we are telling ffmpeg to keep the original source and only repackage it into a new container, which is called muxing. We are simply moving the content from rtsp to mp4 without needing to decode --> encode it.

If you need to decode --> encode because you might be trying to output jpegs from rtsp, then ffmpeg with first have to decode the video using the h264 codec, then encode the jpeg using the mjpeg codec.

If the input source has a high FPS and large dimensions, then decoding will beat up your cpu and might max it out, unless you have a gpu accelerated version of the decoder, such as I am lucky to have on the pi 4. I add -hwaccel rpi -c:v h264_mmal in front of the input section of the ffmpeg command and this reduces my decoding load, but still takes some hit because the jpegs have to be encoded.

For the client side load, its best to use a HLS source that uses fragmented mp4 vs mpegts. This is because the client side lib, hls.js, will have to remux the content to mp4 before it can feed the video to the player, giving some extra burden to your browser.

Is that video feed open access? That would be another good source for testing purposes.

3 Likes

It was @BartButenaers who found it earlier, I just had it saved :wink:
The frame rate is so fast I can hardly read the update!

Thanks for the explanations!! It runs very smooth and well, no load at all, impressive!

Best regards, Walter

I have no more words to show you my admiration :star_struck:, the camera works wonderfully in fhd without burning the processor of the little RPI 3B ! :partying_face:

I would like to understand the mp4frag node, and this url:


what address should I put to run several cameras? Because he tells me that it is already used?

[EDIT] : the second ffmpeg commands, with sound encoded working well too ! HAPPY ! :hugs:

It is a pity that I have no time to test this now. But looking at the reactions, we are - thanks to Kevin - in the good direction for video surveillance in Node-RED.

@kevinGodell: hopefully not off-topic, but not sure... We have already been discussing in the past (but without result :roll_eyes: ) about storing video footage on disc, to allow us to view the stored video footage afterwards (e.g. in case of a burglary). Does that require a complete separate development, or can (larger) mp4 fragments also be used for that purpose? If this has nothing to do with mp4 fragments, just let me know and I will create a separate discussion...

When reading this discussion, I feel like a little boy in a candy store ...

3 Likes