[beta testing] nodes for live streaming mp4

I have the node-red-dashboard installed. Now i have the following msg in node-red

Camera and PC are connected by ethernet.

On VLC the image is clear, but in dashboard i have a lot of breaks (next image for example)

However, in both cases i have a delay on image (1 or 2 seconds). I don't know is it's normal.

To fix the video smearing, add another value to the command to force ffmpeg to connect using tcp instead of the default udp. Of course, this will cause slightly more delay with video, but will ensure that the data is complete.

ffmpeg -f rtsp -rtsp_transport tcp -i rtsp://andre:913558828Aa@ -f image2pipe -q 2 -vf fps=fps=2 pipe:1

another option, in case your camera does not support tcp only, would be to give a list of preferred connection types in order -rtsp_transport +tcp+http+udp+udp_multicast -rtsp_flags +prefer_tcp

~2 second latency in an rtsp stream is normal. I've never found a way to reduce it, although some that were ~4 seconds could be reduced to about 2 seconds by reducing the I-frame interval.

In this moment i have this:

ffmpeg -f rtsp -rtsp_transport tcp -i rtsp://andre:913558828Aa@ -f image2pipe -q 2 -vf fps=fps=2 pipe:1

I have a good quality image and the stream needs +/- 2 seconds to start.
The only problem is that I don't have a fluid image. The image have a lot of lag. How i can solve this?

Edit: The option with a list of preferred connection types did not work.

Maybe @kevinGodell is the expert to explain these kind of things but I thought it could be good to do a small dive into the technology behind

First of all; The lag. See this interesting article describing and demonstrating why there always will be a lag when using ip cameras and rtsp in comparison with a live view (or what a non-ip camera will deliver)

Fetching the RTSP stream from a security camera or recorder involves transcoding the native stream. This transcoding not only has a CPU overhead on the device you are fetching the stream from, it introduces a delay or lag in the video stream. While IP cameras are not lag-free when compared to real-time live action, RTSP increases that lag. To demonstrate what to expect from RTSP streaming, we made the below video comparing the RTSP stream fetched from one of our 4K security cameras in 12MP mode to real live action, and to direct streaming from the camera's web service

Secondly, my understanding; Streaming mp4. The frames transferred in a mp4 stream are not complete images like in a mjpeg or TS (Transport Stream). They just include the "changes" from comparison with previous frame. So if you just capture single frames they would not be complete. That's why when you stream mp4 you need to have "some software" like VLC that assembles those frames or fragments into a complete picture. As long as you use the pipe2jpeg I'm not sure how it handles this, if there is any kind of "merging of frames" built in or so. If not that could explain what you get and what you see

You saw the errors you got when you tried to install the mp4frag nodes; missing dependencies? Makes me wonder how you have installed stuff on your system

That is because you are only sending 2 frames per second. That is your tradeoff for having a lower cpu load and less bandwidth usage. Increase the fps= until you find an acceptable (simulated video) and cpu load.

The lag is normal. The video is being consumed by ffmpeg in nodered and then sent to you. There is always some time to do these things as it decodes the rtsp video and encodes the jpeg frames.

The only way that I know of would be to directly connect to your camera with something like VLC, skipping the nodered middleman.

Interesting article. That particular line does not seem entirely accurate. If we are converting the rtsp into mp4 without changing the h264 encoded video contained within, then we are simply transmuxing with no need for transcoding. Perhaps they were using the term loosely? Ultimately, the video has to be decoded by whatever video player you use, such as your web browser or VLC, and then it paints the frames to the screen.

Hello everyone,

I have this flow in this moment:

If i have my streamming on and after i press "Stop Stream" button on dashboard (PC), the image disappear on PC and not in phone. If i press "Stop Stream" button on dashboard (phone), the image disappear on phone and not in PC.
Anyone know why?

The msg from the ui has a socketid so that any reply goes back to that ui. Delete that from the msg and it will go to all.

How i can do that? I only saw the socketid on html link of the page...

Hi all,
I'm trying to add realtime statistics to the flow like: current bitrate (Mbps), Resolution, FPS & Bandwidth estimate (Mbps). The "current bitrate" is the most important one I guess.

I've used the excellent node-red-contrib-msg-size and node-red-contrib-msg-speed nodes for that, but I'm getting different results comparing to FFMPEG, as shown below:

[{"id":"ecc625cc.78b4b8","type":"exec","z":"ef7323f9.ad195","command":"","addpay":true,"append":"","useSpawn":"false","timer":"","oldrc":false,"name":"BITRATE","x":360,"y":200,"wires":[["66230546.d4bb2c"],[],["21767a4.918bd86"]]},{"id":"e9ffed02.839e2","type":"inject","z":"ef7323f9.ad195","name":"Stop stream","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"true","payloadType":"bool","x":115,"y":200,"wires":[["cc54df93.0dd71"]],"l":false},{"id":"cc54df93.0dd71","type":"function","z":"ef7323f9.ad195","name":"stop","func":"msg = {\n    kill:'SIGHUP',\n    payload : 'SIGHUP'  \n}\n\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","x":210,"y":200,"wires":[["ecc625cc.78b4b8"]]},{"id":"59b33cf8.d685d4","type":"inject","z":"ef7323f9.ad195","name":"Stream1","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":"1","topic":"2>&1 | grep bitrate | sed 's/bitrate: \\(.*\\), kb/\\1/g'","payload":"https://file-examples-com.github.io/uploads/2017/04/file_example_MP4_480_1_5MG.mp4","payloadType":"str","x":80,"y":80,"wires":[["cdaba4b0.ca6ec8"]]},{"id":"cdaba4b0.ca6ec8","type":"change","z":"ef7323f9.ad195","name":"set ffmpeg","rules":[{"t":"set","p":"payload","pt":"msg","to":"\"ffmpeg -i \" & msg.payload & \" \" &msg.topic","tot":"jsonata"}],"action":"","property":"","from":"","to":"","reg":false,"x":230,"y":160,"wires":[["ecc625cc.78b4b8"]]},{"id":"c95eeaba.c312a8","type":"inject","z":"ef7323f9.ad195","name":"Stream2","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"2>&1 | grep bitrate | sed 's/bitrate: \\(.*\\), kb/\\1/g'","payload":"https://multiplatform-f.akamaihd.net/i/multi/april11/sintel/sintel-hd_,512x288_450_b,640x360_700_b,768x432_1000_b,1024x576_1400_m,.mp4.csmil/master.m3u8","payloadType":"str","x":80,"y":120,"wires":[["cdaba4b0.ca6ec8"]]},{"id":"2be7b45e.fa087c","type":"split","z":"ef7323f9.ad195","name":"","splt":", ","spltType":"str","arraySplt":1,"arraySpltType":"len","stream":false,"addname":"","x":210,"y":260,"wires":[["4e3df559.bb77dc"]]},{"id":"4e3df559.bb77dc","type":"join","z":"ef7323f9.ad195","name":"","mode":"custom","build":"array","property":"payload","propertyType":"msg","key":"topic","joiner":"\\n","joinerType":"str","accumulate":false,"timeout":"","count":"","reduceRight":false,"reduceExp":"","reduceInit":"","reduceInitType":"","reduceFixup":"","x":350,"y":260,"wires":[["41661916.c5cb18"]]},{"id":"845811c3.806c6","type":"debug","z":"ef7323f9.ad195","name":"","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":575,"y":260,"wires":[],"l":false},{"id":"41661916.c5cb18","type":"change","z":"ef7323f9.ad195","name":"parse","rules":[{"t":"set","p":"payload","pt":"msg","to":"payload[2]","tot":"msg"},{"t":"set","p":"payload","pt":"msg","to":"$number( $replace(msg.payload, /([^\\d])+/, \"\") )","tot":"jsonata"},{"t":"set","p":"topic","pt":"msg","to":"single_bitrate","tot":"str"}],"action":"","property":"","from":"","to":"","reg":false,"x":480,"y":260,"wires":[["845811c3.806c6"]]},{"id":"66230546.d4bb2c","type":"switch","z":"ef7323f9.ad195","name":"multi-bitrate?","property":"payload","propertyType":"msg","rules":[{"t":"cont","v":"variant_bitrate","vt":"str"},{"t":"else"}],"checkall":"true","repair":false,"outputs":2,"x":520,"y":180,"wires":[["1cd9f2b8.9912ed"],["2be7b45e.fa087c"]]},{"id":"2120220d.a93b9e","type":"inject","z":"ef7323f9.ad195","name":"Stream3","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"2>&1 | grep bitrate | sed 's/bitrate: \\(.*\\), kb/\\1/g'","payload":"https://multiplatform-f.akamaihd.net/i/multi/april11/hdworld/hdworld_,512x288_450_b,640x360_700_b,768x432_1000_b,1024x576_1400_m,.mp4.csmil/master.m3u8","payloadType":"str","x":80,"y":160,"wires":[["cdaba4b0.ca6ec8"]]},{"id":"1cd9f2b8.9912ed","type":"split","z":"ef7323f9.ad195","name":"","splt":"\\n","spltType":"str","arraySplt":1,"arraySpltType":"len","stream":false,"addname":"","x":670,"y":180,"wires":[["1fee7710.8f2289"]]},{"id":"eab37bae.aab858","type":"join","z":"ef7323f9.ad195","name":"","mode":"reduce","build":"string","property":"payload","propertyType":"msg","key":"topic","joiner":",","joinerType":"str","accumulate":false,"timeout":"","count":"2","reduceRight":false,"reduceExp":"$append($A,[payload])","reduceInit":"[]","reduceInitType":"json","reduceFixup":"","x":910,"y":180,"wires":[["54091b45.73c984"]]},{"id":"54091b45.73c984","type":"change","z":"ef7323f9.ad195","name":"distinct","rules":[{"t":"set","p":"payload","pt":"msg","to":"$distinct(msg.payload)","tot":"jsonata"},{"t":"set","p":"topic","pt":"msg","to":"multi_bitrate","tot":"str"}],"action":"","property":"","from":"","to":"","reg":false,"x":920,"y":220,"wires":[["cc30a8e5.48fdd8"]]},{"id":"cc30a8e5.48fdd8","type":"debug","z":"ef7323f9.ad195","name":"multi_bitrate","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":1015,"y":220,"wires":[],"l":false},{"id":"1fee7710.8f2289","type":"change","z":"ef7323f9.ad195","name":"trim","rules":[{"t":"set","p":"payload","pt":"msg","to":"$trim(msg.payload)","tot":"jsonata"}],"action":"","property":"","from":"","to":"","reg":false,"x":790,"y":180,"wires":[["eab37bae.aab858"]]},{"id":"21767a4.918bd86","type":"debug","z":"ef7323f9.ad195","name":"","active":false,"tosidebar":true,"console":false,"tostatus":false,"complete":"payload","targetType":"msg","statusVal":"","statusType":"auto","x":655,"y":220,"wires":[],"l":false}]

Any idea how to output the current bitrate from the exec-node or from the amazing mp4frag node? Or maybe change the FFMPEG command to make it show the current bitrate in spawn mode? Thanks!

Using the exec node in spawn mode, quiet the logs that normally are piped to stderr and add the progress flag to your regular command:
ffmpeg -loglevel quiet -progress pipe:2

This will redirect the progress to the stderr output as:


You can split it by newlines, then further split it by "=".

And if I remember correctly, if you have multiple video outputs from ffmpeg, then some of these values will only show for the 1st output and NOT the additional. For example, outputting an mp4 and flv file at the same time.

1 Like

Thanks Kevin, I'm struggling to pull it together, but I got the picture.
Hopefully someday we'll have these statistics within your mp4frag node :wink:

1 Like

Outputting additional statistics would be very easy to implement. But...

Since I am still very new to the node-red community, I am not sure what would be the expectation for receiving the data.

  • Add an additional output to the node that outputs statistical data only, such as segment size, segment number, etc. so that you can calculate the bitrate being passed through?

  • Or, push all data out a single output and expect the user to first check what type of data it is so that it can be passed to your correct node, for example, is it a playlist to send to the ui_mp4frag or just statistics for you to parse and calculate the bitrate?

Personally, I would like to use a separate output so that the end user knows exactly what type of data is expected. But this can lead to this node having too many outputs for every added feature. I think in the end, the node may end up having 4 outputs and I am not sure if this will have any negative impact.

I guess I am struggling with the overall design. Perhaps I should make a poll asking some questions.

Just my personal view; I think it is already ok what you showed in previous post, how to add params to ffmpeg. I think such things as bitrate etc is so special and maybe useful in debugging but not in a normal operation or in a normal dashboard setup. Once fixed & configured you should not need those data. It's also very simple, you can have all that data available on output 2 from the exec node and then push whatever needed to whatever gui you prefer

If you think it is valuable, for any reason, to present any of the data in the NR editor, you could eventually put that just below the node itself. So for instance, if you would like to present the bitrate or the fps, you could as example put that as info just below the ui_mp4frag node. Again, I do personally not see any "normal" need for this, except maybe that it could be justified to show the actual fps. If so you would have to do the data type check internally in the ui_mp4frag node. But could all this have a negative impact on the streaming performance. Sending a mixture of data types to be filtered and sorted by the node itself?

Nodes with multiple outputs??? I personally do not like them at all if they have more than 2 outputs, they tend to look ugly in the design. But that's again my personal feeling

I tend to agree. While interesting for debug it's not something you should need all the time, and as long as there is a way to get that information (preferably documented or by including an example) then there should be less/no need to build it in.

Secondly remember that every thing sent to the UI whether as status under the node or to a dashboard widget will take some bandwidth and handshaking etc - so if it is sent then it should probably be limited to update say 1 per sec or less.

Finally don't forget that while it's always tempting to add more features - it is the really hard to remove and/or maintain them later - so it's usually best to evolve slowly - and sometimes a pause does let the real solution present itself.

I agree with that.

Yes, I have been deliberately slow and spent much time thinking about the solution instead of just writing code.

I will give a longer response later. Have to leave for work now. Please keep the feedback coming. It definitely helps me.

1 Like

I think that I must stop dev on the mp4frag nodes for now. I have to get started on a node dedicated to ffmpeg. Exec node was good enough for now, but I must cater to some special things I would like to do with managing ffmpeg.

Once I have a good working ffmpeg node and some other things, I can make sure all of the cctv nodes are working together and then try to get these things cleaned up and get them published to npmjs for easier installation.


That's a good decision. In addition we should then (later) also try to make a flow sample involving all those nodes, making a great dashboard, to make it simple & easy for users "getting started". With views for live streams, recordings etc etc


While your excellent mp4frag nodes, which I use every day, are a huge step forward for Node Red, I've been waiting for this for so long :pray:

An additional request OFF TOPIC (If necessary, I will make a new topic for that), but as you are in the process of developing the node :yum:
Almost all of my cameras are PTZ, is there a way to click on a part of the image for the camera to go to this "target" ? (as I can do with the manufacturer's app).
Today, I know how to move my cameras with arrows on the dashboard. But as your node UI_mp4frag, is a video player, I have no way of recovering the position of the click in "the image" to deduce a direction, a rotation time or other.


I recall that @BartButenaers had been using an overlay on his video player implementation. If we overlay the video player, it will cause the built-in controls provided by your browser to be un-clickable, which means that they would have to be replaced externally. I am not saying this is impossible, but would be very difficult for me since I struggle with front end development. And I don't have any ptz cams to use for testing...

I wonder, if you already have the controls(arrows on your dashboard) to do the ptz, is there no way to convert those buttons to be transparent and clickable and placed over the ui_mp4frag player? Also, when you say "dashboard," do you mean they are on the admin editor screen or the ui? I always get confused with the naming of node-red things.

Either way, it seems like that should be possible with CSS on your html buttons. I think I use a standard id for the video player, which should make it easier to target the video element from an external ui node. I had always envisioned a separate node meant for adding a ui layer to the video player.