Media source extension video player for live streaming mp4

Also, we will need a way for the back end to trigger the front end to load and start playing when the m3u8 playlist becomes available. I could probably send some message {cmd: 'start', type: 'hls', payload: '/uri_to_playlist.m3u8'} so that the player does not need to know the url ahead of time and will be given it by wire.

That is indeed one of the many todo's for this viewer. Also when you want to show - at regular intervals - video from other camera's on the same widget, we need to have those input messages.
Will add that tonight.

I found some of my old code when playing with hls.js. At the time, I was trying to make it as close to real-time as possible when comparing it to my socket .io mp4 player. Warning, this code is about 3 years old and will need tweaking.

const hls = new window.Hls({
      liveDurationInfinity: true,
      manifestLoadingTimeOut: 1000,
      manifestLoadingMaxRetry: 30,
      manifestLoadingRetryDelay: 500
    })

I am adding a little code to send a message when initialization has happened and the m3u8 playlist first becomes available (at that moment the init-fragment only exists). Obviously, this is not set in stone. Whatever you think is appropriate for node-red naming conventions, etc., since you have many years experience being a node-red tinkerer.

this.send({cmd: 'start', type: 'hls', payload: `/${uniqueName}.m3u8`});
1 Like

Kevin,
Got only very few time tonight for Node-RED...

Seems Chrome gives errors when setting a new source (while the other source was still playing), but luckely it is a known issue and the guys from hls.js advice to create a new hls instance every time.
And that indeed seems to work: The version on Github now allows you to set a new source via input messages.*

Example flow:

image

[{"id":"c40eaa38.06e108","type":"ui_mp4_player","z":"97e84354.4bf71","group":"27b65cf1.24c8e4","order":1,"width":"12","height":"7","name":"","sourceType":"url","sourceValue":"","aspectratio":"crop","autoplay":false,"x":1490,"y":240,"wires":[[]]},{"id":"c13e5f0c.ee30c","type":"inject","z":"97e84354.4bf71","name":"Source \"Tears of steal\"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"set_source","payload":"http://demo.unified-streaming.com/video/tears-of-steel/tears-of-steel.ism/.m3u8","payloadType":"str","x":1220,"y":200,"wires":[["c40eaa38.06e108"]]},{"id":"957d021d.e9c86","type":"inject","z":"97e84354.4bf71","name":"Source \"Big Buck Bunny\"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"set_source","payload":"https://test-streams.mux.dev/x36xhzz/x36xhzz.m3u8","payloadType":"str","x":1230,"y":280,"wires":[["c40eaa38.06e108"]]},{"id":"64e741e3.46ead","type":"inject","z":"97e84354.4bf71","name":"Source \"Sintel\"","props":[{"p":"payload"},{"p":"topic","vt":"str"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"set_source","payload":"https://multiplatform-f.akamaihd.net/i/multi/april11/sintel/sintel-hd_,512x288_450_b,640x360_700_b,768x432_1000_b,1024x576_1400_m,.mp4.csmil/master.m3u8","payloadType":"str","x":1200,"y":240,"wires":[["c40eaa38.06e108"]]},{"id":"27b65cf1.24c8e4","type":"ui_group","z":"","name":"HLS demo","tab":"e6e99c09.01466","order":1,"disp":true,"width":"12","collapse":false},{"id":"e6e99c09.01466","type":"ui_tab","z":0,"name":"Home","icon":"dashboard"}]

As you can see in the dashboard, where a new movie is played every time I switch between the 3 sources:

mp4_player_set_source

You can choose whatever message structure you like. But I have used msg.topic="set_source" and msg.payload="some url" since that is a common used way to do it ...

Time is up again for today ...

BTW it would be nice if - in the near future - we could also push snapshot images and pull mjpeg streams. Because then the UI player node will be usable for much more (simple) use cases. But then the name node-red-contrib-ui-mp4-player isn't correct anymore...

I will test this out right now and implement my node to match your front end messaging format.

In response to the mjpeg stream, I would personally avoid a persistent http connection like that. For one, as you probably already know, some browsers have different limits with the number of persistent connections, such as chrome is fixed at 6, firefox might be set to 6 but can be changed, and safari seems to have a high limit, if any. Maybe I implemented it wrong and did not wrap my code good enough, but it seemed that the browser gets really laggy or even unresponsive at times. I have much better luck without issues with my 14 cams using jpegs via socket. io to view them on my iphone without any trouble. Plus, the sockets don't run into the connect limit that seems inconsistent with various browsers.

As you say, I guess the name would not make sense anymore. We could isolate the 2 as different nodes, one as mp4, and the other as jpeg. Difficult to pick that path right now. Maybe we fully build it out to play hls via http and then make a mp4 socket .io version and go from there.

Or it can be an all in one player that can dynamically check if hls(media source extensions) is supported, then falling back to see if native hls can play(safari mobile), then fallback to some type of jpeg stream which will be supported 99%.

On my tests, I added a little code to your player to stop and/or destroy the existing hls instance when creating a new one. Also, if source url is empty, then that should be a message indicating to destroy the hls instance.

// TODO if there is a previous hls instance, should we cleanup it???

if($scope.hls && $scope.hls.destroy) {

    $scope.hls.destroy();

    if (!sourceValue) {
       return;
    }
 }

And I was experimenting with trying to get hls to fallback if media source extensions not supported but native hls is supported for ios safari using some of the following taken from their example readme:

if (video.canPlayType('application/vnd.apple.mpegurl')) {
    video.src = videoSrc;
    video.addEventListener('loadedmetadata', function() {
      video.play();
    });

Can you work this into your code so that we can get mobile safari m3u8 playing?

We have to find the right combo of video attributes that will allow autoplay. Seems like muted playsinline was the bare minimum needed and it started playing for me on iphone with the previous js shown.

Hi guys,
Thanks for such a brilliant node.
I've got it working but have no sound - is that correct or is it me??
I've just built a new PC so it may be my setup that's not quite right.
Regards, David.

Yes will do it tonight. Seems there is a big time difference between my hobby hours and yours...

No didn't know that. Useful info! Does this mean that also the number of ...m3u8 connections (like we use the urls currently in the ui node) is also limited? If so, then indeed the socket.io would be the way to go...

I think I would prefer to have a single node that supports both. That you e.g. select in a dropdown what input you expect to arrive in the input messages. Because when we add features afterwards (e.g. to draw stuff in the SVG overlay layer), we would have to implement all of that twice.

I had to add the muted to start playing on chrome (windows 10)...

David, the node is developed in max 1 hour just to allow Kevin to test his backend mp4 nodes. There will be all kind of things incorrect or not working... As you can read above, I had to add 'muted' to be able to start playing on Chrome. People don't like it when websites open and immediately start playing sounds without a user interaction. Therefore most modern browsers don't allow autoplaying with sound. We need to find a solution for that. All ideas are welcome ...

Thanks for the feedback Bart - I'll be patient and wait.
Great node by-the-way.

1 Like

Hi david. Currently, Bart has an svg overlay which prevents you from interacting with the video player. Normally, you could right click the player to show controls and unmute it. Also, i think it will only play audio that is encoded with aac or some variant of that. Some ip cams use an audio codec that is not compatible, but the good news is that you can stream copy the video, while encoding the audio without taking a major hit on cpu load.

While we are testing, can you temporarily hide the svg overlay so that i can interact with the video element? Or maybe just change it z-layer for now the push the video player to the top? Thanks.

Fortunately, this limitation does not exist for m3u8. hls.js and native hls on safari mobile use fetch to make http requests. They constantly request the m3u8 playlist to check it for updates, then parse it to extract the segment file names, then fetch those. So, just tons of http requests. Nothing persistent. So, comparing fetch requests to sockets, sockets may be more efficient for both server and client.

Chicago, IL (GMT-5)

I think all modern browsers will forbid autoplay without muted. The muted playsinline along with the js code to respond to the loadedmetadata event and call video.play() is specific to mobile safari to allow autoplay. The playsinline property can probably be kept in the video element without affecting desktop browsers.

1 Like

The version on Github now contains following features:

  • Native hls support (Safari): have not tested this because I have no apples at home ...

  • Hls instance is destroyed before creating a new one.

  • I have kept the SVG overlay (for me :wink:) but I have added pointer-events:none on it, to pass the mouse events to the underlying video element. The right click context menu should now be available.

  • The playsinline attribute has been added to the video element.

  • There is now a dropdown that allows you to select a "Resizing" option (based on the CSS object-fit property):

    image

    Some resizing examples:

BTW There seems to be a cross-browser autoplay solution. Have not tested it yet. But not sure if this is just a bug that they are exploiting, because that would mean that this won't keep working in the future. Not sure ...

1 Like

I have tested it like this:

  1. I added the silence.mp3 to a "resources" subfolder in our project:

    image

  2. The endpoint has been pimped to server that mp3 file:

     RED.httpNode.get(uiPath, function(req, res) {
         var fullPath;
         
         switch(req.params.resource) { 
             case "hls.js":
                 if (hlsJsPath) {
                     fullPath = hlsJsPath;
                 }
                 break;
             case "silence.mp3":
                 fullPath = path.join(__dirname, "resources", "silence.mp3");
                 break;
             default:
                 console.log("Unknown mp4 player resource requested.");
         }
    

3.This mp3 was loaded by the UI node in the client-side:

  1. And I removed the mute attritibute on the video element.

And that works fine and the mp3 is loaded. But I still get an error on auto-playing video with audio:

image

So I don't think that hack works very well ...

I guess we have to decide whats important and what is the goal of the video player. Is it mainly playing movies such as big buck bunny or is it for live streaming video coming from a cctv camera? If it is mainly a cctv solution, then typically audio is not used. And can you image if you have multiple cams and they all have audio? Seems like the audio should be muted with the individual options to unmute on user interaction. But of course, I am more biased toward a cctv solution and not entertainment. Just my opinion.

By the way, I just got home from work and I will try out your updates. Thanks.

1 Like

Just caught a bug on my backend. Apparently, adding routes via RED.httpNode.get() does not automatically remove them when a node is closed and reinits. The routes were building up every-time I restarted the flows causing the video player to get 404 on the corrupt routes. https://github.com/kevinGodell/node-red-contrib-mp4frag/blob/master/mp4frag.js#L100

@BartButenaers Just finished testing your player against my fixed mp4frag node and it is getting better each day. A couple of things can be improved right now. And I know it will be tricky since you don't have apple products to test with.

I would make a pull request with the fixes, but my style is different than yours and it would probably change every line of code with my linter configuration.

  • Hls.isSupported() should be moved inside setupHls(), Hls.isSupported() is poorly named and doesn't actually check if native hls is supported, but only media source extensions.
  • perhaps add a user option to pick which method to try first (hls.js or native hls)
  • we probably have to keep track of which video type actually was selected because closing a native hls player is not affected by hls.destroy(). I experimented a little by setting video.pause(), video.src = '', and calling video.load(). I ran out of time to finalize this.
// the order of trying different supported types may be user configurable
if (Hls.isSupported()) {
//run hls.js setup code
} else if (video.canPlayType('application/vnd.apple.mpegurl')) {
//run native hls setup code
} else {
//sorry, maybe try some jpegs
}

That is absolutely true. Then we probably use a single (existing) audio-out node that plays the audio of a selected source...

Again I agree with you. I will add on the readme page that for non-cctv video's they might be better use the node-red-contrib-ui-media UI node...
That is why I called my previous node "node-red-contrib-ui-camera-viewer". I think our node name should reflect what we want to do with it ...

Sorry didn't have a chance yet to look at your code. Will review it this weekend! On the link you provided, I see that you use RED.httpNode.get which should be used only for UI nodes! Instead for a non-UI node you should use RED.httpAdmin.get! I assume that will be the cause of some of your troubles ...

Then indeed we wouldn't become best friends forever :wink:
I will update the code somewhere today...

Yep. I was a little confused about that. At first, I didn't know that I could even create routes from within a node. That is why I first added the context option so that I could use the http in node and have it access my flow or global context object.

That seemed a little too complicated for an end user to setup, so I looked again into adding routes dynamically. I picked apart some public code and looked into the aforementioned http in node and mimicked what they do to make the mp4 stream reachable to a ui node at the /ui/ path. From what I have seen, I think RED.httpNode.get is the appropriate api for serving /ui/ content.

I suppose my node is already doing too much by parsing the mp4 and also serving the files to the /ui/? i had thought about breaking up the code into separate nodes, but I already had my existing mp4frag lib that is battle tested and currently at 794k downloads from npm and has 0 open issues. Plus, I have little spare time and am very lazy :yawning_face:.

Now you got me thinking. Perhaps I can get you interested in a linter and moving away from var to const and let? Maybe I will make that pull request :rofl:.

Ah ok, now I get it. Normally the UI node uses RED.httpNode.get to server /ui content, and a non-UI node uses RED.httpAdmin.get. But now your non-UI node uses RED.httpNode.get to make the content available for my UI node. Now my head starts exploding :exploding_head:

I need to have a look at your node tonight, before I can give any advice to this. Unless somebody else wants to join the discussion. Normally you should pass info to my node (via a message), which I can use (on my server side) to make the data available (to my client side).

I didn't have time yet to see how your node (and hls in general) works, so give my some time to digest it.

That is ok for me. But for the linter: I use all my other nodes with the same format, so would like to keep that like it is...