Did you read all of the link provided by @realjax?
Are you saying you don't WANT to "install a web server to serve the files", or that you need direction on how to do that? Because that's what you need to do...
It is need directions on how to do that.
That is a first step.
Then maybe another step will be on node usage but that is for later.
You're gonna get push back because "how do I set up a web server" isn't really a node-red question, and there are probably hundreds of "how to install nginx" (assuming Linux) pages out there that are a single Google away. Depending on your setup (i.e., not using Docker, running Linux) that might just be "sudo apt-get install nginx" or "sudo yum install nginx" and then having ffmpeg dump its m3u8 files in the directory nginx serves by default...
There's better ways to set it up, but that would be the basic starting place.
Thank you i8beef. Well, the thing is I was hoping to be able to do all this in nodered directly and only.
So to summarise:
- I will need my current ffmpeg script
- I will need to install a webserver
- in nodered I will need to 1/ have an exec to run the ffmpeg 2/ feed one cast node with he url.
I will work on that in the week end.
You can actually set it up so NGINX does the whole thing, so that the URL is just always available. It's more reliable that way. Do a Google search for 'nginx rtsp to hls' and you'll come across a bunch of different people doing it using nginx-rtmp and ffmpeg pulls.
Note that this transcoding to HLS will ALWAYS incur a delay on the feed of at least keyframe length * playlist length (i.e., 1 second keyframe * 4 second playlist length = 4 second delay MINIMUM and often more than that).
Are there not node-red nodes that will stream files? I have to admit that there doesn't seem to be anything relevant in the node red flows library, but presumably nodejs has some streaming modules available.
The HLS protocol is completely HTTP based so there's not really real "streaming" going on from my understanding. The m3u8 file is really a "playlist" file that points at smaller mp4 files that are generated, and the browser downloads and plays one after the other. It's really just instructions for downloading and stitching multiple files together to APPEAR like a stream.
Really, it'd be much better if the cast protocol just supported RTSP/RTMP directly as they are better protocols for this...
When you cast something, you aren't actually sending the video to the cast target, you're sending it a command to open a WEB BROWSER and load web page (when you send a media item, it opens up a web page that it has by default on the cast device called DefaultMediaReceiver that just has a media player on it, and it passes the URL for said media into that web page's media player). Cast devices are more like an extremely limited web browser with a remote control.
Ergo, a "streaming" node doesn't really make sense in this context. First, he has an RTSP stream he needs transcoded, and I would totally not do that in node, or in the context of node-red where such a heavy processing operation likely would have a lot of implications. Second, the generated HLS files are just static files, so you could theoretically serve these out of the node-red HTTP nodes? But it would be so many concurrent requests and such that again, I would think a dedicated web server would be FAR better to serve those out of. It's just not the right tool for the job...
There IS a static node-red folder somewhere serving things like images / icons locally that you could theoretically put those HLS files in and reference as well, but I'm pretty sure that would ill advised...
All of this is part of the reason I tell most people to run an NGINX proxy server around node-red, and providing all the things node-red really isn't meant to be from there: static file serving, SSL, basic auth where it makes sense (e.g. IFTTT public URLs), etc. Node-red is awesome for what it is, but I it's not a panacea and once you get to these special cases, using the right tool for the job makes a lot more sense than trying to shove it all into a single node app server.
I remember setting up a similar streaming solution, then using the combo ffmpeg and ffserver. Nowadays I think ffserver is removed from the installation package but I think it can be found and still downloaded. Unless there are better suitable web servers for this purpose
Thanks @i8beef that makes a lot of sense.
1/ I set up ffmpeg to create a HLS
ffmpeg -rtsp_transport tcp -i rtsp://admin:firstname.lastname@example.org/live/ch1 \ -acodec copy \ -vcodec copy \ -hls_wrap 40 \ -flags -global_header \ /var/www/html/cam/cam.m3u8
2/ I installed nginx and made sure the service was always on.
3/ I checked with VLC.
VLC can read http://[Pi address]/cam/cam.m3u8
(although stream is not as good as original, but that is another problem I guess)
4/ In nodered I do not manage to read this stream with castV2 to chromecast, nor cast.
I tried this one too.
I would appreciate an example of flow that works.
Maybe it is mediaType I need to adapt but I tried many with no success so far.
Original stream, according to this (supported media for google cast) I set up as 264H 720P 12fps so it should be supported on that side.
The error message is "error load media".
For the other solution (nginx doing the whole thing) I will look at that later.
Definitely the wrong the content type. You're telling it to load an MP4 but sending it an M3U8. First, if you are using a supported content type (you are) you don't have to set content type at all, but if you DO you need to set the right one (for m3u8 that's "application/x-mpegURL").
VLC will play just about anything, while Chromecast is much more exacting... its good you can at least get it to play there so you know your hosting is working, but that won't mean much for whether Chromecast will be happy with it unfortunately. If its not the content type issue, it might be you need different values for ffmpeg. If I read that right, you're just using the same encodings of the original stream... maybe one of the codecs of that original RTSP stream aren't actually supported and you need to transcode. I unfortunately can't help there as I don't know much about ffmpeg. Maybe someone else here can help you to identify the codecs in it and check it against those supported by Google in the HLS container?
Well, I reworked the ffmpeg part quite a lot, with many trials, but didn't succeed so far.
Result in h264 with AAC, 25fps, 720p.
Normally all this is supported
sudo ffmpeg -rtsp_transport tcp -i rtsp://admin:email@example.com:554/live \ -copyts \ -c:v h264 -b:v 4M \ -crf 16 -quality realtime -cpu-used 8 \ -c:a aac \ -hls_wrap 40 \ -flags -global_header \ -t 20 \ /var/www/html/cam/cam.m3u8
This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.