Storing video as mp4

Hi folks,

Today a collegue of mine asked me if it is already possible to store cam footage, only by using Node-RED. So no third-party software. There have been a lot of interesting discussions been going on, about the mp4 nodes from @kevinGodell. I'm not sure at all if Kevin's nodes can already be used for that purpose? Or perhaps they are not useful for storing video for some reason?

Would appreciate if somebody could five me some context, to enlighten my shrinking brain :wink:

Thanks a lot!!
Bart

Continuous recording or based on an trigger?

If continuous, the exec node using ffmpeg can stay connected to a video stream and make mp4 and write it to disk on a continuous basis which would not need the mp4frag nodes.

If based on some event or trigger, such as motion detection or on a timer, you can have a 2nd exec node tell ffmpeg to read the video stream from node-red-contrib-mp4frag via the http api and make an mp4 video from that. Currently, there is no way of getting the video from node-red-contrib-mp4frag directly from within node. Show ip camera in dashboard and store prefixed buffer video

That is indeed a good question.
I 'assume' at the end he will need to have both. Just thinking out loud:

  • I can image that there might be camera's where e.g. no sensor is nearby so you have continuous recording. And that the footage is overwritten or removed after some while...
  • But that some camera's only start recording as soon as a nearby sensor triggers it.

So ffmpeg decodes the stream continiously and stores little mp4 files? Or one single big mp4 file?
I had expected that two exec nodes would be necessary somehow: one for decoding the stream to images (so you can show them e.g. in a dashboard), and another one to decode it back to mp4 for storage. So that the images would be traversing through the wires of the Node-RED flow. Or is that impossible (in a performant way) or other disadvantages?

Ah so not the mp4 fragments are being stored, but you should create an mp4 file from the fragments? Can this mechanism also be used then for continious storage?

Ok. And is that a technical limitation, or just the limits of your spare time?

I'm glad that you have joined the discussion. Because I had advised my collegua some time ago to use Node-RED for his home automation, so I hate to tell him now afterwards that something is not possible with Node-RED in his house :wink:

I have thought about it many times, with no clear idea of how to integrate the node-red way. Either I take the internal mp4frag object that lives inside node-red-contrib-mp4frag and make it available via context or use a 2nd output of node-red-contrib-mp4frag and pass a wrapper object that gives limited access to its api for getting chunks of mp4. At the very least, I feel like it would have to be a separate node that would handle the creation of mp4 files.

Hmm I understand your doubts. Especially since wrong choices can limit your future developments....

Something like the http request being passed between the http-in and the http-out node I assume. If I'm not mistaken, there are some arguments agains it. E.g. it doesn't work when it the message is being serialized to send it to another remote Node-RED flow via pluggable message routing ...

Could it perhaps be a solution to move that functionality from your mp4frag node into a config node? So that you can share that config node between an mp4frag node and a (future) mp4recorder node. Something like this:

image

Then the user can create a config node for every video source, and select those in a dropdown from both the mp4frag node and the mp4recorder node.

And then you don't end up with invisible wires, which Node-RED freaks don't really like :wink:
But perhaps it cannot work that way, because I don't know your mp4frag node good enough.

It is just an idea ...

1 Like

I was doing a little more thinking and I should have a way to make the buffered mp4 video pieces available without needing to hit the http api and deal with authentication middleware.

  • on input, listen for command stating wether to start or stop outputting video buffer data on 2nd output ( such as turning on or off water spigot )
  • add 2nd output on node meant for outputting video buffer (the spigot)
  • each chunk of video data output will also state what its type is, such as 'init' or 'segment', and will also have to contain a boolean stating that it is 'last' segment so that the receiving node knows when to cut the file to disk.
  • receiving node will be responsible for creating a writestream and accumulate video chunks and write file to disk.

Kevin the existing file out node will accept messages with a parts property (eg from a split node) - and as long as the last part is set correctly it will close the file. Easiest way to see what I mean is to use a file in node to read a file either a chunk at a time or line at a time then feed to a debug to see the msg.parts.

1 Like

For best results and control over the mp4 fragments, etc., I think node-red-contrib-mp4frag will save its own files to disk. When finished writing a file to disk, it will pass a msg to the next node containing the file path and some other status. Of course, you will still be able to use a separate exec node with ffmpeg to read from the http api and make your own mp4 file.

The reason for containing the file creation within this node instead of outputting its raw buffer follows:

I ran into a little roadblock, pertaining to the new feature of ffmpeg to output fragments that do not contain a keyframe/idr/iframe. This happens based on whether we use frag_keyframe vs frag_every_frame. The way that the mp4 fragments work is that I separate the initialization fragment and media segments. At any point, if i were to combine the initialization fragment with a media segment and write it to disk, it would output a perfectly valid and playable mp4 video. If doing that without ensuring that the media segment contained a keyframe, the mp4 will not work and ends up just being black with no video rendered.

Much research has lead me to know that I have to parse a little part of a media segment to know if it contains a keyframe. That data is contained in an mp4 atom trun, which is inside traf, inside of moof, moof->traf->trun.

An example of trun data:

has keyframe <Buffer 00 00 0a 05 00 00 00 02 00 00 00 80 02 00 00 00 00 00 1c 7e 00 00 20 00 00 00 06 a5 00 00 50 00>

no keyframe <Buffer 00 00 0a 01 00 00 00 02 00 00 00 7c 00 00 05 74 00 00 20 00 00 00 01 ab 00 00 00 00>

The final determination comes from the flags depends_on = 2 and non_sync_sample = 0. I am attempting to borrow code from hls.js/mp4-inspect.js at master · video-dev/hls.js · GitHub to come up with a solution so that I can ensure that the leading media segment contains a keyfram/idr/iframe when saving to disk.

p.s. I could just cheat and measure the size of the segment and use that to make the determination. The segments containing a keyframe are always much larger than those containing only b frames and p frame. But, that seems rather hackish.

2 Likes

Now that I am tracking the keyframe segments (contains idr), I can save mp4 to disk and it is playable. The current roadblock I am facing is video playback with vlc reporting incorrect duration. For example, instead of showing 22 seconds, it may be 36 hours 42 minutes 10 seconds, which causes the video player scrubber to be pegged to the right. This is because the timestamps and duration are embedded in the media segments based on when the video was started by ffmpeg. A simple fix is passing the data through ffmpeg again and stream copying it, which will remaster it and fix the durations. But, i really dont want to call ffmpeg within node-red-contrib-mp4frag. Unless I can figure out how to have mp4box.js remux the mp4 and possibly fix the durations, then the recording will have to take place outside of node-red-contrib-mp4frag in some function node receiving the mp4 buffer and passing it to a spawned ffmpeg.

With mp4, whatever you are using to write the file has to close it so the meta data is written to the file (duration etc).

@Colin You are correct. The only problem is that i am not writing the mp4, but copying the fragmented mp4 to disk. The pieces have already been written and rewriting them is beyond my skill level. The real issue is that I must convert the fragmented mp4 back into a regular mp4 without re-encoding the actual video content, just changing the meta data contained in the mp4 atom boxes to make the video files more playable.

Also, I have learned how to use mp4box.js and realize that it cannot do what I ask of it. Luckily, ffmpeg is perfectly qualified to help me and it does the job extremely fast and efficient since we are stream copying without re-encoding (remuxing only).

As of this morning, I think I have the idea of how this node should handle recording. I am trying not to make the single node do the job of many separate nodes, but...

node-red-contrib-mp4frag

  • will have new options for saving video internally or externally ( not sure what to name that option )
  • if externally, will output video buffer on 2nd output to be handled by node of your choice
  • if internally, then it will save file to disk and then output the data describing the recorded video on the 2nd output, such as file name and path, etc.
  • either internal or external, give option to remaster video with ffmpeg to fix durations
  • you will provide path to ffmpeg ( and be responsible for its installation ) and any additional args such as -movflags +faststart
  • if receiving a "record" command while already recording, it will extend the time of recording (until reaching some hard limit so that you don't accidentally create an mp4 that is 100 hours long, etc.)
  • size/time/duration limit might be set in settings.js so that you have to make that deliberate decision
  • option to record with most recent keyframe or oldest keyframe in memory (for using pre video that occurred before triggering event, such as capturing 5 seconds of video before recording is triggered)

Any feedback or suggestions before I start working on this is appreciated. I might have an hour later to tinker with this, or maybe a few hours this weekend.

3 Likes

Most of this is way above my level of understanding but with respect to closing a file to save its meta data I vaguely remember I used C dupe on an open handle which allows the old handle to close keep the existing pipeline intact.

Could someone here please show what the ui_template code looks like when displaying an mp4 file held on the local hard disk ?
I can point the browser at file:///home/user/stream.mp4 and it plays, I just cannot play the file within the dashboard ui_templte ?

w.r.t. the topic and for time lapse I use a modified advanced web example https://picamera.readthedocs.io/en/release-1.13/recipes2.html#web-streaming
that produces a low lag (100ms ish.) mjpeg stream from a pi zero w.

In NR I use the node-red-contrib-mjpgcamera node to capture .jpg files into a folder yyyy/mm/dd.
Then with 'cat ./capture/2021/03/12/[asterix].jpg | ffmpeg -f image2pipe -c:v mjpeg -i - -filter:v "setpts=0.1*PTS" todays_video.mp4' I get an mp4 file time lapse video that plays in the browser.

So for a pipelined approach take the mjpeg stream from the pi convert each frame to a jpg (in my case mjpg-consumer) and pipe it into the ffmpeg command line above.

P.S. Thank you to all those working on this. I only wish I was more technically competent in this area.

I am now able to send my mp4 video buffer to the "file out" node and it successfully writes the file. I was picking apart the file in/file out nodes and found that the "file in" sends the "parts" data, but the "file out" node does not use it, unless that is maybe in a newer version than what i have installed. It works well by me just sending a message structured as { payload: <buffer>, filename: 'some_name'} without including any parts info. As long as my follow-up messages include the same filename, it keeps appending the buffer to the file, as it should.

1 Like

You are correct of course - no idea why I got that mixed up - too many nodes I guess :-0... but yes in theory we could add parts handling around line 159 or so and if they are there then maybe decide not to close the file even if the filename is dynamic until we see the last part - but then ass with all nodes that handle parts there would need to be a way to force it to close in case the last part never arrives. So maybe it's fine as is for now.

But as I said if you do supply the filename the dfault behaviour is to close after every write - so it would be faster to fix he filename in the config and not supply it via the msg - but hey if its working anyway that is great.

For the intermittent recording of mp4 video, most likely the filename will have to be created dynamically, probably based on some timestamp. Originally, I thought it would be a performance issue if the file closed after each write, but I haven't tested enough to be able to put a load on the pi to see if it will be an issue. Realistically, most video chunks will arrive in 2 second interval based on the lowest iframe interval of most ip rtsp cams that i have tested, unless someone chooses the other ffmpeg movflags that cause the segments to be output on non-keyframes, frag_keyframe vs frag_every_frame.

If it really becomes an issue with performance, then maybe a solution could be to keep the writeStream in memory, perhaps in a Map, and set a timer since the last chunk received. If x number of seconds pass without receiving a new chunk, then close writeStream. But like you said, it is fine for now.

I cant say how, but I can say what. The best way that I can think of to get those video files from the disk to your browser would be to serve them via http. Is there any node that is designed to run as a simple server by serving static files from a directory?

Maybe this could help a bit. I have a view in my system setup allowing me to play the latest recorded mp4 videos from my security cameras. The specific code in the ui_template node I put as a table with rows for the camera recordings. I hope this could be a starting point for you

<table>
    <tr><!-- Row 1 -->
    		<td style="text-align: center"><video width="405px" controls autoplay><source id="CAM31" src="/cam31.mp4" type="video/mp4"></video></td>
    		<td style="text-align: center"><video width="405px" controls autoplay><source id="CAM12" src="/cam12.mp4" type="video/mp4"></video></td>
    		<td style="text-align: center"><video width="405px" controls autoplay><source id="CAM41" src="/cam41.mp4" type="video/mp4"></video></td>
    		<td style="text-align: center"><video width="405px" controls autoplay><source id="CAM11" src="/cam11.mp4" type="video/mp4"></video></td>
    </tr>
</table>
<table>
    <tr><!-- Row 2 -->
    		<td style="text-align: center"><video width="405px" controls autoplay><source id="CAM32" src="/cam32.mp4" type="video/mp4"></video></td>
    		<td style="text-align: center"><video width="405px" controls autoplay><source id="CAM22" src="/cam22.mp4" type="video/mp4"></video></td>
    		<td style="text-align: center"><video width="405px" controls autoplay><source id="CAM21" src="/cam21.mp4" type="video/mp4"></video></td>
    </tr>
</table>

I made a little progress for triggering an mp4 recording. If you are brave enough, you can try the #recorder branch.

npm install kevinGodell/node-red-contrib-mp4frag#recorder

A 2nd output has been added to the node for feeding the video buffer to the file node.
The output can be triggered by passing an object to the input, such as

{
  action: {
    subject: 'write',
    command: 'start',
    keyframe: -1,
    timeLimit: 5000,
    sizeLimit: 2500000
}

Hopefully the settings and help text and example flow here will be good enough to explain how to use it.

flow:

[{"id":"9215bc1c.b408d","type":"inject","z":"28dd399e.972736","name":"Start stream","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":"1","topic":"","payload":"true","payloadType":"bool","x":110,"y":100,"wires":[["f001af15.29445"]]},{"id":"f001af15.29445","type":"switch","z":"28dd399e.972736","name":"","property":"payload","propertyType":"msg","rules":[{"t":"true"},{"t":"false"}],"checkall":"true","repair":false,"outputs":2,"x":261,"y":100,"wires":[["40073444.e625bc"],["a1330022.ca53c"]]},{"id":"40073444.e625bc","type":"exec","z":"28dd399e.972736","command":"ffmpeg -re -i http://f24hls-i.akamaihd.net/hls/live/221147/F24_EN_HI_HLS/master_2000.m3u8 -c:v copy -c:a aac -f mp4 -movflags +frag_keyframe+empty_moov+default_base_moof pipe:1","addpay":false,"append":"","useSpawn":"true","timer":"","oldrc":false,"name":"france 24 news","x":480,"y":100,"wires":[["1d68b87a.0fefc8"],[],["1d68b87a.0fefc8"]]},{"id":"a1330022.ca53c","type":"function","z":"28dd399e.972736","name":"stop","func":"msg = {\n kill:'SIGHUP',\n payload : 'SIGHUP' \n}\n\nreturn msg;","outputs":1,"noerr":0,"initialize":"","finalize":"","x":281,"y":149,"wires":[["40073444.e625bc"]]},{"id":"80cd04c4.71b318","type":"inject","z":"28dd399e.972736","name":"Stop stream","props":[{"p":"payload"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"false","payloadType":"bool","x":110,"y":146,"wires":[["f001af15.29445"]]},{"id":"1d68b87a.0fefc8","type":"mp4frag","z":"28dd399e.972736","name":"","migrate":1e-9,"hlsPlaylistSize":"20","hlsPlaylistExtra":"10","basePath":"fr24_2","processVideo":true,"commandPath":"ffmpeg","commandArgs":"[\"-loglevel\",\"quiet\",\"-f\",\"mp4\",\"-i\",\"pipe:0\",\"-f\",\"mp4\",\"-c\",\"copy\",\"-movflags\",\"+faststart+empty_moov\",\"-t\",\"60\",\"-fs\",\"8000000\",\"pipe:1\"]","x":720,"y":100,"wires":[[],["ea3f12ef.4b81f"]]},{"id":"ea3f12ef.4b81f","type":"file","z":"28dd399e.972736","name":"","filename":"","appendNewline":false,"createDir":true,"overwriteFile":"false","encoding":"none","x":770,"y":200,"wires":[[]]},{"id":"d602be42.d4dbc","type":"inject","z":"28dd399e.972736","name":"write start -1, 5000, 2500000","props":[{"p":"action","v":"{\"subject\":\"write\",\"command\":\"start\",\"keyframe\":-1,\"timeLimit\":5000,\"sizeLimit\":2500000}","vt":"json"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payloadType":"str","x":460,"y":200,"wires":[["1d68b87a.0fefc8"]]},{"id":"a1640b3b.120f68","type":"inject","z":"28dd399e.972736","name":"write start -5, 5000, 2500000","props":[{"p":"action","v":"{\"subject\":\"write\",\"command\":\"start\",\"keyframe\":-5,\"timeLimit\":5000,\"sizeLimit\":2500000}","vt":"json"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","x":480,"y":240,"wires":[["1d68b87a.0fefc8"]]},{"id":"a965e1bc.5d78d","type":"inject","z":"28dd399e.972736","name":"write start with defaults","props":[{"p":"action","v":"{\"subject\":\"write\",\"command\":\"start\"}","vt":"json"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","x":520,"y":280,"wires":[["1d68b87a.0fefc8"]]},{"id":"b7066f7e.57fd3","type":"inject","z":"28dd399e.972736","name":"write stop","props":[{"p":"action","v":"{\"subject\":\"write\",\"command\":\"stop\"}","vt":"json"}],"repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payloadType":"str","x":580,"y":320,"wires":[["1d68b87a.0fefc8"]]}]

One point to note, which may not be obvious, when sending a triggering object to start the video output, if sending a start command before the current output finishes, it will extend the currently running process by resetting the current timeLimit and sizeLimit.

For example, if you send this:

{
  action: {
    subject: 'write',
    command: 'start',
    keyframe: -1,
    timeLimit: 5000,
    sizeLimit: 2500000
}

It will cause the video output to run for 5 seconds or 2.5mb, whichever is reached first. If you send that same command after just 3 seconds have passed, then the values are reset per message. You may end up with a 8 second video.

Another scenario for this is that you have some triggering event, such as motion detection, sending many triggers for this to record a video. Each trigger extends the recording time (not cumulatively). If you need to ensure that the resulting file stays within a certain maximum duration or file size, you can pass the -t or -fs parameter to ffmpeg and it will exit gracefully at that time.

Don't hesitate to critique or complain before I merge this to the main branch. Nothing is final at this point. Seems like it works ok, but please try to break it.

1 Like

Hi Kevin,
Thanks for all the time you have spend to find a solution for my question!!! And marvellous to see that you have figured out how to do it the Node-RED way, by delegating the file related stuff to the existing file nodes!!!
Since I'm not able at the moment to work on my computer, I would appreciate if some other folks can test it and give feedback!!!
Bart

1 Like

It is working great! Recordings are made with high quality, I found them in /home/pi/mp4frag/fr24_2 folder

Some things I started to think a bit about:

  • option to configure path for videos: you may have or want to be able to set it to a SSD disk or a net share. I suppose you could mount a share or external SSD disk in FSTAB to map the mp4frag folder but maybe would be easier for the end user to have a configuration option, also to make it more platform independent

  • option to configure file name of the recorded video: suppose you are just interested in the latest recording made for a specific camera, you might want that the previous recording is overwritten and replaced with the same file name, this would make it easier to create a ui showing the latest, using the same file name

  • in a ui, make it possible to scroll and select recordings to be viewed, list sorted with latest on top. Could this be something as an additional feature of the ui_mp4frag node?