Display camera ip in dashboard and store video in raspberry

Hi BartButenaers, Can I do the same thing you mentioned in your first suggestion using a raspberry pi camera instead of a IP camera ?

Hello @nassima100,
This had become a rather long thread, but I guess you mean this one (????):

If that is your question, there is no difference if the images (running through your Node-RED flow) are coming from an IP camera or a Raspicam or whatever. But I haven't managed to get that done, especially since the timing of the images in the video files (e.g. mp4) should be correctly. Otherwise the playback rate afterwards would be wrong ...

1 Like

In januari I was completely stuck, so I contacted Kevin Godell. Got immediate help from Kevin, so would like to thank him for his time!!!!!! After some research, he came up with following ffmpeg command:

-f mjpeg -i pipe:0 -vf scale=320:240 -f image2pipe pipe:1

When you use this command in Node-RED's daemon node, it is now possible to pass images via memory (stdin stream pipe:0) to ffmpeg (since we (ab)use an mjpeg stream to be able to process an infinite series of images). And the resized image is returned back to Node-RED as an output message (stdout stream pipe:1):

[{"id":"122804f3.f8b7eb","type":"daemon","z":"50df184.124f2e8","name":"Resize image","command":"/home/pi/FFmpeg/ffmpeg","args":"-f mjpeg -i pipe:0 -vf scale=320:240 -f image2pipe pipe:1","autorun":true,"cr":false,"redo":true,"op":"buffer","closer":"SIGTERM","x":560,"y":320,"wires":[["98ce6e9f.81789"],[],[]]},{"id":"631c9e62.c5bf1","type":"inject","z":"50df184.124f2e8","name":"","topic":"","payload":"","payloadType":"date","repeat":"","crontab":"","once":false,"onceDelay":0.1,"x":180,"y":340,"wires":[["afda2c1f.ec754"]]},{"id":"dfde7d7e.a8b0b","type":"image","z":"50df184.124f2e8","name":"","width":200,"x":980,"y":300,"wires":[]},{"id":"afda2c1f.ec754","type":"http request","z":"50df184.124f2e8","name":"","method":"GET","ret":"bin","paytoqs":false,"url":"https://static.independent.co.uk/s3fs-public/thumbnails/image/2017/09/12/11/naturo-monkey-selfie.jpg?w968","tls":"","proxy":"","authType":"basic","x":340,"y":340,"wires":[["122804f3.f8b7eb","c886e69d.47aa98"]]},{"id":"93ccc4c3.6f45c8","type":"image","z":"50df184.124f2e8","name":"","width":200,"x":720,"y":400,"wires":[]},{"id":"c886e69d.47aa98","type":"image-info","z":"50df184.124f2e8","name":"","x":530,"y":400,"wires":[["93ccc4c3.6f45c8"]]},{"id":"98ce6e9f.81789","type":"image-info","z":"50df184.124f2e8","name":"","x":790,"y":300,"wires":[["dfde7d7e.a8b0b"]]},{"id":"853bae8d.c380d","type":"comment","z":"50df184.124f2e8","name":"Image processing without spawn but with delay","info":"-f mjpeg -i pipe:0 -vf scale=320:240 -f image2pipe pipe:1","x":660,"y":260,"wires":[]}]


  • All data is passed via memory, and no temporary files (or disc IO) is needed.
  • A single daemon ffmpeg needs to be spawned that processes all images. I.e. it is not necessary to spawn a new ffmpeg process for every image, like e.g. the ffmpeg-stream package.

Disadvantage: there is a single image delay :woozy_face: When you insert an image nothing happens. As soon as you inject the second image, the first image will arrive on the output. And so on ... Not 100% sure what the problem is, but after the first image is inserted I get only this output (on stderr and adding -loglevel debug):

[mjpeg @ 0x7fffcd526700] Before avformat_find_stream_info() pos: 0 bytes read:32768 seeks:0 nb_streams:1

This is similar to the issue I had before: since we (ab)use an mjpeg stream, ffmeg wants to collect information about that stream. And I think that causes the delay. But I'm not sure...

Unless somebody else has some usefull tips, I will have to stop with this mechanism :weary: Would be a pitty, since we are almost there. And it would have been an awesome addition for Node-RED... P.S. I'm not referring to the image resizing, because that is only a simple test that you can also achieve with node-red-contrib-image-tools in pure Javascript ...

Alright thanks sir.

Hey Bart.

Did you ever find a solution to get rid of the "single image delay" problem?

Morning Kevin,
Nice to hear from you again!
Unfortunately not. And I had some other issues with the pipes (e.g. creating mp4 files seemed not to be possible). So at the end I decided to go for a file-based approach, since that is the standard way to use it. And for systems that mostly don't have a hard drive ( like a Raspberry PI 3), I have been experimenting with in-memory filesystems. Those allow you to achieve better results as the pipe-approach (i.e. all ffmpeg commands are possible). You only need enough ram...
I have summarized it into another discussion.
Kind regards,

That is strange. The other discussion was "automatically closed 60 days after the last reply" and this much older one was still open.

FYI, piping mp4 is possible as long as you pass the correct -movflags. I connect to my rtsp ip cams and use ffmpeg to mux them to mp4 fragments and pass them to media source extension in the browser. I also encode some jpegs to send to browser. No files are ever written to disk.

1 Like

Awhile ago a change to the forum was made so that threads marked solved whould automatically be closed after 60 days. Ones 'not' marked 'solved' still remain open.

1 Like

Hi Kevin, that sounds very interesting! Would it be possible to share some examples of such ffmpeg command lines? I mean from rtsp -> mp4
Best regards, Walter


Interesting about the mp4. I had read somewhere that it was not possible...
And with the jpeg encoding you also have a 1 image delay?

There are some options when it comes to piping an mp4. You can take an RTSP input stream and mux it (without re-encoding) out to mp4. The key to having a pipe-able mp4 will be the -movflags. Some combinations of -movflags include, but not limited to, +frag_keyframe or +empty_moov or +dash or any combination including atleast 1 of those. Without the movflags, ffmpeg will error with muxer does not support non seekable output.

Some actual examples:

ffmpeg -loglevel quiet -progress pipe:1 -hwaccel auto -probesize 32 -analyzeduration 0 -reorder_queue_size 0 -rtsp_transport tcp -i rtsp:// -an -c:v copy -f mp4 -movflags +dash+negative_cts_offsets -metadata title="ip" pipe:3 -c mjpeg -q 10 -r 7 -vf scale=trunc(iw*0.75/2)*2:-2 -f image2pipe pipe:4

ffmpeg -loglevel info -progress pipe:3 -hwaccel auto -analyzeduration 10000000 -probesize 1048576 -re -f hls -i http://commondatastorage.googleapis.com/gtv-videos-bucket/CastVideos/hls/TearsOfSteel.m3u8 -c:a aac -c:v copy -f mp4 -movflags +dash+negative_cts_offsets pipe:1 -c mjpeg -q 10 -r 7 -vf scale=trunc(iw*0.75/2)*2:-2 -f image2pipe pipe:4

You may notice the pipe:3, pipe:4 etc. With nodejs, we are able to create pipes beyond stdin, stdout, stderr.

I dont have a jpeg delay because the jpegs are being encoded from an RTSP source, and thus have a constant input pushing through. From what I remember, you were trying to push in jpegs and have them output in a different size. There must be some magical ffmpeg flags that we have not yet discovered. Perhaps adding -fflags +nobuffer+flush_packets -flags low_delay

Kevin, thank you very much for this!

Kevin, I will try your proposal next weekend...

Hi Kevin, I am trying to test your 2nd command line to display the "TearsOfSteal" out to MP4 video in NR Dashboard (as i usually do with my IPcam RTSP stream) :
ffmpeg -loglevel info -progress pipe:3 -hwaccel auto -analyzeduration 10000000 -probesize 1048576 -re -f hls -i http://commondatastorage.googleapis.com/gtv-videos-bucket/CastVideos/hls/TearsOfSteel.m3u8 -c:a aac -c:v copy -f mp4 -movflags +dash+negative_cts_offsets pipe:1 -c mjpeg -q 10 -r 7 -vf scale=trunc(iw*0.75/2)*2:-2 -f image2pipe pipe:4
This is my test flow :

[{"id":"8c6b37c4.a5aef8","type":"inject","z":"fe2c7bc6.bd49c8","name":"Start stream","repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"true","payloadType":"str","x":110,"y":4910,"wires":[["dc8a0ca9.117e9"]]},{"id":"2de27e17.be3c12","type":"inject","z":"fe2c7bc6.bd49c8","name":"Stop all streams","repeat":"","crontab":"","once":false,"onceDelay":0.1,"topic":"","payload":"false","payloadType":"str","x":120,"y":4950,"wires":[["dc8a0ca9.117e9"]]},{"id":"40f893c3.a21d1c","type":"image-info","z":"fe2c7bc6.bd49c8","name":"","x":500,"y":4950,"wires":[["21cccabd.e70df6"]]},{"id":"15f19ef2.c099d1","type":"base64","z":"fe2c7bc6.bd49c8","name":"b64","action":"str","property":"payload","x":630,"y":4850,"wires":[["8b9cf7bd.a00318"]]},{"id":"8b9cf7bd.a00318","type":"ui_template","z":"fe2c7bc6.bd49c8","group":"51c96dc0.e62d64","name":"dash oeuf","order":1,"width":6,"height":4,"format":"<img width=\"16\" height=\"360\" src=\"data:image/jpg;base64,{{msg.payload}}\"/>","storeOutMessages":true,"fwdInMessages":true,"templateScope":"local","x":760,"y":4850,"wires":[[]]},{"id":"361bb3b.15d904c","type":"ui_switch","z":"fe2c7bc6.bd49c8","name":"on/off stream","label":"","tooltip":"","group":"51c96dc0.e62d64","order":4,"width":1,"height":1,"passthru":true,"decouple":"false","topic":"","style":"","onvalue":"true","onvalueType":"str","onicon":"","oncolor":"","offvalue":"false","offvalueType":"str","officon":"","offcolor":"","x":110,"y":4930,"wires":[["dc8a0ca9.117e9"]]},{"id":"dc8a0ca9.117e9","type":"switch","z":"fe2c7bc6.bd49c8","name":"","property":"payload","propertyType":"msg","rules":[{"t":"eq","v":"true","vt":"str"},{"t":"eq","v":"false","vt":"str"}],"checkall":"true","repair":false,"outputs":2,"x":270,"y":4890,"wires":[["2ca9755e.222aaa"],["a3c294f8.aed6b8"]]},{"id":"a3c294f8.aed6b8","type":"function","z":"fe2c7bc6.bd49c8","name":"stop","func":"msg= [\n    {\n    kill:'SIGTERM',\n    payload : 'SIGTERM'\n    }\n    \n    \n    ];     // set a new payload & the counter\nreturn msg;","outputs":1,"noerr":0,"x":270,"y":4930,"wires":[["2ca9755e.222aaa"]]},{"id":"21cccabd.e70df6","type":"msg-speed","z":"fe2c7bc6.bd49c8","name":"","frequency":"sec","estimation":true,"ignore":false,"x":590,"y":4950,"wires":[["794946e8.e0b618","2cbb32f3.899c1e"],[]]},{"id":"2ca9755e.222aaa","type":"exec","z":"fe2c7bc6.bd49c8","command":"ffmpeg -loglevel info -progress pipe:3 -hwaccel auto -analyzeduration 10000000 -probesize 1048576 -re -f hls -i \"http://commondatastorage.googleapis.com/gtv-videos-bucket/CastVideos/hls/TearsOfSteel.m3u8\" -c:a aac -c:v copy -f mp4 -movflags +dash+negative_cts_offsets pipe:1 -c mjpeg -q 10 -r 7 -vf scale=trunc(iw*0.75/2)*2:-2 -f image2pipe pipe:4","addpay":false,"append":"","useSpawn":"true","timer":"","oldrc":false,"name":"Decode RTSP stream","x":440,"y":4890,"wires":[["40f893c3.a21d1c","15f19ef2.c099d1"],["67ffbec1.566cf"],[]],"info":"oeuf:   MARCHE PAS \nffmpeg -rtsp_transport tcp -i  \"rtsp://admin:271273@\" -f image2pipe pipe:1\n\nmisecu : MARCHE OK\nffmpeg -rtsp_transport tcp -i  \"rtsp://\" -filter:v fps=fps=5 -f image2pipe pipe:1\n\nORIGINAL -> MP4:\n------------------\nffmpeg -loglevel quiet -progress pipe:1 -hwaccel auto -probesize 32 -analyzeduration 0 -reorder_queue_size 0 -rtsp_transport tcp -i rtsp:// -an -c:v copy -f mp4 -movflags +dash+negative_cts_offsets -metadata title=\"ip\" pipe:3 -c mjpeg -q 10 -r 7 -vf scale=trunc(iw*0.75/2)*2:-2 -f image2pipe pipe:4\n                ou\nffmpeg -loglevel info -progress pipe:3 -hwaccel auto -analyzeduration 10000000 -probesize 1048576 -re -f hls -i http://commondatastorage.googleapis.com/gtv-videos-bucket/CastVideos/hls/TearsOfSteel.m3u8 -c:a aac -c:v copy -f mp4 -movflags +dash+negative_cts_offsets pipe:1 -c mjpeg -q 10 -r 7 -vf scale=trunc(iw*0.75/2)*2:-2 -f image2pipe pipe:4\n\n\nmisecu -> MP4 : MARCHE PAS \nffmpeg -loglevel quiet -progress pipe:1 -hwaccel auto -probesize 32 -analyzeduration 0 -reorder_queue_size 0 -rtsp_transport tcp -i \"rtsp://\" -an -c:v copy -f mp4 -movflags +dash+negative_cts_offsets -metadata title=\"ip\" pipe:3 -c mjpeg -q 10 -r 7 -vf scale=trunc(iw*0.75/2)*2:-2 -f image2pipe pipe:4\n\n\nffmpeg -loglevel quiet -progress pipe:1 -hwaccel auto -probesize 32 -analyzeduration 0 -reorder_queue_size 0 -rtsp_transport tcp -i rtsp:// -an -c:v copy -f mp4 -movflags +dash+negative_cts_offsets -metadata title=\"ip\" pipe:3 -c mjpeg -q 10 -r 7 -vf scale=trunc(iw*0.75/2)*2:-2 -f image2pipe pipe:4\n\nffmpeg -loglevel quiet -progress pipe:1 -hwaccel auto -probesize 32 -analyzeduration 0 -reorder_queue_size 0 -rtsp_transport tcp -i \"rtsp://\" -an -c:v copy -f mp4 -movflags +dash+negative_cts_offsets -metadata title=\"ip\" pipe:3 -c mjpeg -q 10 -r 7 -vf scale=trunc(iw*0.75/2)*2:-2 -f image2pipe pipe:4\n\n\n\n\n\n"},{"id":"794946e8.e0b618","type":"ui_valuetrail","z":"fe2c7bc6.bd49c8","group":"51c96dc0.e62d64","order":3,"width":3,"height":1,"name":"","label":"","blur":false,"minmax":false,"showvalue":false,"decimals":"0","colorLine":"#ff9900","colorFromTheme":true,"stroke":2,"property":"payload","pointcount":"24","x":800,"y":4950,"wires":[]},{"id":"2cbb32f3.899c1e","type":"ui_text","z":"fe2c7bc6.bd49c8","group":"51c96dc0.e62d64","order":5,"width":1,"height":1,"name":"fps","label":"{{msg.payload}}fps","format":"","layout":"row-center","x":710,"y":4950,"wires":[]},{"id":"67ffbec1.566cf","type":"debug","z":"fe2c7bc6.bd49c8","name":"complet","active":true,"tosidebar":true,"console":false,"tostatus":false,"complete":"true","targetType":"full","x":690,"y":4890,"wires":[]},{"id":"51c96dc0.e62d64","type":"ui_group","z":"","name":"camOeuf","tab":"250de740.57ee78","order":9,"disp":true,"width":"6","collapse":true},{"id":"250de740.57ee78","type":"ui_tab","z":"","name":"Dashboard","icon":"dashboard","order":1,"disabled":false,"hidden":false}]

but without success .

Did you see something wrong ?

Best regards

That command as-is will probably not work. It was just an example of making a streaming mp4 using the movflags. The various data is output to the pipe:N's and have to be handled by some script to receive the raw data.

In response to the yellow highlighted code, I always see the deprecated pixel format warning and it seems to have no negative side affect. Instead of using +dash+negative_cts_offsets, try just using +frag_keyframe. You can get rid of -vf scale=trunc(iw*0.75/2)*2:-2 or change it to -vf scale='trunc(iw*0.75/2)*2:-2' The quotes needed may be different in the context in which it is called.

As of right now, I don't know enough about node-red to understand how you handle the data flow. I read through the docs but i was unable to understand how to get started. Perhaps you can educate me?

If you tell me what you are trying to accomplish, I can probably suggest some appropriate ffmpeg command line to use.

1 Like

Many of us use IP cameras which stream rtsp. To display these flows on the Node Red Dashboard, thanks to (@BartButenaers) Bart and his hard work on this use, to create this flow:

the central element is the "DECODE RTSP STREAM" node which will execute any command like the one I am currently using (thanks Bart again):
ffmpeg -rtsp_transport tcp -i "rtsp://" -filter:v fps=fps=5 -f image2pipe pipe:1
we start or stop the rtsp flow thanks to the "buttons" on the left "
The result is a succession of images which will be encoded in Base 64 to be displayed in the dashboard as a video.

It's satisfactory, nothing more. There is a delay of several seconds between the launch of the command and the display of the flow. The image is pixelated compared to the original (with VLC player) in 720p,
vlc vs nr
and does not allow full HD 1080p under penalty of "buffer saturated" or something like that.

Therefore, a new idea is welcome.

so i tested your order (which can't work like you said)

ok i will

PM me if you want (so as not to pollute this topic) ?

1 Like

I forgot to ask, or maybe i overlooked it, but what platform are you on? I have been experimenting with a raspberry pi 4 model b rev 1.1 4gb to see if it can handle my cctv system, and so far the results are very good. The tricky part was to take advantage of hardware decoding. I have a gist that you can look at.

If your issue is cpu load, based on the posted gif, then hardware decoding should help, since the jpeg is being created from h264 video. But of course, that is all platform dependent.

1 Like

(Unfortunately) I'm running Node Red on a Raspberry Pi 3B with only 1GB of RAM.

I don't think this is due to a lack of cpu, (I noted 36% of cpu in 1080p and 15% in 640x360),because when I analyze the images :

The signature of starting a jpeg is FF D8 FF ... and end FF D9.
The first image of a corrupted stream, generated by the flow decoder, starts well with FF D8 FF but does not end with FF D9
It would seem that it goes up to the maximum size of the buffer which would be 65536 bytes.
Not having found FF D9: we end up with an "Unknown image format".

This is a corrupt picture with a gray headband = 65536 bytes

This is a good picture 65387 bytes (< 65536 bytes)

That is easy enough to solve. I ran into that problem some years back. It was worse on mac, whose pipe buffer size is 8192, while linux is 65536 or ~32000, and windows being somewhere near ~90000, if i remember correctly. The claim by mac was that a smaller piping size was more efficient, but I found it to be a huge pain when dealing with piping data in to nodejs from external processes.

The data containing jpegs coming out of ffmpeg will have to first be piped into some parser that can hold the data until the entire jpeg is pushed in. At this point, the jpeg buffer can be larger than the system piping limit and there will be no limit to pass it to further processing within nodejs.

In the gist, I show the usage of pipe2jpeg, which will do what you need. You can get the complete jpeg by listening to it's "jpeg" event.

1 Like