Display camera ip in dashboard and store video in raspberry

Hi everyone
i have camera ip support protocols such as http, tcp, rtsp, ftp,...
i want store video and display video in dashboard, review history.
what is easy way? what should i do? can you help me?
my english is not very good. thank you.

1 Like

Hi Kientrung,

  • In case you want to connect to camera via http (snapshot JPEG images or MJPEG streams), see my explanation here.
  • For RTSP streams we had recently an interesting discussion, and the result was that you can have multiple options (ffmpeg, avconv, vlc media player, motion software). In case you are interested in the ffmpeg solution, see my post here.
  • In case you want to store video and review the history, you will need to store the camera images somewhere from your flow (again with ffmpeg, motion, ...) into a directory. Afterwards you have to show somehow the directory listing into you dashboard. That is on my large todo list, but I have not done something like that yet ...

Bart

Hi everybody,

My good intention for 2019 was to keep focussed, and not being distracted by new posts on this forum.
I have persisted for more than 2 days now, which is a personal record...

I'm wondering how we can store separate images as a video (e.g. an mp4 file), so we are able to view it afterwards. Most probably (??) it can be done with tools like Motion, ... but I would like to try it with ffmpeg (in combination with the Exec-node).

  • FFMEG OUTPUT STREAM
    As mentioned above, there was a nice discussion recently to decode an RTSP stream to seperate images (which were send across the Node-RED wires). To accomplish that, I had used an Exec-node to execute an Ffmpeg command:

    1. An input message arrives, which triggers the Exec-node
    2. The exec node runs my command, which means it spawns ffmpeg as a separate background process (with my command line parameters).
    3. Ffmpeg will contact the host to setup an RTSP stream, and it will start decoding the images it receives.
    4. Since I have used "-f image2pipe" in the command line, ffmpeg will pipe it's output to the available I/O streams, instead of writing files to disc. This way everything works quickly via memory, without having to exchange temporary files on disc. And because I have specified "pipe:1", it will send the images to the stdout stream (pipe:0 = stdin / pipe:1 = stdout / pipe:2 = stderr).
    5. Ffmpeg will send error information on the stderr stream.
    6. The Exec-node listens to data on the stdout stream (of the spawned process) and will create output messages containing the images.
    7. The Exec-node listens to data on the stderr stream, and will generate output messages (on the second output) containing error information.
  • FFMPEG INPUT STREAM
    Ok so far so good. Now I want to do something similar: pass images via a wire to the input of the exec node, and let Ffmpeg encode N images into an mp4 file.

    From the Ffmpeg documentation it seems that again a pipe can be used to send input data (e.g. images) into ffmpeg via the stdin stream: cat *.png | ffmpeg -f image2pipe -i - output.mkv

    image

    1. A message arrives on the input port, containing an image.
    2. For the first message, no ffmpeg process has been spawned yet so it is done now. For all next messages the same process would be reused.
    3. The input data (i.e. images) are send to the process via the stdin stream.
    4. Ffmpeg creates an MP4 file from all the streamed images.

    However when I look at the Exec-node code, I don't think this is possible with the Exec node: there is no 'stdin' used in the code. Moreover every time I would send an image to it, a new Ffmpeg process would be spawned. Or am I wrong, and is it somehow possible with the Exec-node to keep sending data (via messages on the node's input) to the stdin stream of an existing process ??????

Thanks !!
Bart

1 Like

Look at the node-red-node-daemon instead. That runs a long running process and accepts stdin (and creates stdout,stderr)

Hey Dave (@dceejay),
When looking at the code of the code of node-red-node-daemon, that seems to be indeed exactly what I was looking for. So thanks for the tip !!!

I would like to use that node for ALL my Ffmpeg commands, instead of the Exec node (since I can use that only for a limited set of ffmpeg commands). However my RTSP command works fine for the Exec node, but I get an error for the SAME command using the daemon node:

image

I don't see any major difference between both nodes, in the way they spawn my command (based on the same require('child_process').spawn):

Do you have any idea in which direction I could start searching ?

Completely overlooked it, but it appears that the daemon node doesn't expect me specifying my own quotes:

image

When I remove the quotes in my ffmpeg command, it works fine!

1 Like

Hey Dave (@dceejay),
Have been trying over-and-over again to create a single mp4 file from a series of images (arriving on the stdin stream). But unfortunately no mp4 file was generated :woozy_face:. Now I finally figured out that I have to flush the data (i.e. the images) in the stdin stream, to have Ffmpeg to generate the mp4 file.

There 'should' be two solutions:

  • Force Ffmpeg to process new data automatically as soon as it arrives. Would be the best solution, because I assume that otherwise memory could be filled rapidely. However I only found a -flush_packets parameter, but that doesn't change anything in my test ...
  • Flush the stdin stream from the node-red-node-daemon node. But it seems that a Writable stream in NodeJs cannot be flushed. So I need to call end() on the stdin stream, and then it works fine ...

Question: can I create a pull request to e.g. add a checkbox 'flush input stream before killing' which calls node.child.stdin.end() before node.child.kill(...) is executed ? Or does anybody have a better solution?

Bart

I'm missing something. Why does it need to call kill ? The idea of daemon is to keep running.

Dave (@dceejay) ,
you have a good point! But now I'm not sure anymore if I need to use the Exec node, the Daemon node, or whatever other node ... I just want to execute Ffmpeg with following functionalities:

  • I need stdout and stderr for example to inject images into my flow.
  • I need stdin for example to receive images from my flow.
  • I need to be able to send signals to a running process, to be able to pause/resume/stop a running process. Indeed a process might run for a long time, e.g. an RTSP stream that capture live video from an ip camera.

And I would like to do everything with a single node (so not Exec node for ffmpeg command A and Daemon node for ffmpeg command B), to avoid confusing our users...

Do you know any node that supports all of this? Or which node I'm allowed to upgrade (via a pull request) to accomplish this?

Probably neither :slight_smile: or both :-)...

daemon is meant to run a fixed command (with parameters) for a long time (ie continuously in the background) - and accepts stdin and provides stdout and stderr - so fine for handling conversions as long as you aren't going to be changing parameters. - You can kill a process and restart it - but it will restart the same - You can use stdin for control if the program you are running allows you to.

exec is better for single calls as you can pass in different parameters each time - but it won't accept stdin (so ok for pointing at a file or url to convert etc - but not for images via a previous node)

Hi folks,

I'm completely stuck, so could use some help!!!!

The node-red-node-daemon node allows me to spawn FFmpeg in a separate process. Moreover I can send data to that process via a stdin stream, and receive results from it via a stdout stream (and error information via a stderr stream):

image

So far so good. Everything is going through memory, and no slow disc I/O is required (to read/write temporary files).

Let's create a simple example node to resize images (to size 320x240). The input messages contain an image, and the output messages will contain a resized image:

image

These are the FFmpeg command parameters, that I entered originally in the daemon node:

-f image2pipe -i pipe:0 -vf scale=320:240 -f image2pipe pipe:1
  • The -f image2pipe format allows me to read images from a stream.
  • The -i pipe:0 defines that that input stream will be stdin.
  • The -vf scale=320:240 specifies that the image needs to be resized to 320x240.
  • The -f image2pipe format specifies that the output needs to be written to a stream.
  • The pipe:1 at the end defines that that output stream will be stdout.

However when I inject an image on the node's input, nothing appears on the its output (i.e. no output message is send). The only solution I found is by ending the input stream, to force the input data to be flushed:

node.child.stdin.write(msg.payload); // Existing statement
node.child.stdin.end(); // New extra statement added

Then the resized image appears nicely on the output, as you can seen on the above flow screenshot. However the FFmpeg process will be ended also, and needs to be spawned again. Since this will occur for every image that arrives on the input, it will result in a lot of overhead. :worried:

I don't want that. The FFmpeg process needs to keep running, and infinitely needs to keep processing images that arrive on the stdin input stream. Have travelled across the whole globe, but cannot accomplish this. Here are some experiments that I have done:

  • Had a tip from one of my partners (@btsimonh) to play with the drain events. Have debugged the writable stdin stream in NodeJs, but it looks to me that the HighWaterMark hasn't been exceeded. So I don't think this is a data overflow problem, since I haven't exceeded any data limit.
  • FFmpeg can loop all images in a directory and process them all (you can find plenty of examples to accomplish that). So FFmpeg needs to have something inside to loop its processing, without quitting after the first image. Have tried the below option (see documentation) but it doesn't help:
    -stream_loop number (input)
    Set number of times input stream shall be looped. Loop 0 means no loop, loop -1 means infinite loop.
    
    In older FFmpeg versions this parameter was called -loop, but that also doesn't work.
  • Seems there is another way to flush the stdin stream by writing "\n" to it, but nothing happens:
    node.child.stdin.write(msg.payload); // Existing statement
    node.child.stdin.write("\n"); // New extra statement added
    
  • And today during the afternoon tea, our friend @dceejay had a tip to flush the stdout stream using the -blocksize 2048 -flush_packets 1 options but again nothing happens.

Have tried to get some extra information by adding the -loglevel debug parameter, but I didn't learned anything from it :woozy_face:

Any other tips ????????????????????????????????

Bart

Have never doubted about Dave's skills, but it seems he is right again...

  1. I ask FFmpeg to log as much as possible:

    -f image2pipe -i pipe:0 -vf scale=320:240 -f image2pipe pipe:1 -loglevel debug
    
  2. Then I add a debug node to the second output (which represents stderr), because FFmpeg will write all log data there:

    image

  3. As soon as I inject an input image, a debug message is immediately created (although nothing appears on the first output, i.e. no resized image on stdout):

    image

  4. So FFmpeg immediately receives 14211 bytes of data, which is the entire image as you can see in my debugger:

    image

So the question has become: how can I force FFmpeg to process the image as soon as he received it (without stopping processing other images).

I suspect it may not be ffmpeg which is buffering, but stdio and the pipe. Instead of using the command ffmpeg .... try unbuffer ffmpeg .....
You may have to

apt install expect

to get the unbuffer command. No guarantees but worth trying.

Hi Colin (@Colin),
indeed in this NodeJs issue they use the unbuffer mechanism. When I try this:

image

Then I think the other (FFmpeg related) input parameters get lost somewhere, because this is my output now:

image

But as I see above, FFmpeg receives the entire file. He just needs to process it, output it and wait for another input image. I had expected -stream_loop -1 would have allowed that ...

Did you try the -flush_packets option (of ffmpeg) ?

Did you try unbuffer without the -p option?

When I only remove the '-p'

image

Then I get the same error ...

Yes it use it in all my experiments now:

-f image2pipe -i pipe:0 -vf scale=320:240 -f image2pipe -blocksize 2048 -flush_packets 1 pipe:1

I did not dare to remove it anymore, from the moment you have advised that parameter to me :thinking:

This is truly maddening - the ffmpeg doc imply that -flush_packets will act as soon as a packet arrives - but the spawn stdin says it won't close the packet unless the application tells it to - so they both seem to waiting for each other...

@BartButenaers I have lost track slightly as to what the purpose of what you are doing is. I think earlier you said you wanted to combine the pictures into an mp4 file or similar. If that is the case then you can tell ffmpeg to output directly to that file. Perhaps that is not what you are trying to do now though.

Morning Colin (@Colin),

Indeed I should have explained that. The question in this discussion was to save video footage from a camera. That is perhaps very easy with external tools (like Motion, ...), but you know me: I want to integrate this entirely inside Node-RED, which is possible by using FFmpeg.

So I did an experiment:

  1. Getting images from a camera (using an mpjeg stream or an rtsp stream): both work fine.
  2. Inject those images into a daemon node, and let FFmpeg create an mp4 file from those images.

However the mp4 file is only created when I end the stdin stream. This is not what I want since all images are being queued in memory, which might blow up my system. And I want FFmpeg to keep running. First I thought that FFmpeg was queueing the images, because mp4 does a compression between multiple images (and he perhaps does need N images before he could start creating an mp4 file). But afterwards I thought: perhaps FFmpeg doesn't start compressing until ALL (whathever that means) images have arrived.

But then I realised that even a simple use case doesn't work: if I try to resize a single image, I only get a resized image as output when I end the stdin stream. And that is not normal, so I'm doing something wrong!

Have been experimenting and guessing for 3 evenings, and now I have enough of it. I have build a debug version of FFmpeg and attached a Visual Studio Code debbuger to it. Hopefully I can now debug the C code what is going wrong, if I can find some time this weekend ...

Will keep you guys informed.
Bart