How to display CCTV camera in dashboard (RTSP)

A jpeg image via stdout.

Aha, for that kind of stuff I have developed another node in the past :wink:
If I put a msg-speed node in between, you will see that I have an average of 25 frames (i.e. messages) per second:

image

Seems rather high, but think it is correct when I look at the statistics in VLC:

rtsp_statistics

I'm also not an ffmpeg expert ... And my time is up for today, and tomorrow I 'have' to go counting votes for the European selections (read: last time I got 19 euro for 11 hours counting). So no Node-RED tomorrow for me ...

Do you mean that each time you run the exec node you expect a single image? If so then if you put a debug node on the output of the exec do you see a single message each time you run it?

So keep digging.. and found this


(but that is trying to use a stdbuf command)
though there does seem to be something that node.js itself is doing ... there is mention of a 200k limit (albeit 4 years ago)...
to be continued

1 Like

Ah right it's coming back to me now (from previous loop round with Bart.) - so I think we got to the understanding that ffmpeg never actually closes the stream. So if we changed the node to put chunks back together it would never emit anything (as there was no end signal), so it is better to send them as we get given them, and reassemble them in a next step.

Dave,
If I remember correctly we only had that flushing issue with stdin: wanted to send an infinite stream of images into ffmpeg, but ffmpeg waited until the stream was closed. Here I have reported a workaround, but I still have 1 image delay...

But now with stdout, it seems to me (with the Scheveningen Beach stream) that the output arrives without delay. But of course there is such much data involved, that I cannot analyse delays ...

In this discussion I see that (for file output) you can specify to flush, and a blocksize... Perhaps ffmpeg has similar parameters for pipes.

@wb666greene would be nice if you could add a debug node at the output, and share a screenshot of the debug panel. Another question: rtsp can transport all kind of dat formats (jpg, png...). Can you see somehow what format your camera is delivering? Because I thought I read that image2pipe doesn't like all formats, and lot of users advice to specify ecplicit which format you want on the output. When the output are files (like in your good test), then ffmpeg can determine the output format based on the file extensions.

My time is up. I'm going to count votes for the belgian elections. Manual counting in 2019, unbelievable ...

By resizing the images with ffmpeg -s WxH
option, Its looking like 65K is the maximum chunk size. Images below this seem to arrive correctly, images larger seem truncated.

The beach images are like 14K and they work wondefully for me too. I'm going to compare some jpeg image frames with the frames I get reading the rtsp stream with OpenCV. Unfortunately I won't get very far until Tuesday as this is Memorial Day Holiday weekend in the US.

Its nice to expose these limitations even if there is nothing can be done to work around so I have to give up on this approach for an rtsp stream to MQTT buffer stream converter. The node-red would be very elegant if it could be made to work with other than "smallish" output images.

Rezizing my HD security system mp4 image frames to under 65K loses a lot of quality -- makes little difference on the AI person detection, but matters a lot for the "human in the loop" friend or foe determination.

Krambriw,

Thanks for the information here. I have got motion up and running and I can browse to my camera via the pi ip address and the port I have supplied it through the config file. Can you tell me how you got the image on your node red dashboard please?

Thanks in advance

Glenn

Hi, nice to hear,
Try the flow in this thread

1 Like

can you please share the flow for you camera in node red ? Thanks

It is shared already, just follow the link you see in the posting above yours

Hey Walter,

I was just wondering, can you help me get the notification of movement?
I have an IP camera in my dash and i want to get a notification in my dash if it identifies any movement (Not in the DNN Analyzewr leevel yet). I have a high sensitivity setup in Motion app.

TYIA

Do you mean you are already using Motion for motion detection (GitHub - Motion-Project/motion: Motion, a software motion detector. Home page: https://motion-project.github.io/) if so then that can run a script on motion detection from where you can notify node-red (via mqtt for example).

Hi,
In Motion there are several possibilities to achieve this. One that I use myself is the configuration parameter "on_picture_save". Another is "on_motion_detected" and this is the one I use in this simple example

In Motion configuration, select the camera and list the configuration parameters. You will/should find this depending on your Motion version

I have entered the full path to a simple Python script that I have in my RPi's /home/pi directory

The Python script is as an example and can be much more advanced. When Motion is set to start detection & senses motion above the configured "Threshold", it executes the script that simply connects to your mqtt broker and publishes the event. The event is captured by NR where all the other processing can be made


With this example you will most likely get many events when Motion triggers and it will require you to configure the filters and mask to avoid too many. You can also do configurations in NR to filter out, like using the RBE node and others. As stated, this example is just a starter to make it happen...

Here is the Python script. You will have to install mosquitto as broker and paho for Python if you do not have them already. You also have to change the ip & port to fit with your local setup

#!/usr/bin/python
# USAGE
# python on_motion_detected.py

import time
import paho.mqtt.client as mqtt


def send_mqtt_message(msg):
    client = mqtt.Mosquitto()
    client.connect(mqtt_host, port)
    result, mid = client.publish(topic_out, msg, 0)
    time.sleep(1)
    client.disconnect()
    del client


# Main starts here -----------------------------------------------------------

#MQTT settings
mqtt_host = '192.168.0.241'
port = 1883
topic_out = 'motion'

send_mqtt_message('Motion detected!!!')


1 Like

Walter i already have the livestream IP.
How do i get it to show on node red dash?
Its a template, but i cant find help anywhere

Here you have a ready made example flow that will work fine with Motion

I already tried your flow.
I was wondering what setup you have im motion.
I have basic authentication on my stream URL. And localhost is on.
Even though I tried changing it and still didn't work. I get a "broken file" icon. Any thoughts?

stream_localhost needs to be off to allow viewing cameras from other computers. Have you tried to view a camera directly in a browser from a computer on your network? Like http://192.168.0.236:8081 but with ip that fits your settings?

Yes i did. I got it working last night! I had the localhost on.

Now I have two other problems:
1- When i open the dash on my iPhone (using safari) I can't see the livestream;
2- My config is getting almost one minute delay (with motion detection off!).
Any thoughts?

  1. Only will work if your iphone is connected to your same home wifi network
  2. Possible too high framerate. Try reduce frame rate