I have a Raspberry Pi zero with a camera module running.
Is it possible to connect to this camera with another Raspberry Pi running Node-Red.
Which nodes do I have to use?
I have to connect to the Raspberry Zero with the camera onboard.
The Node Red nodes I saw are all for running a Raspberry Pi with a camera module.
Here the camera is attached to another Raspberry Pi.
I'll give you a rough outline of how I do it, with still images. I run two Node-RED instances:
Instance "A" runs on the Pi which has the camera attached; this has an MQTT input node, which subscribes to a topic (cameras/a/trigger in my case). Messages arriving with this topic are passed on to a node-red-camerapi node which takes a picture and passes it on as a buffer. This buffer is encoded with a Base-64 node and then sent out as an MQTT message with the topic cameras/a/capture.
Instance "B", on a different Pi on the same LAN, runs the Node-RED dashboard, and contains an MQTT input node which listens to the topic cameras/a/capture. Images arriving with that topic are displayed on the Node-RED dashboard by means of a simple Template node. The dashboard also contains a button which triggers the sending of an MQTT message with the topic cameras/a/trigger, resulting in a new image being captured.
There's a lot more to it of course, such as the MQTT broker/subscriber set-up - let me know if this is of interest to you and I'll elaborate.
Streaming video would not work via MQTT - it's not even a good fit for single images tbh. For live streaming I think you'd be better off setting up your Pi Zero as a generic IP camera and creating a dashboard template node containing for example the VLC browser plugin. This could then be directed to open a stream at a given address (the address of the Pi Zero). Haven't done this myself though, so can't help with the details.
If I was going to turn a Pi Zero (with a Raspberry Pi camera module attached) into a generic IP camera, I would probably use the UV4L library - it seems to be the most lightweight, efficient and simple video streaming server with support for the Raspberry Pi's GPU hardware acceleration. You should be able to play the resulting video stream in any modern browser without needing any plugins.
Hello Yvonne,
I see that @clickworkorange already has given some good ideas. Although MQTT is not designed for passing continious audio and video stream, I 'think' you could use it for that purpose (since you could chop the audio and video into small data chunks if necessary). Have never tried it myself... But when I should have to pass images between two Raspberries, I probable would also have considered to use MQTT.
Of course this might become pure horror, e.g. if the MQTT broker would start persisting your MQTT messages on disc. But there are MQTT specialists on this forum that most probably can give better advise about this ...
But even if you could get it working, one disadvantage of this approach is that only MQTT clients would be able to receive your video stream...
If you want your Raspberry (running Node-RED) behaves like an IP camera, your Node-RED flow should be able to offer e.g. an MJPEG stream as soon as an http request for such a stream arrives. If that is what you want, you could implement it e.g. like this:
It sends a http request to the other Node-RED flow.
The Http-In node receives the request and passes it to the multiPart-stream-encoder node.
The multiPart-stream-encoder node sends a http response to the multiPart-stream-decoder node. That response is an infinite mjpeg stream.
The multiPart-stream-decoder decodes the mjpeg stream to individual images. Although I have tried to optimize this node as much as possible, keep in mind that this will cost CPU...
There is a lot information available in the readme pages of both nodes ...
Good luck!!
Bart
Thank you so much for your help.
I will study your suggestions and I come back to you with the results (or more questions;) )
This can take some time, as I'm busy with other things at the moment, but I'm very grateful with people like you, always trying to help.
The cpu load (RPi3B+, see below) is around 131% load on one cpu and the cpu temperature quickly rises to 57 degrees, not very good for a long term running stream, just put your finger on the cpu to verify. This is while streaming one single camera, imagine the effect when adding (one) more
Reducing the frame rate lowered the load but even with 5 fps, it is still loading one cpu to around 70%
This works very well. The cpu load is neglectable, nothing to worry about, just 2,6% on one cpu!!! This is indeed a very good result. I think also that the streaming is even better, it is very smooth and updates quickly, virtually no delay at all when moving around in the view