Raspberry Pi camera

I have a Raspberry Pi zero with a camera module running.
Is it possible to connect to this camera with another Raspberry Pi running Node-Red.

Which nodes do I have to use?
I have to connect to the Raspberry Zero with the camera onboard.
The Node Red nodes I saw are all for running a Raspberry Pi with a camera module.
Here the camera is attached to another Raspberry Pi.

Thank you so much for your help.

Kind regards,

Yvonne de Vries

Hi Yvonne,

Are you looking to stream live video from the Pi Zero, or just capture individual images?

I'll give you a rough outline of how I do it, with still images. I run two Node-RED instances:

  • Instance "A" runs on the Pi which has the camera attached; this has an MQTT input node, which subscribes to a topic (cameras/a/trigger in my case). Messages arriving with this topic are passed on to a node-red-camerapi node which takes a picture and passes it on as a buffer. This buffer is encoded with a Base-64 node and then sent out as an MQTT message with the topic cameras/a/capture.
  • Instance "B", on a different Pi on the same LAN, runs the Node-RED dashboard, and contains an MQTT input node which listens to the topic cameras/a/capture. Images arriving with that topic are displayed on the Node-RED dashboard by means of a simple Template node. The dashboard also contains a button which triggers the sending of an MQTT message with the topic cameras/a/trigger, resulting in a new image being captured.

There's a lot more to it of course, such as the MQTT broker/subscriber set-up - let me know if this is of interest to you and I'll elaborate.

Hi Clickworkorange,

I'm trying to do what you suggest. Thanks for the reply.
I will let you know asap. I want to stream live video on demand.


Hi Yvonne,

Streaming video would not work via MQTT - it's not even a good fit for single images tbh. For live streaming I think you'd be better off setting up your Pi Zero as a generic IP camera and creating a dashboard template node containing for example the VLC browser plugin. This could then be directed to open a stream at a given address (the address of the Pi Zero). Haven't done this myself though, so can't help with the details.

If I was going to turn a Pi Zero (with a Raspberry Pi camera module attached) into a generic IP camera, I would probably use the UV4L library - it seems to be the most lightweight, efficient and simple video streaming server with support for the Raspberry Pi's GPU hardware acceleration. You should be able to play the resulting video stream in any modern browser without needing any plugins.

Hello Yvonne,
I see that @clickworkorange already has given some good ideas. Although MQTT is not designed for passing continious audio and video stream, I 'think' you could use it for that purpose (since you could chop the audio and video into small data chunks if necessary). Have never tried it myself... But when I should have to pass images between two Raspberries, I probable would also have considered to use MQTT.

Of course this might become pure horror, e.g. if the MQTT broker would start persisting your MQTT messages on disc. But there are MQTT specialists on this forum that most probably can give better advise about this ...

But even if you could get it working, one disadvantage of this approach is that only MQTT clients would be able to receive your video stream...

If you want your Raspberry (running Node-RED) behaves like an IP camera, your Node-RED flow should be able to offer e.g. an MJPEG stream as soon as an http request for such a stream arrives. If that is what you want, you could implement it e.g. like this:

  1. You send the (base64 encoded images) to my node-red-contrib-multipart-stream-encoder node, which converts the individual images to a continious mjpeg stream.
  2. On the other Node-RED flow (or whatever http client like a browser) you can use my node-red-contrib-multipart-stream-decoder to receive this stream:
    1. It sends a http request to the other Node-RED flow.
    2. The Http-In node receives the request and passes it to the multiPart-stream-encoder node.
    3. The multiPart-stream-encoder node sends a http response to the multiPart-stream-decoder node. That response is an infinite mjpeg stream.
    4. The multiPart-stream-decoder decodes the mjpeg stream to individual images. Although I have tried to optimize this node as much as possible, keep in mind that this will cost CPU...

There is a lot information available in the readme pages of both nodes ...
Good luck!!


On second thoughts, scratch that - I'd suggest using streamEye instead; it's FOSS.

Dear clickworkorange and Bart,

Thank you so much for your help.
I will study your suggestions and I come back to you with the results (or more questions;) )
This can take some time, as I'm busy with other things at the moment, but I'm very grateful with people like you, always trying to help.

Kind regards,



streameye works, from a technical perspective, but unfortunately it loads the Pi really heavy. I just tried the

ffmpeg -f video4linux2 -i /dev/video0 -r 30 -s 640x480 -f mjpeg -qscale 5 - 2>/dev/null | streameye

The cpu load (RPi3B+, see below) is around 131% load on one cpu and the cpu temperature quickly rises to 57 degrees, not very good for a long term running stream, just put your finger on the cpu to verify. This is while streaming one single camera, imagine the effect when adding (one) more

Reducing the frame rate lowered the load but even with 5 fps, it is still loading one cpu to around 70%

1 Like

Ouch. So there is no FOSS alternative to UV4L?

Edit: What about mjpeg-streamer?

Also vlc (running in non-gui mode).

This works very well. The cpu load is neglectable, nothing to worry about, just 2,6% on one cpu!!! This is indeed a very good result. I think also that the streaming is even better, it is very smooth and updates quickly, virtually no delay at all when moving around in the view

1 Like

I looks like we have a winner then! Really appreciate you benchmarking these - very useful info!