I see that @clickworkorange already has given some good ideas. Although MQTT is not designed for passing continious audio and video stream, I 'think' you could use it for that purpose (since you could chop the audio and video into small data chunks if necessary). Have never tried it myself... But when I should have to pass images between two Raspberries, I probable would also have considered to use MQTT.
Of course this might become pure horror, e.g. if the MQTT broker would start persisting your MQTT messages on disc. But there are MQTT specialists on this forum that most probably can give better advise about this ...
But even if you could get it working, one disadvantage of this approach is that only MQTT clients would be able to receive your video stream...
If you want your Raspberry (running Node-RED) behaves like an IP camera, your Node-RED flow should be able to offer e.g. an MJPEG stream as soon as an http request for such a stream arrives. If that is what you want, you could implement it e.g. like this:
- You send the (base64 encoded images) to my node-red-contrib-multipart-stream-encoder node, which converts the individual images to a continious mjpeg stream.
- On the other Node-RED flow (or whatever http client like a browser) you can use my node-red-contrib-multipart-stream-decoder to receive this stream:
- It sends a http request to the other Node-RED flow.
- The Http-In node receives the request and passes it to the multiPart-stream-encoder node.
- The multiPart-stream-encoder node sends a http response to the multiPart-stream-decoder node. That response is an infinite mjpeg stream.
- The multiPart-stream-decoder decodes the mjpeg stream to individual images. Although I have tried to optimize this node as much as possible, keep in mind that this will cost CPU...
There is a lot information available in the readme pages of both nodes ...