Perhaps @kevinGodell could provide some guidance.
Is it possible to inject a buffer into pipe2jpeg and have that become an endpoint with a 'video' stream ?
ie. I have a single image sitting in flow context as a buffer, this image changes every x minutes.
Can I inject this buffer (every second for example) directly into pipe2jpeg ? It does not seem to work, is there a way that i could 'emulate' this as a 'stream' ?
Ok found the method (i think), save the image to a file and use ffmpeg to feed it into pipe2jpeg
ffmpeg -re -loop 1 -i /path/image.jpg -f image2pipe -c mjpeg -t 00:02:00 pipe:1
I'm intrigued as to your use case, can you elaborate on what you are doing please ?
This is just some tomfoolery, bear with me :')
I create a webpage showing a chart, news headlines, or whatever I want.
Then use puppeteer to capture a screenshot of that webpage.
Feed the screenshot into pipe2jpeg to make it available as a stream.
This stream can be exposed to homekit (via homebridge) as a security camera:
:')
Now I could create my own 'live station' of data i want to show, i noticed that i can kill ffmpeg and restart with another file and the stream will get updated.
ie: charts in homekit 
1 Like
Interesting idea, but this doesn't seem overly efficient for a static image, how about reducing the frame rate to 1 ?
ffmpeg -re -loop 1 -r 1 -i /path/image.jpg -f image2pipe -c mjpeg -t 00:02:00 pipe:1
Same effect but much less bandwidth 
EDIT: Alternatively take a look at @BartButenaers node -
2 Likes