Is it possible to inject a buffer into pipe2jpeg and have that become an endpoint with a 'video' stream ?
ie. I have a single image sitting in flow context as a buffer, this image changes every x minutes.
Can I inject this buffer (every second for example) directly into pipe2jpeg ? It does not seem to work, is there a way that i could 'emulate' this as a 'stream' ?
I create a webpage showing a chart, news headlines, or whatever I want.
Then use puppeteer to capture a screenshot of that webpage.
Feed the screenshot into pipe2jpeg to make it available as a stream.
This stream can be exposed to homekit (via homebridge) as a security camera:
Now I could create my own 'live station' of data i want to show, i noticed that i can kill ffmpeg and restart with another file and the stream will get updated.