My Shinobi instance has more streaming options available than what is documented:
- Base64 over Websocket (*)
- JPEG (Auto enabled JPEG API)
- MJPEG (*)
- HLS (includes audio) (*)
Only the ones marked with a (*) are documented. And even then, there is no technical discussion on why I might want to use one or the other.
I messed with all these via trial-and-error. What I found:
- Poseidon - very flaky - different monitors randomly go black, effectively unusable
- Base64 over Websocket - causes CPU usage to stay pegged around 30% per camera
- JPEG (Auto enabled JPEG API) - it appears that whenever the JPEG API is enabled (regardless of stream type), the CPU utilization jumps to 30% per camera
- MJPEG - another CPU hog
- FLV - so far this one seems pretty good. One of my three monitors did go black under this, but CPU utilization is negligible and for the most part it appears stable
- HLS (includes audio) - I haven't tested this one yet
Also: I don't actually watch my cameras all that often, except for the occasional check-in. But it looks like for the methods above that use tremendous CPU, the CPU continues to be pegged even when I'm not logged in to the web GUI. Shouldn't Shinobi be smart enough to do less work if there is no streaming request?
FWIW, my three cameras all are Hikvision DS-2CD2132F.