Recently I want to host my own broadcast on-air. After some digging, I found this repository is a very useful tool for the beginners. Here are some notes that I want to keep:
RTMP (Adobe Real-Time Messaging Protocol):
- [+]: low latency, almost real-time
- [-]: need Flash support
HLS (Http Live Streaming):
- Slice, store and serve continuous pieces collected from RTMP
- [+]: can play with HTML5 (no Flash needed)
- [-]: delay (10s~30s)
RTMP to HLS:
- 1. Use srs to create a HLS stream and serve the m3u8 with customized Nginx. Use FFMPEG to slice.
- 2. Use srs to create a HTTP stream and serve the m3u8 with native Nginx. Use FFMPEG to slice.
- 3. Use Nginx-RMTP to handle everything.
RTMP/HLS/HDS are distribution methods. RTMP is also the name of the protocol. HLS is using HTTP protocol.
SRS vs Nginx-RMTP:
Haven’t try RTSP. Might try that one later.
The second problem is to find a way to handle the FFmpeg. Since I use the RTMP+HTTP solution to stream, I need to run a bash script to keep chopping the RTMP stream into HLS using ffmpeg. However, if the RTMP stream comes after the start of FFmpeg, the RTMP stream might get blocked for some unknown reasons. Therefore, I cannot run a background process to handle FFmpeg on my server. I have to start the FFmpeg process manually after the stream (e.g. from OBS) starts.
I write a script (with a PID lock) and a frontend UI to solve that problem. I can start the script (to execute FFmpeg loop) after the stream starts and terminate the script after the stream stops.
Although checking the existence of the stream before chopping can be another solution, it costs a lot of I/O operation (checking existence). Checking the existence every minute can alleviate the problem but still, it’s not necessary to do that if we can start the script manually.