live-streaming-video-on-web-using-srs
SRS Stack is an all-in-one, out-of-the-box, and open-source video solution for creating online video services, including live streaming and WebRTC, on the cloud or through self-hosting.
SRS Stack makes it easy for you to create an online video service. It is made using Go, Reactjs, SRS, FFmpeg, and WebRTC. It supports protocols like RTMP, WebRTC, HLS, HTTP-FLV, and SRT. It offers features like authentication, streaming on multiple platforms, recording, transcoding, virtual live events, automatic HTTPS, and an easy-to-use HTTP Open API.
Notes
-
Video source must be
h264
encoded for max compatibility. Set the camera to encode h264 at source so transcoding can be avoided later. -
HTTPS is required for publishing streams using WebRTC, and it improves security. If you want to support the video streaming in any HTTPS website, such as a WordPress website, you must use HLS/FLV/WebRTC with HTTPS, or it will fail for security reasons.
Step 1: Docker Run
Run srs-stack in one docker, then open http://localhost:2022 in browser:
docker run --restart always -d -it --name srs-stack -v $HOME/data:/data \
-p 2022:2022 -p 2443:2443 -p 1935:1935 -p 8000:8000/udp -p 10080:10080/udp \
ossrs/srs-stack:5
Important: To use WebRTC in a browser, avoid using localhost or 127.0.0.1. Instead, use a private IP (e.g., https://192.168.3.85:2443), a public IP (e.g., https://136.12.117.13:2443), or a domain (e.g., https://your-domain.com:2443). To set up HTTPS, refer to this post.
Step 1.2 Docker Compose
version: '3'
services:
livevideo:
image: ossrs/srs-stack:5.13.13
container_name: livevideo.example.com
restart: always
network_mode: host
environment:
- CANDIDATE="livevideo.example.com"
volumes:
- ./data:/data
labels:
- com.centurylinklabs.watchtower.enable=true
nginx:
image: nginx:latest
container_name: nginx
restart: always
network_mode: host
volumes:
- ./nginx.conf:/etc/nginx/nginx.conf:ro
ports:
- "80:80"
- "443:443"
Step 1.3 Nginx conf
events {}
http {
server {
listen 80;
server_name livevideo.example.com;
location / {
proxy_pass http://localhost:2022;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
server {
listen 443 ssl;
server_name livevideo.example.com;
ssl_certificate /path/to/your/fullchain.pem;
ssl_certificate_key /path/to/your/privkey.pem;
location / {
proxy_pass http://localhost:2022;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
}
}
}
Step 1.4: # Deploy SRS Stack in docker
Run docker-compose up -d
to deploy the srs-stack.
After creating the SRS Stack, you can access it through http://livevideo.example.com/mgmt
via a browser.
Step 2: # Live Streaming Setup with FFmpeg and SRS
Open the SRS Stack, select Scenarios > Streaming > RTMP: FFmpeg
, and copy the stream URL from the RTMP: FFmpeg
usage section.
Example Publish Stream URL: `rtmp://livevideo.example.com/live/paemog?secret=743acd8e22af4bc9b5c358e63704e0ba
Step 3: Setup ffmpeg
ffmpeg used for pulling a camera feed over rtsp and forwarding to WHEP player or ffplay with the correct streaming container format.
Input From File for Testing
ffmpeg -re -i test.mp4 -c copy -f flv rtmp://livevideo.example.com/live/paemog?secret=743acd8e22af4bc9b5c358e63704e0ba
Notes
-re
: play back file in real time by reading input at native frame rate, eg 1 hour video will play back in 1 hour.-i test.mp4
: sample input file for streaming.-c copy
: don't transcode the audio/video stream, just copy to output.-f flv
: suitable streaming format.rtmp://livevideo.example.com/live/paemog?secret=743acd8e22af4bc9b5c358e63704e0ba
: srs rtmp server location for sending the output stream.
Input From CCTV Camera
HikVision CCTV cameras provide RTSP stream in the default tcp:554
port and no path is required. Username and password are the same as what is required to access the camera web portal. Ensure the camera is set for h264
encoding.
# camera IP: c.c.c.c
ffmpeg -re -rtsp_transport tcp -i rtsp://username:password@c.c.c.c -flvflags no_duration_filesize -c copy -f flv rtmp://livevideo.example.com/live/paemog?secret=743acd8e22af4bc9b5c358e63704e0ba
-rtsp_transport tcp
: Specifies the transport protocol to be used for the RTSP connection. In this case, it's set to TCP, which can be more reliable than UDP in certain network conditions.
-flvflags no_duration_filesize
: This option is used to set FLV (Flash Video) flags. In this case, it's set to no_duration_filesize
, which means FFmpeg will not write duration and filesize to the FLV header.
docker-compose for srs-encoder
version: '3'
services:
srs_encoder:
image: ossrs/srs:encoder
container_name: srs-encoder.techplayr.lan
restart: always
command: >
ffmpeg -re -rtsp_transport tcp -i rtsp://username:password@c.c.c.c -flvflags no_duration_filesize -c copy -f flv rtmp://livevideo.example.com/live/paemog?secret=743acd8e22af4bc9b5c358e63704e0ba
stdin_open: true
tty: true
Notes
- If ffmpeg is segfaulting, use the ubuntu supplied ffmpeg binary (
apt install ffmpeg
) and not the static builds from ffmpeg.org.
Step 4: Play WebRTC stream, WHEP player
After publishing the stream, you can view it with a WebRTC HTML5 player.
Access the WHEP player from the RTMP: FFmpeg
usage section.
Example: Play WebRTC stream, WHEP player: https://livevideo.example.com/rtc/v1/whep/?app=live&stream=paemog
Step 5: Latency Check
For the HikVision camera model DS-2CD3026G2-IS
we get 1.4 seconds latency , as observed with the following configuration.
The CPU utilization is 4%
across 4 CPUs
, and the memory usage
is 0%
out of 34MB in a total of 16GB for 6 video streams.