Not long ago, I took the time to do some research and exploration on the currently popular video live broadcast, understand its overall implementation process, and explore the feasibility of mobile HTML5 live broadcast.
It is found that the current mainstream video live broadcast solutions on the WEB are HLS and RTMP. The mobile WEB end is currently based on HLS (HLS has latency issues, and RTMP can also be used with the help of video.js), while the PC end is based on RTMP, which has better real-time performance. , then we will start sharing the H5 live broadcast theme around these two video streaming protocols.
1. Video streaming protocols HLS and RTMP 1.HTTP Live StreamingHTTP Live Streaming (HLS for short) is an HTTP-based video streaming protocol implemented by Apple. QuickTime and Safari on Mac OS and Safari on iOS all support HLS well. Higher versions of Android also add support for HLS. support. Some common clients such as MPlayerX and VLC also support the HLS protocol.
The HLS protocol is based on HTTP, and a server that provides HLS needs to do two things:
Encoding: Encode images in H.263 format, encode sounds in MP3 or HE-AAC, and finally package them into MPEG-2 TS (Transport Stream) containers; Split: Cut the encoded TS files into equal lengths A small file with the suffix ts and generates a .m3u8 plain text index file; the browser uses m3u8 files. m3u8 is very similar to the audio list format m3u. You can simply think of m3u8 as a playlist containing multiple ts files. The player plays them one by one in order, and then requests the m3u8 file after playing them all, and obtains the playlist containing the latest ts file to continue playing, and the cycle starts again. The entire live broadcast process relies on a constantly updated m3u8 and a bunch of small ts files. m3u8 must be dynamically updated, and ts can go through CDN. A typical m3u8 file format is as follows:
#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=200000gear1/prog_index.m3u8#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=311111gear2/prog_index.m3u8#EXT-X -STREAM-INF:PROGRAM-ID=1, BANDWIDTH=484444gear3/prog_index.m3u8#EXT-X-STREAM-INF:PROGRAM-ID=1, BANDWIDTH=737777gear4/prog_index.m3u8
It can be seen that the essence of the HLS protocol is still one HTTP request/response, so it has good adaptability and will not be affected by firewalls. But it also has a fatal weakness: the delay is very obvious. If each ts is divided into 5 seconds and one m3u8 contains 6 ts indexes, it will cause a delay of at least 30 seconds. If you reduce the length of each ts and reduce the number of indexes in m3u8, the delay will indeed be reduced, but it will cause more frequent buffering and the request pressure on the server will also increase exponentially. So we can only find a compromise point based on the actual situation.
For browsers that support HLS, just write this to play:
<video src=./bipbopall.m3u8″ height=300″ width=400″ preload=auto autoplay=autoplay loop=loop webkit-playsinline=true></video>
Note: HLS only supports safari browser on PC, similar to chrome browser using HTML5 video
The tag cannot play the m3u8 format. You can directly use some relatively mature solutions on the Internet, such as: sewise-player, MediaElement, videojs-contrib-hls, and jwplayer.
2. Real Time Messaging ProtocolReal Time Messaging Protocol (RTMP for short) is a set of video live broadcast protocols developed by Macromedia and now belongs to Adobe. This solution requires building a specialized RTMP streaming service such as Adobe Media Server, and only Flash can be used to implement the player in the browser. Its real-time performance is very good and the delay is very small, but its shortcoming is that it cannot support mobile WEB playback.
Although it cannot be played on the H5 page of iOS, you can write your own decoding and parsing for native iOS applications. RTMP has low latency and good real-time performance. Browser side, HTML5 video
The tag cannot play RTMP protocol videos, which can be achieved through video.js.
<link href=http://vjs.zencdn.net/5.8.8/video-js.css rel=stylesheet><video id=example_video_1″ class=video-js vjs-default-skin controls preload=auto width=640 height=264 loop=loop webkit-playsinline><source src=rtmp://10.14.221.17:1935/rtmplive/home type='rtmp/flv '></video><script src=http://vjs.zencdn.net/5.8.8/video.js>< /script><script>videojs.options.flash.swf = 'video.swf ';videojs('example_video_1′).ready(function() {this.play();});</script>3. Comparison between video streaming protocol HLS and RTMP 2. Live broadcast format
At present, live broadcast display formats are usually dominated by pages such as YY Live and Yingke Live. You can see that its structure can be divided into three layers:
① Background video layer
② Follow and comment module
③ Like animation
The current H5 is similar to a live broadcast page, and its implementation is not technically difficult. It can be divided into:
① The video background at the bottom uses the video tag to play
② The follow and comment module uses WebScoket to send and receive new messages in real time through DOM and CSS3.
③ Like and use CSS3 animation
After understanding the live broadcast format, let’s understand the live broadcast process as a whole.
3. Overall live broadcast processThe overall live broadcast process can be roughly divided into:
Video collection end: It can be the audio and video input device on the computer, or the camera or microphone on the mobile phone. Currently, mobile phone video is the main one.
Live streaming video server: An Nginx server collects the video stream (H264/ACC encoding) transmitted by the video recording end, parses and encodes it on the server side, and pushes the RTMP/HLS format video stream to the video playback end.
Video player: It can be the player on the computer (QuickTime Player, VLC), the native player on the mobile phone, or the video tag of H5, etc. Currently, the native player on the mobile phone is the main one.
(Web front-end learning exchange group: 328058344. Chatting is prohibited. If you are not interested, please do not enter!)
4. H5 video recordingFor H5 video recording, you can use the powerful webRTC (Web Real-Time Communication), which is a technology that supports web browsers for real-time voice conversations or video conversations. The disadvantage is that it is only well supported on Chrome on PC and not very well on mobile devices. ideal.
Basic process of recording video using webRTC
① Call window.navigator.webkitGetUserMedia()
Get the user's PC camera video data.
② Convert the video stream data obtained into window.webkitRTCPeerConnection
(a video streaming data format).
③ Use WebScoket
Transmit video streaming data to the server.
Notice:
Although Google has been promoting WebRTC, and many mature products have appeared, most mobile browsers do not yet support webRTC (the latest iOS 10.0 does not support it), so real video recording still depends on the client (iOS, Android), the effect will be better.
WebRTC support
WebRTC support
iOS native application calls the camera to record video process
① For audio and video collection, the original audio and video data stream can be collected using AVCaptureSession and AVCaptureDevice.
② Encode H264 for video and AAC for audio. There are encapsulated encoding libraries (x264 encoding, faac encoding, ffmpeg encoding) in iOS to encode audio and video.
③ Assemble the encoded audio and video data into packets.
④ Establish RTMP connection and push it to the server.
5. Build Ng
Install nginx, nginx-rtmp-module
① First clone the nginx project locally:
brew tap homebrew/nginx
② Execute and install nginx-rtmp-module
brew install nginx-full –with-rtmp-module
2. nginx.conf configuration file, configure RTMP and HLS
Find the nginx.conf configuration file (path/usr/local/etc/nginx/nginx.conf) and configure RTMP and HLS.
① Add the rtmp configuration content before the http node:
② Add hls configuration in http
Restart the nginx service, enter http://localhost:8080 in the browser, and check whether the welcome interface appears to confirm that nginx restarts successfully.
nginx -s reload
6. Live streaming format conversion, encoding and streamingWhen the server receives the video stream transmitted from the video recording end, it needs to parse and encode it and push the RTMP/HLS format video stream to the video player. Common encoding library solutions commonly used, such as x264 encoding, faac encoding, ffmpeg encoding, etc. Since the FFmpeg tool integrates multiple audio and video format encodings, we can give priority to using FFmpeg for format conversion, encoding and streaming.
1.Install FFmpeg tool
brew install ffmpeg
2. Push MP4 files
Video file address:/Users/gao/Desktop/video/test.mp4
Push and pull streaming addresses: rtmp://localhost:1935/rtmplive/home, rtmp://localhost:1935/rtmplive/home
//RTMP protocol stream ffmpeg -re -i /Users/gao/Desktop/video/test.mp4 -vcodec libx264 -acodec aac -f flv rtmp://10.14.221.17:1935/rtmplive/home//HLS protocol stream ffmpeg -re -i /Users/gao/Desktop/video/test.mp4 -vcodec libx264 -vprofile baseline -acodec aac -ar 44100 -strict -2 -ac 1 -f flv -q 10 rtmp://10.14.221.17:1935/hls/test
Notice:
After we push the stream, we can install VLC and ffplay (a video player that supports rtmp protocol) to pull the stream locally for demonstration.
3.FFmpeg push streaming command① Video files for live streaming
ffmpeg -re -i /Users/gao/Desktop/video/test.mp4 -vcodec libx264 -vprofile baseline -acodec aac -ar 44100 -strict -2 -ac 1 -f flv -q 10 rtmp://192.168.1.101: 1935/hls/testffmpeg-re-i /Users/gao/Desktop/video/test.mp4 -vcodec libx264 -vprofile baseline -acodec aac -ar 44100 -strict -2 -ac 1 -f flv -q 10 rtmp://10.14.221.17:1935/hls/test
② Push streaming camera + desktop + microphone recording for live broadcast
ffmpeg -f avfoundation -framerate 30 -i 1:0″ /-f avfoundation -framerate 30 -video_size 640x480 -i 0 /-c:v libx264 -preset ultrafast /-filter_complex 'overlay=main_w-overlay_w-10:main_h-overlay_h -10′ -acodec libmp3lame -ar 44100 -ac 1 -f flv rtmp://192.168.1.101:1935/hls/test
For more commands, please refer to:
A complete list of FFmpeg commands for processing RTMP streaming media
FFmpeg commonly used streaming commands
7. H5 live video playbackThe mobile terminals iOS and Android both naturally support the HLS protocol. After the video collection end and video streaming push service are completed, you can directly configure the video tag on the H5 page to play the live video.
<video controls preload=auto autoplay=autoplay loop=loop webkit-playsinline><source src=http://10.14.221.8/hls/test.m3u8″ type=application/vnd.apple.mpegurl /><p class=warning >Your browser does not support HTML5 video.</p></video>8. Summary
This article details the entire process of video collection and uploading, server processing of video push, and H5 page playback of live video. It details the principles of live broadcast implementation. Many performance optimization issues will be encountered during the implementation process.
① H5 HLS restriction must be H264+AAC encoding.
② The H5 HLS playback problem is stuck. The server side can implement a fragmentation strategy, put the ts files on CDN, and the front end can try to implement DNS caching, etc.
③ In order to achieve better real-time interaction, H5 live broadcast can also use RTMP protocol and realize playback through video.js.
SummarizeThe above is a detailed explanation of the HTML5 video live broadcast function introduced by the editor. I hope it will be helpful to you. If you have any questions, please leave me a message and the editor will reply to you in time. I would also like to thank everyone for your support of the VeVb martial arts website!