For video live streaming, I studied many solutions in the early stage, including websocket. Due to various reasons, I did not adopt this solution in the end, but I still want to record my learning experience.
WebSocket is a protocol for full-duplex communication on a single TCP connection that HTML5 began to provide.
In the WebSocket API, the browser and the server only need to perform a handshake action, and then a fast channel is formed between the browser and the server. Data can be transmitted directly between the two.
The browser sends a request to the server to establish a WebSocket connection through JavaScript. After the connection is established, the client and the server can directly exchange data through the TCP connection.
After you obtain the Web Socket connection, you can send data to the server through the send() method, and receive data returned by the server through the onmessage event.
During the process, the main idea is: use setTimeout() on the recording page to convert the video into a frame-by-frame image through canvas at fixed intervals, and then use websocket's socket.send() to send the image data to the server. . On the live broadcast page, you first create an <img> structure, obtain the image data through socket.onmessage() of websocket, and display it on the <img> tag to form a live broadcast.
Attached code
Video page HTML structure
<video autoplay id=sourcevid style=width:1600;height:900px></video> <canvas id=output style=display:none></canvas>
Video page js
<script type=text/javascript charset=utf-8> //Create a + instance var socket = new WebSocket(ws://+document.domain+:8080); var back = document.getElementById('output'); / /Returns an environment for drawing on the canvas. var backcontext = back.getContext('2d'); var video = document.getElementsByTagName('video')[0]; var success = function(stream){ //Get the video stream and convert it to url video.src = window. URL.createObjectURL(stream); } //Open socket socket.onopen = function(){ draw(); console.log(open success) } // Draw video frames onto the Canvas object, and Canvas switches frames every 100ms to form a naked-eye video effect. var draw = function(){ try{ backcontext.drawImage(video,0,0, back.width, back.height); }catch( e){ if (e.name == NS_ERROR_NOT_AVAILABLE) { return setTimeout(draw, 100); } else { throw e; } } if(video.src){ //The content of Canvas is converted into PNG data URI and sent to the server, 0.5 is the compression coefficient socket.send(back.toDataURL(image/jpeg, 0.5)); } setTimeout(draw, 100) ; } //Call the device's camera and put the resource into the video tag navigator.getUserMedia = navigator.getUserMedia || navigator.webkitGetUserMedia || navigator.mozGetUserMedia || navigator.msGetUserMedia; navigator.getUserMedia({video:true, audio:false}, success, console.log); </script>
Live page HTML structure:
<img id=receiver style=width:1600px;height:900px/>
Live page JS
<script type=text/javascript charset=utf-8> //Create a socket instance var receiver_socket = new WebSocket(ws://+document.domain+:8008); alert(ws://+document.domain+:8008) var image = document.getElementById('receiver'); // Listen for messages receiver_socket.onmessage = function(data) { image.src=data.data; } </script>Summarize
The above is the websocket implementation in HTML5 that the editor introduces to you. I hope it will be helpful to you. If you have any questions, please leave me a message and the editor will reply to you in time. I would also like to thank everyone for your support of the VeVb martial arts website!