In this article, I want to share my attempts to stream video via websockets without using third-party browser plugins like Adobe Flash Player. What came of it, read on.
Adobe Flash - formerly Macromedia Flash, is a platform for creating applications that run in a web browser. Prior to the introduction of the Media Stream API, it was practically the only platform for streaming video and voice from a webcam, as well as for creating various kinds of conferences and chats in the browser. The protocol for transferring media information RTMP (Real Time Messaging Protocol) was actually closed for a long time, which meant: if you want to raise your streaming service, please use the software from Adobe themselves - Adobe Media Server (AMS).
After some time in 2012, Adobe "gave up and spit out" to the public
Adobe Flash platform is more than 20 years old, during this time many critical vulnerabilities have been discovered, support
For my project, I immediately decided to completely abandon the use of Flash in the browser. I indicated the main reason above, also Flash is not supported at all on mobile platforms, and I really didnβt want to deploy Adobe Flash for development on windows (wine emulator). Therefore, I undertook to write a client in JavaScript. This will be just a prototype, since later I learned that streaming can be done much more efficiently based on p2p, only for me it will be peer - server - peers, but more on that another time, because it's not ready yet.
To get started, we need the actual websockets server. I made the simplest one based on the melody go-package:
Server side code
package main
import (
"errors"
"github.com/go-chi/chi"
"gopkg.in/olahol/melody.v1"
"log"
"net/http"
"time"
)
func main() {
r := chi.NewRouter()
m := melody.New()
m.Config.MaxMessageSize = 204800
r.Get("/", func(w http.ResponseWriter, r *http.Request) {
http.ServeFile(w, r, "public/index.html")
})
r.Get("/ws", func(w http.ResponseWriter, r *http.Request) {
m.HandleRequest(w, r)
})
// ΠΡΠΎΠ΄ΠΊΠ°ΡΡΠΈΠΌ Π²ΠΈΠ΄Π΅ΠΎ ΠΏΠΎΡΠΎΠΊ
m.HandleMessageBinary(func(s *melody.Session, msg []byte) {
m.BroadcastBinary(msg)
})
log.Println("Starting server...")
http.ListenAndServe(":3000", r)
}
On the client (transmitting side), you first need to access the camera. This is done through
We get access (permission) to the camera / microphone through
Next, getUserMedia() returns a Promise, into which it passes a MediaStream object β a video-audio data stream. We assign this object to the src property of the video element. Code:
Broadcasting side
<style>
#videoObjectHtml5ApiServer { width: 320px; height: 240px; background: #666; }
</style>
</head>
<body>
<!-- ΠΠ΄Π΅ΡΡ Π² ΡΡΠΎΠΌ "ΠΎΠΊΠΎΡΠ΅ΡΠΊΠ΅" ΠΊΠ»ΠΈΠ΅Π½Ρ Π±ΡΠ΄Π΅Ρ Π²ΠΈΠ΄Π΅ΡΡ ΡΠ΅Π±Ρ -->
<video autoplay id="videoObjectHtml5ApiServer"></video>
<script type="application/javascript">
var
video = document.getElementById('videoObjectHtml5ApiServer');
// Π΅ΡΠ»ΠΈ Π΄ΠΎΡΡΡΠΏΠ΅Π½ MediaDevices API, ΠΏΡΡΠ°Π΅ΠΌΡΡ ΠΏΠΎΠ»ΡΡΠΈΡΡ Π΄ΠΎΡΡΡΠΏ ΠΊ ΠΊΠ°ΠΌΠ΅ΡΠ΅ (ΠΌΠΎΠΆΠ½ΠΎ Π΅ΡΠ΅ ΠΈ ΠΊ ΠΌΠΈΠΊΡΠΎΡΠΎΠ½Ρ)
// getUserMedia Π²Π΅ΡΠ½Π΅Ρ ΠΎΠ±Π΅ΡΠ°Π½ΠΈΠ΅, Π½Π° ΠΊΠΎΡΠΎΡΠΎΠ΅ ΠΏΠΎΠ΄ΠΏΠΈΡΡΠ²Π°Π΅ΠΌΡΡ ΠΈ ΠΏΠΎΠ»ΡΡΠ΅Π½Π½ΡΠΉ Π²ΠΈΠ΄Π΅ΠΎΠΏΠΎΡΠΎΠΊ Π² ΠΊΠΎΠ»Π±Π΅ΠΊΠ΅ Π½Π°ΠΏΡΠ°Π²Π»ΡΠ΅ΠΌ Π² video ΠΎΠ±ΡΠ΅ΠΊΡ Π½Π° ΡΡΡΠ°Π½ΠΈΡΠ΅
if (navigator.mediaDevices.getUserMedia) {
navigator.mediaDevices.getUserMedia({video: true}).then(function (stream) {
// Π²ΠΈΠ΄Π΅ΠΎ ΠΏΠΎΡΠΎΠΊ ΠΏΡΠΈΠ²ΡΠ·ΡΠ²Π°Π΅ΠΌ ΠΊ video ΡΠ΅Π³Ρ, ΡΡΠΎΠ±Ρ ΠΊΠ»ΠΈΠ΅Π½Ρ ΠΌΠΎΠ³ Π²ΠΈΠ΄Π΅ΡΡ ΡΠ΅Π±Ρ ΠΈ ΠΊΠΎΠ½ΡΡΠΎΠ»ΠΈΡΠΎΠ²Π°ΡΡ
video.srcObject = stream;
});
}
</script>
To broadcast a video stream through sockets, it is necessary to encode it somewhere, buffer it, and transfer it in parts. The raw video stream cannot be transmitted via websockets. Here comes to our aid
We encode the video stream, we beat it into parts
<style>
#videoObjectHtml5ApiServer { width: 320px; height: 240px; background: #666; }
</style>
</head>
<body>
<!-- ΠΠ΄Π΅ΡΡ Π² ΡΡΠΎΠΌ "ΠΎΠΊΠΎΡΠ΅ΡΠΊΠ΅" ΠΊΠ»ΠΈΠ΅Π½Ρ Π±ΡΠ΄Π΅Ρ Π²ΠΈΠ΄Π΅ΡΡ ΡΠ΅Π±Ρ -->
<video autoplay id="videoObjectHtml5ApiServer"></video>
<script type="application/javascript">
var
video = document.getElementById('videoObjectHtml5ApiServer');
// Π΅ΡΠ»ΠΈ Π΄ΠΎΡΡΡΠΏΠ΅Π½ MediaDevices API, ΠΏΡΡΠ°Π΅ΠΌΡΡ ΠΏΠΎΠ»ΡΡΠΈΡΡ Π΄ΠΎΡΡΡΠΏ ΠΊ ΠΊΠ°ΠΌΠ΅ΡΠ΅ (ΠΌΠΎΠΆΠ½ΠΎ Π΅ΡΠ΅ ΠΈ ΠΊ ΠΌΠΈΠΊΡΠΎΡΠΎΠ½Ρ)
// getUserMedia Π²Π΅ΡΠ½Π΅Ρ ΠΎΠ±Π΅ΡΠ°Π½ΠΈΠ΅, Π½Π° ΠΊΠΎΡΠΎΡΠΎΠ΅ ΠΏΠΎΠ΄ΠΏΠΈΡΡΠ²Π°Π΅ΠΌΡΡ ΠΈ ΠΏΠΎΠ»ΡΡΠ΅Π½Π½ΡΠΉ Π²ΠΈΠ΄Π΅ΠΎΠΏΠΎΡΠΎΠΊ Π² ΠΊΠΎΠ»Π±Π΅ΠΊΠ΅ Π½Π°ΠΏΡΠ°Π²Π»ΡΠ΅ΠΌ Π² video ΠΎΠ±ΡΠ΅ΠΊΡ Π½Π° ΡΡΡΠ°Π½ΠΈΡΠ΅
if (navigator.mediaDevices.getUserMedia) {
navigator.mediaDevices.getUserMedia({video: true}).then(function (stream) {
// Π²ΠΈΠ΄Π΅ΠΎ ΠΏΠΎΡΠΎΠΊ ΠΏΡΠΈΠ²ΡΠ·ΡΠ²Π°Π΅ΠΌ ΠΊ video ΡΠ΅Π³Ρ, ΡΡΠΎΠ±Ρ ΠΊΠ»ΠΈΠ΅Π½Ρ ΠΌΠΎΠ³ Π²ΠΈΠ΄Π΅ΡΡ ΡΠ΅Π±Ρ ΠΈ ΠΊΠΎΠ½ΡΡΠΎΠ»ΠΈΡΠΎΠ²Π°ΡΡ
video.srcObject = s;
var
recorderOptions = {
mimeType: 'video/webm; codecs=vp8' // Π±ΡΠ΄Π΅ΠΌ ΠΊΠΎΠ΄ΠΈΡΠΎΠ²Π°ΡΡ Π²ΠΈΠ΄Π΅ΠΎΠΏΠΎΡΠΎΠΊ Π² ΡΠΎΡΠΌΠ°Ρ webm ΠΊΠΎΠ΄Π΅ΠΊΠΎΠΌ vp8
},
mediaRecorder = new MediaRecorder(s, recorderOptions ); // ΠΎΠ±ΡΠ΅ΠΊΡ MediaRecorder
mediaRecorder.ondataavailable = function(e) {
if (e.data && e.data.size > 0) {
// ΠΏΠΎΠ»ΡΡΠ°Π΅ΠΌ ΠΊΡΡΠΎΡΠ΅ΠΊ Π²ΠΈΠ΄Π΅ΠΎΠΏΠΎΡΠΎΠΊΠ° Π² e.data
}
}
mediaRecorder.start(100); // Π΄Π΅Π»ΠΈΡ ΠΏΠΎΡΠΎΠΊ Π½Π° ΠΊΡΡΠΎΡΠΊΠΈ ΠΏΠΎ 100 ΠΌΡ ΠΊΠ°ΠΆΠ΄ΡΠΉ
});
}
</script>
Now let's add websockets transmission. Surprisingly, all you need is an object
Sending a video stream to the server
<style>
#videoObjectHtml5ApiServer { width: 320px; height: 240px; background: #666; }
</style>
</head>
<body>
<!-- ΠΠ΄Π΅ΡΡ Π² ΡΡΠΎΠΌ "ΠΎΠΊΠΎΡΠ΅ΡΠΊΠ΅" ΠΊΠ»ΠΈΠ΅Π½Ρ Π±ΡΠ΄Π΅Ρ Π²ΠΈΠ΄Π΅ΡΡ ΡΠ΅Π±Ρ -->
<video autoplay id="videoObjectHtml5ApiServer"></video>
<script type="application/javascript">
var
video = document.getElementById('videoObjectHtml5ApiServer');
// Π΅ΡΠ»ΠΈ Π΄ΠΎΡΡΡΠΏΠ΅Π½ MediaDevices API, ΠΏΡΡΠ°Π΅ΠΌΡΡ ΠΏΠΎΠ»ΡΡΠΈΡΡ Π΄ΠΎΡΡΡΠΏ ΠΊ ΠΊΠ°ΠΌΠ΅ΡΠ΅ (ΠΌΠΎΠΆΠ½ΠΎ Π΅ΡΠ΅ ΠΈ ΠΊ ΠΌΠΈΠΊΡΠΎΡΠΎΠ½Ρ)
// getUserMedia Π²Π΅ΡΠ½Π΅Ρ ΠΎΠ±Π΅ΡΠ°Π½ΠΈΠ΅, Π½Π° ΠΊΠΎΡΠΎΡΠΎΠ΅ ΠΏΠΎΠ΄ΠΏΠΈΡΡΠ²Π°Π΅ΠΌΡΡ ΠΈ ΠΏΠΎΠ»ΡΡΠ΅Π½Π½ΡΠΉ Π²ΠΈΠ΄Π΅ΠΎΠΏΠΎΡΠΎΠΊ Π² ΠΊΠΎΠ»Π±Π΅ΠΊΠ΅ Π½Π°ΠΏΡΠ°Π²Π»ΡΠ΅ΠΌ Π² video ΠΎΠ±ΡΠ΅ΠΊΡ Π½Π° ΡΡΡΠ°Π½ΠΈΡΠ΅
if (navigator.mediaDevices.getUserMedia) {
navigator.mediaDevices.getUserMedia({video: true}).then(function (stream) {
// Π²ΠΈΠ΄Π΅ΠΎ ΠΏΠΎΡΠΎΠΊ ΠΏΡΠΈΠ²ΡΠ·ΡΠ²Π°Π΅ΠΌ ΠΊ video ΡΠ΅Π³Ρ, ΡΡΠΎΠ±Ρ ΠΊΠ»ΠΈΠ΅Π½Ρ ΠΌΠΎΠ³ Π²ΠΈΠ΄Π΅ΡΡ ΡΠ΅Π±Ρ ΠΈ ΠΊΠΎΠ½ΡΡΠΎΠ»ΠΈΡΠΎΠ²Π°ΡΡ
video.srcObject = s;
var
recorderOptions = {
mimeType: 'video/webm; codecs=vp8' // Π±ΡΠ΄Π΅ΠΌ ΠΊΠΎΠ΄ΠΈΡΠΎΠ²Π°ΡΡ Π²ΠΈΠ΄Π΅ΠΎΠΏΠΎΡΠΎΠΊ Π² ΡΠΎΡΠΌΠ°Ρ webm ΠΊΠΎΠ΄Π΅ΠΊΠΎΠΌ vp8
},
mediaRecorder = new MediaRecorder(s, recorderOptions ), // ΠΎΠ±ΡΠ΅ΠΊΡ MediaRecorder
socket = new WebSocket('ws://127.0.0.1:3000/ws');
mediaRecorder.ondataavailable = function(e) {
if (e.data && e.data.size > 0) {
// ΠΏΠΎΠ»ΡΡΠ°Π΅ΠΌ ΠΊΡΡΠΎΡΠ΅ΠΊ Π²ΠΈΠ΄Π΅ΠΎΠΏΠΎΡΠΎΠΊΠ° Π² e.data
socket.send(e.data);
}
}
mediaRecorder.start(100); // Π΄Π΅Π»ΠΈΡ ΠΏΠΎΡΠΎΠΊ Π½Π° ΠΊΡΡΠΎΡΠΊΠΈ ΠΏΠΎ 100 ΠΌΡ ΠΊΠ°ΠΆΠ΄ΡΠΉ
}).catch(function (err) { console.log(err); });
}
</script>
The broadcast side is ready! Now let's try to receive a video stream and display it on the client. What do we need for this? First, of course, the socket connection. We hang a "listener" (listener) on the WebSocket object, subscribe to the 'message' event. Having received a piece of binary data, our server broadcasts to its subscribers, that is, clients. At the same time, the callback function associated with the βlistenerβ of the 'message' event is triggered on the client, the object itself is passed to the function argument - a piece of the video stream encoded by vp8.
Receive video stream
<style>
#videoObjectHtml5ApiServer { width: 320px; height: 240px; background: #666; }
</style>
</head>
<body>
<!-- ΠΠ΄Π΅ΡΡ Π² ΡΡΠΎΠΌ "ΠΎΠΊΠΎΡΠ΅ΡΠΊΠ΅" ΠΊΠ»ΠΈΠ΅Π½Ρ Π±ΡΠ΄Π΅Ρ Π²ΠΈΠ΄Π΅ΡΡ ΡΠ΅Π±Ρ -->
<video autoplay id="videoObjectHtml5ApiServer"></video>
<script type="application/javascript">
var
video = document.getElementById('videoObjectHtml5ApiServer'),
socket = new WebSocket('ws://127.0.0.1:3000/ws'),
arrayOfBlobs = [];
socket.addEventListener('message', function (event) {
// "ΠΊΠ»Π°Π΄Π΅ΠΌ" ΠΏΠΎΠ»ΡΡΠ΅Π½Π½ΡΠΉ ΠΊΡΡΠΎΡΠ΅ΠΊ Π² ΠΌΠ°ΡΡΠΈΠ²
arrayOfBlobs.push(event.data);
// Π·Π΄Π΅ΡΡ Π±ΡΠ΄Π΅ΠΌ ΡΠΈΡΠ°ΡΡ ΠΊΡΡΠΎΡΠΊΠΈ
readChunk();
});
</script>
For a long time I tried to understand why it is impossible to immediately send the received pieces for playback to the video element, but it turned out that of course you canβt do this, you must first put the piece in a special buffer attached to the video element, and only then will the video stream start playing. This will require
MediaSource acts as an intermediary between the media playback object and the source of this media stream. The MediaSource object contains a pluggable buffer for the source of the video/audio stream. One quirk is that a buffer can only contain data of type Uint8, so a FileReader is required to create such a buffer. Look at the code and it will become clearer:
Playing a video stream
<style>
#videoObjectHtml5ApiServer { width: 320px; height: 240px; background: #666; }
</style>
</head>
<body>
<!-- ΠΠ΄Π΅ΡΡ Π² ΡΡΠΎΠΌ "ΠΎΠΊΠΎΡΠ΅ΡΠΊΠ΅" ΠΊΠ»ΠΈΠ΅Π½Ρ Π±ΡΠ΄Π΅Ρ Π²ΠΈΠ΄Π΅ΡΡ ΡΠ΅Π±Ρ -->
<video autoplay id="videoObjectHtml5ApiServer"></video>
<script type="application/javascript">
var
video = document.getElementById('videoObjectHtml5ApiServer'),
socket = new WebSocket('ws://127.0.0.1:3000/ws'),
mediaSource = new MediaSource(), // ΠΎΠ±ΡΠ΅ΠΊΡ MediaSource
vid2url = URL.createObjectURL(mediaSource), // ΡΠΎΠ·Π΄Π°Π΅ΠΌ ΠΎΠ±ΡΠ΅ΠΊΡ URL Π΄Π»Ρ ΡΠ²ΡΠ·ΡΠ²Π°Π½ΠΈΡ Π²ΠΈΠ΄Π΅ΠΎΠΏΠΎΡΠΎΠΊΠ° Ρ ΠΏΡΠΎΠΈΠ³ΡΡΠ²Π°ΡΠ΅Π»Π΅ΠΌ
arrayOfBlobs = [],
sourceBuffer = null; // Π±ΡΡΠ΅Ρ, ΠΏΠΎΠΊΠ° Π½ΡΠ»Ρ-ΠΎΠ±ΡΠ΅ΠΊΡ
socket.addEventListener('message', function (event) {
// "ΠΊΠ»Π°Π΄Π΅ΠΌ" ΠΏΠΎΠ»ΡΡΠ΅Π½Π½ΡΠΉ ΠΊΡΡΠΎΡΠ΅ΠΊ Π² ΠΌΠ°ΡΡΠΈΠ²
arrayOfBlobs.push(event.data);
// Π·Π΄Π΅ΡΡ Π±ΡΠ΄Π΅ΠΌ ΡΠΈΡΠ°ΡΡ ΠΊΡΡΠΎΡΠΊΠΈ
readChunk();
});
// ΠΊΠ°ΠΊ ΡΠΎΠ»ΡΠΊΠΎ MediaSource Π±ΡΠ΄Π΅Ρ ΠΎΠΏΠΎΠ²Π΅ΡΠ΅Π½ , ΡΡΠΎ ΠΈΡΡΠΎΡΠ½ΠΈΠΊ Π³ΠΎΡΠΎΠ² ΠΎΡΠ΄Π°Π²Π°ΡΡ ΠΊΡΡΠΎΡΠΊΠΈ
// Π²ΠΈΠ΄Π΅ΠΎ/Π°ΡΠ΄ΠΈΠΎ ΠΏΠΎΡΠΎΠΊΠ°
// ΡΠΎΠ·Π΄Π°Π΅ΠΌ Π±ΡΡΠ΅Ρ , ΡΠ»Π΅Π΄ΡΠ΅Ρ ΠΎΠ±ΡΠ°ΡΠΈΡΡ Π²Π½ΠΈΠΌΠ°Π½ΠΈΠ΅, ΡΡΠΎ Π±ΡΡΠ΅Ρ Π΄ΠΎΠ»ΠΆΠ΅Π½ Π·Π½Π°ΡΡ Π² ΠΊΠ°ΠΊΠΎΠΌ ΡΠΎΡΠΌΠ°ΡΠ΅
// ΠΊΠ°ΠΊΠΈΠΌ ΠΊΠΎΠ΄Π΅ΠΊΠΎΠΌ Π±ΡΠ» Π·Π°ΠΊΠΎΠ΄ΠΈΡΠΎΠ²Π°Π½ ΠΏΠΎΡΠΎΠΊ, ΡΡΠΎΠ±Ρ ΡΠ΅ΠΌ ΠΆΠ΅ ΡΠΏΠΎΡΠΎΠ±ΠΎΠΌ ΠΏΡΠΎΡΠΈΡΠ°ΡΡ Π²ΠΈΠ΄Π΅ΠΎΠΏΠΎΡΠΎΠΊ
mediaSource.addEventListener('sourceopen', function() {
var mediaSource = this;
sourceBuffer = mediaSource.addSourceBuffer("video/webm; codecs="vp8"");
});
function readChunk() {
var reader = new FileReader();
reader.onload = function(e) {
// ΠΊΠ°ΠΊ ΡΠΎΠ»ΡΠΊΠΎ FileReader Π±ΡΠ΄Π΅Ρ Π³ΠΎΡΠΎΠ², ΠΈ Π·Π°Π³ΡΡΠ·ΠΈΡ ΡΠ΅Π±Π΅ ΠΊΡΡΠΎΡΠ΅ΠΊ Π²ΠΈΠ΄Π΅ΠΎΠΏΠΎΡΠΎΠΊΠ°
// ΠΌΡ "ΠΏΡΠΈΡΠ΅ΠΏΠ»ΡΠ΅ΠΌ" ΠΏΠ΅ΡΠ΅ΠΊΠΎΠ΄ΠΈΡΠΎΠ²Π°Π½Π½ΡΠΉ Π² Uint8Array (Π±ΡΠ» Blob) ΠΊΡΡΠΎΡΠ΅ΠΊ Π² Π±ΡΡΠ΅Ρ, ΡΠ²ΡΠ·Π°Π½Π½ΡΠΉ
// Ρ ΠΏΡΠΎΠΈΠ³ΡΡΠ²Π°ΡΠ΅Π»Π΅ΠΌ, ΠΈ ΠΏΡΠΎΠΈΠ³ΡΡΠ²Π°ΡΠ΅Π»Ρ Π½Π°ΡΠΈΠ½Π°Π΅Ρ Π²ΠΎΡΠΏΡΠΎΠΈΠ·Π²ΠΎΠ΄ΠΈΡΡ ΠΏΠΎΠ»ΡΡΠ΅Π½Π½ΡΠΉ ΠΊΡΡΠΎΡΠ΅ΠΊ Π²ΠΈΠ΄Π΅ΠΎ/Π°ΡΠ΄ΠΈΠΎ
sourceBuffer.appendBuffer(new Uint8Array(e.target.result));
reader.onload = null;
}
reader.readAsArrayBuffer(arrayOfBlobs.shift());
}
</script>
The streaming service prototype is ready. The main disadvantage is that video playback will lag behind the transmitting side by 100 ms, we set this ourselves when splitting the video stream before transferring it to the server. Moreover, when I checked on my laptop, I gradually accumulated a lag between the transmitting and receiving sides, it was clearly visible. I started looking for ways to overcome this shortcoming, and ... came across
Source: habr.com