I'm trying to add an analyser to a HTML5 video which is streamed through websockets and mediarecorder. The problem is that the analyser works fine when the video is static and known upfront (so no livestream), but when I change the video to be a livestream (ie we append buffers from websockets to the video element), the analyser does not output anything, yet the video plays perfectly fine.
It seems there is no way to use * createMediaElementSource* in combination with a video element to which we append sourcebuffers. Is there a workaround?
The code/analyser below works perfectly fine when:
- we use a mediastream directly
- we use createMediaElementSource with a static file.
not when:
- we use createMediaElementSource where we append sourcebuffers to the video
Code:
var context = new AudioContext();
//var source = context.createMediaElementSource(video); // does not work when using srcObject for the video
var source = context.createMediaStreamSource(this.userMediaStream); // media stream from MediaRecorder, it works correct
var analyser = context.createAnalyser();
analyser.fftSize = 512;
//
var frequencyBins = new Uint8Array(analyser.frequencyBinCount);
// connect source node to analyser to collect frequency data
source.connect(analyser);
Aucun commentaire:
Enregistrer un commentaire