I am trying to identify events when speaker and mic are used using a Chrome Extension.
I used as an example a Google Meet link, I logged in and then I started the extension to record when the mic / speaker are used in order to pull the events on “ondataavailable()”.
The code in the offscreen javascript file is reached as I can see all the console.log() but I do not see anything ondataavailable() and I am not sure why. Anybody has any clue?
manifest.json
"action": {},
"permissions": ["sidePanel", "storage", "scripting", "tabCapture", "offscreen"]
offscreen.js
console.log("StartRecording Function");
const stream = await navigator.mediaDevices.getUserMedia({
audio: {
mandatory: {
chromeMediaSource: 'tab',
chromeMediaSourceId: streamId
}
}
});
// Continue to play the captured audio to the user.
const output = new AudioContext();
const source = output.createMediaStreamSource(stream);
source.connect(output.destination);
mediaRecorder = new MediaRecorder(stream);
mediaRecorder.start();
mediaRecorder.onstart = () => console.log("================Recording started======================");
mediaRecorder.onerror = (error) => console.error("Recording error: ", error);
console.log("Recording Started after onstart, before ondata available");
mediaRecorder.ondataavailable = (event) => {
console.log("Data available audioChunks pushed...")
console.log(event.data);
};
Everything seems to be logged properly but the following code is never triggered:
console.log("Data available audioChunks pushed...")
Do you know why?
I am also wondering if by using the above piece of code I can receive information when the other person on the meeting is speaking.
My goal is to leverage only getUserMedia() and identify when the 2 people on the call are speaking.
Thanks.