I’m working on a project where I’m capturing audio from the user’s microphone using getUserMedia and sending it to a server via WebSockets. The server then broadcasts this audio to other users. However, I’m encountering an issue when trying to play the audio on the receiving end. I’m getting the error “Error playing audio: DOMException: The element has no supported sources.”
Here’s the relevant code:
Admin Side (Broadcasting)
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Admin Broadcasting</title>
</head>
<body>
<h1>Admin Broadcasting</h1>
<button id="startBroadcastBtn">Start Broadcasting</button>
<script src="/socket.io/socket.io.js"></script>
<script>
const startBroadcastBtn = document.getElementById('startBroadcastBtn');
const socket = io('ws://localhost:9000');
let mediaRecorder;
startBroadcastBtn.addEventListener('click', async () => {
try {
const stream = await navigator.mediaDevices.getUserMedia({ audio: true });
if (!stream) {
console.error('getUserMedia failed to return a valid stream');
return;
}
console.log('MediaStream object:', stream);
if (stream) {
const audioTracks = stream.getAudioTracks();
if (audioTracks.length > 0) {
mediaRecorder = new MediaRecorder(stream);
mediaRecorder.ondataavailable = (event) => {
if (event.data.size > 0) {
console.log('Audio chunk captured inside if:', event.data);
socket.emit('admin-audio-chunk', event.data);
}
};
mediaRecorder.start(1_000);
console.log('Recording started.');
} else {
console.error('No audio tracks available');
}
} else {
console.error('getUserMedia failed to return a valid stream');
}
} catch (error) {
console.error('Error accessing media devices:', error);
}
});
</script>
</body>
</html>
User Side (Receiving)
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>User Listening</title>
</head>
<body>
<h1>User Listening</h1>
<audio id="audioPlayer" controls></audio>
<button id="playButton">Play Audio</button>
<script src="/socket.io/socket.io.js"></script>
<script>
const audioPlayer = document.getElementById('audioPlayer');
const playButton = document.getElementById('playButton');
const socket = io('ws://localhost:9000');
socket.on('user-audio-chunk', (data) => {
console.log('Received audio chunk from user:', data);
try {
const blob = new Blob([data]);
console.log('Received audio format:', blob.type);
const audioURL = URL.createObjectURL(blob);
audioPlayer.src = audioURL;
audioPlayer.pause();
playButton.addEventListener('click', () => {
audioPlayer.play().catch((error) => {
console.error('Error playing audio:', error);
});
});
} catch (error) {
console.error('Error creating Blob object:', error);
}
});
</script>
</body>
</html
Additional Detail:
1.)On the admin side, I’m getting the blob type as “audio/webm;codecs=opus”. However, on the user side, the blob type is null. I suspect this discrepancy might be causing the issue.
Both the admin and user sides are connected via a Node.js server using Socket.IO for real-time communication.
2.)I’m not sure why I’m encountering this error. How can I resolve it and ensure that the audio plays correctly on the user’s side?