I already wrote a Dart FFI plugin to create the SRT client that listens to the online radio and receives the raw MP3 frames in raw bytes. What gives me the headache is to spawn an isolate to run in the background as a service which feeds a just_audio custom StreamAudioSource
‘s stream with the data and plays it back. The actual SRT protocol is non-relevant as the only problem I have is to play back a non pre-determined amount of bytes from a live stream.
What I already tried is to use the request
method to just get a new MP3 frame from the SRT server and feed it as a stream to the StreamAudioResponse
. Logging showed the request
method only get’s called every 5-7 seconds or so, but I get around 30-50 audfio frames a second sent by the SRT server.
class AudioStreamSource extends StreamAudioSource {
AudioStreamSource() {
// Initialises SRT client...
}
@override
Future<StreamAudioResponse> request([int? start, int? end]) async {
// Get's the next MP3 frame from the SRT server into a buffer...
final audio = ...; // Pointer<Char> converted to a Stream
return StreamAudioResponse(
sourceLength: null,
contentLength: null,
offset: null,
stream: audio,
contentType: 'audio/mpeg');
}
}
In my head the correct concept looks like the following:
I initialise the SRT connection, then I spawn an isolate which runs indefinitely (or at least until it doesn’t get message through a port to make the service stop) that has loop receiving the newest MP3 frames as raw bytes that gets fed to a custom just_audio StreamAudioSource
(via a MethodChannel ?), which then get’s played by the audio player.
Am I wrong, has anyone tried doing something similar?