I am working on a project where I need to do a live stream using a webcam.
I work with web sockets and currently I use the AForge lib to get the images (Bitmaps) from my webcam.
Once I have the bitmap I put it in a byte array and then I directly send it to the client using a websocket. So I send bitmap per bitmap once at a time, but I am not sure that’s the right way to do it. Can someone help me?
private void video_NewFrameSync(object sender, NewFrameEventArgs eventArgs)
{
using (Bitmap bitmap = (Bitmap)eventArgs.Frame.Clone())
{
if (webSocket != null && webSocket.State == WebSocketState.Open)
{
byte[] byteArray = BitmapToByteArray(bitmap);
var buffer = new ArraySegment<byte>(byteArray);
webSocket.SendAsync(buffer, WebSocketMessageType.Binary, true, CancellationToken.None).Wait();
}
}
}
I feel like the client won’t even understand that it is a video that I want to send.
Sten33 is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.