My goal is to stream a video from my pc, to a smartphone.
However, i want the latency between us to be minimal.
VIDEO -> ENCODING -> TRANSFER -> DECODING -> VIDEO
I already know that the transfer will take approximately 50ms.
What i want to know is different methods of encoding/decoding and how fast they are. The decoding smartphone will of course have limited computing power, and limited bandwidth available as well. As far as i can understand, less time encoding/decoding means more bandwidth usage. The whole operation above should take maximum 200ms, preferably less.
EDIT:
@gnat I’m just looking for some numbers on how fast a coder/decoder can work. I’m not very experienced with this kind of theory at all, seeing as i only know very basic c#. That is why i asked here, and not on stackoverflow. I’ve searched around quite a lot but haven’t really found anything about coding/decoding speed at all, and all i wanted was some example numbers like Per was kind enough to provide.
Other then that, this was a very theoretical question, as i have used VLC media server for the purpose above. I just found it easier to ask with a simple example.
2