I am trying to stream some video to a browser using WebRTC. To accomplish this, I have a gstreamer instance that encodes the video and sends it to a server via RTP which simply packs it into WebRTC to display. If I use VP8 (vp8enc
), this works as expected. However, when using H.264, my browser does not decode any frames:
gst-launch-1.0 videotestsrc ! videoscale ! video/x-raw,width=640,height=360 ! x264enc ! rtph264pay ! udpsink host=127.0.0.1 port=6001 # < my forwarding server
Furthermore, if I run my pipeline on my Khadas VIM4 and use its hardware H.264 encoder (amlvenc
) instead of x264
with the exact same pipeline otherwise, the video is displayed (at least in Firefox).
What could be the difference between these two encoders (which both output H.264 video), or how could I find the cause for this problem? I tried looking through the browser WebRTC debugging pages, but this did not really help me…