Why does video resolution change when streaming from Android via WebRTC Why does video resolution change when streaming from Android via WebRTC google-chrome google-chrome

Why does video resolution change when streaming from Android via WebRTC


getUserMedia constraints only affect the media requested from the browser to the hardware and returned as a stream. getUserMedia constraints don't have any effect on what is done to that stream afterwards (i.e., when it's streamed over a connection). The degradation you're seeing is in the PeerConnection layer, not in the getUserMedia layer. Degradation is triggered by the webrtc implementation when hardware and bandwidth statistics are indicative of low performance, and is negotiated by both sides.

[Hardware] <-   getUserMedia   -> [javascript client] <- PeerConnection -> [another client]           <- 640x480 captured ->                     <-  320x240 sent  ->

You'll have to dig into source code for documentation and evidence of how it's done in each implementation, but references to behavior:

From the OReilly Chapter on WebRTC:

The good news is that the WebRTC audio and video engines work together with the underlying network transport to probe the available bandwidth and optimize delivery of the media streams. However, DataChannel transfers require additional application logic: the application must monitor the amount of buffered data and be ready to adjust as needed.

...

WebRTC audio and video engines will dynamically adjust the bitrate of the media streams to match the conditions of the network link between the peers. The application can set and update the media constraints (e.g., video resolution, framerate, and so on), and the engines do the rest—this part is easy.