Google Chrome does not autoplay HTML5 video on mobile Google Chrome does not autoplay HTML5 video on mobile google-chrome google-chrome

Google Chrome does not autoplay HTML5 video on mobile


<video autoplay loop autobuffer muted playsinline>     <source src="video/video-hat.mp4" type="video/mp4"></video>


The problem is that Google want that users initiate by themselves any media, so If you debug your device chrome browser, you will get the warning "Failed to execute 'play' on 'HTMLMediaElement': API can only be initiated by a user gesture."So that means you need to attach the video initialization, for example, with a click event


There doesn't appear to be any great info on this, so thought I'd post my findings.

I've been debugging html5 video playback on Chrome desktop and mobile on an Android 5.0.1 Samsung S4 with Chrome 61 and the embedded browser, and Safari 9 & 11, using an automatic javascript play/pause written in AngularJS (below). The video is embedded in a carousel so is sometimes visible, sometimes not. In summary:

  • I would recommend having both webm(vp8/vorbis) and mp4(h264/aac) formats. These are the most supported formats and have equivalent quality for the same bitrate. ffmpeg can encode both.
  • It seems Chrome mobile prefers webm if it can get it, so put that first.
  • If a browser plays a file when you direct it to the file url, this does not mean it will play it when embedded in a video tag, though it will tell you if the format & codecs are supported if it does play. Chrome mobile seems very picky about having a video source whose resolution is too high.
  • Safari (and probably iOS) will not play a video unless served by a server supporting byte-ranges. Apache, nginx and Amazon S3 for example do support them, but many smaller web servers (like WSGI servers) do not.
  • The order of the videos matters more than the source media attribute. Always have low resolution versions of a video first. The example below uses 1920x1080 and 1280x720. It seems if the mobile browser encounters a video that is "too high-res", it just stops processing the other sources and prefers the poster.
  • having a controls attribute and manual play vs playing through javascript doesn't appear to make any difference.
  • the muted attribute stops android from putting a little speaker icon in the status bar when playing but off-screen, even when the video doesn't have audio. As a side-note, I'd also really think about your audience if you intend to autoplay video with sound. Personally I think it's a bad idea.
  • the preload attribute doesn't seem to make much difference. The browser will tend to automatically preload the selected video metadata anyway.
  • having a source type attribute does not stop the video from playing. If anything it helps the browser choose which source to pick for the best
  • the JS video.oncanplay event is the best way to see if the video tag has been successful. If you don't get that, the video won't play, but the browser won't tell you why.

HTML:

<video class="img-responsive-upscale ng-scope"  video-auto-ctrl loop muted preload poster="0022.png">  <source src="vid_small.webm" media="(max-width: 1280px)" type="video/webm">  <source src="vid_small.mp4" media="(max-width: 1280px)" type="video/mp4">  <source src="vid.webm" media="(max-width: 1920px)" type="video/webm">  <source src="vid.mp4" type="video/mp4">  <img src="0022.png" alt="something"    title="Your browser does not support the <video> tag"></video>

Javascript:

<script type="text/javascript">angular.module('myproducts.videoplay', []).directive('videoAutoCtrl',  function() {  return {    require: '^uibCarousel',    link: function(scope, element, attrs) {      var video = element[0];      var canplay = false;      var rs = ["HAVE_NOTHING", "HAVE_METADATA", "HAVE_CURRENT_DATA", "HAVE_FUTURE_DATA", "HAVE_ENOUGH_DATA"];      var ns = ["NETWORK_EMPTY", "NETWORK_IDLE", "NETWORK_LOADING", "NETWORK_NO_SOURCE"];      function vinfo() {        console.log("currentSrc = " + video.currentSrc);        console.log("readyState = " + rs[video.readyState]);        console.log("networkState = " + ns[video.networkState]);        bufinfo();      }      function bufinfo() {        // tr is a TimeRanges object        tr = video.buffered        if (tr.length > 0) {          var ranges = ""          for (i = 0; i < tr.length; i++) {            s = tr.start(i);            e = tr.end(i);            ranges += s + '-' + e;            if (i + 1 < tr.length) {              ranges += ', '            }          }          console.log("buffered time ranges: " + ranges);        }      }      video.onerror = function () {        console.log(video.error);      }      video.oncanplay = function () {        canplay = true;        if (!playing) {          console.log("canplay!");          vinfo();        }      }      var playing = false;      function playfulfilled(v) {        console.log("visible so playing " + video.currentSrc.split('/').pop());        playing = true;      }      function playrejected(v) {        console.log("play failed", v);      }      function setstate(visible) {        if (canplay) {          if (visible) {            p = video.play();            if (p !== undefined) {              p.then(playfulfilled, playrejected);            }          } else if (playing) {            video.pause();            console.log("invisible so paused");            playing = false;          }        } else {          console.log("!canplay, visible:", visible);          vinfo();        }      }      // Because $watch calls $parse on the 1st arg, the property doesn't need to exist on first load      scope.$parent.$watch('active', setstate);    }  };});</script>