Instructions to do WebM live streaming via DASH

This page describes the recommended ways to create, stream and playback live WebM files using DASH.

Prerequisites

1) FFmpeg - You will need to download version 2.8 of FFmpeg (or higher) in order for some DASH features to work. You can either download nightly static build from https://www.ffmpeg.org/download.html or build FFmpeg yourself from the git repository.

2) A web server.

3) Shaka Player (for playback on Web) - https://github.com/google/shaka-player

4) ExoPlayer (for playback on Android) - https://github.com/google/ExoPlayer

5) Dash.js (for playback on Web) - https://github.com/Dash-Industry-Forum/dash.js

Creating Live Content

Encode Video and Audio

FFmpeg can be used to create the Audio and Video streams for DASH Live. This will seem familiar if you have used FFmpeg to create VOD (non-live) DASH streams.

For live streaming WebM files using DASH, the video and audio streams have to be non-muxed and chunked. For more information on what this means, see this link.

We are going to use the following encoding settings with libvpx for the VP9 encoder:

VP9_LIVE_PARAMS="-speed 6 -tile-columns 4 -frame-parallel 1 -threads 8 -static-thresh 0 -max-intra-rate 300 -deadline realtime -lag-in-frames 0 -error-resilient 1"

Now, the video and audio streams can be generated by using a command as follows:

ffmpeg \

 -f v4l2 -input_format mjpeg -r 30 -s 1280x720 -i /dev/video0 \

 -f alsa -ar 44100 -ac 2 -i hw:2 \

 -map 0:0 \

 -pix_fmt yuv420p \

 -c:v libvpx-vp9 \

   -s 1280x720 -keyint_min 60 -g 60 ${VP9_LIVE_PARAMS} \

   -b:v 3000k \

 -f webm_chunk \

   -header "/var/www/webm_live/glass_360.hdr" \

   -chunk_start_index 1 \

 /var/www/webm_live/glass_360_%d.chk \

 -map 1:0 \

 -c:a libvorbis \

   -b:a 128k -ar 44100 \

 -f webm_chunk \

   -audio_chunk_duration 2000 \

   -header /var/www/webm_live/glass_171.hdr \

   -chunk_start_index 1 \

 /var/www/webm_live/glass_171_%d.chk

This command captures the video and audio from the webcam and microphone respectively and encodes them into a Live WebM Stream.

Things to note:

Create the DASH Manifest

FFmpeg can be used to create the DASH Manifest by passing the header file created from the previous step as input. Here's a sample command:

ffmpeg \

 -f webm_dash_manifest -live 1 \

 -i /var/www/webm_live/glass_360.hdr \

 -f webm_dash_manifest -live 1 \

 -i /var/www/webm_live/glass_171.hdr \

 -c copy \

 -map 0 -map 1 \

 -f webm_dash_manifest -live 1 \

   -adaptation_sets "id=0,streams=0 id=1,streams=1" \

   -chunk_start_index 1 \

   -chunk_duration_ms 2000 \

   -time_shift_buffer_depth 7200 \

   -minimum_update_period 7200 \

 /var/www/webm_live/glass_live_manifest.mpd

Make sure that the "chunk_start_index" and the "chunk_duration_ms" parameters are the same as in the previous step. Also, ensure that the previous FFmpeg command has actually written the header file (usually happens instantaneously) before running this. If you are wrapping both the commands in a script, it's recommended to add at least 1 second of sleep between these two commands.

Streaming Live Content

Shaka Player is an open source media player built on top of HTML5 video API and it is capable of playing back Live WebM via DASH (provided the browser supports Media Source Extensions and WebM/VP9 playback).

Clone and Build Shaka Player:

The above steps will generate "shaka-player.compiled.js". Copy that to your web-server's directory.

Here's a sample piece of HTML and Javascript that uses Shaka Player for WebM Live Playback:

<html>

 <script src="shaka-player.compiled.js"></script>

 <script>

   function startVideo(mpd) {

     shaka.polyfill.installAll();

     var video = document.getElementById('video');

     var player = new shaka.player.Player(video);

     var source = new shaka.player.DashVideoSource(mpd, null, null);

     player.load(source);

   }

 </script>

 <body onLoad="startVideo('glass_live_manifest.mpd');">

   <video id="video" controls autoplay></video>

 </body>

</html>

Android (ExoPlayer)

ExoPlayer is an extensible open source media player built on top of Android's Media APIs. ExoPlayer natively supports WebM Live Streams via DASH. Please refer to the ExoPlayer's sample app to know how to use ExoPlayer to playback Live Streams via DASH.

To playback VP9 videos on Android devices running prior to Kitkat, the native ExoPlayer VP9 Extension can be used.

Web (Dash.js)

Dash.js is an open source media player built on top of the HTML5 video API.

Merely point the URL of your manifest to Dash.js and it will adaptively stream the live content. Here's a sample piece of HTML and Javascript that uses Dash.js:

<html>

 <script src="dash.all.js"></script>

 <script>

   function startVideo(mpd) {

     var video, context, player;

     video = document.querySelector('video');

     context = new Dash.di.DashContext();

     player = new MediaPlayer(context);

     player.startup();

     player.attachView(video);

     player.setAutoPlay(true);

     player.attachSource(mpd);

   }

 </script>

 <!-- see note below about manifest URLs -->

 <body onload="startVideo('glass_live_manifest.mpd');">

   <video controls></video>

 </body>

</html>

For more information on Dash.js API and tweaking, you can refer to Dash.js wiki here: https://github.com/Dash-Industry-Forum/dash.js/wiki

[Optional] Time Sync Between Server and Clients

Note: This section only applies for ExoPlayer and Dash.js

DASH Live playback works better if the server and client clocks are in sync. In order to achieve that, the client requests a page from the server, for which the server responds with 200 OK and the response body as the current server UTC date and time in ISO Format. The strftime format specifier for ISO format is "%FT%TZ". A sample time page can be found here: http://time.akamai.com/?iso

Once you have the time server ready, you can include the UTCTiming URL in the manifest by passing:

-utc_timing_url "<url_to_iso_time_page_as_described_above>"