Adaptive Streaming via Amazon AWS using S3, CloudFront and Elastic Transcoder
Recently got brought on board to help develop a streaming music and video project for a new yoga-based subscription site that will be called Love Tribe Vibes.
It’s a Drupal-based site and there are three people on the team so far – the project manager in Nashville, a front end designer in NYC and a coder in Houston.
The first project I took on is to enable adaptive streaming so that users can get as high or as low resolution video streams as their connection allows for.
The project was already making use of JWPlayer which seems to be a very reasonable choice in terms of flexibility (flash and HTML5 so it works on iOS and desktop devices) and price.
This article by Vic Dorfman covers getting set up for “real streaming” (meaning that users can stream starting from any arbitrary section of the video) by hosting the video in an Amazon S3 bucket and delivering it via their CloudFront service.
But there’s another layer of complexity for getting adaptive streams going, which as far as I can tell requires a JWPlayer account as opposed to just downloading the free version. Fortunately we already had an account. Enabling adaptive streaming requires that there are multiple version of the video to request, and they need to be broken up into segments of the same length so that if a user needs to go from one bitrate to another mid-stream the video will remain in sync. This is accomplished via Amazon’s Elastic Transcoder and there’s a post on the JWPlayer site that explains the process fairly well and a YouTube video that also helps clarify the process.
Basically you create three buckets in the S3 account, one for the Source video files, and two for the output files: Video and Thumbnails. In the Elastic Transcoder create a set of pipelines for each video which connect the in
and out
buckets, which will be assigned when you set up a job to automatically generate the various thumbnails and videos based on selected presets.
Two steps the video and blog leave out are that it’s necessary to add crossdomain xml files to each bucket in order for the player to be able to access the data. And set permissions so that the index file for each video is accessible to open/download by the public.
Mine crossdomain xml files are as simple as:
<cross-domain-policy>
<allow-access-from domain="*"/>
</cross-domain-policy>
For each file (possibly would work would work at top bucket/directory level), navigate to the file in bucket (video_folder/index.m3u8
), select it, and under Actions, select Properties. Under Permissions, add a new permission, allowing Everyone to Open/Download. The crossdomain.xml
files also need to be publicly accessible for download and reading.
Now you can log in to the JWPlayer account and publish the adaptive streaming videos using the urls (Domain Names) for the buckets followed by directory and file for each of video and thumbnail (poster). Something like this: http://dnm3555v1ze7vg.cloudfront.net/directory_named_for_video/index.m3u8
(note we’re already in the hlsstreams
directory which is what the specific CloudFront link points to), and follow the rest of the JWPlayer instructions for embedding in page.
For a lot of videos I more automated approaches to uploading and processing each video via Amazon, publishing via JWPlayer and embedding in a specific page. Not sure what yet, though.