Monday, September 29, 2014

Live Streaming with Windows Azure Media Services


In this blog we will see the Live Streaming capability of Windows Azure Media Services (WAMS) which has recently been released under preview on Sep 10th.
Prior to this capability there were other ways to provide live streaming like using IIS Media Services, but its configuration turns complex when the need is to provide a scalable and reliable (backup & failover paths) solution. Now with WAMS, "you now have everything you need to quickly build great, extremely scalable, end-to-end media solutions for streaming on-demand video to consumers on any device.

Architecture:
 At a high level, the Live Streaming architecture consists of three components: Channel/Program, Storage and Streaming Endpoints.

Channel/Program- Channels enable live streaming. They include the ingest point for your live encoder. As of today RTMP and Fragmented MP4 (Smooth Streaming) are the supported ingest protocols. A program is the logical component inside a channel. Program publish the received data for streaming and also archive the content.

Streaming Endpoint and Streaming Units- a Streaming Endpoint provides you with a URL from which you can pull your live assets.  Streaming Endpoints also provide dynamic packaging capabilities and secure the delivery of the streams.

Storage- Programs use Azure storage for storing live archives. On-demand streaming and encoding services also use storage.

How to configure Live Streaming:

Steps:
  • Create a new Media Service by clicking New -> Media Service -> Quick Create
  • Once the service is created click the service name which will show the screen as below. There are new tabs like Streaming Endpoints and Channels. Let’s see what are these individually

Streaming Endpoints: Media Services enables you to add multiple streaming endpoints to your account and to configure these endpoints.
Note: 
  • While you create your media service a default streaming endpoint is already created for you but you can add more and configure it
  • The default streaming endpoint cannot be deleted

To configure the streaming endpoint click on the default endpoint (or the new one you just added). Under the configure tab “Streaming allowed IP addresses” you can add a rule which IP’s can access the published streaming endpoint. If no rule is provided the endpoint has public access.
For Scaling details can be referred here
Once you have configured Streaming Endpoint make sure the status of the endpoint is running .
Channels: A Channel is used to stream live content from an Origin over HTTP. For example, you can stream a sporting event to an audience that is using various devices such as a phone, tablet, computer, or XBox. The stream employs adaptive bit rate streaming to allow clients to request different bitrates on a moment to moment basis. The stream must be encoded with a live encoder. The encoder must be configured to connect to the Ingest URL.

Ingest URL - When the Ingest URL is available, copy it and connect your encoder to the URL.


Preview URL - Once a video source is connected through the Ingest URL, you can monitor the live stream using the Preview URL. The event does not start streaming until you push Start Streaming.

Start Streaming - The following actions occur after clicking Start Streaming
  1. A new Program is created
  2. A new Asset is created and associated with the Program
  3. The Channel starts
  4. The Publish URL is created and appears in the UI
  • Click on the “Create New Channel”
  • Once created, the ingest and preview URL’s will be shown
  • Copy the Ingest URL and paste it in the input encoder (example: I am using Expression Encoder here)
  • Now we are almost ready to start/send live streams just need one more extra step to publish these live streams to users.
    Go to Azure Portal and select the Media Service -> Channels -> click Start Streaming button

Live Streaming Video Snapshots:
Below image shows the encoder pushing live streams and it automatically plays on the azure website.

I have created a video log which can be found here.

Pricing Details:
Pricing information for Live Streaming can be found here.

Saturday, August 31, 2013

Windows Azure Media Services

 
In this blog, I will describe how we used Window azure media services (WAMS) core features to display content targeted for various devices.
 
Media Services Implementation:
 
Here is a high level description of individual components:
  • File System: Represents media files stored in different formats (i.e. format acceptable by media service. See here for info. on the accepted media formats)
  • WAMS Storage: It will store all the media in separate containers
  • Media Service application: This component polls for any files uploaded to media store.  It also creates an entry on SQL Azure Table containing the Locator URL’s for each encoded videos.
  • Cloud Service: A basic WCF Service which gets data from SQL Azure table.  This information supports native players on win8, win7 phone and Surface devices.
  • Azure Website: The website detects the device type and plays the correct video/audio content
  • Devices: iPad, Tablet etc. connects to the azure website and the compatible content is played
Implementation Strategy:
The implementation was focused towards the following areas:
  • Supporting videos for different devices/platforms
  • Supporting both Adaptive and Progressive Streaming
  • YouTube like format (clicking on video thumbnail should play the video)
  • Smooth Streaming using Player Framework (showing captions, network & bitrates info., Browser info., Video name etc.)
  • Authentication & Authorization
Workflow of the application (Backend process):
There are two ways to deliver content - traditional approach & dynamic packing.   We will now look at both the options and pros/cons for one vs. the other.
 
Traditional Encode and Package:
 
 
Figure-1: Traditional Approach to deliver content
 
 
Code Link: http://sdrv.ms/18drgRz (code for the backend process with traditional approach to upload, encode the media to WAMS storage)
 
Below are the implementation steps which you will see once you download the code from the above link.
  • Step1: First an empty Asset is created & the selected input media (either file system/file share) is uploaded to the Asset Container.
  • Step2: The uploaded Asset referenced from step1 is passed to Encoding routine which creates an .mp4 format & goes through the below tasks:
    • Declare a new job
    • Create a task with the Encoding details, using a string preset “H264 Broadband 720p
    • Specify the input asset to be encoded
    • Add an output asset to contain the results from the job
    • Submit the job
    • Check the job progress at specified interval
    • On success Get a reference to the output asset from the job 
  •  Step3: The output Asset from step2 is now passed to encoding routine which creates a smooth stream format.  Following tasks are executed:
    • Reading the configuration file “MediaPackager_MP4ToSmooth.xml”
    • Declare a new job
    • Create a task with the Encoding details, using the configuration file
    • Specify the input asset to be encoded
    • Add an output asset to contain the results of the job
    • Submit the job
    • Check the job progress at specified interval
    • On success Get a reference to the output asset from the job 
  •  Step4: At this stage, output Asset from step3 is passed to encoding routine which creates an HLS format.
    • Reading the configuration file “MediaPackager_SmoothToHLS.xml
    • Identify the smooth stream file from the output Asset of step3 and make it as Primary File using the Is Primary property
    • Rest of the steps are similar as from step3 encoding routine 
  • Step5: Generating a Thumbnail by using the Asset created in step2
    • Similar steps will be followed as mentioned in step3 except reading the configuration file named “Thumbnails.xml
  • Step6: Create an Origin Locator URL by using CreateLocator method for smooth streaming, HLS streaming asset on an Origin Server
  • Step7: Create a SAS Locator URL by using CreateSasLocator method for the Thumbnail & .mp4.
  • Step8: Add the URL’s (Origin Locator for smooth stream & HLS, SASLocator for Thumbnail & mp4) to the SQL Azure Table
  • Steps 1 to 8 are repeated for each file uploaded
Pros:
  • Can serve files immediately as the format is available
  • No special configuration required
Cons:
  • Dynamic encoding for different formats is not supported at runtime i.e. each format needs to be generated and stored
  • Storage is wasted as multiple format of files are created
  • Execution process is slow
Challenges:
  • Uploading very large media files
 
Dynamically Packaging Assets:
Using dynamic packing feature in WAMS, you only need to store an Mp4 file in storage.  Based on client requests, we dynamically package Mp4 file into HLS or Smooth Streaming.
 
Figure-2: Dynamically Packaging to deliver content
 
Pre-requisites:
You need to request for at least one On-Demand Streaming Reserved Unit. This process can be setup on azure management portal via the Scale tab. For pricing detail, please refer here
 
Note:
To take advantage of dynamic packaging, you must first get at least one On-demand Streaming reserved units. For more information, see Howto Scale a Media Service
.
 
 
Code Link: http://sdrv.ms/19ykRym (code for the backend process with dynamic packing to upload, encode the media to WAMS storage)
 
Here are the implementation steps,
  • Step1: First upload the input media (either file system/file share) to the media store using the assets.Create method
  • Step2: Create a job which will EncodeToMultiBitrateMp4 
    • Declare a new job
    • Create a task with the Encoding details, using a string preset "H264 Adaptive Bitrate MP4 Set SD 16x9"
    • Specify the input asset to be encoded
    • Add an output asset to contain the results from the job
    • Submit the job
    • Check the job progress at specified interval
    • On success Get the job reference
    • Get a reference of the encoded mp4 media from the output Asset of the job
    • Get the streamingURL for mp4, smooth stream & HLS
  • Step3: Generating a Thumbnail with the input File
    • Declare a new job
    • Create a task & provide task name, media processor and configuration file “Thumbnails.xml”
    • Specify the input asset to be encoded
    • Add an output asset to contain the results of the job
    • Submit the job
    • Check the job progress
  • Step4: Get URL’s SASLocator for .mp4, Thumbnail and Origin Locator for Smooth Stream, HLS
  • Step5: Add the URL’s (Origin Locator for smooth stream & HLS, SASLocator for Thumbnail & mp4) to the SQL Azure Table
  • Steps 1 to 5 are repeated for each file uploaded
Pros:
  • Only MultiBitrateMp4 file is stored; rest of the formats are generated on the fly
  • Saves on storage space & cost
  • Execution process is fast
Cons:
  • Additional configuration of “On-demand Streaming reserved units”
Challenges:
  • Uploading very large media files
 
 
Workflow of the application (Frontend process):
 
Azure Website:
We have created a simple MVC and a Silverlight application. The Silverlight application uses Player Framework component and shows Network Bitrate, Video Name & other sets of information while playing the video.

The MVC application performs authentication using Membership API and shows Gallery of all the videos by pulling data from the SQL Azure table. Before the gallery is loaded, there is also a check to see which device is used by executing the following JavaScript snippet:
 
navigator.userAgent.toLowerCase(); [check for Android, IOS & Windows Phone]

Silverlight.isInstalled('<add specific Silverlight version no here>'); [check if Silverlight is installed]
 
Based on device identified, the video player is loaded & the compatible video is played.
Example:
  • If the device has Silverlight installed, then Player Framework will be loaded & smooth streaming video is played.
  • If Android device is detected, HTML5 video player will play an .mp4 formatted video
 Code Link:http://sdrv.ms/18drnN5  (code for the website with the custom Silverlight application)
 
Below are couple of screenshots from the Website.
  • Login Screen:
 
  • Gallery: Playing Smooth Streaming Video with Player Framework
 
  • Gallery: Playing .mp4 video on Android device
 
  • Gallery: Playing HLS video on Apple IPAD
 
 
Windows 8 & Surface:
We have also created an installer app for Win 8 systems. Below is the installer link & the steps to install the application.
 
Installer Link: http://sdrv.ms/11EvE8n (Client side App.)
 
Code Link: http://sdrv.ms/1aKdntw (Cloud WCF Service)
 
Steps to install the Win 8 app. on your system
  • Step1: Install the app. by going to the PowerShell script “Add-AppDevPackage” and running in as Administrator mode.
 
  • Step2: Go to start and you will see the application shortcut icon
 
  • Step3: App. will load with the Thumbnails of all the videos. Click on any of the video and it will start playing it.
    Note: The app. calls the cloud service (WCF) to get the data from the SQL Azure table.

    Couple of things to note here is we are showing the StreamingURL (video blob), Network connectivity in SD/HD (Standard/High Definition) mode, Full Screen option, Video Name along with the Player Framework playing the smooth stream video.
 
Windows Phone7:
To configure the project to be used for Win phone 7 device below are the steps required.
  • Step1: In order to load the win phone 7 project (as shared in the below code link) there are couple of Pre-requisites components to be setup you can refer the link here.
  • Step2: Once step1 is completed, Compile and run the solution the simulator will load & show the UI.
  • Step3: Next step is to deploy and test the app. on the windows phone 7. You can follow this link to deploy the application to the Windows Phone Store.
Code Link: http://sdrv.ms/18drwA9 (Win Phone 7 code) and http://sdrv.ms/1aKdntw (Cloud WCF Service)
References: