Can Of Code

The History of Video Decoding on Android

In this post I aim to give a brief overview of the differences with video decoding on Android. This overview is mainly focused on whats available if MediaPlayer or VideoView doesn’t fit your requirements.

 

From the beginning (Android 1.0)

The first media framework on Android came from PacketVideo and was called OpenCore.

 

Donut time (Android 1.6)

Android-Donut

Image source: http://www.huntlogo.com/android-donut-logo/

 

A updated version of OpenCore was released with the catchy title of “OpenCore 2.0”.  The new version of OpenCore included support for the OpenMax IL API which enabled apps to make use of dedicated hardware encoder/decoders.

The inclusion of OpenMax support was a massive step in bringing high resolution video to mobile devices. A dedicated Hardware decoding/encoding chip would take the stress away from the CPU,  essential in the days when a device like the HTC Hero had a CPU speed of 528Mhz.

That is all I know of OpenCore and considering that it’s only available for 0.1% of the distribution it’s probably not worth implementing.

 

Eclair has Stagefright (Android 2.0 – 2.1)

Android-Eclair

Image source: http://www.huntlogo.com/android-eclair-logo/

 

The Stagefright library was introduced as a replacement for the previously mentioned OpenCore.

From the Java side you only get a handful of classes which should use hardware decoding when available. If you want to use media formats that are not support by Eclair you will need to go down the painful route of using private Android libraries.

When going down this route you will read of people warning you against it, the main reason being that there is no guarantee that the libraries will be the same on every device.

The OMXCodec would be your best bet at decoding via hardware. A good demo of this can be found here. What that demo doesn’t mention is how to include the private libraries and headers so you can actually use OMXCodec. One way would be to copy the libstagefright.so (among others) from an emulator into your project.

 

Gingerbread also has Stagefright (Android 2.3)

Android-Gingerbread

Image source: http://www.huntlogo.com/android-gingerbread-logo/

 

Not much relating to video was changed in Gingerbread. One point of interest is the addition of being able to use the Surface from native code via the NDK.

 

Honeycomb HLS (Android 3.0)

Android-Honeycomb-2

Image source: http://www.huntlogo.com/android-honeycomb-logo-2/

 

Although nobody uses Honeycomb it did bring along support for HTTP Live Streaming (HLS). However Android’s implementation is a little flaky especially when seeking.

 

Ice Cream Sandwich (Android 4.0)

Android-Ice-Cream-Sandwich

Image source: http://www.huntlogo.com/android-ice-cream-sandwich-logo/
 

Ice Cream Sandwich introduced OpenMax AL Support via the NDK. One significant downfall with Android’s implementation of OpenMax AL is that it only supports the decoding of “MPEG-2 transport stream via a buffer queue“. This rules out passing raw h264 NALUs or other media formats for that matter.

 

Jellybean brings MediaCodec (Android 4.1 – 4.3)

Android-Jelly-Bean-logo

 

 

Image source: http://www.huntlogo.com/android-jelly-bean-logo/
 

In jellybean the MediaCodec was introduced which provides a Java API for Encoding / Decoding using both software and hardware decoders.

Although the introduction of the MediaCodec gives the benefit of staying within the comfort of a Java environment. One frustration with MediaCodec is the poor documentation, especially if you want to use your own demuxer instead of the MediaExtractor.

A good example of the lack of documentation revolves around how codec private data should be handled. For example with a H264 Stream you should put the SPS in a ByteBuffer with the key “csd-0” and the PPS with a key of “csd-1”. The best approach i found was to study the MediaFormat that is created by the MediaExtractor and then emulate the results in your project.

A really helpful working example of the MediaCodec can be found here, its worth keeping the site in your favorites as they also has a few other useful related posts.

Update: Android 4.3 has a new “modular DRM framework” for handling streams that make use of DRM.

 

Finally

That’s my overview of video decoding on Android. You may notice i have missed a few of the smaller release mainly because nothing interesting had changed. I am relatively new to both Android and Video decoding so if i have missed something important or made any incorrect statements please let me know in the comments.

4 thoughts on “The History of Video Decoding on Android

  1. David Gerber says:

    Here’s my experience with the MediaCodec API: http://zapek.com/blog/android-mediacodec/

    Unfortunately you’re right about the codec’s private data part. Not only there’s no documentation about it, but you cannot strip it from the buffers. The funny thing is that Kitkat adds some API for adaptive streaming (resolution change on the fly) but there’s no way to do that with MediaCodec/MediaExtractor.

  2. Johnf758 says:

    I appreciate, cause I found just what I was looking for. You have ended my 4 day long hunt! God Bless you man. Have a nice day. Bye fbeakkcgaeae

  3. Glenn says:

    The codec-specific data is documented here:
    http://developer.android.com/reference/android/media/MediaCodec.html under “Codec-specific Data”.

Leave a Reply

Your email address will not be published. Required fields are marked *