Friday, June 28, 2013

How to build ffmpeg shared objects

Simple way of building ffmpeg shared objects.
Refer This tutorial. https://ffmpeg.org/trac/ffmpeg/wiki/UbuntuCompilationGuide
This will build only static libraries.

So use --enable-shared option in commandline like below.

git clone --depth 1 git://git.videolan.org/x264.git
cd x264
./configure --prefix="$HOME/ffmpeg_build" --bindir="$HOME/bin" --enable-shared --disable-asm
make
make install
make distclean

cd ..
git clone --depth 1 git://github.com/mstorsjo/fdk-aac.git
cd fdk-aac
autoreconf -fiv
./configure --prefix="$HOME/ffmpeg_build" --enable-shared
make
make install
make distclean
cd ..

wget http://downloads.sourceforge.net/project/lame/lame/3.99/lame-3.99.5.tar.gz
tar xzvf lame-3.99.5.tar.gz
cd lame-3.99.5
./configure --prefix="$HOME/ffmpeg_build" --enable-nasm --enable-shared
make
make install
make distclean
cd ..

wget http://downloads.xiph.org/releases/opus/opus-1.0.2.tar.gz
tar xzvf opus-1.0.2.tar.gz
cd opus-1.0.2
./configure --prefix="$HOME/ffmpeg_build" --enable-shared
make
make install
make distclean
cd ..

git clone --depth 1 http://git.chromium.org/webm/libvpx.git
cd libvpx
./configure --prefix="$HOME/ffmpeg_build" --disable-examples --enable-shared
make
make install
make clean
cd ..

cd ffmpeg
PKG_CONFIG_PATH="$HOME/ffmpeg_build/lib/pkgconfig" ./configure --prefix="$HOME/ffmpeg_build" \
  --extra-cflags="-I$HOME/ffmpeg_build/include" --extra-ldflags="-L$HOME/ffmpeg_build/lib" \
  --bindir="$HOME/bin" --extra-libs="-ldl" --enable-gpl --enable-libass --enable-libfdk-aac \
  --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --disable-libvpx \
  --enable-libx264 --enable-nonfree --enable-x11grab --enable-shared --disable-asm

make
make install
make distclean
hash -r
LD_LIBRARY_PATH=$LD_LIBRARY_PATH:$HOME/ffmpeg_build/lib
export PKG_CONFIG_PATH=$HOME/ffmpeg_build/lib

http://linuxserverguide.wordpress.com/2010/10/15/ffmpeg-error-while-loading-shared-libraries-libavdevice-so-52-cannot-open-shared-object-file/

Wednesday, June 12, 2013

Convert RGB to YUV420 planar format in java

This Method shows how you can convert RGB data to YUV420 planar data in Java.

colorconvertRGB_IYUV_I420 is a public static method/function which expects RGB data in aRGB[] which is an int (4 bytes/pixel) array, followed by width & height of input image. Returns yuv[] which is a byte array containing yuv data.

Android refers this format as COLOR_FormatYUV420Planar
FourCC name for it is : IYUV & I420


public static byte[] colorconvertRGB_IYUV_I420(int[] aRGB, int width, int height) {
        final int frameSize = width * height;
        final int chromasize = frameSize / 4;
       
        int yIndex = 0;
        int uIndex = frameSize;
        int vIndex = frameSize + chromasize;
        byte [] yuv = new byte[width*height*3/2];
       
        int a, R, G, B, Y, U, V;
        int index = 0;
        for (int j = 0; j < height; j++) {
            for (int i = 0; i < width; i++) {

                //a = (aRGB[index] & 0xff000000) >> 24; //not using it right now
                R = (aRGB[index] & 0xff0000) >> 16;
                G = (aRGB[index] & 0xff00) >> 8;
                B = (aRGB[index] & 0xff) >> 0;

                Y = ((66 * R + 129 * G +  25 * B + 128) >> 8) +  16;
                U = (( -38 * R -  74 * G + 112 * B + 128) >> 8) + 128;
                V = (( 112 * R -  94 * G -  18 * B + 128) >> 8) + 128;

                yuv[yIndex++] = (byte) ((Y < 0) ? 0 : ((Y > 255) ? 255 : Y));
               
                if (j % 2 == 0 && index % 2 == 0)
                {
                    yuv[uIndex++] = (byte)((U < 0) ? 0 : ((U > 255) ? 255 : U));
                    yuv[vIndex++] = (byte)((V < 0) ? 0 : ((V > 255) ? 255 : V));
                }

                index ++;
            }
        }       
        return yuv;
    }

Saturday, May 25, 2013

Encode AAC ADTS using android Media codec API

I faced this problem recently while recording AAC Audio with Mediacodec APIs.

Configure Android 4.1+ MediaCodec APIs to record AAC Audio as shown in below figure.


 Where P1, P2, P3...Pn are individual AAC audio frames coming out of Medicacodec AAC encoder's outputbuffer queue.

For similar source code you can refer here.

Problem is after you store each packets to a file or send RTP packetised P1, P2, P3 ...Pn  packets over network, most of the decoders cannot decode these AAC frames as packets don't have ADTS config header data along with them.

I refer to ADTS packets details here

Solution is to "Prepend" 7 or 9 bytes of ADTS headers to each AAC encoded frame coming out. 

When you are configuring mediacodec api for encoding AAC, you will mention,
mime type = "audio/mp4a-latm",
bitrate,
samplerate,
channel count etc
 
Once encoder is created, we can assume that encoder is honoring our configurations.
 

So using these info, create ADTS headers ( I use 7 bytes) and prepend them to each encoded buffer of mediacodec output. 

something similar to below info:

private void fillInADTSHeader(byte[] ENCodedByteArray, int encoded_length) {
         int finallength = 
encoded_length + 7;//7 bytes of adts headers
       
        int length3bits    = ((finallength & 0xFFFF) & 0x07) << 5;
        int length8bits    = (finallength & 0xFFFF) >> 3;
        int lengthMax2bits = (((finallength & 0xFFFF) & 0x1800) >> 11);
       
        ENCByteArray[0] = (byte)0xFF;
        ENCByteArray[1] = (byte)0xF1;//layer = 0; Mpeg-4 version, Protection absent
        ENCByteArray[2] = (byte)0x6C;//1 channel, AAC LC, private stream 0, sampling freq 8000
        ENCByteArray[3] = (byte)(0x40 | ((lengthMax2bits & 0xFF) << 6));//last 2 bits msb
        ENCByteArray[4] = (byte)length8bits;// 8 bits
        ENCByteArray[5] = (byte)length3bits;//3 bits lsb
        ENCByteArray[6] = (byte)0x00;//number of frames = 1
}

Catch with android phones are, Generated AAC profiles are not always same configured aac profiles (depends on phone :(