In Part 1, we compiled the FFmpeg library for android. In this post, we’ll actually do something with the library.
Basically, I followed this excellent FFmpeg tutorial. I recommend that you go read it. All you need are Parts 1 and 2.
One problem with the Dranger tutorial is that it’s written using an outdated FFmpeg function, so we need to update it using the current FFmpeg library. The culprit is
img_convert((AVPicture *)pFrameRGB, PIX_FMT_RGB24, (AVPicture*)pFrame, pCodecCtx->pix_fmt, pCodecCtx->width, pCodecCtx->height);
As you may have guessed, this function takes a decoded video frame
(pFrame) and converts it to a format our program can use. Unfortunately, this function is obsolete. We need to use a
sws_scale instead. These are like
Bitmap.createScaled—you first create a
Bitmap.Factory object, pass it options, then pass the resulting object to
sws_scale work the same way: first, create a
static struct SwsContext *img_convert_ctx; //... img_convert_ctx = sws_getContext( pCodecCtx->width, //original width pCodecCtx->height, //original height pCodecCtx->pix_fmt, //original format targetWidth, targetHeight, PIX_FMT_RGBA, //conversion format SWS_FAST_BILINEAR, NULL, NULL, NULL //leave these alone );
If you followed the Dranger tutorial,
pCodecCtx should already be setup to contain the dimensions and format of the video. The important variables are
PIX_FMT_RGBA. These give us the dimensions and format to use for the conversion. Since we will display the converted frame as an OpenGL ES texture, the dimensions must be a power of 2. The format must be either
PIX_FMT_RGBA as this is what OpenGL ES supports. We’ll use the latter because it seems to be faster.
Next, actually convert the frame:
sws_scale(img_convert_ctx, pFrame->data, pFrame->linesize, 0, pCodecCtx->height, pFrameConverted->data, pFrameConverted->linesize);
If you followed the Dranger tutorial, these data structures have already been setup. The converted image will be stored in
pFrameConverted->data i.e., you can pass this to
glTexImage2D to use as a texture.
Let’s pretend you wrote a C program that does something with FFmpeg. (Yeah, yeah I know I should add some sample code but it’s a lot easier than it sounds. The real problem is figuring out how to link the pre-compiled Android FFmpeg libraries.) The magic code to link this program to FFmpeg in the NDK is:
include $(CLEAR_VARS) LOCAL_MODULE := <YOUR.MODULE.NAME> LOCAL_SRC_FILES := <RELATIVE_PATH_TO_FILE> LOCAL_C_INCLUDES := \ $(LOCAL_PATH)/include \ $(LOCAL_PATH)/ffmpeg-android/ffmpeg LOCAL_LDLIBS := -L$(NDK_PLATFORMS_ROOT)/$(TARGET_PLATFORM)/arch-arm/usr/lib \ -L$(LOCAL_PATH) \ -L$(LOCAL_PATH)/ffmpeg-android/build/ffmpeg/armeabi/lib/ \ -lGLESv1_CM -ldl \ -lavformat -lavcodec -lavdevice -lavfilter -lavutil -lswscale \ -llog -lz -lm include $(BUILD_SHARED_LIBRARY)
Add this to the
Android.mk file in the root JNI folder. Line by line:
LOCAL_C_INCLUDESadds the headers to the FFmpeg files.
LOCAL_LDLIBSadd the locations of the pre-compiled FFmpeg libraries. There may be some crud here. Feel free to experiment.
lGLESv1_CMlinks to OpenGL ES (which we’ll eventually use).
lavformat ...links to the FFmpeg libraries we compiled in Part 1.
-llogis for Android debugging.
-lmare needed by FFmpeg (I believe).
If you run
ndk-build, you’ll have a program that links to the FFmpeg library.
Well that’s it for today. In the next post, we’ll display the converted frame,
pFrameConverted->data, on the Wallpaper screen using OpenGL.