I finally made some sample code available:
In the last post, we saw what we needed to do in order to use FFmpeg to decode a video. In particular, we loaded a video, decoded a frame, and saved it into
pFrameConverted->data. All that remains is to display this frame on the phone. For that we’re going to use GLWallpaperService. It works just like a regular WallpaperService except that you can run OpenGL code.
For the remainder, I’m going to assume that you know how to create a live wallpaper—nothing fancy, just the basic “hello world” live wallpaper. Another prerequisite is that you’ve read Dranger’s FFmpeg tutorials, namely Tutorials 1 and 2.
The idea is to grab a video frame, draw the frame on the texture, then draw the texture on the screen using the correct aspect ratio.
I took a freelance coding job asking me to code a Video Live Wallpaper. Sounds easy, right? Just create a Live Wallpaper and tell Android to play a video. How hard can that be? Unfortunately, in Android 2.1, 2.2, (and possibly 2.3), you can’t play videos in a Live Wallpaper. Well, you can but it’s something you have to program yourself. And that’s the problem. This seemingly trivial project is way harder than it looks. While not exactly rocket science, it’s not exactly a “walk in the park” either.
Let me explain exactly what’s involved: (1) you have to write your own decoder, which means that (2) you need to work with the NDK.