Friday, April 19, 2013

Accelerated Video Decoding and Rendering in QML with Necessitas Qt 4 for Android

Ok, I did this a couple of months ago but I just realized it might be of help to someone who is currently using Necessitas Qt4 for some project and still cannot use Qt5.
This is a sample code which shows how to create a custom QML component in the Qt4 Necessitas porting to use hardware acceleration on any Android devices with API level at least 11. The result is pretty good, you can check the demo I uploaded on youtube a couple of months ago (the third application shown is the one which is implemented over Qt 4):



The description says it all: "The third sample code uses a custom QML component written in C++ using a Qt 4.8.2 port for Android (Necessitas). Regular QML animations are then applied to the custom component. The semi-transparent image is a regular QML Image element with alpha set to 0.5."

The code is available now here on github: https://github.com/carlonluca/TextureStreaming. The project TextureStreaming can be opened with Qt Creator and run on a device (assuming API level constraint is met).

Take into consideration that Qt guys are working on the QtMultimedia backend for Android, and I think it should be available in Qt 5.2. You might want to try that also: http://qt.gitorious.org/qt/qtmultimedia/trees/dev/src/plugins/android.

How it Works

As you can see from the code, a custom QML component is used and placed in the QML scene. That component instantiates some Java classes through JNI glue code and use the Android standard Media Player to start decoding video and playing audio. The sink is set to be a SurfaceTexture instance, which provides the OpenGL texture that the custom QML component renders in the QML scene. Result is pretty good.

6 comments:

  1. Hello, when play file error occurs:
    W / SurfaceTexture (21637): [unnamed-21637-0] updateTexImage: clearing GL error: 0502

    device acer a510

    ReplyDelete
    Replies
    1. Sorry, but I only have a couple of devices I can test. On those it seems to work.

      Delete
  2. Really interesting! :) Thanks!

    We have an android unit where newest qt multimedia components just play sound and there is no video. I've noticed this post, so I thought maybe we could do something like this in our case (the android player does work).

    But in your git repo there is the following:
    * the video surface class has just mTex texture and mSurfaceTexture surface; paint() uses them, but ... there seems no place which would be setting the texture?
    * qml has the setting source and a call to play() commented out: indeed VideoSurface class has no "source" property nor play() slot;
    In other words: in your git it's clear how painting is done. But could you maybe hint on how the texture update (based on android's player output) could be done?

    (by the way: in 'source: "/home/pi/usb/out.h264"' in qml is /home/pi/usb/out.h264 something like a named pipeline? do you need to start the android player in a specific way to indicate where the output should go to?)

    Do I miss something? Or do you maybe have a fuller example which includes also getting the input from android's player?

    ReplyDelete
    Replies
    1. This solution was developed before Qt 5 was available for Android and before QtMultimedia started to support Android. Anyway, I read the code of the current QtMultimedia implementation and the component I wrote here uses the same idea.

      Texture is drawn in C code. Look there.
      This is not a ready-to-use component, you are supposed to start from this architecture and implement. The media is set in TextureHelper.java.

      Not a named pipeline. I ported that QML element from another project and that was the path used there.

      There is no other project. This is the one you see in the video. It simply works. However, you'll have to implement what is missing yourself.
      Bye.

      Delete
    2. Thanks. I've missed the source being set in TextureHelper.java .

      Delete
  3. I got following error on your tutorial : take it from the github :

    11-17 11:25:26.599: E/AndroidRuntime(13341): java.lang.UnsatisfiedLinkError: Couldn't load TextureStreaming from loader dalvik.system.PathClassLoader[DexPathList[[zip file "/data/app/org.kde.necessitas.example.TextureStreaming-1.apk"],nativeLibraryDirectories=[/data/app-lib/org.kde.necessitas.example.TextureStreaming-1, /vendor/lib, /system/lib]]]: findLibrary returned null

    ReplyDelete