Android, Programming, Python

25 comments

So, one day you wake up and decide that you don’t have enough problems in your life and that you’d like to have more. That’s when you go with the obvious choice: You’ll try to cross compile the Python runtime for use in Android.

These instructions and the required patch apply to Python v2.7.2 (at least that’s the version I know works). The patch was adapted by me, and it’s based on the Py4A patch. I also got ideas and guidance from the Pygame For Android project, specially their build scripts, which if you are interested in this kind of self inflicted pain I suggest you have a look at them. All testing and development was done on Ubuntu Natty 64 bits version.

The first thing we have to do is create a host version of the Python runtime, as it is required to cross compile Python. This is easily done by extracting the Python source, and running the usual configure/make/make install.

Once you have that, after extracting the Python source to some other place and applying the patch (patch -p0 python-2.7.2.android.diff), you need to set up the Android NDK, and set up some environment variables (valid at least for Android NDK R6):

export ANDROID_NDK=[PATH WHERE THE ANDROID NDK IS]
export PATH="$ANDROID_NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86/bin/:$ANDROID_NDK:$ANDROID_NDK/tools:/usr/local/bin:/usr/bin:/bin"
export ARCH="armeabi"
export CFLAGS="-DANDROID -mandroid -fomit-frame-pointer --sysroot $ANDROID_NDK/platforms/android-5/arch-arm"
export CXXFLAGS = "$CFLAGS"
export CC="arm-linux-androideabi-gcc $CFLAGS"
export CXX="arm-linux-androideabi-g++ $CXXFLAGS"
export AR="arm-linux-androideabi-ar"
export RANLIB="arm-linux-androideabi-ranlib"
export STRIP="arm-linux-androideabi-strip --strip-unneeded"
export MAKE="make -j4 HOSTPYTHON=[PATH TO HOST PYTHON] HOSTPGEN=[PATH TO HOST PGEN] CROSS_COMPILE=arm-eabi- CROSS_COMPILE_TARGET=yes"

In the final variable “MAKE” up above you have to complete the appropriate full path to the python executable and pgen executable that you generated when you compiled Python for the host (I don’t go into the details of compiling Python for the host as it is rather simple and there’s plenty of information on the net about this, as I mentioned it shouldn’t be harder than doing configure/make/make install).

I configured the cross compilation with:

./configure LDFLAGS="-Wl,--allow-shlib-undefined" CFLAGS="-mandroid -fomit-frame-pointer --sysroot $ANDROID_NDK/platforms/android-5/arch-arm" HOSTPYTHON=[HOST PYTHON PATH] HOSTPGEN=[HOST PGEN PATH] --host=arm-eabi --build=i686-pc-linux-gnu --enable-shared --prefix="[WHERE YOU WANT TO PUT THE GENERATED PYTHON STUFF]"

After this I had to make a small correction to the generated Makefile.

sed -i "s|^INSTSONAME=\(.*.so\).*|INSTSONAME=\\1|g" Makefile

Now you are ready to compile:

$MAKE

With a little bit of luck that should be it, I promised pain but it didn’t seem like much trouble, right? Well, I just gave you my hard earned patch that makes the trick!

Actually using the compiled library is a different matter entirely, but to give you a hint of where to go I suggest that you take a look at my previous article on how to embed and freeze modules and packages in Python, which is what I did. Once you do that, maybe mix it with a bit of SDL (of special interest is the Android project skeleton that they use), and you’ll have a fully working Python environment that you can build apps on!

Stay tuned for instructions on how to accomplish something similar (but with a WAY bigger patch) for cross compiling to Windows via Mingw32

UPDATE: Thanks to Anthony Prieur who let me know of a couple of typos in the instructions (already fixed) and that the patch has an indentation issue in setup.py, which is trivial to fix if you need the file (I don’t have any use for that file in the application I’m developing).
UPDATE2: Patch now supports cross compiling from OS X in addition to Linux.

SDL under Android

Aug
2011
25

Android, Programming

No comments

The SDL library supports the Android OS quite well. To get started, this guide is a great starting point.

I have a couple of points to add to that tutorial. I didn’t actually use the pre made SDL project, but rather took all the structure from SDL’s Mercurial repository. If you do that, the project will build, but you’ll have problems if you try to open a file via SDL (for example using IMG_Load). SDL seems to have good Asset Manager integration, but there’s a key piece of glue code missing, a static function called getContext that should be in the SDLActivity.java file. This is my fixed SDLActivity.java file, which also includes a workaround to disable OpenGL ES 2.x without recompiling SDL (it’s in the initEGL function).

 

package org.libsdl.app;

import javax.microedition.khronos.egl.EGLConfig;
import javax.microedition.khronos.opengles.GL10;
import javax.microedition.khronos.egl.*;

import android.app.*;
import android.content.*;
import android.view.*;
import android.os.*;
import android.util.Log;
import android.graphics.*;
import android.text.method.*;
import android.text.*;
import android.media.*;
import android.hardware.*;
import android.content.*;

import java.lang.*;

/**
    SDL Activity
*/
public class SDLActivity extends Activity {

    // Main components
    private static SDLActivity mSingleton;
    private static SDLSurface mSurface;

    // Audio
    private static Thread mAudioThread;
    private static AudioTrack mAudioTrack;

    // Load the .so
    static {
        System.loadLibrary("SDL");
        System.loadLibrary("SDL_image");
        System.loadLibrary("mikmod");
        System.loadLibrary("SDL_mixer");
        System.loadLibrary("SDL_ttf");
        System.loadLibrary("main");
    }

    // Setup
    protected void onCreate(Bundle savedInstanceState) {
        //Log.v("SDL", "onCreate()");
        super.onCreate(savedInstanceState);

        // So we can call stuff from static callbacks
        mSingleton = this;

        // Set up the surface
        mSurface = new SDLSurface(getApplication());
        setContentView(mSurface);
        SurfaceHolder holder = mSurface.getHolder();
        holder.setType(SurfaceHolder.SURFACE_TYPE_GPU);
    }

    // Events
    protected void onPause() {
        //Log.v("SDL", "onPause()");
        super.onPause();
    }

    protected void onResume() {
        //Log.v("SDL", "onResume()");
        super.onResume();
    }

    // Messages from the SDLMain thread
    static int COMMAND_CHANGE_TITLE = 1;

    // Handler for the messages
    Handler commandHandler = new Handler() {
        public void handleMessage(Message msg) {
            if (msg.arg1 == COMMAND_CHANGE_TITLE) {
                setTitle((String)msg.obj);
            }
        }
    };

    // Send a message from the SDLMain thread
    void sendCommand(int command, Object data) {
        Message msg = commandHandler.obtainMessage();
        msg.arg1 = command;
        msg.obj = data;
        commandHandler.sendMessage(msg);
    }

    // C functions we call
    public static native void nativeInit();
    public static native void nativeQuit();
    public static native void onNativeResize(int x, int y, int format);
    public static native void onNativeKeyDown(int keycode);
    public static native void onNativeKeyUp(int keycode);
    public static native void onNativeTouch(int action, float x,
                                            float y, float p);
    public static native void onNativeAccel(float x, float y, float z);
    public static native void nativeRunAudioThread();

    // Java functions called from C

    public static boolean createGLContext(int majorVersion, int minorVersion) {
        return mSurface.initEGL(majorVersion, minorVersion);
    }

    public static void flipBuffers() {
        mSurface.flipEGL();
    }

    public static void setActivityTitle(String title) {
        // Called from SDLMain() thread and can't directly affect the view
        mSingleton.sendCommand(COMMAND_CHANGE_TITLE, title);
    }

    public static Context getContext() {
        return mSingleton;
    }

    // Audio
    private static Object buf;

    public static Object audioInit(int sampleRate, boolean is16Bit, boolean isStereo, int desiredFrames) {
        int channelConfig = isStereo ? AudioFormat.CHANNEL_CONFIGURATION_STEREO : AudioFormat.CHANNEL_CONFIGURATION_MONO;
        int audioFormat = is16Bit ? AudioFormat.ENCODING_PCM_16BIT : AudioFormat.ENCODING_PCM_8BIT;
        int frameSize = (isStereo ? 2 : 1) * (is16Bit ? 2 : 1);

        Log.v("SDL", "SDL audio: wanted " + (isStereo ? "stereo" : "mono") + " " + (is16Bit ? "16-bit" : "8-bit") + " " + ((float)sampleRate / 1000f) + "kHz, " + desiredFrames + " frames buffer");

        // Let the user pick a larger buffer if they really want -- but ye
        // gods they probably shouldn't, the minimums are horrifyingly high
        // latency already
        desiredFrames = Math.max(desiredFrames, (AudioTrack.getMinBufferSize(sampleRate, channelConfig, audioFormat) + frameSize - 1) / frameSize);

        mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRate,
                channelConfig, audioFormat, desiredFrames * frameSize, AudioTrack.MODE_STREAM);

        audioStartThread();

        Log.v("SDL", "SDL audio: got " + ((mAudioTrack.getChannelCount() >= 2) ? "stereo" : "mono") + " " + ((mAudioTrack.getAudioFormat() == AudioFormat.ENCODING_PCM_16BIT) ? "16-bit" : "8-bit") + " " + ((float)mAudioTrack.getSampleRate() / 1000f) + "kHz, " + desiredFrames + " frames buffer");

        if (is16Bit) {
            buf = new short[desiredFrames * (isStereo ? 2 : 1)];
        } else {
            buf = new byte[desiredFrames * (isStereo ? 2 : 1)];
        }
        return buf;
    }

    public static void audioStartThread() {
        mAudioThread = new Thread(new Runnable() {
            public void run() {
                mAudioTrack.play();
                nativeRunAudioThread();
            }
        });

        // I'd take REALTIME if I could get it!
        mAudioThread.setPriority(Thread.MAX_PRIORITY);
        mAudioThread.start();
    }

    public static void audioWriteShortBuffer(short[] buffer) {
        for (int i = 0; i < buffer.length; ) {             int result = mAudioTrack.write(buffer, i, buffer.length - i);             if (result > 0) {
                i += result;
            } else if (result == 0) {
                try {
                    Thread.sleep(1);
                } catch(InterruptedException e) {
                    // Nom nom
                }
            } else {
                Log.w("SDL", "SDL audio: error return from write(short)");
                return;
            }
        }
    }

    public static void audioWriteByteBuffer(byte[] buffer) {
        for (int i = 0; i < buffer.length; ) {             int result = mAudioTrack.write(buffer, i, buffer.length - i);             if (result > 0) {
                i += result;
            } else if (result == 0) {
                try {
                    Thread.sleep(1);
                } catch(InterruptedException e) {
                    // Nom nom
                }
            } else {
                Log.w("SDL", "SDL audio: error return from write(short)");
                return;
            }
        }
    }

    public static void audioQuit() {
        if (mAudioThread != null) {
            try {
                mAudioThread.join();
            } catch(Exception e) {
                Log.v("SDL", "Problem stopping audio thread: " + e);
            }
            mAudioThread = null;

            //Log.v("SDL", "Finished waiting for audio thread");
        }

        if (mAudioTrack != null) {
            mAudioTrack.stop();
            mAudioTrack = null;
        }
    }
}

/**
    Simple nativeInit() runnable
*/
class SDLMain implements Runnable {
    public void run() {
        // Runs SDL_main()
        SDLActivity.nativeInit();

        //Log.v("SDL", "SDL thread terminated");
    }
}

/**
    SDLSurface. This is what we draw on, so we need to know when it's created
    in order to do anything useful.

    Because of this, that's where we set up the SDL thread
*/
class SDLSurface extends SurfaceView implements SurfaceHolder.Callback,
    View.OnKeyListener, View.OnTouchListener, SensorEventListener  {

    // This is what SDL runs in. It invokes SDL_main(), eventually
    private Thread mSDLThread;

    // EGL private objects
    private EGLContext  mEGLContext;
    private EGLSurface  mEGLSurface;
    private EGLDisplay  mEGLDisplay;

    // Sensors
    private static SensorManager mSensorManager;

    // Startup
    public SDLSurface(Context context) {
        super(context);
        getHolder().addCallback(this);

        setFocusable(true);
        setFocusableInTouchMode(true);
        requestFocus();
        setOnKeyListener(this);
        setOnTouchListener(this);

        mSensorManager = (SensorManager)context.getSystemService("sensor");
    }

    // Called when we have a valid drawing surface
    public void surfaceCreated(SurfaceHolder holder) {
        //Log.v("SDL", "surfaceCreated()");

        enableSensor(Sensor.TYPE_ACCELEROMETER, true);
    }

    // Called when we lose the surface
    public void surfaceDestroyed(SurfaceHolder holder) {
        //Log.v("SDL", "surfaceDestroyed()");

        // Send a quit message to the application
        SDLActivity.nativeQuit();

        // Now wait for the SDL thread to quit
        if (mSDLThread != null) {
            try {
                mSDLThread.join();
            } catch(Exception e) {
                Log.v("SDL", "Problem stopping thread: " + e);
            }
            mSDLThread = null;

            //Log.v("SDL", "Finished waiting for SDL thread");
        }

        enableSensor(Sensor.TYPE_ACCELEROMETER, false);
    }

    // Called when the surface is resized
    public void surfaceChanged(SurfaceHolder holder,
                               int format, int width, int height) {
        //Log.v("SDL", "surfaceChanged()");

        int sdlFormat = 0x85151002; // SDL_PIXELFORMAT_RGB565 by default
        switch (format) {
        case PixelFormat.A_8:
            Log.v("SDL", "pixel format A_8");
            break;
        case PixelFormat.LA_88:
            Log.v("SDL", "pixel format LA_88");
            break;
        case PixelFormat.L_8:
            Log.v("SDL", "pixel format L_8");
            break;
        case PixelFormat.RGBA_4444:
            Log.v("SDL", "pixel format RGBA_4444");
            sdlFormat = 0x85421002; // SDL_PIXELFORMAT_RGBA4444
            break;
        case PixelFormat.RGBA_5551:
            Log.v("SDL", "pixel format RGBA_5551");
            sdlFormat = 0x85441002; // SDL_PIXELFORMAT_RGBA5551
            break;
        case PixelFormat.RGBA_8888:
            Log.v("SDL", "pixel format RGBA_8888");
            sdlFormat = 0x86462004; // SDL_PIXELFORMAT_RGBA8888
            break;
        case PixelFormat.RGBX_8888:
            Log.v("SDL", "pixel format RGBX_8888");
            sdlFormat = 0x86262004; // SDL_PIXELFORMAT_RGBX8888
            break;
        case PixelFormat.RGB_332:
            Log.v("SDL", "pixel format RGB_332");
            sdlFormat = 0x84110801; // SDL_PIXELFORMAT_RGB332
            break;
        case PixelFormat.RGB_565:
            Log.v("SDL", "pixel format RGB_565");
            sdlFormat = 0x85151002; // SDL_PIXELFORMAT_RGB565
            break;
        case PixelFormat.RGB_888:
            Log.v("SDL", "pixel format RGB_888");
            // Not sure this is right, maybe SDL_PIXELFORMAT_RGB24 instead?
            sdlFormat = 0x86161804; // SDL_PIXELFORMAT_RGB888
            break;
        default:
            Log.v("SDL", "pixel format unknown " + format);
            break;
        }
        SDLActivity.onNativeResize(width, height, sdlFormat);

        // Now start up the C app thread
        if (mSDLThread == null) {
            mSDLThread = new Thread(new SDLMain(), "SDLThread");
            mSDLThread.start();
        }
    }

    // unused
    public void onDraw(Canvas canvas) {}

    // EGL functions
    public boolean initEGL(int majorVersion, int minorVersion) {
        // Temporarily disable OpenGL ES 2 as the SDL backend is buggy
        if (majorVersion != 1) return false;

        Log.v("SDL", "Starting up OpenGL ES " + majorVersion + "." + minorVersion);

        try {
            EGL10 egl = (EGL10)EGLContext.getEGL();

            EGLDisplay dpy = egl.eglGetDisplay(EGL10.EGL_DEFAULT_DISPLAY);

            int[] version = new int[2];
            egl.eglInitialize(dpy, version);

            int EGL_OPENGL_ES_BIT = 1;
            int EGL_OPENGL_ES2_BIT = 4;
            int renderableType = 0;
            if (majorVersion == 2) {
                renderableType = EGL_OPENGL_ES2_BIT;
            } else if (majorVersion == 1) {
                renderableType = EGL_OPENGL_ES_BIT;
            }
            int[] configSpec = {
                //EGL10.EGL_DEPTH_SIZE,   16,
                EGL10.EGL_RENDERABLE_TYPE, renderableType,
                EGL10.EGL_NONE
            };
            EGLConfig[] configs = new EGLConfig[1];
            int[] num_config = new int[1];
            if (!egl.eglChooseConfig(dpy, configSpec, configs, 1, num_config) || num_config[0] == 0) {
                Log.e("SDL", "No EGL config available");
                return false;
            }
            EGLConfig config = configs[0];

            EGLContext ctx = egl.eglCreateContext(dpy, config, EGL10.EGL_NO_CONTEXT, null);
            if (ctx == EGL10.EGL_NO_CONTEXT) {
                Log.e("SDL", "Couldn't create context");
                return false;
            }

            EGLSurface surface = egl.eglCreateWindowSurface(dpy, config, this, null);
            if (surface == EGL10.EGL_NO_SURFACE) {
                Log.e("SDL", "Couldn't create surface");
                return false;
            }

            if (!egl.eglMakeCurrent(dpy, surface, surface, ctx)) {
                Log.e("SDL", "Couldn't make context current");
                return false;
            }

            mEGLContext = ctx;
            mEGLDisplay = dpy;
            mEGLSurface = surface;

        } catch(Exception e) {
            Log.v("SDL", e + "");
            for (StackTraceElement s : e.getStackTrace()) {
                Log.v("SDL", s.toString());
            }
        }

        return true;
    }

    // EGL buffer flip
    public void flipEGL() {
        try {
            EGL10 egl = (EGL10)EGLContext.getEGL();

            egl.eglWaitNative(EGL10.EGL_NATIVE_RENDERABLE, null);

            // drawing here

            egl.eglWaitGL();

            egl.eglSwapBuffers(mEGLDisplay, mEGLSurface);

        } catch(Exception e) {
            Log.v("SDL", "flipEGL(): " + e);
            for (StackTraceElement s : e.getStackTrace()) {
                Log.v("SDL", s.toString());
            }
        }
    }

    // Key events
    public boolean onKey(View  v, int keyCode, KeyEvent event) {

        if (event.getAction() == KeyEvent.ACTION_DOWN) {
            //Log.v("SDL", "key down: " + keyCode);
            SDLActivity.onNativeKeyDown(keyCode);
            return true;
        }
        else if (event.getAction() == KeyEvent.ACTION_UP) {
            //Log.v("SDL", "key up: " + keyCode);
            SDLActivity.onNativeKeyUp(keyCode);
            return true;
        }

        return false;
    }

    // Touch events
    public boolean onTouch(View v, MotionEvent event) {

        int action = event.getAction();
        float x = event.getX();
        float y = event.getY();
        float p = event.getPressure();

        // TODO: Anything else we need to pass?
        SDLActivity.onNativeTouch(action, x, y, p);
        return true;
    }

    // Sensor events
    public void enableSensor(int sensortype, boolean enabled) {
        // TODO: This uses getDefaultSensor - what if we have >1 accels?
        if (enabled) {
            mSensorManager.registerListener(this,
                            mSensorManager.getDefaultSensor(sensortype),
                            SensorManager.SENSOR_DELAY_GAME, null);
        } else {
            mSensorManager.unregisterListener(this,
                            mSensorManager.getDefaultSensor(sensortype));
        }
    }

    public void onAccuracyChanged(Sensor sensor, int accuracy) {
        // TODO
    }

    public void onSensorChanged(SensorEvent event) {
        if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) {
            SDLActivity.onNativeAccel(event.values[0],
                                      event.values[1],
                                      event.values[2]);
        }
    }

}

The other issue I had happened when loading a bunch of files from native code (via de Android NDK). It so happens that there’s object reference stack that the JNI mechanism relies on, and while it gets properly handled by the Java VM internals, if your native code opens a series of files without returning to the Java VM, you run the risk of going over the 512 object reference’s limit and get an exception.

Luckily the solution to this problem is quite simple. In SDL_android.cpp, look for the function: static int Android_JNI_FileOpen(SDL_RWops* ctx)

At the beginning add

mEnv->PushLocalFrame(10);

and before every return expression:

mEnv->PopLocalFrame(NULL);

This will take care of freeing the object references that are no longer in use after the function returns, and then you can load as many images as you need in native code without hitting that limit.

Programming, Python

2 comments

My latest adventures with SDL bindings eventually led me to Cython, a very recommendable tool if you are looking to extract a little bit more juice out of your Python app performance, or just hide your source a little bit more obscurely. It compiles almost any Python code as is, and it includes extensions to the language that allow even faster converted code with the inclusion of static typed variables and other niceties. Cython offers a way to automatically compile your .py/.pyx modules, and you load those dynamically with the familiar import command, the usage of the imported module is exactly the same as if it were a native .py module or a compiled c module.

At this point, it’s important to mention that the generated .c files and their corresponding linked versions depend on the Python runtime, you can’t make a standalone executable out of them…are least not easily. Now, let’s suppose you didn’t read the “not easily” part I just mentioned, and that you wanted to integrate this module (or any other module you made in C from scratch) in a statically linked Python interpreter, how you’d go about it?

The following instructions were tested under Ubuntu Natty 64 bits. First, start by downloading the Python source. Extract, copy Modules/Setup.dist to Modules/Setup and run configure with the following parameters:

./configure LDFLAGS=”-Wl,–no-export-dynamic -static-libgcc -static” CPPFLAGS=”-static -fPIC” LINKFORSHARED=” ” DYNLOADFILE=”dynload_stub.o” –disable-shared –prefix=”/path/to/where/you/want/it/installed”

 

Followed by the all too familiar make & make install

You will see A LOT of errors that you can ¿safely? ignore mostly related to the fact that the c modules that come from Python won’t compile in static mode without some help. Once this crazyness stops, you’ll have a static Python interpreter (you can check with ldd ./python to see that it’s actually a standalone executable).

Now, this Python interpreter is lacking severely in content, and no one wants to re invent the wheel, specially such a fine wheel as Python provides…Go to that Modules/Setup file and take a look…search for the #*shared* line, remove it and replace it by *static* (with no # sign)…now look for some notable modules and uncomment them. Run the process again (configure and make) and this time you’ll end up with some builtin modules that you can import.

By now, you are probably catching my drift… lets suppose you have a module test.py, run “cython test.py” on it, and you’ll get a test.c file…copy it to Modules under the Python source, and edit Modules/Setup adding a line:

 

test test.c

Do the configure and make dance again, and now you should be able to do “import test” in the new Python interpreter, which will load the module as builtin. Neat, right?

If you go further down the rabbit hole and start depending on 3rd party libraries (or your own!), you will need to pay attention to how dependencies are specified in Modules/Setup. In short, you put whatever compiler and linker directives you need after the source files for the module.

This is all fine and dandy, but we haven’t broken anything yet…Let’s try something more advanced…imagine you have a full Python package already made (as in a full hierachy of modules arranged in folders and subfolders, etc), and you want to do the same Cython fueled embedding with it…After hitting your head on the wall for a looong while, you’ll figure out that actually you can’t (easily) do it…Basically because the Python interpreter builtin system is not geared towards packages, but rather towards shallow modules.

So, there’s two ways around it (that I know of). The first one is to use a series of shallow modules, and string them into a package like structure by means of importing submodules from the parent modules…

main.py
import submod1 as _submod1
submod1 = _submod1

This is boring, error prone, requires a lot of glue code, it doesn’t play well with your module structure if you want it to also work in non compiled mode, etc.

The alternative is hacking the Python code just a little bit. Namely, the Python/import.c file, look for the find_module function and add:

 if (is_builtin(fullname)) {
         strcpy(buf, fullname);
         return &fd_builtin;
     }

Place this code near the top of the function, right above the “if (path != NULL && PyString_Check(path)) {” line seems like a good place. What it does is to check the full module name (package1.package2.module) and sees if it is builtin. The official Python code doesn’t do this, it checks only for the module name for the reasons stated above.

Besides this little patch, you have to alter the Cython generated code just a bit…look for the “Py_InitModule4″ line, and replace the module name for the whole package name (if the module is package1.package2.module that line will only say “module”, you need to replace it by the whole enchilada). Doing this by hand is a PITA, but a simple find+sed command takes care of it swiftly. Also, while you are unleashing your sed kung fu, take care of the init??? functions, if a module is at package1.package2.mymodule, replace initmymodule by initpackage1_package2_mymodule (the reason why you have to do this will become clear later…or maybe not and I’m just making this stuff up)

Now, you have to go back to the Modules/Setup and edit the module line you added by appending all your sources (seems like a good job for a Python script, right?). If you run configure and make at this point you’ll then see that…it doesn’t quite work. Why? Because Python depends on a __path__ variable to figure out which module is a package and which one is just a module. Yes, you need to add those…

This is simple enough, in every package __init__.py file, add a __path__=[‘package1/package2/…”,] line with the right path for the location of the file.

And finally, you are ready…well, not yet. There are two things more you need to do…first, as the Python build system is geared towards shallow packages, you’ll have a problem if files in different subpackages have the same name, as they’ll end up overwriting each other when they are compiled (this will certainly happen for the __init__.py files), so you have to figure out a way to flatten your structure before adding them to Modules/Setup. What I do is scan the whole structure and copy the *.c files to a separate folder, replacing the ‘/’ of the directory separator by a ‘+’ sign. This way package1/package2/module.c becomes package1+package2+module.c. Then, add all this files to the same line in Modules/Setup, and then it comes the final piece of glue:

If your overall package is called…let’s say “test” to be creative, create a test.c file with something like this:

 

#include "Python.h"
static PyMethodDef nomethods[] = {  {NULL, NULL}};
extern void inittest_module1();
extern void inittest_package1();
extern void inittest_package1_submodule();

PyMODINIT_FUNC
inittest(){
    PyObject* module;
    PyObject* __path__;

    // Add a __path__ attribute so Python knows that this is a package
    PyObject* package_gilbert = PyImport_AddModule("test");
    Py_InitModule("test", nomethods);
    __path__ = PyList_New(1);
    PyList_SetItem(__path__, 0, PyString_FromString("test"));
    PyModule_AddObject(package_test, "__path__", __path__);

    PyImport_AppendInittab("test.package1", inittest_package1);
    PyImport_AppendInittab("test.package1.submodule", inittest_package1_submodule);
    }

Append this file also to the Modules/Setup line. What this file does is create a “test” package, set up the __path__ variable accordingly, and append to the Python internal builtin table all of our modules. Now the reason for renaming the init functions earlier should become clear (just nod even if you go lost at Statically linking Python…)

Finally, run configure and make for the last time and your builtin package should be there…or not, there’s literally a hundred places where things could go wrong and the online documentation on the subject is quite sparse, that’s why I’m leaving this here for those brave souls that wish to try it. If something or everything in the process is not clear enough, let me know in the comments and good luck! (You’ll definitively will need it).

Android, Programming

No comments

So, I got one of those wonderful Asus Transformers…I had to see if I could make something for it. And so I did! Having some Android experience from the 1.x days, I started looking around for a quick way to get something done, and it was then that I found the amazing AndEngine and its Live Wallpaper Extension . While the documentation is sparse (specially when it comes down to the newer changes in the engine), there’s a big enough community that if you search around in the forums and over the net you will find what you are looking for. Getting it all in place turned out to be quite simple once I figured out what changes needed to be done to adapt the provided example to the latest modifications done to the engine.

All in all, it’s pretty simple:

(LiveWallpaperService.java)

package org.anddev.wallpaper.live.donprimerizowp;

import java.io.File;
import java.util.Random;

import net.rbgrn.opengl.GLWallpaperService.GLEngine;

import org.anddev.andengine.engine.camera.Camera;
import org.anddev.andengine.engine.handler.timer.ITimerCallback;
import org.anddev.andengine.engine.handler.timer.TimerHandler;
import org.anddev.andengine.engine.options.EngineOptions;
import org.anddev.andengine.engine.options.EngineOptions.ScreenOrientation;
import org.anddev.andengine.engine.options.resolutionpolicy.FillResolutionPolicy;
import org.anddev.andengine.entity.particle.ParticleSystem;
import org.anddev.andengine.entity.particle.modifier.AlphaModifier;
import org.anddev.andengine.entity.particle.modifier.ExpireModifier;
import org.anddev.andengine.entity.scene.Scene;
import org.anddev.andengine.entity.sprite.AnimatedSprite.IAnimationListener;
import org.anddev.andengine.entity.sprite.Sprite;
import org.anddev.andengine.entity.sprite.AnimatedSprite;
import org.anddev.andengine.extension.ui.livewallpaper.BaseLiveWallpaperService;
import org.anddev.andengine.opengl.texture.Texture;
import org.anddev.andengine.opengl.texture.TextureOptions;
import org.anddev.andengine.opengl.texture.atlas.bitmap.BitmapTextureAtlas;
import org.anddev.andengine.opengl.texture.atlas.bitmap.BitmapTextureAtlasTextureRegionFactory;
import org.anddev.andengine.opengl.texture.region.TextureRegion;
import org.anddev.andengine.opengl.texture.region.TextureRegionFactory;
import org.anddev.andengine.opengl.texture.region.TiledTextureRegion;
import org.anddev.andengine.opengl.view.GLSurfaceView.Renderer;
import org.anddev.andengine.opengl.view.RenderSurfaceView;
import org.anddev.andengine.sensor.accelerometer.AccelerometerData;
import org.anddev.andengine.sensor.accelerometer.IAccelerometerListener;
import org.anddev.andengine.sensor.orientation.IOrientationListener;
import org.anddev.andengine.sensor.orientation.OrientationSensorOptions;

import android.app.WallpaperManager;
import android.content.res.Configuration;
import android.os.Bundle;

public class LiveWallpaperService extends BaseLiveWallpaperService implements IAccelerometerListener, IOffsetsChanged {

    protected class MyBaseWallpaperGLEngine extends GLEngine {
        // ===========================================================
        // Fields
        // ===========================================================

        private Renderer mRenderer;

        private IOffsetsChanged mOffsetsChangedListener = null;

        // ===========================================================
        // Constructors
        // ===========================================================

        public MyBaseWallpaperGLEngine(IOffsetsChanged pOffsetsChangedListener) {
                this.setEGLConfigChooser(false);
                this.mRenderer = new RenderSurfaceView.Renderer(LiveWallpaperService.this.mEngine);
                this.setRenderer(this.mRenderer);
                this.setRenderMode(RENDERMODE_CONTINUOUSLY);
                this.mOffsetsChangedListener = pOffsetsChangedListener;
        }

        // ===========================================================
        // Methods for/from SuperClass/Interfaces
        // ===========================================================

        @Override
        public Bundle onCommand(final String pAction, final int pX, final int pY, final int pZ, final Bundle pExtras, final boolean pResultRequested) {
                if(pAction.equals(WallpaperManager.COMMAND_TAP)) {
                    LiveWallpaperService.this.onTap(pX, pY);
                } else if (pAction.equals(WallpaperManager.COMMAND_DROP)) {
                    LiveWallpaperService.this.onDrop(pX, pY);
                }

                return super.onCommand(pAction, pX, pY, pZ, pExtras, pResultRequested);
        }

        @Override
        public void onResume() {
                super.onResume();
                LiveWallpaperService.this.getEngine().onResume();
                LiveWallpaperService.this.onResume();
        }

        @Override
        public void onPause() {
                super.onPause();
                LiveWallpaperService.this.getEngine().onPause();
                LiveWallpaperService.this.onPause();
        }

        @Override
        public void onDestroy() {
                super.onDestroy();
                if (this.mRenderer != null) {
                        // mRenderer.release();
                }
                this.mRenderer = null;
        }

        @Override
        public void onOffsetsChanged(float xOffset, float yOffset,
                        float xOffsetStep, float yOffsetStep, int xPixelOffset,
                        int yPixelOffset) {
                // TODO Auto-generated method stub
                super.onOffsetsChanged(xOffset, yOffset, xOffsetStep, yOffsetStep,
                                xPixelOffset, yPixelOffset);

                if(this.mOffsetsChangedListener != null)
                        this.mOffsetsChangedListener.offsetsChanged(xOffset, yOffset, xOffsetStep, yOffsetStep, xPixelOffset, yPixelOffset);

        }

    }

    // ===========================================================
    // Constants
    // ===========================================================

    private static final int CAMERA_WIDTH = 1280;
    private static final int CAMERA_HEIGHT = 800;

    // ===========================================================
    // Fields
    // ===========================================================

    private BitmapTextureAtlas mTexture;
    private BitmapTextureAtlas mTexture2;
    private BitmapTextureAtlas mTexture3;
    private BitmapTextureAtlas mTexture4;
    private BitmapTextureAtlas mTexture5;

    private TextureRegion mPulperia;
    private TextureRegion mCarreta;
    private TiledTextureRegion mPibito;
    private TiledTextureRegion mMosca;
    private TiledTextureRegion mDPSeq1;
    private TiledTextureRegion mDPSeq2;

    private Sprite mPulperiaSprite;
    private AnimatedSprite mPibitoSprite;
    private Sprite mCarretaSprite;
    private AnimatedSprite mMoscaSprite;
    private AnimatedSprite mDP1Sprite;
    private AnimatedSprite mDP2Sprite;

    private ScreenOrientation mScreenOrientation;
    private Camera mCamera;
    private Scene mScene;

    private IAnimationListener mDP1ListenerF, mDP1ListenerB, mDP2Listener; 

    // ===========================================================
    // Constructors
    // ===========================================================

    // ===========================================================
    // Getter & Setter
    // ===========================================================

    // ===========================================================
    // Methods for/from SuperClass/Interfaces
    // ===========================================================

    @Override
    public org.anddev.andengine.engine.Engine onLoadEngine() {
        mCamera = new Camera(0, 0, CAMERA_WIDTH, CAMERA_HEIGHT);
        return new org.anddev.andengine.engine.Engine(new EngineOptions(true, this.mScreenOrientation, new FillResolutionPolicy(), mCamera));
    }

    @Override
    public void onLoadResources() {
        this.getEngine().disableOrientationSensor(this);
        this.mTexture = new BitmapTextureAtlas(2048, 2048, TextureOptions.BILINEAR);
        this.mTexture2 = new BitmapTextureAtlas(2048, 2048, TextureOptions.BILINEAR);
        this.mTexture3 = new BitmapTextureAtlas(2048, 2048, TextureOptions.BILINEAR);
        this.mTexture4 = new BitmapTextureAtlas(1024, 1024, TextureOptions.BILINEAR);
        this.mTexture5 = new BitmapTextureAtlas(1024, 2048, TextureOptions.BILINEAR);

        /* Creates the needed texture-regions on the texture. */
        this.mPulperia = BitmapTextureAtlasTextureRegionFactory.createFromAsset(this.mTexture, this, "gfx/pulperia.png", 0, 0); // 1280x800
        this.mCarreta = BitmapTextureAtlasTextureRegionFactory.createFromAsset(this.mTexture, this, "gfx/carreta.png", 0, 801); // 263x386
        this.mPibito = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(this.mTexture2, this, "gfx/pibito.png", 0, 0, 6, 6); // 2048x1980
        this.mMosca = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(this.mTexture3, this, "gfx/mosca.png", 0, 0, 11, 5); // 1980x1070
        this.mDPSeq1 = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(this.mTexture4, this, "gfx/dpseq1.png", 0, 0, 7, 4); // 1022x1000
        this.mDPSeq2 = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(this.mTexture5, this, "gfx/dpseq2.png", 0, 0, 10, 7); // 990x1393

        this.getEngine().getTextureManager().loadTexture(this.mTexture);
        this.getEngine().getTextureManager().loadTexture(this.mTexture2);
        this.getEngine().getTextureManager().loadTexture(this.mTexture3);
        this.getEngine().getTextureManager().loadTexture(this.mTexture4);
        this.getEngine().getTextureManager().loadTexture(this.mTexture5);
        this.enableAccelerometerSensor(this);
    }

    @Override
    public Scene onLoadScene() {
        mScene = new Scene();
        mPulperiaSprite = new Sprite(0, 0, this.mPulperia);
        mScene.attachChild(mPulperiaSprite);
        mPibitoSprite = new AnimatedSprite(88, 200, this.mPibito);
        mPibitoSprite.setScale((float) 1.6);

        mMoscaSprite = new AnimatedSprite(700, 550, this.mMosca);

        mDP1Sprite = new AnimatedSprite(530, 260, this.mDPSeq1);
        mDP2Sprite = new AnimatedSprite(530, 310, this.mDPSeq2);

        mDP1ListenerF = new IAnimationListener () {
            @Override
            public void onAnimationEnd(final AnimatedSprite pAnimatedSprite) {
                runOnUpdateThread(new Runnable() {
                    @Override
                    public void run() {
                        // Stop the animation, play it backwards
                        mDP1Sprite.stopAnimation();
                        mDP1Sprite.animate(
                                new long[]
                                {100,100,100,100,100,100,100,
                                 100,100,100,100,100,100,100,
                                 100,100,100,100,100,100,100,
                                 100,100,100,100,100,100,100,},
                                new int[] {27,26,25,24,23,22,21,20,19,18,17,16,15,14,13,12,11,10,9,8,7,6,5,4,3,2,1,0}, 0, mDP1ListenerB);

                    }
                });
           }
        };

        mDP1ListenerB = new IAnimationListener () {
            @Override
            public void onAnimationEnd(final AnimatedSprite pAnimatedSprite) {
                runOnUpdateThread(new Runnable() {
                    @Override
                    public void run() {
                        mScene.detachChild(mDP1Sprite);
                        mDP1Sprite.stopAnimation();
                        mDP2Sprite.animate(100, false, mDP2Listener);
                        mScene.attachChild(mDP2Sprite);

                    }
                });
           }
        };

        mDP2Listener = new IAnimationListener () {
            @Override
            public void onAnimationEnd(final AnimatedSprite pAnimatedSprite) {
                runOnUpdateThread(new Runnable() {
                    @Override
                    public void run() {
                        mScene.detachChild(mDP2Sprite);
                        mDP2Sprite.stopAnimation();
                        mDP1Sprite.animate(100, false, mDP1ListenerF);
                        mScene.attachChild(mDP1Sprite);
                    }
                });
           }
        };

        mDP1Sprite.animate(100, false, mDP1ListenerF);
        mScene.attachChild(mDP1Sprite);

        mCarretaSprite = new Sprite(-30, 414, this.mCarreta);
        mScene.attachChild(mCarretaSprite);

        //this.mVelocityInitializer = new VelocityInitializer(-20, 20, -100, -120);

//      this.getEngine().registerPreFrameHandler(new FPSCounter());

        mScene.registerUpdateHandler(new TimerHandler(10f, true, new ITimerCallback() {
            Random randomSrc = new Random();
            @Override
            public void onTimePassed(final TimerHandler pTimerHandler) {
                try {

                    int x = randomSrc.nextInt(10);

                    //Pibito runs 3 out of 10 times
                    if (x5 && ! mMoscaSprite.isAnimationRunning()) {
                        mScene.attachChild(mMoscaSprite);
                        mMoscaSprite.animate(100, false, new IAnimationListener () {
                            @Override
                            public void onAnimationEnd(final AnimatedSprite pAnimatedSprite) {
                                runOnUpdateThread(new Runnable() {
                                    @Override
                                    public void run() {
                                        mScene.detachChild(mMoscaSprite);
                                    }
                                });
                           }
                        });
                    }

                } catch (Exception e) {}
            }
        })); 

        return mScene;
    }

    @Override
    public void onLoadComplete() {

    }

    @Override
    public Engine onCreateEngine() {
            // TODO Auto-generated method stub
            return new MyBaseWallpaperGLEngine(this);
    }

    @Override
    public void offsetsChanged(float xOffset, float yOffset, float xOffsetStep, float yOffsetStep, int xPixelOffset, int yPixelOffset) {
           /* if(mCamera != null){
                // Emulator has 3 screens
                mCamera.setCenter( ((960 * xOffset ) - 240) , mCamera.getCenterY() );
                / *formel                mCamera.setCenter(( (Camera-WIDTH * (screensCount-1)) * xOffset ) - (Camera-WIDTH / 2) ,mCamera.getCenterY() ); * /
            }*/

        mCarretaSprite.setPosition(-xOffset*80, 414);
        mMoscaSprite.setPosition(-xOffset*300+700, 550);
    }

    @Override
    public void onAccelerometerChanged(final AccelerometerData pAccelerometerData) {
    /*  final float minVelocityX = (pAccelerometerData.getX() + 2) * 5;
        final float maxVelocityX = (pAccelerometerData.getX() - 2) * 5;

        final float minVelocityY = (pAccelerometerData.getY() - 8) * 10;
        final float maxVelocityY = (pAccelerometerData.getY() - 10) * 10;
        this.mVelocityInitializer.setVelocity(minVelocityX, maxVelocityX, minVelocityY, maxVelocityY);*/
    }

    @Override
    public void onUnloadResources() {
        // TODO Auto-generated method stub

    }

    @Override
    public void onPauseGame() {
        super.onPause();
        LiveWallpaperService.this.getEngine().onPause();
        LiveWallpaperService.this.onPause();
    }

    @Override
    public void onResumeGame() {
        super.onResume();
        LiveWallpaperService.this.getEngine().onResume();
        LiveWallpaperService.this.onResume();

    }

    @Override
    public void onConfigurationChanged (Configuration newConfig){
                    if(newConfig.orientation == Configuration.ORIENTATION_PORTRAIT)
                    {
                            mScene.setScaleX(1280.0f/800.0f);
                            mScene.setScaleY(1.0f);
                    }
                    else if(newConfig.orientation == Configuration.ORIENTATION_LANDSCAPE)
                    {
                        mScene.setScale(1);
                    }

    }

    // ===========================================================
    // Methods
    // ===========================================================

    // ===========================================================
    // Inner and Anonymous Classes
    // ===========================================================

}

(IOffsetsChanged.java)

package org.anddev.wallpaper.live.donprimerizowp;
public interface IOffsetsChanged{

    public void offsetsChanged(float xOffset, float yOffset,
                    float xOffsetStep, float yOffsetStep, int xPixelOffset,
                    int yPixelOffset);

}

I’m making the full project available here . The Live Wallpaper is available from the Android Market.

The license for the code is “do as you please”, for the art it’s “All rights reserved under Creative Commons License”. All art is from the defunct (oh really? I hear him breathing!) “Epopeya de Don Primerizo Lata” and was created by the great Leonardo Falaschini, you should check his stuff at liondart.com