I’ve been a power Kubuntu user (I use it in my main system for everyday work) since 8.04. Every update to a newer version has had its fair share of minor issues and some not so minor, but all in all I haven’t hit a major roadblock (that I can remember), even when KDE switched from the 3.x to 4.x series. That is, until now. I’m really surprised about the number of major issues I’ve hit on this release, and a sad low point in the history of Ubuntu. After migrating with Muon, I rebooted and…
Boot takes a looooong time, no console or garbage in the console
Nvidia video cards don’t deal well with the vt.handoff parameter that Ubuntu adds by default to the kernel boot line in Grub. The upgrade process puts a /etc/grub.d/10_linux with this parameter, that you have to remove to get a useful console. Solution:
- Select recovery mode in Grub, press e on that option, look for the line with vt.handoff=7 and remove it, then press CTRL-X.
- Select root console
- mount -o rw,remount /
- Edit /etc/grub.d/10_linux, remove vt.handoff=7
- update-grub
- reboot
Now you can see the console, it still takes a looooong time, “Waiting for network connection”
It seems Ubuntu is now more serious about network interfaces and their availability/readiness. Too bad they didn’t bother asking the user if he wants this or issuing a big warning during the upgrade process. Anyway, if you have interfaces listed in /etc/network/interfaces that don’t exist in your system or that can’t get an IP quick enough, the system will keep you waiting without telling you what it’s really waiting for. Most solutions I’ve found online point to yet another random change in 11.10 where they moved /var/run to /run and /var/lock to /run/lock (I guess they felt like keeping things fresh by doing pointless changes, or maybe I’m just bitter because of the many hours I spent trying to figure out the problem). Solution:
- Boot in recovery mode
- mount -o rw,remount /
- Edit /etc/network/interfaces, comment out all mentions to network interfaces that you don’t actually have or that are not connected
- reboot
Nvidia propietary driver is no longer working, no X server
I have an Nvidia card configured with propietary drivers and so the X server depends on them. The upgrade process removed the Ubuntu X-Swat PPA, so the Nvidia driver didn’t update. As the upgrade process installs a new kernel, some conflict happens (the details of which I don’t care enough to find out, but be certain that it’s a problem), and you are left out the X environment with no warning of any kind. Solution:
- Boot in recovery mode, go to root console.
- mount -o rw,remount /
- Edit /etc/apt/sources.list.d/ubuntu-x-swat (something).list, enable the repository again
- apt-get update;apt-get –reinstall install nvidia-current
- reboot
Now X starts, but KDE complains of dbus being not available, and ask you to run qdbus.
- Boot in recovery mode, go to root console.
- mount -o rw,remount /
- apt-get install qdbus
No sound
KMail 2 has a lot of issues and is slow
PostgreSQL 8.4 has 100% CPU usage, PostgreSQL 9 is installed in parallel, Akonadi uses 8.4 instead of 9
Solution:
- Edit /etc/postgresql/main/postgresql.conf, make sure port=5432 (I guess the upgrade process had configured it on 5433 because 5432 was used by PostgreSQL 8.4)
- Apply this solution otherwise PostgreSQL 9.1 probably won’t start automatically after boot (though it will if you do service postgresql start afterwards…for reasons unknown to me!)
- sudo service postgresql restart
- Edit ~/.config/akonadi/akonadiserverrc and set the correct path (usually involves changing 8.4 for 9.1 in the PGSQL paths)
Skype doesn’t load
Update (Oct 9th, 2012): The third message in this thread explains a simpler alternative to what’s explained below. I haven’t tried it myself but it seems cleaner and easier to implement, so you should probably try that one first.
Update 2 (Feb 21st, 2013): I’m now using the simplified method mentioned above in my engine. I copy the org.libsdl.app.SDLActivity into my project and inherit from it from a class with the package and class name that I want to use in my app.
The only change I had to do on the standard SDLActivity.java is to modify the static section of this file to load the rest of the remaining libraries I require, doing this in your inherited class doesn’t quite work depending on how each library depends on the others.
The method stated below is still useful if your app does something that the default SDLActivity can’t really handle, such as a live wallpaper.
The Android SDL backend works great, it comes with a full skeleton project that you can use to get up to speed quickly. There is however one minor problem, everything in said project is hardwired to use the “org.libsdl.app” app name, which is all fine and dandy until you want to make a real world app with your own name…and keep using the SDL source with as little modifications as possible to benefit from the latest and greatest modifications provided by the open source community.
So, it’s time to take out the glue gun and put it to good use. What I do is deploy a little piece of code that I call jni_glue.cpp and that goes into jni/src/jni_glue.cpp. In my case the app name is “com.mdqinc.dp”…
#include "jni.h" #include "SDL_config.h" #include "SDL_stdinc.h" //static float fLastAccelerometer[3]; extern "C" { void Java_org_libsdl_app_SDLActivity_nativeRunAudioThread(JNIEnv* env, jclass cls); void Java_org_libsdl_app_SDLActivity_onNativeResize( JNIEnv* env, jclass jcls, jint width, jint height, jint format); void Java_org_libsdl_app_SDLActivity_onNativeKeyDown( JNIEnv* env, jclass jcls, jint keycode); void Java_org_libsdl_app_SDLActivity_onNativeKeyUp( JNIEnv* env, jclass jcls, jint keycode); void Java_org_libsdl_app_SDLActivity_onNativeTouch( JNIEnv* env, jclass jcls, jint touch_device_id_in, jint pointer_finger_id_in, jint action, jfloat x, jfloat y, jfloat p); void Java_org_libsdl_app_SDLActivity_onNativeAccel( JNIEnv* env, jclass jcls, jfloat x, jfloat y, jfloat z); void Java_org_libsdl_app_SDLActivity_nativeQuit( JNIEnv* env, jclass cls); void Java_org_libsdl_app_SDLActivity_nativeRunAudioThread( JNIEnv* env, jclass cls); void Java_org_libsdl_app_SDLActivity_nativeInit(JNIEnv* env, jclass cls, jobject obj); }; // Resize extern "C" void Java_com_mdqinc_dp_SDLActivity_onNativeResize( JNIEnv* env, jclass jcls, jint width, jint height, jint format) { Java_org_libsdl_app_SDLActivity_onNativeResize(env, jcls, width, height, format); } // Keydown extern "C" void Java_com_mdqinc_dp_SDLActivity_onNativeKeyDown( JNIEnv* env, jclass jcls, jint keycode) { Java_org_libsdl_app_SDLActivity_onNativeKeyDown(env, jcls, keycode); } // Keyup extern "C" void Java_com_mdqinc_dp_SDLActivity_onNativeKeyUp( JNIEnv* env, jclass jcls, jint keycode) { Java_org_libsdl_app_SDLActivity_onNativeKeyUp(env, jcls, keycode); } // Touch extern "C" void Java_com_mdqinc_dp_SDLActivity_onNativeTouch( JNIEnv* env, jclass jcls, jint touch_device_id_in, jint pointer_finger_id_in, jint action, jfloat x, jfloat y, jfloat p) { Java_org_libsdl_app_SDLActivity_onNativeTouch(env, jcls, touch_device_id_in, pointer_finger_id_in, action, x, y, p); } // Accelerometer extern "C" void Java_com_mdqinc_dp_SDLActivity_onNativeAccel( JNIEnv* env, jclass jcls, jfloat x, jfloat y, jfloat z) { Java_org_libsdl_app_SDLActivity_onNativeAccel(env, jcls, x, y, z); } // Quit extern "C" void Java_com_mdqinc_dp_SDLActivity_nativeQuit( JNIEnv* env, jclass cls) { Java_org_libsdl_app_SDLActivity_nativeQuit(env, cls); } extern "C" void Java_com_mdqinc_dp_SDLActivity_nativeRunAudioThread( JNIEnv* env, jclass cls) { Java_org_libsdl_app_SDLActivity_nativeRunAudioThread(env, cls); } extern "C" void Java_com_mdqinc_dp_SDLActivity_nativeInit(JNIEnv* env, jclass cls, jobject obj) { Java_org_libsdl_app_SDLActivity_nativeInit(env, cls, obj); }
This glue code should be paired with a Java activity file based on the one SDL ships, but you need to modify the line that says “package org.libsdl.app;” for a name that suits your needs (and that matches the glue code function names posted above). The same modification needs to be applied to the AndroidManifest.xml.
Besides these changes you need to take care of a final minor adjustment, you have to place your activity file in the right location. It will no longer go in src/org/libsdl/app/SDLActivity.java but rather (for example, in my case) in src/org/mdqinc/dp/SDLActivity.java.
You can certainly achieve the same result by modifying the SDL source code directly, but you’ll find that updating it later is a hassle. This approach I’m describing stands separate from the actual SDL code, so you can update it without worries. The only thing you may want to keep an eye on is changes in the SDLActivity.java file, but in case that’s updated you only need to change a single line with the package name at the top.
The most famous Python bindings for SDL is probably the PyGame project, but it’s oriented to the 1.2.x versions of the multiplatform library. There’s also PySDL by Albert Zeyer, which are automatically created from the SDL headers using a generator.
I tried to do the same for Cython, using an automated generator to parse header files and output a Cython sxd file. Sadly the tools for that are a bit green and they don’t quite work yet, so I’ve made a Cython header file handcrafted from SDL 1.3 headers containing the functions I need (There’s SDL, SDL_image and SDL_ttf functions in there). Based on it and the SDL source code it’s very easy to expand it, so contributions are most welcome.
Without further ado, SDL.pxd:
cdef extern from "SDL.h": ctypedef unsigned char Uint8 ctypedef unsigned long Uint32 ctypedef unsigned long long Uint64 ctypedef signed long long Sint64 ctypedef signed short Sint16 ctypedef unsigned short Uint16 ctypedef enum: SDL_PIXELFORMAT_ARGB8888 ctypedef enum SDL_BlendMode: SDL_BLENDMODE_NONE = 0x00000000 SDL_BLENDMODE_BLEND = 0x00000001 SDL_BLENDMODE_ADD = 0x00000002 SDL_BLENDMODE_MOD = 0x00000004 ctypedef enum SDL_TextureAccess: SDL_TEXTUREACCESS_STATIC SDL_TEXTUREACCESS_STREAMING SDL_TEXTUREACCESS_TARGET ctypedef enum SDL_RendererFlags: SDL_RENDERER_SOFTWARE = 0x00000001 SDL_RENDERER_ACCELERATED = 0x00000002 SDL_RENDERER_PRESENTVSYNC = 0x00000004 ctypedef enum SDL_bool: SDL_FALSE = 0 SDL_TRUE = 1 cdef struct SDL_Rect: int x, y int w, h ctypedef struct SDL_Point: int x, y cdef struct SDL_Color: Uint8 r Uint8 g Uint8 b Uint8 unused cdef struct SDL_Palette: int ncolors SDL_Color *colors Uint32 version int refcount cdef struct SDL_PixelFormat: Uint32 format SDL_Palette *palette Uint8 BitsPerPixel Uint8 BytesPerPixel Uint8 padding[2] Uint32 Rmask Uint32 Gmask Uint32 Bmask Uint32 Amask Uint8 Rloss Uint8 Gloss Uint8 Bloss Uint8 Aloss Uint8 Rshift Uint8 Gshift Uint8 Bshift Uint8 Ashift int refcount SDL_PixelFormat *next cdef struct SDL_BlitMap cdef struct SDL_Surface: Uint32 flags SDL_PixelFormat *format int w, h int pitch void *pixels void *userdata int locked void *lock_data SDL_Rect clip_rect SDL_BlitMap *map int refcount ctypedef enum SDL_EventType: SDL_FIRSTEVENT = 0, SDL_QUIT = 0x100 SDL_WINDOWEVENT = 0x200 SDL_SYSWMEVENT SDL_KEYDOWN = 0x300 SDL_KEYUP SDL_TEXTEDITING SDL_TEXTINPUT SDL_MOUSEMOTION = 0x400 SDL_MOUSEBUTTONDOWN SDL_MOUSEBUTTONUP SDL_MOUSEWHEEL SDL_INPUTMOTION = 0x500 SDL_INPUTBUTTONDOWN SDL_INPUTBUTTONUP SDL_INPUTWHEEL SDL_INPUTPROXIMITYIN SDL_INPUTPROXIMITYOUT SDL_JOYAXISMOTION = 0x600 SDL_JOYBALLMOTION SDL_JOYHATMOTION SDL_JOYBUTTONDOWN SDL_JOYBUTTONUP SDL_FINGERDOWN = 0x700 SDL_FINGERUP SDL_FINGERMOTION SDL_TOUCHBUTTONDOWN SDL_TOUCHBUTTONUP SDL_DOLLARGESTURE = 0x800 SDL_DOLLARRECORD SDL_MULTIGESTURE SDL_CLIPBOARDUPDATE = 0x900 SDL_EVENT_COMPAT1 = 0x7000 SDL_EVENT_COMPAT2 SDL_EVENT_COMPAT3 SDL_USEREVENT = 0x8000 SDL_LASTEVENT = 0xFFFF ctypedef enum SDL_WindowEventID: SDL_WINDOWEVENT_NONE #< Never used */ SDL_WINDOWEVENT_SHOWN #< Window has been shown */ SDL_WINDOWEVENT_HIDDEN #< Window has been hidden */ SDL_WINDOWEVENT_EXPOSED #< Window has been exposed and should be # redrawn */ SDL_WINDOWEVENT_MOVED #< Window has been moved to data1, data2 # */ SDL_WINDOWEVENT_RESIZED #< Window has been resized to data1xdata2 */ SDL_WINDOWEVENT_SIZE_CHANGED #< The window size has changed, either as a result of an API call or through the system or user changing the window size. */ SDL_WINDOWEVENT_MINIMIZED #< Window has been minimized */ SDL_WINDOWEVENT_MAXIMIZED #< Window has been maximized */ SDL_WINDOWEVENT_RESTORED #< Window has been restored to normal size # and position */ SDL_WINDOWEVENT_ENTER #< Window has gained mouse focus */ SDL_WINDOWEVENT_LEAVE #< Window has lost mouse focus */ SDL_WINDOWEVENT_FOCUS_GAINED #< Window has gained keyboard focus */ SDL_WINDOWEVENT_FOCUS_LOST #< Window has lost keyboard focus */ SDL_WINDOWEVENT_CLOSE #< The window manager requests that the # window be closed */ ctypedef enum SDL_WindowFlags: SDL_WINDOW_FULLSCREEN = 0x00000001 SDL_WINDOW_OPENGL = 0x00000002 SDL_WINDOW_SHOWN = 0x00000004 SDL_WINDOW_HIDDEN = 0x00000008 SDL_WINDOW_BORDERLESS = 0x00000010 SDL_WINDOW_RESIZABLE = 0x00000020 SDL_WINDOW_MINIMIZED = 0x00000040 SDL_WINDOW_MAXIMIZED = 0x00000080 SDL_WINDOW_INPUT_GRABBED = 0x00000100 SDL_WINDOW_INPUT_FOCUS = 0x00000200 SDL_WINDOW_MOUSE_FOCUS = 0x00000400 SDL_WINDOW_FOREIGN = 0x00000800 ctypedef enum SDL_RendererFlip: SDL_FLIP_NONE = 0x00000000 SDL_FLIP_HORIZONTAL = 0x00000001 SDL_FLIP_VERTICAL = 0x00000002 cdef struct SDL_MouseMotionEvent: Uint32 type Uint32 windowID Uint8 state Uint8 padding1 Uint8 padding2 Uint8 padding3 int x int y int xrel int yrel cdef struct SDL_MouseButtonEvent: Uint32 type Uint32 windowID Uint8 button Uint8 state Uint8 padding1 Uint8 padding2 int x int y cdef struct SDL_WindowEvent: Uint32 type Uint32 windowID Uint8 event Uint8 padding1 Uint8 padding2 Uint8 padding3 int data1 int data2 ctypedef Sint64 SDL_TouchID ctypedef Sint64 SDL_FingerID cdef struct SDL_TouchFingerEvent: Uint32 type Uint32 windowID SDL_TouchID touchId SDL_FingerID fingerId Uint8 state Uint8 padding1 Uint8 padding2 Uint8 padding3 Uint16 x Uint16 y Sint16 dx Sint16 dy Uint16 pressure cdef struct SDL_KeyboardEvent: pass cdef struct SDL_TextEditingEvent: pass cdef struct SDL_TextInputEvent: pass cdef struct SDL_MouseWheelEvent: Uint32 type Uint32 windowID int x int y cdef struct SDL_JoyAxisEvent: pass cdef struct SDL_JoyBallEvent: pass cdef struct SDL_JoyHatEvent: pass cdef struct SDL_JoyButtonEvent: pass cdef struct SDL_QuitEvent: pass cdef struct SDL_UserEvent: pass cdef struct SDL_SysWMEvent: pass cdef struct SDL_TouchFingerEvent: pass cdef struct SDL_TouchButtonEvent: pass cdef struct SDL_MultiGestureEvent: pass cdef struct SDL_DollarGestureEvent: pass cdef union SDL_Event: Uint32 type SDL_WindowEvent window SDL_KeyboardEvent key SDL_TextEditingEvent edit SDL_TextInputEvent text SDL_MouseMotionEvent motion SDL_MouseButtonEvent button SDL_MouseWheelEvent wheel SDL_JoyAxisEvent jaxis SDL_JoyBallEvent jball SDL_JoyHatEvent jhat SDL_JoyButtonEvent jbutton SDL_QuitEvent quit SDL_UserEvent user SDL_SysWMEvent syswm SDL_TouchFingerEvent tfinger SDL_TouchButtonEvent tbutton SDL_MultiGestureEvent mgesture SDL_DollarGestureEvent dgesture cdef struct SDL_RendererInfo: char *name Uint32 flags Uint32 num_texture_formats Uint32 texture_formats[16] int max_texture_width int max_texture_height ctypedef struct SDL_Texture ctypedef struct SDL_Renderer ctypedef struct SDL_Window ctypedef struct SDL_DisplayMode: Uint32 format int w int h int refresh_rate void *driverdata cdef struct SDL_RWops: long (* seek) (SDL_RWops * context, long offset,int whence) size_t(* read) ( SDL_RWops * context, void *ptr, size_t size, size_t maxnum) size_t(* write) (SDL_RWops * context, void *ptr,size_t size, size_t num) int (* close) (SDL_RWops * context) cdef SDL_Renderer * SDL_CreateRenderer(SDL_Window * window, int index, Uint32 flags) cdef SDL_Texture * SDL_CreateTexture(SDL_Renderer * renderer, Uint32 format, int access, int w, int h) cdef SDL_Texture * SDL_CreateTextureFromSurface(SDL_Renderer * renderer, SDL_Surface * surface) cdef SDL_Surface * SDL_CreateRGBSurface(Uint32 flags, int width, int height, int depth, Uint32 Rmask, Uint32 Gmask, Uint32 Bmask, Uint32 Amask) cdef int SDL_RenderCopy(SDL_Renderer * renderer, SDL_Texture * texture, SDL_Rect * srcrect, SDL_Rect * dstrect) cdef int SDL_RenderCopyEx(SDL_Renderer * renderer, SDL_Texture * texture, SDL_Rect * srcrect, SDL_Rect * dstrect, double angle, SDL_Point *center, SDL_RendererFlip flip) cdef void SDL_RenderPresent(SDL_Renderer * renderer) cdef SDL_bool SDL_RenderTargetSupported(SDL_Renderer *renderer) cdef int SDL_SetTargetTexture(SDL_Texture *texture) cdef SDL_bool SDL_ResetTargetTexture(SDL_Renderer *renderer) cdef void SDL_DestroyTexture(SDL_Texture * texture) cdef void SDL_FreeSurface(SDL_Surface * surface) cdef int SDL_UpperBlit (SDL_Surface * src, SDL_Rect * srcrect, SDL_Surface * dst, SDL_Rect * dstrect) cdef int SDL_LockTexture(SDL_Texture * texture, SDL_Rect * rect, void **pixels, int *pitch) cdef void SDL_UnlockTexture(SDL_Texture * texture) cdef void SDL_GetWindowSize(SDL_Window * window, int *w, int *h) cdef SDL_Window * SDL_CreateWindow(char *title, int x, int y, int w, int h, Uint32 flags) cdef int SDL_SetRenderDrawColor(SDL_Renderer * renderer, Uint8 r, Uint8 g, Uint8 b, Uint8 a) cdef int SDL_RenderClear(SDL_Renderer * renderer) cdef int SDL_SetTextureBlendMode(SDL_Texture * texture, SDL_BlendMode blendMode) cdef int SDL_GetTextureBlendMode(SDL_Texture * texture, SDL_BlendMode *blendMode) cdef SDL_Surface * SDL_CreateRGBSurfaceFrom(void *pixels, int width, int height, int depth, int pitch, Uint32 Rmask, Uint32 Gmask, Uint32 Bmask, Uint32 Amask) cdef int SDL_Init(Uint32 flags) cdef void SDL_Quit() cdef int SDL_EnableUNICODE(int enable) cdef Uint32 SDL_GetTicks() cdef void SDL_Delay(Uint32 ms) cdef int SDL_PollEvent(SDL_Event * event) cdef SDL_RWops * SDL_RWFromFile(char *file, char *mode) cdef void SDL_FreeRW(SDL_RWops *area) cdef int SDL_GetRendererInfo(SDL_Renderer *renderer, SDL_RendererInfo *info) cdef int SDL_RenderSetViewport(SDL_Renderer * renderer, SDL_Rect * rect) cdef int SDL_GetCurrentDisplayMode(int displayIndex, SDL_DisplayMode * mode) cdef int SDL_GetDesktopDisplayMode(int displayIndex, SDL_DisplayMode * mode) cdef int SDL_SetTextureColorMod(SDL_Texture * texture, Uint8 r, Uint8 g, Uint8 b) cdef int SDL_SetTextureAlphaMod(SDL_Texture * texture, Uint8 alpha) cdef char * SDL_GetError() cdef extern from "SDL_image.h": cdef SDL_Surface *IMG_Load(char *file) cdef extern from "SDL_ttf.h": ctypedef struct TTF_Font cdef int TTF_Init() cdef TTF_Font * TTF_OpenFont( char *file, int ptsize) cdef TTF_Font * TTF_OpenFontIndex( char *file, int ptsize, long index) cdef TTF_Font * TTF_OpenFontRW(SDL_RWops *src, int freesrc, int ptsize) cdef TTF_Font * TTF_OpenFontIndexRW(SDL_RWops *src, int freesrc, int ptsize, long index) #Set and retrieve the font style #define TTF_STYLE_NORMAL 0x00 #define TTF_STYLE_BOLD 0x01 #define TTF_STYLE_ITALIC 0x02 #define TTF_STYLE_UNDERLINE 0x04 #define TTF_STYLE_STRIKETHROUGH 0x08 cdef int TTF_GetFontStyle( TTF_Font *font) cdef void TTF_SetFontStyle(TTF_Font *font, int style) cdef int TTF_GetFontOutline( TTF_Font *font) cdef void TTF_SetFontOutline(TTF_Font *font, int outline) #Set and retrieve FreeType hinter settings */ #define TTF_HINTING_NORMAL 0 #define TTF_HINTING_LIGHT 1 #define TTF_HINTING_MONO 2 #define TTF_HINTING_NONE 3 cdef int TTF_GetFontHinting( TTF_Font *font) cdef void TTF_SetFontHinting(TTF_Font *font, int hinting) #Get the total height of the font - usually equal to point size cdef int TTF_FontHeight( TTF_Font *font) ## Get the offset from the baseline to the top of the font #This is a positive value, relative to the baseline. #*/ cdef int TTF_FontAscent( TTF_Font *font) ## Get the offset from the baseline to the bottom of the font # This is a negative value, relative to the baseline. # */ cdef int TTF_FontDescent( TTF_Font *font) ## Get the recommended spacing between lines of text for this font */ cdef int TTF_FontLineSkip( TTF_Font *font) ## Get/Set whether or not kerning is allowed for this font */ cdef int TTF_GetFontKerning( TTF_Font *font) cdef void TTF_SetFontKerning(TTF_Font *font, int allowed) ## Get the number of faces of the font */ cdef long TTF_FontFaces( TTF_Font *font) ## Get the font face attributes, if any */ cdef int TTF_FontFaceIsFixedWidth( TTF_Font *font) cdef char * TTF_FontFaceFamilyName( TTF_Font *font) cdef char * TTF_FontFaceStyleName( TTF_Font *font) ## Check wether a glyph is provided by the font or not */ cdef int TTF_GlyphIsProvided( TTF_Font *font, Uint16 ch) ## Get the metrics (dimensions) of a glyph # To understand what these metrics mean, here is a useful link: # http://freetype.sourceforge.net/freetype2/docs/tutorial/step2.html # */ cdef int TTF_GlyphMetrics(TTF_Font *font, Uint16 ch,int *minx, int *maxx, int *miny, int *maxy, int *advance) ## Get the dimensions of a rendered string of text */ cdef int TTF_SizeText(TTF_Font *font, char *text, int *w, int *h) cdef int TTF_SizeUTF8(TTF_Font *font, char *text, int *w, int *h) cdef int TTF_SizeUNICODE(TTF_Font *font, Uint16 *text, int *w, int *h) # Create an 8-bit palettized surface and render the given text at # fast quality with the given font and color. The 0 pixel is the # colorkey, giving a transparent background, and the 1 pixel is set # to the text color. # This function returns the new surface, or NULL if there was an error. #*/ cdef SDL_Surface * TTF_RenderText_Solid(TTF_Font *font, char *text, SDL_Color fg) cdef SDL_Surface * TTF_RenderUTF8_Solid(TTF_Font *font, char *text, SDL_Color fg) cdef SDL_Surface * TTF_RenderUNICODE_Solid(TTF_Font *font, Uint16 *text, SDL_Color fg) # Create an 8-bit palettized surface and render the given glyph at # fast quality with the given font and color. The 0 pixel is the # colorkey, giving a transparent background, and the 1 pixel is set # to the text color. The glyph is rendered without any padding or # centering in the X direction, and aligned normally in the Y direction. # This function returns the new surface, or NULL if there was an error. #*/ cdef SDL_Surface * TTF_RenderGlyph_Solid(TTF_Font *font, Uint16 ch, SDL_Color fg) # Create an 8-bit palettized surface and render the given text at # high quality with the given font and colors. The 0 pixel is background, # while other pixels have varying degrees of the foreground color. # This function returns the new surface, or NULL if there was an error. #*/ cdef SDL_Surface * TTF_RenderText_Shaded(TTF_Font *font, char *text, SDL_Color fg, SDL_Color bg) cdef SDL_Surface * TTF_RenderUTF8_Shaded(TTF_Font *font, char *text, SDL_Color fg, SDL_Color bg) cdef SDL_Surface * TTF_RenderUNICODE_Shaded(TTF_Font *font, Uint16 *text, SDL_Color fg, SDL_Color bg) # Create an 8-bit palettized surface and render the given glyph at # high quality with the given font and colors. The 0 pixel is background, # while other pixels have varying degrees of the foreground color. # The glyph is rendered without any padding or centering in the X # direction, and aligned normally in the Y direction. # This function returns the new surface, or NULL if there was an error. # cdef SDL_Surface * TTF_RenderGlyph_Shaded(TTF_Font *font, Uint16 ch, SDL_Color fg, SDL_Color bg) # Create a 32-bit ARGB surface and render the given text at high quality, # using alpha blending to dither the font with the given color. # This function returns the new surface, or NULL if there was an error. #*/ cdef SDL_Surface * TTF_RenderText_Blended(TTF_Font *font, char *text, SDL_Color fg) cdef SDL_Surface * TTF_RenderUTF8_Blended(TTF_Font *font, char *text, SDL_Color fg) cdef SDL_Surface * TTF_RenderUNICODE_Blended(TTF_Font *font, Uint16 *text, SDL_Color fg) # Create a 32-bit ARGB surface and render the given glyph at high quality, # using alpha blending to dither the font with the given color. # The glyph is rendered without any padding or centering in the X # direction, and aligned normally in the Y direction. # This function returns the new surface, or NULL if there was an error. #*/ cdef SDL_Surface * TTF_RenderGlyph_Blended(TTF_Font *font, Uint16 ch, SDL_Color fg) # For compatibility with previous versions, here are the old functions */ #define TTF_RenderText(font, text, fg, bg) \ # TTF_RenderText_Shaded(font, text, fg, bg) #define TTF_RenderUTF8(font, text, fg, bg) \ # TTF_RenderUTF8_Shaded(font, text, fg, bg) #define TTF_RenderUNICODE(font, text, fg, bg) \ # TTF_RenderUNICODE_Shaded(font, text, fg, bg) # Close an opened font file */ cdef void TTF_CloseFont(TTF_Font *font) # De-initialize the TTF engine */ cdef void TTF_Quit() # Check if the TTF engine is initialized */ cdef int TTF_WasInit() # Get the kerning size of two glyphs */ cdef int TTF_GetFontKerningSize(TTF_Font *font, int prev_index, int index)
If you are interested in cross compiling Python 2.7.2 for use in Windows using the Mingw32 cross compiling tools I’ll suggest that you first read my post on how to cross compile it for Android as much of the instructions are the same.
Again, this instructions are tested under Ubuntu 11.04 (Natty) 64 bits . The patch that you’ll need is based on the Android patch, and several other pieces from around the web (including several proposed patches in the Python tracker).
Instead of repeating the instructions all over again, I’ll just point out the differences in the process. Assuming you’ve already downloaded Python’s 2.7.2 source, the patch, created a host version of the Python interpreter and applied the patch for cross compilation, you’ll then need to install the Mingw32 toolset (if you haven’t already):
sudo apt-get install mingw32 mingw32-binutils mingw32-runtime
Prepare the environment:
export ARCH="win32" export CFLAGS="" export CXXFLAGS="" export CC="i586-mingw32msvc-gcc" export CXX="i586-mingw32msvc-g++" export AR="i586-mingw32msvc-ar" export RANLIB="i586-mingw32msvc-ranlib" export STRIP="i586-mingw32msvc-strip --strip-unneeded" export LD="i586-mingw32msvc-ld" export AS="i586-mingw32msvc-as" export NM="i586-mingw32msvc-nm" export DLLTOOL="i586-mingw32msvc-dlltool" export OBJDUMP="i586-mingw32msvc-objdump" export RESCOMP="i586-mingw32msvc-windres" export MAKE="make -k -j4 HOSTPYTHON=[PATH TO HOST PYTHON] HOSTPGEN=[PATH TO HOST PGEN] CROSS_COMPILE=mingw32msvc CROSS_COMPILE_TARGET=yes" export EXTRALIBS="-lstdc++ -lgcc -lodbc32 -lwsock32 -lwinspool -lwinmm -lshell32 -lcomctl32 -lctl3d32 -lodbc32 -ladvapi32 -lopengl32 -lglu32 -lole32 -loleaut32 -luuid"
Then configure the build:
./configure LDFLAGS="-Wl,--no-export-dynamic -static-libgcc -static $EXTRALIBS" CFLAGS="-DMS_WIN32 -DMS_WINDOWS -DHAVE_USABLE_WCHAR_T" CPPFLAGS="-static" LINKFORSHARED=" " LIBOBJS="import_nt.o dl_nt.o getpathp.o" THREADOBJ="Python/thread.o" DYNLOADFILE="dynload_win.o" --disable-shared HOSTPYTHON=[PATH TO HOST PYTHON] HOSTPGEN=[PATH TO HOST PGEN] --host=i586-mingw32msvc --build=i686-pc-linux-gnu --prefix="[WHERE YOU WANT TO PUT THE GENERATED PYTHON STUFF]"
You may have noticed that there’s a few static compilation related switches in there. I think it’s possible to compile Python without them, but as I’m building a monolithic app with all the modules compiled in the same exe file, I haven’t really tested the shared library building part of the process. If you do test with shared libraries enabled, then feel free to leave feedback in the comments and I’ll update the article with whatever caveats there might be.
Now comes the hacky stuff…not everything that should be fixed for the cross compilation to succeed is fixed in the patch, so we need to do some manual repairs after the config process is finished.
sed -i "s|\${LIBOBJDIR}fileblocks\$U\.o||g" Makefile # Enable NT Threads sed -i "s|.*NT_THREADS.*|#define NT_THREADS|g" pyconfig.h # Disable PTY stuff that gets activated because of errors in the configure script sed -i "s|.*HAVE_OPENPTY.*|#undef HAVE_OPENPTY|g" pyconfig.h sed -i "s|.*HAVE__GETPTY.*|#undef HAVE__GETPTY|g" pyconfig.h sed -i "s|.*HAVE_DEV_PTMX.*|#undef HAVE_DEV_PTMX|g" pyconfig.h
Finally:
$MAKE
Hopefully you’ll get a static python binary (.exe) and also a static library that you can link against other files. The interactive console seems to work as well (if you copy the generated files to Windows and run python.exe)
So, one day you wake up and decide that you don’t have enough problems in your life and that you’d like to have more. That’s when you go with the obvious choice: You’ll try to cross compile the Python runtime for use in Android.
These instructions and the required patch apply to Python v2.7.2 (at least that’s the version I know works). The patch was adapted by me, and it’s based on the Py4A patch. I also got ideas and guidance from the Pygame For Android project, specially their build scripts, which if you are interested in this kind of self inflicted pain I suggest you have a look at them. All testing and development was done on Ubuntu Natty 64 bits version.
The first thing we have to do is create a host version of the Python runtime, as it is required to cross compile Python. This is easily done by extracting the Python source, and running the usual configure/make/make install.
Once you have that, after extracting the Python source to some other place and applying the patch (patch -p0 python-2.7.2.android.diff), you need to set up the Android NDK, and set up some environment variables (valid at least for Android NDK R6):
export ANDROID_NDK=[PATH WHERE THE ANDROID NDK IS] export PATH="$ANDROID_NDK/toolchains/arm-linux-androideabi-4.4.3/prebuilt/linux-x86/bin/:$ANDROID_NDK:$ANDROID_NDK/tools:/usr/local/bin:/usr/bin:/bin" export ARCH="armeabi" export CFLAGS="-DANDROID -mandroid -fomit-frame-pointer --sysroot $ANDROID_NDK/platforms/android-5/arch-arm" export CXXFLAGS = "$CFLAGS" export CC="arm-linux-androideabi-gcc $CFLAGS" export CXX="arm-linux-androideabi-g++ $CXXFLAGS" export AR="arm-linux-androideabi-ar" export RANLIB="arm-linux-androideabi-ranlib" export STRIP="arm-linux-androideabi-strip --strip-unneeded" export MAKE="make -j4 HOSTPYTHON=[PATH TO HOST PYTHON] HOSTPGEN=[PATH TO HOST PGEN] CROSS_COMPILE=arm-eabi- CROSS_COMPILE_TARGET=yes"
In the final variable “MAKE” up above you have to complete the appropriate full path to the python executable and pgen executable that you generated when you compiled Python for the host (I don’t go into the details of compiling Python for the host as it is rather simple and there’s plenty of information on the net about this, as I mentioned it shouldn’t be harder than doing configure/make/make install).
I configured the cross compilation with:
./configure LDFLAGS="-Wl,--allow-shlib-undefined" CFLAGS="-mandroid -fomit-frame-pointer --sysroot $ANDROID_NDK/platforms/android-5/arch-arm" HOSTPYTHON=[HOST PYTHON PATH] HOSTPGEN=[HOST PGEN PATH] --host=arm-eabi --build=i686-pc-linux-gnu --enable-shared --prefix="[WHERE YOU WANT TO PUT THE GENERATED PYTHON STUFF]"
After this I had to make a small correction to the generated Makefile.
sed -i "s|^INSTSONAME=\(.*.so\).*|INSTSONAME=\\1|g" Makefile
Now you are ready to compile:
$MAKE
With a little bit of luck that should be it, I promised pain but it didn’t seem like much trouble, right? Well, I just gave you my hard earned patch that makes the trick!
Actually using the compiled library is a different matter entirely, but to give you a hint of where to go I suggest that you take a look at my previous article on how to embed and freeze modules and packages in Python, which is what I did. Once you do that, maybe mix it with a bit of SDL (of special interest is the Android project skeleton that they use), and you’ll have a fully working Python environment that you can build apps on!
Stay tuned for instructions on how to accomplish something similar (but with a WAY bigger patch) for cross compiling to Windows via Mingw32
UPDATE: Thanks to Anthony Prieur who let me know of a couple of typos in the instructions (already fixed) and that the patch has an indentation issue in setup.py, which is trivial to fix if you need the file (I don’t have any use for that file in the application I’m developing).
UPDATE2: Patch now supports cross compiling from OS X in addition to Linux.
The SDL library supports the Android OS quite well. To get started, this guide is a great starting point.
I have a couple of points to add to that tutorial. I didn’t actually use the pre made SDL project, but rather took all the structure from SDL’s Mercurial repository. If you do that, the project will build, but you’ll have problems if you try to open a file via SDL (for example using IMG_Load). SDL seems to have good Asset Manager integration, but there’s a key piece of glue code missing, a static function called getContext that should be in the SDLActivity.java file. This is my fixed SDLActivity.java file, which also includes a workaround to disable OpenGL ES 2.x without recompiling SDL (it’s in the initEGL function).
package org.libsdl.app; import javax.microedition.khronos.egl.EGLConfig; import javax.microedition.khronos.opengles.GL10; import javax.microedition.khronos.egl.*; import android.app.*; import android.content.*; import android.view.*; import android.os.*; import android.util.Log; import android.graphics.*; import android.text.method.*; import android.text.*; import android.media.*; import android.hardware.*; import android.content.*; import java.lang.*; /** SDL Activity */ public class SDLActivity extends Activity { // Main components private static SDLActivity mSingleton; private static SDLSurface mSurface; // Audio private static Thread mAudioThread; private static AudioTrack mAudioTrack; // Load the .so static { System.loadLibrary("SDL"); System.loadLibrary("SDL_image"); System.loadLibrary("mikmod"); System.loadLibrary("SDL_mixer"); System.loadLibrary("SDL_ttf"); System.loadLibrary("main"); } // Setup protected void onCreate(Bundle savedInstanceState) { //Log.v("SDL", "onCreate()"); super.onCreate(savedInstanceState); // So we can call stuff from static callbacks mSingleton = this; // Set up the surface mSurface = new SDLSurface(getApplication()); setContentView(mSurface); SurfaceHolder holder = mSurface.getHolder(); holder.setType(SurfaceHolder.SURFACE_TYPE_GPU); } // Events protected void onPause() { //Log.v("SDL", "onPause()"); super.onPause(); } protected void onResume() { //Log.v("SDL", "onResume()"); super.onResume(); } // Messages from the SDLMain thread static int COMMAND_CHANGE_TITLE = 1; // Handler for the messages Handler commandHandler = new Handler() { public void handleMessage(Message msg) { if (msg.arg1 == COMMAND_CHANGE_TITLE) { setTitle((String)msg.obj); } } }; // Send a message from the SDLMain thread void sendCommand(int command, Object data) { Message msg = commandHandler.obtainMessage(); msg.arg1 = command; msg.obj = data; commandHandler.sendMessage(msg); } // C functions we call public static native void nativeInit(); public static native void nativeQuit(); public static native void onNativeResize(int x, int y, int format); public static native void onNativeKeyDown(int keycode); public static native void onNativeKeyUp(int keycode); public static native void onNativeTouch(int action, float x, float y, float p); public static native void onNativeAccel(float x, float y, float z); public static native void nativeRunAudioThread(); // Java functions called from C public static boolean createGLContext(int majorVersion, int minorVersion) { return mSurface.initEGL(majorVersion, minorVersion); } public static void flipBuffers() { mSurface.flipEGL(); } public static void setActivityTitle(String title) { // Called from SDLMain() thread and can't directly affect the view mSingleton.sendCommand(COMMAND_CHANGE_TITLE, title); } public static Context getContext() { return mSingleton; } // Audio private static Object buf; public static Object audioInit(int sampleRate, boolean is16Bit, boolean isStereo, int desiredFrames) { int channelConfig = isStereo ? AudioFormat.CHANNEL_CONFIGURATION_STEREO : AudioFormat.CHANNEL_CONFIGURATION_MONO; int audioFormat = is16Bit ? AudioFormat.ENCODING_PCM_16BIT : AudioFormat.ENCODING_PCM_8BIT; int frameSize = (isStereo ? 2 : 1) * (is16Bit ? 2 : 1); Log.v("SDL", "SDL audio: wanted " + (isStereo ? "stereo" : "mono") + " " + (is16Bit ? "16-bit" : "8-bit") + " " + ((float)sampleRate / 1000f) + "kHz, " + desiredFrames + " frames buffer"); // Let the user pick a larger buffer if they really want -- but ye // gods they probably shouldn't, the minimums are horrifyingly high // latency already desiredFrames = Math.max(desiredFrames, (AudioTrack.getMinBufferSize(sampleRate, channelConfig, audioFormat) + frameSize - 1) / frameSize); mAudioTrack = new AudioTrack(AudioManager.STREAM_MUSIC, sampleRate, channelConfig, audioFormat, desiredFrames * frameSize, AudioTrack.MODE_STREAM); audioStartThread(); Log.v("SDL", "SDL audio: got " + ((mAudioTrack.getChannelCount() >= 2) ? "stereo" : "mono") + " " + ((mAudioTrack.getAudioFormat() == AudioFormat.ENCODING_PCM_16BIT) ? "16-bit" : "8-bit") + " " + ((float)mAudioTrack.getSampleRate() / 1000f) + "kHz, " + desiredFrames + " frames buffer"); if (is16Bit) { buf = new short[desiredFrames * (isStereo ? 2 : 1)]; } else { buf = new byte[desiredFrames * (isStereo ? 2 : 1)]; } return buf; } public static void audioStartThread() { mAudioThread = new Thread(new Runnable() { public void run() { mAudioTrack.play(); nativeRunAudioThread(); } }); // I'd take REALTIME if I could get it! mAudioThread.setPriority(Thread.MAX_PRIORITY); mAudioThread.start(); } public static void audioWriteShortBuffer(short[] buffer) { for (int i = 0; i < buffer.length; ) { int result = mAudioTrack.write(buffer, i, buffer.length - i); if (result > 0) { i += result; } else if (result == 0) { try { Thread.sleep(1); } catch(InterruptedException e) { // Nom nom } } else { Log.w("SDL", "SDL audio: error return from write(short)"); return; } } } public static void audioWriteByteBuffer(byte[] buffer) { for (int i = 0; i < buffer.length; ) { int result = mAudioTrack.write(buffer, i, buffer.length - i); if (result > 0) { i += result; } else if (result == 0) { try { Thread.sleep(1); } catch(InterruptedException e) { // Nom nom } } else { Log.w("SDL", "SDL audio: error return from write(short)"); return; } } } public static void audioQuit() { if (mAudioThread != null) { try { mAudioThread.join(); } catch(Exception e) { Log.v("SDL", "Problem stopping audio thread: " + e); } mAudioThread = null; //Log.v("SDL", "Finished waiting for audio thread"); } if (mAudioTrack != null) { mAudioTrack.stop(); mAudioTrack = null; } } } /** Simple nativeInit() runnable */ class SDLMain implements Runnable { public void run() { // Runs SDL_main() SDLActivity.nativeInit(); //Log.v("SDL", "SDL thread terminated"); } } /** SDLSurface. This is what we draw on, so we need to know when it's created in order to do anything useful. Because of this, that's where we set up the SDL thread */ class SDLSurface extends SurfaceView implements SurfaceHolder.Callback, View.OnKeyListener, View.OnTouchListener, SensorEventListener { // This is what SDL runs in. It invokes SDL_main(), eventually private Thread mSDLThread; // EGL private objects private EGLContext mEGLContext; private EGLSurface mEGLSurface; private EGLDisplay mEGLDisplay; // Sensors private static SensorManager mSensorManager; // Startup public SDLSurface(Context context) { super(context); getHolder().addCallback(this); setFocusable(true); setFocusableInTouchMode(true); requestFocus(); setOnKeyListener(this); setOnTouchListener(this); mSensorManager = (SensorManager)context.getSystemService("sensor"); } // Called when we have a valid drawing surface public void surfaceCreated(SurfaceHolder holder) { //Log.v("SDL", "surfaceCreated()"); enableSensor(Sensor.TYPE_ACCELEROMETER, true); } // Called when we lose the surface public void surfaceDestroyed(SurfaceHolder holder) { //Log.v("SDL", "surfaceDestroyed()"); // Send a quit message to the application SDLActivity.nativeQuit(); // Now wait for the SDL thread to quit if (mSDLThread != null) { try { mSDLThread.join(); } catch(Exception e) { Log.v("SDL", "Problem stopping thread: " + e); } mSDLThread = null; //Log.v("SDL", "Finished waiting for SDL thread"); } enableSensor(Sensor.TYPE_ACCELEROMETER, false); } // Called when the surface is resized public void surfaceChanged(SurfaceHolder holder, int format, int width, int height) { //Log.v("SDL", "surfaceChanged()"); int sdlFormat = 0x85151002; // SDL_PIXELFORMAT_RGB565 by default switch (format) { case PixelFormat.A_8: Log.v("SDL", "pixel format A_8"); break; case PixelFormat.LA_88: Log.v("SDL", "pixel format LA_88"); break; case PixelFormat.L_8: Log.v("SDL", "pixel format L_8"); break; case PixelFormat.RGBA_4444: Log.v("SDL", "pixel format RGBA_4444"); sdlFormat = 0x85421002; // SDL_PIXELFORMAT_RGBA4444 break; case PixelFormat.RGBA_5551: Log.v("SDL", "pixel format RGBA_5551"); sdlFormat = 0x85441002; // SDL_PIXELFORMAT_RGBA5551 break; case PixelFormat.RGBA_8888: Log.v("SDL", "pixel format RGBA_8888"); sdlFormat = 0x86462004; // SDL_PIXELFORMAT_RGBA8888 break; case PixelFormat.RGBX_8888: Log.v("SDL", "pixel format RGBX_8888"); sdlFormat = 0x86262004; // SDL_PIXELFORMAT_RGBX8888 break; case PixelFormat.RGB_332: Log.v("SDL", "pixel format RGB_332"); sdlFormat = 0x84110801; // SDL_PIXELFORMAT_RGB332 break; case PixelFormat.RGB_565: Log.v("SDL", "pixel format RGB_565"); sdlFormat = 0x85151002; // SDL_PIXELFORMAT_RGB565 break; case PixelFormat.RGB_888: Log.v("SDL", "pixel format RGB_888"); // Not sure this is right, maybe SDL_PIXELFORMAT_RGB24 instead? sdlFormat = 0x86161804; // SDL_PIXELFORMAT_RGB888 break; default: Log.v("SDL", "pixel format unknown " + format); break; } SDLActivity.onNativeResize(width, height, sdlFormat); // Now start up the C app thread if (mSDLThread == null) { mSDLThread = new Thread(new SDLMain(), "SDLThread"); mSDLThread.start(); } } // unused public void onDraw(Canvas canvas) {} // EGL functions public boolean initEGL(int majorVersion, int minorVersion) { // Temporarily disable OpenGL ES 2 as the SDL backend is buggy if (majorVersion != 1) return false; Log.v("SDL", "Starting up OpenGL ES " + majorVersion + "." + minorVersion); try { EGL10 egl = (EGL10)EGLContext.getEGL(); EGLDisplay dpy = egl.eglGetDisplay(EGL10.EGL_DEFAULT_DISPLAY); int[] version = new int[2]; egl.eglInitialize(dpy, version); int EGL_OPENGL_ES_BIT = 1; int EGL_OPENGL_ES2_BIT = 4; int renderableType = 0; if (majorVersion == 2) { renderableType = EGL_OPENGL_ES2_BIT; } else if (majorVersion == 1) { renderableType = EGL_OPENGL_ES_BIT; } int[] configSpec = { //EGL10.EGL_DEPTH_SIZE, 16, EGL10.EGL_RENDERABLE_TYPE, renderableType, EGL10.EGL_NONE }; EGLConfig[] configs = new EGLConfig[1]; int[] num_config = new int[1]; if (!egl.eglChooseConfig(dpy, configSpec, configs, 1, num_config) || num_config[0] == 0) { Log.e("SDL", "No EGL config available"); return false; } EGLConfig config = configs[0]; EGLContext ctx = egl.eglCreateContext(dpy, config, EGL10.EGL_NO_CONTEXT, null); if (ctx == EGL10.EGL_NO_CONTEXT) { Log.e("SDL", "Couldn't create context"); return false; } EGLSurface surface = egl.eglCreateWindowSurface(dpy, config, this, null); if (surface == EGL10.EGL_NO_SURFACE) { Log.e("SDL", "Couldn't create surface"); return false; } if (!egl.eglMakeCurrent(dpy, surface, surface, ctx)) { Log.e("SDL", "Couldn't make context current"); return false; } mEGLContext = ctx; mEGLDisplay = dpy; mEGLSurface = surface; } catch(Exception e) { Log.v("SDL", e + ""); for (StackTraceElement s : e.getStackTrace()) { Log.v("SDL", s.toString()); } } return true; } // EGL buffer flip public void flipEGL() { try { EGL10 egl = (EGL10)EGLContext.getEGL(); egl.eglWaitNative(EGL10.EGL_NATIVE_RENDERABLE, null); // drawing here egl.eglWaitGL(); egl.eglSwapBuffers(mEGLDisplay, mEGLSurface); } catch(Exception e) { Log.v("SDL", "flipEGL(): " + e); for (StackTraceElement s : e.getStackTrace()) { Log.v("SDL", s.toString()); } } } // Key events public boolean onKey(View v, int keyCode, KeyEvent event) { if (event.getAction() == KeyEvent.ACTION_DOWN) { //Log.v("SDL", "key down: " + keyCode); SDLActivity.onNativeKeyDown(keyCode); return true; } else if (event.getAction() == KeyEvent.ACTION_UP) { //Log.v("SDL", "key up: " + keyCode); SDLActivity.onNativeKeyUp(keyCode); return true; } return false; } // Touch events public boolean onTouch(View v, MotionEvent event) { int action = event.getAction(); float x = event.getX(); float y = event.getY(); float p = event.getPressure(); // TODO: Anything else we need to pass? SDLActivity.onNativeTouch(action, x, y, p); return true; } // Sensor events public void enableSensor(int sensortype, boolean enabled) { // TODO: This uses getDefaultSensor - what if we have >1 accels? if (enabled) { mSensorManager.registerListener(this, mSensorManager.getDefaultSensor(sensortype), SensorManager.SENSOR_DELAY_GAME, null); } else { mSensorManager.unregisterListener(this, mSensorManager.getDefaultSensor(sensortype)); } } public void onAccuracyChanged(Sensor sensor, int accuracy) { // TODO } public void onSensorChanged(SensorEvent event) { if (event.sensor.getType() == Sensor.TYPE_ACCELEROMETER) { SDLActivity.onNativeAccel(event.values[0], event.values[1], event.values[2]); } } }
The other issue I had happened when loading a bunch of files from native code (via de Android NDK). It so happens that there’s object reference stack that the JNI mechanism relies on, and while it gets properly handled by the Java VM internals, if your native code opens a series of files without returning to the Java VM, you run the risk of going over the 512 object reference’s limit and get an exception.
Luckily the solution to this problem is quite simple. In SDL_android.cpp, look for the function: static int Android_JNI_FileOpen(SDL_RWops* ctx)
At the beginning add
mEnv->PushLocalFrame(10);
and before every return expression:
mEnv->PopLocalFrame(NULL);
This will take care of freeing the object references that are no longer in use after the function returns, and then you can load as many images as you need in native code without hitting that limit.
My latest adventures with SDL bindings eventually led me to Cython, a very recommendable tool if you are looking to extract a little bit more juice out of your Python app performance, or just hide your source a little bit more obscurely. It compiles almost any Python code as is, and it includes extensions to the language that allow even faster converted code with the inclusion of static typed variables and other niceties. Cython offers a way to automatically compile your .py/.pyx modules, and you load those dynamically with the familiar import command, the usage of the imported module is exactly the same as if it were a native .py module or a compiled c module.
At this point, it’s important to mention that the generated .c files and their corresponding linked versions depend on the Python runtime, you can’t make a standalone executable out of them…are least not easily. Now, let’s suppose you didn’t read the “not easily” part I just mentioned, and that you wanted to integrate this module (or any other module you made in C from scratch) in a statically linked Python interpreter, how you’d go about it?
The following instructions were tested under Ubuntu Natty 64 bits. First, start by downloading the Python source. Extract, copy Modules/Setup.dist to Modules/Setup and run configure with the following parameters:
./configure LDFLAGS=”-Wl,–no-export-dynamic -static-libgcc -static” CPPFLAGS=”-static -fPIC” LINKFORSHARED=” ” DYNLOADFILE=”dynload_stub.o” –disable-shared –prefix=”/path/to/where/you/want/it/installed”
Followed by the all too familiar make & make install
You will see A LOT of errors that you can ¿safely? ignore mostly related to the fact that the c modules that come from Python won’t compile in static mode without some help. Once this crazyness stops, you’ll have a static Python interpreter (you can check with ldd ./python to see that it’s actually a standalone executable).
Now, this Python interpreter is lacking severely in content, and no one wants to re invent the wheel, specially such a fine wheel as Python provides…Go to that Modules/Setup file and take a look…search for the #*shared* line, remove it and replace it by *static* (with no # sign)…now look for some notable modules and uncomment them. Run the process again (configure and make) and this time you’ll end up with some builtin modules that you can import.
By now, you are probably catching my drift… lets suppose you have a module test.py, run “cython test.py” on it, and you’ll get a test.c file…copy it to Modules under the Python source, and edit Modules/Setup adding a line:
test test.c
Do the configure and make dance again, and now you should be able to do “import test” in the new Python interpreter, which will load the module as builtin. Neat, right?
If you go further down the rabbit hole and start depending on 3rd party libraries (or your own!), you will need to pay attention to how dependencies are specified in Modules/Setup. In short, you put whatever compiler and linker directives you need after the source files for the module.
This is all fine and dandy, but we haven’t broken anything yet…Let’s try something more advanced…imagine you have a full Python package already made (as in a full hierachy of modules arranged in folders and subfolders, etc), and you want to do the same Cython fueled embedding with it…After hitting your head on the wall for a looong while, you’ll figure out that actually you can’t (easily) do it…Basically because the Python interpreter builtin system is not geared towards packages, but rather towards shallow modules.
So, there’s two ways around it (that I know of). The first one is to use a series of shallow modules, and string them into a package like structure by means of importing submodules from the parent modules…
main.py import submod1 as _submod1 submod1 = _submod1
This is boring, error prone, requires a lot of glue code, it doesn’t play well with your module structure if you want it to also work in non compiled mode, etc.
The alternative is hacking the Python code just a little bit. Namely, the Python/import.c file, look for the find_module function and add:
if (is_builtin(fullname)) { strcpy(buf, fullname); return &fd_builtin; }
Place this code near the top of the function, right above the “if (path != NULL && PyString_Check(path)) {” line seems like a good place. What it does is to check the full module name (package1.package2.module) and sees if it is builtin. The official Python code doesn’t do this, it checks only for the module name for the reasons stated above.
Besides this little patch, you have to alter the Cython generated code just a bit…look for the “Py_InitModule4″ line, and replace the module name for the whole package name (if the module is package1.package2.module that line will only say “module”, you need to replace it by the whole enchilada). Doing this by hand is a PITA, but a simple find+sed command takes care of it swiftly. Also, while you are unleashing your sed kung fu, take care of the init??? functions, if a module is at package1.package2.mymodule, replace initmymodule by initpackage1_package2_mymodule (the reason why you have to do this will become clear later…or maybe not and I’m just making this stuff up)
Now, you have to go back to the Modules/Setup and edit the module line you added by appending all your sources (seems like a good job for a Python script, right?). If you run configure and make at this point you’ll then see that…it doesn’t quite work. Why? Because Python depends on a __path__ variable to figure out which module is a package and which one is just a module. Yes, you need to add those…
This is simple enough, in every package __init__.py file, add a __path__=[‘package1/package2/…”,] line with the right path for the location of the file.
And finally, you are ready…well, not yet. There are two things more you need to do…first, as the Python build system is geared towards shallow packages, you’ll have a problem if files in different subpackages have the same name, as they’ll end up overwriting each other when they are compiled (this will certainly happen for the __init__.py files), so you have to figure out a way to flatten your structure before adding them to Modules/Setup. What I do is scan the whole structure and copy the *.c files to a separate folder, replacing the ‘/’ of the directory separator by a ‘+’ sign. This way package1/package2/module.c becomes package1+package2+module.c. Then, add all this files to the same line in Modules/Setup, and then it comes the final piece of glue:
If your overall package is called…let’s say “test” to be creative, create a test.c file with something like this:
#include "Python.h" static PyMethodDef nomethods[] = { {NULL, NULL}}; extern void inittest_module1(); extern void inittest_package1(); extern void inittest_package1_submodule(); PyMODINIT_FUNC inittest(){ PyObject* module; PyObject* __path__; // Add a __path__ attribute so Python knows that this is a package PyObject* package_gilbert = PyImport_AddModule("test"); Py_InitModule("test", nomethods); __path__ = PyList_New(1); PyList_SetItem(__path__, 0, PyString_FromString("test")); PyModule_AddObject(package_test, "__path__", __path__); PyImport_AppendInittab("test.package1", inittest_package1); PyImport_AppendInittab("test.package1.submodule", inittest_package1_submodule); }
Append this file also to the Modules/Setup line. What this file does is create a “test” package, set up the __path__ variable accordingly, and append to the Python internal builtin table all of our modules. Now the reason for renaming the init functions earlier should become clear (just nod even if you go lost at Statically linking Python…)
Finally, run configure and make for the last time and your builtin package should be there…or not, there’s literally a hundred places where things could go wrong and the online documentation on the subject is quite sparse, that’s why I’m leaving this here for those brave souls that wish to try it. If something or everything in the process is not clear enough, let me know in the comments and good luck! (You’ll definitively will need it).
So, I got one of those wonderful Asus Transformers…I had to see if I could make something for it. And so I did! Having some Android experience from the 1.x days, I started looking around for a quick way to get something done, and it was then that I found the amazing AndEngine and its Live Wallpaper Extension . While the documentation is sparse (specially when it comes down to the newer changes in the engine), there’s a big enough community that if you search around in the forums and over the net you will find what you are looking for. Getting it all in place turned out to be quite simple once I figured out what changes needed to be done to adapt the provided example to the latest modifications done to the engine.
All in all, it’s pretty simple:
(LiveWallpaperService.java)
package org.anddev.wallpaper.live.donprimerizowp; import java.io.File; import java.util.Random; import net.rbgrn.opengl.GLWallpaperService.GLEngine; import org.anddev.andengine.engine.camera.Camera; import org.anddev.andengine.engine.handler.timer.ITimerCallback; import org.anddev.andengine.engine.handler.timer.TimerHandler; import org.anddev.andengine.engine.options.EngineOptions; import org.anddev.andengine.engine.options.EngineOptions.ScreenOrientation; import org.anddev.andengine.engine.options.resolutionpolicy.FillResolutionPolicy; import org.anddev.andengine.entity.particle.ParticleSystem; import org.anddev.andengine.entity.particle.modifier.AlphaModifier; import org.anddev.andengine.entity.particle.modifier.ExpireModifier; import org.anddev.andengine.entity.scene.Scene; import org.anddev.andengine.entity.sprite.AnimatedSprite.IAnimationListener; import org.anddev.andengine.entity.sprite.Sprite; import org.anddev.andengine.entity.sprite.AnimatedSprite; import org.anddev.andengine.extension.ui.livewallpaper.BaseLiveWallpaperService; import org.anddev.andengine.opengl.texture.Texture; import org.anddev.andengine.opengl.texture.TextureOptions; import org.anddev.andengine.opengl.texture.atlas.bitmap.BitmapTextureAtlas; import org.anddev.andengine.opengl.texture.atlas.bitmap.BitmapTextureAtlasTextureRegionFactory; import org.anddev.andengine.opengl.texture.region.TextureRegion; import org.anddev.andengine.opengl.texture.region.TextureRegionFactory; import org.anddev.andengine.opengl.texture.region.TiledTextureRegion; import org.anddev.andengine.opengl.view.GLSurfaceView.Renderer; import org.anddev.andengine.opengl.view.RenderSurfaceView; import org.anddev.andengine.sensor.accelerometer.AccelerometerData; import org.anddev.andengine.sensor.accelerometer.IAccelerometerListener; import org.anddev.andengine.sensor.orientation.IOrientationListener; import org.anddev.andengine.sensor.orientation.OrientationSensorOptions; import android.app.WallpaperManager; import android.content.res.Configuration; import android.os.Bundle; public class LiveWallpaperService extends BaseLiveWallpaperService implements IAccelerometerListener, IOffsetsChanged { protected class MyBaseWallpaperGLEngine extends GLEngine { // =========================================================== // Fields // =========================================================== private Renderer mRenderer; private IOffsetsChanged mOffsetsChangedListener = null; // =========================================================== // Constructors // =========================================================== public MyBaseWallpaperGLEngine(IOffsetsChanged pOffsetsChangedListener) { this.setEGLConfigChooser(false); this.mRenderer = new RenderSurfaceView.Renderer(LiveWallpaperService.this.mEngine); this.setRenderer(this.mRenderer); this.setRenderMode(RENDERMODE_CONTINUOUSLY); this.mOffsetsChangedListener = pOffsetsChangedListener; } // =========================================================== // Methods for/from SuperClass/Interfaces // =========================================================== @Override public Bundle onCommand(final String pAction, final int pX, final int pY, final int pZ, final Bundle pExtras, final boolean pResultRequested) { if(pAction.equals(WallpaperManager.COMMAND_TAP)) { LiveWallpaperService.this.onTap(pX, pY); } else if (pAction.equals(WallpaperManager.COMMAND_DROP)) { LiveWallpaperService.this.onDrop(pX, pY); } return super.onCommand(pAction, pX, pY, pZ, pExtras, pResultRequested); } @Override public void onResume() { super.onResume(); LiveWallpaperService.this.getEngine().onResume(); LiveWallpaperService.this.onResume(); } @Override public void onPause() { super.onPause(); LiveWallpaperService.this.getEngine().onPause(); LiveWallpaperService.this.onPause(); } @Override public void onDestroy() { super.onDestroy(); if (this.mRenderer != null) { // mRenderer.release(); } this.mRenderer = null; } @Override public void onOffsetsChanged(float xOffset, float yOffset, float xOffsetStep, float yOffsetStep, int xPixelOffset, int yPixelOffset) { // TODO Auto-generated method stub super.onOffsetsChanged(xOffset, yOffset, xOffsetStep, yOffsetStep, xPixelOffset, yPixelOffset); if(this.mOffsetsChangedListener != null) this.mOffsetsChangedListener.offsetsChanged(xOffset, yOffset, xOffsetStep, yOffsetStep, xPixelOffset, yPixelOffset); } } // =========================================================== // Constants // =========================================================== private static final int CAMERA_WIDTH = 1280; private static final int CAMERA_HEIGHT = 800; // =========================================================== // Fields // =========================================================== private BitmapTextureAtlas mTexture; private BitmapTextureAtlas mTexture2; private BitmapTextureAtlas mTexture3; private BitmapTextureAtlas mTexture4; private BitmapTextureAtlas mTexture5; private TextureRegion mPulperia; private TextureRegion mCarreta; private TiledTextureRegion mPibito; private TiledTextureRegion mMosca; private TiledTextureRegion mDPSeq1; private TiledTextureRegion mDPSeq2; private Sprite mPulperiaSprite; private AnimatedSprite mPibitoSprite; private Sprite mCarretaSprite; private AnimatedSprite mMoscaSprite; private AnimatedSprite mDP1Sprite; private AnimatedSprite mDP2Sprite; private ScreenOrientation mScreenOrientation; private Camera mCamera; private Scene mScene; private IAnimationListener mDP1ListenerF, mDP1ListenerB, mDP2Listener; // =========================================================== // Constructors // =========================================================== // =========================================================== // Getter & Setter // =========================================================== // =========================================================== // Methods for/from SuperClass/Interfaces // =========================================================== @Override public org.anddev.andengine.engine.Engine onLoadEngine() { mCamera = new Camera(0, 0, CAMERA_WIDTH, CAMERA_HEIGHT); return new org.anddev.andengine.engine.Engine(new EngineOptions(true, this.mScreenOrientation, new FillResolutionPolicy(), mCamera)); } @Override public void onLoadResources() { this.getEngine().disableOrientationSensor(this); this.mTexture = new BitmapTextureAtlas(2048, 2048, TextureOptions.BILINEAR); this.mTexture2 = new BitmapTextureAtlas(2048, 2048, TextureOptions.BILINEAR); this.mTexture3 = new BitmapTextureAtlas(2048, 2048, TextureOptions.BILINEAR); this.mTexture4 = new BitmapTextureAtlas(1024, 1024, TextureOptions.BILINEAR); this.mTexture5 = new BitmapTextureAtlas(1024, 2048, TextureOptions.BILINEAR); /* Creates the needed texture-regions on the texture. */ this.mPulperia = BitmapTextureAtlasTextureRegionFactory.createFromAsset(this.mTexture, this, "gfx/pulperia.png", 0, 0); // 1280x800 this.mCarreta = BitmapTextureAtlasTextureRegionFactory.createFromAsset(this.mTexture, this, "gfx/carreta.png", 0, 801); // 263x386 this.mPibito = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(this.mTexture2, this, "gfx/pibito.png", 0, 0, 6, 6); // 2048x1980 this.mMosca = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(this.mTexture3, this, "gfx/mosca.png", 0, 0, 11, 5); // 1980x1070 this.mDPSeq1 = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(this.mTexture4, this, "gfx/dpseq1.png", 0, 0, 7, 4); // 1022x1000 this.mDPSeq2 = BitmapTextureAtlasTextureRegionFactory.createTiledFromAsset(this.mTexture5, this, "gfx/dpseq2.png", 0, 0, 10, 7); // 990x1393 this.getEngine().getTextureManager().loadTexture(this.mTexture); this.getEngine().getTextureManager().loadTexture(this.mTexture2); this.getEngine().getTextureManager().loadTexture(this.mTexture3); this.getEngine().getTextureManager().loadTexture(this.mTexture4); this.getEngine().getTextureManager().loadTexture(this.mTexture5); this.enableAccelerometerSensor(this); } @Override public Scene onLoadScene() { mScene = new Scene(); mPulperiaSprite = new Sprite(0, 0, this.mPulperia); mScene.attachChild(mPulperiaSprite); mPibitoSprite = new AnimatedSprite(88, 200, this.mPibito); mPibitoSprite.setScale((float) 1.6); mMoscaSprite = new AnimatedSprite(700, 550, this.mMosca); mDP1Sprite = new AnimatedSprite(530, 260, this.mDPSeq1); mDP2Sprite = new AnimatedSprite(530, 310, this.mDPSeq2); mDP1ListenerF = new IAnimationListener () { @Override public void onAnimationEnd(final AnimatedSprite pAnimatedSprite) { runOnUpdateThread(new Runnable() { @Override public void run() { // Stop the animation, play it backwards mDP1Sprite.stopAnimation(); mDP1Sprite.animate( new long[] {100,100,100,100,100,100,100, 100,100,100,100,100,100,100, 100,100,100,100,100,100,100, 100,100,100,100,100,100,100,}, new int[] {27,26,25,24,23,22,21,20,19,18,17,16,15,14,13,12,11,10,9,8,7,6,5,4,3,2,1,0}, 0, mDP1ListenerB); } }); } }; mDP1ListenerB = new IAnimationListener () { @Override public void onAnimationEnd(final AnimatedSprite pAnimatedSprite) { runOnUpdateThread(new Runnable() { @Override public void run() { mScene.detachChild(mDP1Sprite); mDP1Sprite.stopAnimation(); mDP2Sprite.animate(100, false, mDP2Listener); mScene.attachChild(mDP2Sprite); } }); } }; mDP2Listener = new IAnimationListener () { @Override public void onAnimationEnd(final AnimatedSprite pAnimatedSprite) { runOnUpdateThread(new Runnable() { @Override public void run() { mScene.detachChild(mDP2Sprite); mDP2Sprite.stopAnimation(); mDP1Sprite.animate(100, false, mDP1ListenerF); mScene.attachChild(mDP1Sprite); } }); } }; mDP1Sprite.animate(100, false, mDP1ListenerF); mScene.attachChild(mDP1Sprite); mCarretaSprite = new Sprite(-30, 414, this.mCarreta); mScene.attachChild(mCarretaSprite); //this.mVelocityInitializer = new VelocityInitializer(-20, 20, -100, -120); // this.getEngine().registerPreFrameHandler(new FPSCounter()); mScene.registerUpdateHandler(new TimerHandler(10f, true, new ITimerCallback() { Random randomSrc = new Random(); @Override public void onTimePassed(final TimerHandler pTimerHandler) { try { int x = randomSrc.nextInt(10); //Pibito runs 3 out of 10 times if (x5 && ! mMoscaSprite.isAnimationRunning()) { mScene.attachChild(mMoscaSprite); mMoscaSprite.animate(100, false, new IAnimationListener () { @Override public void onAnimationEnd(final AnimatedSprite pAnimatedSprite) { runOnUpdateThread(new Runnable() { @Override public void run() { mScene.detachChild(mMoscaSprite); } }); } }); } } catch (Exception e) {} } })); return mScene; } @Override public void onLoadComplete() { } @Override public Engine onCreateEngine() { // TODO Auto-generated method stub return new MyBaseWallpaperGLEngine(this); } @Override public void offsetsChanged(float xOffset, float yOffset, float xOffsetStep, float yOffsetStep, int xPixelOffset, int yPixelOffset) { /* if(mCamera != null){ // Emulator has 3 screens mCamera.setCenter( ((960 * xOffset ) - 240) , mCamera.getCenterY() ); / *formel mCamera.setCenter(( (Camera-WIDTH * (screensCount-1)) * xOffset ) - (Camera-WIDTH / 2) ,mCamera.getCenterY() ); * / }*/ mCarretaSprite.setPosition(-xOffset*80, 414); mMoscaSprite.setPosition(-xOffset*300+700, 550); } @Override public void onAccelerometerChanged(final AccelerometerData pAccelerometerData) { /* final float minVelocityX = (pAccelerometerData.getX() + 2) * 5; final float maxVelocityX = (pAccelerometerData.getX() - 2) * 5; final float minVelocityY = (pAccelerometerData.getY() - 8) * 10; final float maxVelocityY = (pAccelerometerData.getY() - 10) * 10; this.mVelocityInitializer.setVelocity(minVelocityX, maxVelocityX, minVelocityY, maxVelocityY);*/ } @Override public void onUnloadResources() { // TODO Auto-generated method stub } @Override public void onPauseGame() { super.onPause(); LiveWallpaperService.this.getEngine().onPause(); LiveWallpaperService.this.onPause(); } @Override public void onResumeGame() { super.onResume(); LiveWallpaperService.this.getEngine().onResume(); LiveWallpaperService.this.onResume(); } @Override public void onConfigurationChanged (Configuration newConfig){ if(newConfig.orientation == Configuration.ORIENTATION_PORTRAIT) { mScene.setScaleX(1280.0f/800.0f); mScene.setScaleY(1.0f); } else if(newConfig.orientation == Configuration.ORIENTATION_LANDSCAPE) { mScene.setScale(1); } } // =========================================================== // Methods // =========================================================== // =========================================================== // Inner and Anonymous Classes // =========================================================== }
(IOffsetsChanged.java)
package org.anddev.wallpaper.live.donprimerizowp; public interface IOffsetsChanged{ public void offsetsChanged(float xOffset, float yOffset, float xOffsetStep, float yOffsetStep, int xPixelOffset, int yPixelOffset); }
I’m making the full project available here . The Live Wallpaper is available from the Android Market.
The license for the code is “do as you please”, for the art it’s “All rights reserved under Creative Commons License”. All art is from the defunct (oh really? I hear him breathing!) “Epopeya de Don Primerizo Lata” and was created by the great Leonardo Falaschini, you should check his stuff at liondart.com