Monday, 6 July 2009

Android Audio?

I've been investigating Android recenly, and have downloaded the SDK and had a play with it.

And in summary, I'm quite staggered at how poor the support is for audio in the Android SDK.

You can find-out more here.

The fundamental problem is that there is no way for a program to generate its own stream of audio. All audio resources must be from a URI of some sort, or from data bundled-in to the application, or from a file in the file system.

Of course, some bright spark out there might tell me that you can easily create your own "in-app" audio server with Java that you can access through a URI from that self-same application, and that can generate an audio stream on demand; I'd love to hear if this is the case!

On another note, the audio data capture system only allows you to send data to a file; there is no way for your application to get access to the audio as it arrives. So whichever way you look at it, there is no way to create audio FX boxes for Android.

This all reminds me very much of the JSR-135 multimedia APIs for Java, which were completely useless from the perspective of a low-level audio programmer.

I'm pleased to report however that if you want to create interesting audio apps for Mobile, then your needs are well served both by the iPhone/iPod SDK, and the Windows Mobile SDK. Shame about Android though!


Daniel Webb said...


Did you look at the Native part of the SDK (NDK) ? It's a shame if there's no audio goodness there, as a stated goal for the NDK is signal processing.

Daniel Webb
(btw the link to find out more goes to a wikipedia article on test automation - I read it, it was interesting, but I didn't find out more about Android audio...)

Pete Cole said...

Ah, sorry, link now fixed! :)

Yes, I've looked at the NDK and am pretty disappointed with it. I hope to post on that shortly!

Best wishes,


Pete Cole said...

I now realise that I've already posted on the NDK - a couple of days back. :)

The way that the NDK is supposed to help DSP, is that you can put your algorithms in C/C++, so they have the potential to be a lot more efficient than if implemented in Java.

However: you can't actually apply your fancy DSP code to any real-time audio streams. Unless there were a way to implement some sort of server within the application that could deliver a stream via a URI in some form, back to that same application (a long shot, I know!).


Jay Foad said...

You'll be after wanting an AudioTrack in streaming mode:

Pete Cole said...

Hi Jay,

Aha - many thanks for spotting that, and for letting me know! :)


Painter said...

Android Audio would work better in the NDK if Android Bug 3434 could be solved.

TiGeR said...

so any news about possibly streaming on android ?

Pete Cole said...


The approach to use is to follow what Jay said, which is through using this API:

Hoping that helps!

Best wishes,


TiGeR said...

so no Native Code at all ? Just try to make the best use out of the AudioTrack class ?

Pete Cole said...

Yes; one of Android's primary limitations is that all APIs are Java based. If you want to do anything with audio, you have to use the - high latency - Java audio APIs.

Of course, the core of your audio processing pipeline can be done in C++, with only the adaptors talking to Java.

The real problem then comes with the poor integration between Java and C++ on Android. For example, debugging your JNI code (in C/C++) with GDB is something I have not yet been able to get working; none of the leads I've followed-up on the net seem to lead to work! This makes debugging of real-time audio code very difficult :)

IMO, the development environment really needs to have far better integration between C++ and Java; with integrated debugger support, proper dependency checking etc.

Until such time, if you want to use a lot of C++/C code in your Android apps, you might instead want to consider alternative approaches such as AirPlay SDK ( or Antix Labs (