To register and login, use your Google, Twitter, Facebook, LinkedIn, or OpenID credentials.

This is allowing us to stop most spam registrations. We've deleted most of the spam accounts that got through, and we're closely watching for more.

Obtaining Android device latency times

kapsykapsy Posts: 30
edited February 2013 in Pd Everywhere

Does anyone know of an easy way of obtaining Android device latency times? Wondering if there was a method on Android (or a module in PD) that could accurately return the devices playback latency.

I'm building graphic/sound app and using the a delay on the graphics to sync with the sound coming from PD. At the moment I'm just guessing the value and hard coding it in a ScheduledExecutorService. Was thinking that firing an impulse 1 sample sound from pd and then sending a trigger when that sound is played back on the PD side and getting Android to measure the difference - but I'm sure there must be an easier and more accurate way.

Would be great if there was a catch all way of obtaining latency so the graphics are in sync on high and low latency devices - keen to hear any ideas.


  • Hi,
    I have the same issue. My app is receiving data via Bluetooth and generating sound via libpd and i have to measure the latency between those two actions.

    Your solution does not include the delay between writing the sound buffer and actually having a speaker output which is specific for each device and Android version or am I wrong?

  • kapsykapsy Posts: 30
    edited February 2013

    Hi Heavylolo - no solution as of yet - was just thinking out loud. My idea was to try and handle it almost 100 on the PD side and let Android measure the time difference - I'm not even sure if that's possible yet.

    There is a latency tester here:

    But I think it uses a round trip method of playback-record, and it's slow. I'm after something that's fast and invisible to the user - for playback only.

    There is also the AudioTrack.getMinBufferSize(); method - have yet to try but am keen to see how it goes.

    Edit: Also AudioParameters.suggestOutputBufferSize(sampleRate) might have what we're looking for - don't want to get excited until I've tested and the results are accurate (on different devices too).

  • kapsykapsy Posts: 30
    edited February 2013

    Update: Ok gave the following a go, trying to get the sample rate at 11025Hz:

    float buffersize_at = (float)AudioTrack.getMinBufferSize(11025, AudioFormat.CHANNEL_OUT_STEREO, AudioFormat.ENCODING_PCM_16BIT);

    Which returns:

    7680 on Apad 7
    and 3168 on the Galaxy S

    These values are in bytes, so used the following to covert to millisec:

    float buffersizemillis = buffersize_at * (1000F/(11025F * 2F * 2F));

    Which gives 174.15ms for the Apad and 71.83ms for the Galaxy S.

    The value I'm entering manually into Executors.newSingleThreadScheduledExecutor(); to delay the graphics into sync is 490ms. So that's a hugh difference. Accepted that there might be some lag to draw graphics - but at the most that should be about 50ms.

    Of course we could factor the val AudioTrack gives us up (say *3), BUT, the value of 490ms I've guessed roughly equates well for both devices, i.e. the Apad's latency is without a doubt NOT more than twice that of the Galaxy.

    ^ Update - WRONG - my bad - see post below!

    So, back to searching for another method - AudioTrack.getMinBufferSize() is not what I'm after as far as I can tell.

  • pbrinkmannpbrinkmann Posts: 686 ✭✭

    I'm afraid this approach won't work, for several reasons. Most importantly, the buffer size only gives you a lower bound on the latency, but it doesn't tell you how much time will pass before a newly enqueued buffer will actually come out as sound. Moreover, the buffer size reported by AudioTrack is not necessarily the same as the one that libpd uses. AudioParameters.suggestOutputBufferSize(sampleRate) will tell you what libpd is using, but as I said, that doesn't account of the latency of the overall audio stack.

  • kapsykapsy Posts: 30

    Thanks - so getMinBufferSize() is a guideline more than an accurate measure? I'm also guessing that latency changes due to what's going on in the background. The more I dig into this the more complex it seems, with no real easy catch all solution. Heard reports of people getting different latency times every time they reboot their device even.

    Having said that, I have to eat my words a bit - I'd upgraded to Cyanogen 4.2 on the Galaxy S - when I had 4.0.4 installed the lag WAS roughly the same as the APad but after swearing I was right in a ever so public fashion, I just found on closer inspection that running 4.2 latency DOES seem to be about half (just guessing using eyes/ears) - which correlates to what AudioTrack was giving me. My bad...

    So while it may not be 100% accurate at least it gives a value that can factored/up/down based on graphics - hopefully this is useful for heavylolo too.

    With suggestOutputBufferSize() - the value is returned in frames - is there a way to find out how many samples/ms that is? I read somewhere 64 - but want to be sure...

  • pbrinkmannpbrinkmann Posts: 686 ✭✭

    The number of frames is what you want; counting frames instead of samples eliminates ambiguities due to channel counts. The number of frames in a buffer is the same as the number of samples for mono buffers.

  • leopqleopq Posts: 1
    As per the audio layer latency, you could try this:
    This app was created by Raph Levien from Google in order to measure the audio latency and determine the best operating values for buffer size/sample rate for the device. A database is being generated based on the user results ( This must also show you which devices are better to achieve the desired latency and consequently would be better to work with your app.
Sign In or Register to comment.