How to get most consistent timing for music app?

:information_source: Attention Topic was automatically imported from the old Question2Answer platform.
:bust_in_silhouette: Asked By ESN42

I’m working on a music app. I have a very short audio sample (a drum hit), which I’ve connected to a timer such that the sound will play every 200 msec. When I run the scene, the tempo that is produced, while mostly consistent, is inconsistent enough that it’s not musically useful. Is there anything I can do to get more even timing? I don’t know if the inconsistency is due to how timer works or just audio latency.

I have the timer node set to physics mode. I also have V-sync enabled. Is there a way to increase the framerate, if that would help? What about using audioserver instead of audiostreamplayer?

You could try upping the physics process rate maybe but that’s probably a bandaid at best.

Pieter-Jan Briers | 2018-03-18 15:34

I actually just tried that and it helped tremendously. Timing is still not perfect though.

ESN42 | 2018-03-18 15:57

Are you using AudioServer or SamplePlayer? Since AudioServer provides low level access to audio, you can reduce hugely the delay.

Löne | 2018-03-18 16:36

I’m using SamplePlayer. I’d like to give AudioServer a shot, but I can’t seem to find any tutorials on it.

ESN42 | 2018-03-18 17:59

You should take a look to the official documentation.

Löne | 2018-03-18 20:24

I did, and the only thing I’ve found in reference to AudioServer is the class description. No examples on usage.

ESN42 | 2018-03-18 21:13

The link I gave you explains how it works. You have to use buses if you use the graphical interface. The graphical interface does the same thing that using AudioServer in scripts.

Löne | 2018-03-18 21:19

Earlier you said this:

“Are you using AudioServer or SamplePlayer? Since AudioServer provides low level access to audio, you can reduce hugely the delay.”

I haven’t (intentionally) done anything with AudioServer yet, and I can’t find anything called SamplePlayer. Is that from an older version? I’m just using an AudioStreamPlayer to generate my sounds. I’m not specifying a bus, so presumably it’s using a default bus routing. Are you saying that, if I’m playing through an audio bus, that I’m essentially already going through an AudioServer?

“The link I gave you explains how it works.”

I’m sorry but I don’t understand what you’re getting at with that page of the documentation. It just describes the general purpose of an audio bus. It doesn’t mention anything about settings that might lead to reduced latency, and there’s no mention of AudioServer at all.

“You have to use buses if you use the graphical interface. The graphical interface does the same thing that using AudioServer in scripts.”

I’m not using the graphical interface, I’m doing everything in scripting, and what I was looking for was an example usage of AudioServer in a script, which I can’t find.

ESN42 | 2018-03-20 17:12