Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

directsound: underruns with 44100Hz/1024 samples, not in SDL1.2, due to timer resolution change #1827

Closed
SDLBugzilla opened this issue Feb 10, 2021 · 0 comments

Comments

@SDLBugzilla
Copy link
Collaborator

This bug report was migrated from our old Bugzilla tracker.

Reported in version: HG 2.1
Reported for operating system, platform: Windows 8, x86_64

Comments on the original bug report:

On 2015-04-14 20:21:34 +0000, Eric Wasylishen wrote:

Requesting a 1024 sample buffer for 44100Hz audio worked fine with SDL1.2, but I get buffer underruns with these settings in SDL2 with the directsound backend.

Steps to reproduce:
In test/loopwave.c:

  • SDL_AudioInit("directsound");
  • wave.spec.callback = fillerup;
  • wave.spec.samples = 1024;
  • wave.spec.freq = 44100;

Also replace sample.wav with a 44100Hz sample, otherwise you'll have sped-up audio. Expected is pop-free audio, observed behaviour is occasional pops in the audio (every few seconds). I can't reproduce it 100% of the time, but the issue usually comes back after a reboot or restarting VS.
Tested on Windows 8, on two systems, one Cirrus Logic and one Realtek audio.

Analysis:
-The cause seems to be SDL1.2 set the OS timer resolution to 1ms a startup (in SDL_StartTicks), but SDL2 no longer does this. The default 15 ms resolution in Windows 8 is too large for the directsound backend to work with 1024 samples / 44100Hz.

I realize that changing the default timer resolution is a separate topic, and defaulting to 1ms for SDL2 would increase power use. However, I think asking for a 1024 sample buffer to work with directsound is not unreasonable, that's 23ms, and it used to work in SDL1.2.

One possible solution could be calling timeBeginPeriod/timeEndPeriod in the directsound wait function, so the extra power use only applies when the directsound backend is used, and the timer res change won't be visible outside (assuming it works to change the resolution for a short block of time like that.)

On 2015-04-20 16:22:09 +0000, Ryan C. Gordon wrote:

Yeah, we should be upping the timer resolution by default and offer a hint to tell SDL not to for the unlikely cases you wouldn't want it. There's some coderot that is preventing this from working correctly though.

Fixing it now.

--ryan.

On 2015-04-20 17:45:13 +0000, Ryan C. Gordon wrote:

This is fixed in https://hg.libsdl.org/SDL/rev/7454bfce9202, thanks!

--ryan.

On 2015-04-21 18:13:05 +0000, Eric Wasylishen wrote:

Thanks! Confirmed the fix works for me.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant