We are currently migrating Bugzilla to GitHub issues.
Any changes made to the bug tracker now will be lost, so please do not post new bugs or make changes to them.
When we're done, all bug URLs will redirect to their equivalent location on the new bug tracker.

Bug 106 - Double buffering, DGA and SDL_DisplayFormat => problem?
Summary: Double buffering, DGA and SDL_DisplayFormat => problem?
Status: RESOLVED FIXED
Alias: None
Product: SDL
Classification: Unclassified
Component: video (show other bugs)
Version: don't know
Hardware: x86 Linux
: P2 blocker
Assignee: Sam Lantinga
QA Contact: Sam Lantinga
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2006-01-29 01:59 UTC by Sam Lantinga
Modified: 2006-05-09 02:45 UTC (History)
1 user (show)

See Also:


Attachments
Test.cpp (2.83 KB, text/plain)
2006-01-29 01:59 UTC, Sam Lantinga
Details
font.png (4.20 KB, image/x-png)
2006-01-29 01:59 UTC, Sam Lantinga
Details

Note You need to log in before you can comment on or make changes to this bug.
Description Sam Lantinga 2006-01-29 01:59:01 UTC
Date: Mon, 27 Dec 2004 15:09:49 +0100
From: Gaetan de Menten <gedeonlecanard@gmail.com>
Subject: [SDL] Double buffering, DGA and SDL_DisplayFormat => problem?

Hello,

I recently noticed my current game doesn't work well with double
buffering enabled on the DGA backend (it works fine on the X11
backend). It seems like my sprites get all swapped (one sprite take
the place of another one) and some of them are slightly grabbled.

During my search for the bug, I noticed that if I didn't convert my
sprites to the display format, the bug disappeared, and if I convert
them multiple times in a row, the bug is more noticeable...

Attached is a minimal code sample and my font bitmap. That sample
first loads the font bitmap (using SDL_image), splits it in several
individual surfaces (one for each character) and then converts each of
these surfaces to the display format multiple times in a row...
Finally, it displays the first 26 characters of my font, which are
"!.0123456789:?ABCDEFGHIJK".

With the default backend (x11) , it works just fine. With the dga
backend (you have to set the SDL_VIDEODRIVER environment variable to
"dga" and run the program as root), the display looks like
"!!0022446688...". Although it seems weird to me, the result is
consistent from one run to the other (always the same result) on my
computer (might not be the case on someone else's computer).

I suppose it's much more likely the bug is in my code and not in SDL
but since I'm getting crazy with it, could someone look at it and tell
me what I'm doing wrong? It's probably something obvious that I don't
see because I've been looking at that code for too long...

For what it's worth, I'm using Debian unstable, but I tried with a
hand-compiled version of SDL too (CVS from yesterday) and the results
are the same.

-Gaetan.
Comment 1 Sam Lantinga 2006-01-29 01:59:17 UTC
Created attachment 49 [details]
Test.cpp
Comment 2 Sam Lantinga 2006-01-29 01:59:44 UTC
Created attachment 50 [details]
font.png
Comment 3 Ga 2006-05-06 11:15:19 UTC
Just in case you need more information, please direct your mails to "ged bugfactory org", I don't use the other address anymore...
Comment 4 Sam Lantinga 2006-05-07 17:15:42 UTC
I'd like to get this fixed for SDL 1.2.10 release, if possible.
Comment 5 Sam Lantinga 2006-05-09 02:45:13 UTC
This is fixed in subversion - thanks for the great test case!