We are currently migrating Bugzilla to GitHub issues.
Any changes made to the bug tracker now will be lost, so please do not post new bugs or make changes to them.
When we're done, all bug URLs will redirect to their equivalent location on the new bug tracker.

Bug 2401 - CreateTexture does not accurately match supplied PixelFormat
Summary: CreateTexture does not accurately match supplied PixelFormat
Status: RESOLVED FIXED
Alias: None
Product: SDL
Classification: Unclassified
Component: video (show other bugs)
Version: 2.0.1
Hardware: x86 Windows (XP)
: P2 normal
Assignee: Sam Lantinga
QA Contact: Sam Lantinga
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2014-02-18 04:52 UTC by mattreecebentley
Modified: 2017-06-23 13:41 UTC (History)
2 users (show)

See Also:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description mattreecebentley 2014-02-18 04:52:17 UTC
The following code works (atlas_texture and s_renderer are predefined, plf_create_surface is macro function for the usual little/big-endian stuff + CreateRGBSurface):

	SDL_Surface *surface = plf_create_surface(atlas_width, atlas_height);
	
	if (surface == NULL)
	{
        return;
	}
	
	atlas_texture = SDL_CreateTextureFromSurface(s_renderer, surface);
	SDL_FreeSurface(surface);
	surface = NULL;

The following code does not work:

	SDL_Surface *surface = plf_create_surface(atlas_width, atlas_height);
	
	if (surface == NULL)
	{
        return;
	}
	
	Uint32 pixel_format = surface->format->format;
	SDL_FreeSurface(surface);
	surface = NULL;

	atlas_texture = SDL_CreateTexture(s_renderer, pixel_format, SDL_TEXTUREACCESS_STATIC, atlas_width, atlas_height);


When using UpdateTexture with other surfaces and the former resultant texture, the alpha channels are copied correctly. When using the same with the second example, the alpha channels are white once copied.

The only difference is using CreateTextureFromSurface instead of grabbing the renderer's default pixel_format from the same surface, and using CreateTexture instead.

Speaking of which, is there a way to determine the renderer's preferred default pixelformat without resorting to creating RGBSurface's? It seems like a cludge, and the renderers usually support multiple formats, but the rendererinfo doesn't specify the default one.
M@
Comment 1 Sam Lantinga 2014-03-05 05:11:03 UTC
The texture formats listed in SDL_RendererInfo are sorted in order of preference. Most of the renderers only support one RGBA format, the optimal one.

What format is the surface returned by plf_create_surface()?
Comment 2 mattreecebentley 2014-03-05 07:50:22 UTC
plf_create_surface is returning a surface with surface->format->format = 376840196 - I don't know which pixelformat that is, unfortunately.
Comment 3 mattreecebentley 2014-03-12 08:27:30 UTC
I found the basic issue is this:
https://wiki.libsdl.org/SDL_QueryTexture

"a pointer filled in with the raw format of the texture; the actual format may differ, but pixel transfers will use this format"

So this is useless for determining what pixelformat to create new textures in.
It's only useful for determining what pixelformat to make your surfaces that you want to transfer to a texture.

But to create a texture to match those surfaces, you pretty much have to use CreateTextureFromSurface, otherwise you're going to lose your alpha channel most of the time.

What I want is this: a function that gives me the _actual_ pixelformat that the texture is using. Then I can create a small texture from an RGBA (or ARGB or whatever) surface in my initialisation code, then grab the pixelformat off it to determine the default format to create my textures in if I want to transfer ARGB (or RGBA) surfaces to them.

Then I want to have the other 'pixel transfer' format specified by QueryTexture so I can use ConvertPixels on loaded image textures to transform them to a format that I can copy to the texture.

At the moment I have to create a blank surface the size of my texture_atlas, and then CreateTextureFromSurface to make the atlas. It's very slow.

Basically QueryTexture needs an additional argument - 'actual_format'.
Comment 4 mattreecebentley 2014-03-12 08:36:33 UTC
Basically what I'm trying to do is to create 32-bit red-green-blue-alpha textures in whatever the renderer's default or preferred 32-bit format is.
If you know of a better way to determine this, please let me know-
Cheers,
Matt
Comment 5 Adam M. 2014-05-12 21:31:22 UTC
Just a comment: pixel format 376840196 is SDL_PIXELFORMAT_ABGR8888 (which I find confusing because it's actually RGBA in memory).
Comment 6 Adam M. 2014-05-12 21:33:00 UTC
Also, why is this still marked as WAITING? mattreecebentley answered the question months ago.
Comment 7 mattreecebentley 2014-05-12 21:56:59 UTC
Because I was waiting for an answer about how to determine the best pixelformat for textures without using SDL_CreateTextureFromSurface.
Unfortunately nobody got back to me, so I posted about it on the forums.
After looking up the SDL source code I was able to determine what SDL_CreateTextureFromSurface was doing and replicate it. By that point I'd forgotten about this bug report.
Closed.
Comment 8 mattreecebentley 2014-05-12 21:58:02 UTC
(In reply to Adam M. from comment #5)
> Just a comment: pixel format 376840196 is SDL_PIXELFORMAT_ABGR8888 (which I
> find confusing because it's actually RGBA in memory).

ps. How did you look up the pixelformat code, just out of interest?
Comment 9 Adam M. 2014-05-12 22:22:49 UTC
My question was really directed at Sam, who is the one who moved the bug into the WAITING state when he asked you what the pixel format was, but never moved it back out of the WAITING state or made any other reply when you answered the question months ago. Maybe he's too busy, but if that's the case they should advertise on the forums and website for somebody who has the time and desire to take the bug database seriously. (Sorry for the mini-rant, but I'm a bit frustrated with it. I just hate seeing bugs that are years or months old with no meaningful responses.)

Anyway, as for how I got the pixel format, I created a C# game library built partly on top of SDL, and it includes a binding for the SDL API. It's easy to convert an enum value into its symbolic name in C#. (But actually, I believe Visual Studio will do that even for C/C++ code, if you cast a number to an enum type. I actually just typed (SDL.PixelFormat)376840196 into the watch window.)
Comment 10 mattreecebentley 2014-05-12 22:51:34 UTC
Thank you - yes SDL in general does seem low-priority for many of the contributors at present, but I expect full-time work with Valve is taking it's toll.
I'm not quite sure how I'd do something equivalent in codelite, but it'd be good to know.
Cheers Adam
Comment 11 Adam M. 2014-05-12 23:46:32 UTC
Well, a more tedious method is to look at the way SDL_PixelFormatEnum is defined. It's a bitwise combination of multiple fields. One field describes the bits per pixel, another the bytes per pixel, another the pixel type, another the packing order, etc. It would be possible to write code like the following to extract these parts (not valid YUV pixel formats):

BytesPerPixel = (byte)format;
BitsPerPixel  = (byte)((int)format >> 8);
ChannelLayout = (format >> 16) & 0xF;
ChannelOrder  = (format >> 20) & 0xF;
PixelType     = (format >> 24) & 0xF;
Comment 12 Sam Lantinga 2014-06-22 18:16:14 UTC
FYI, the preferred format for textures is SDL_PIXELFORMAT_ARGB8888.
If you get the renderer info, it lists the native texture formats in preferred order.