| Summary: | CreateTexture does not accurately match supplied PixelFormat | ||
|---|---|---|---|
| Product: | SDL | Reporter: | mattreecebentley |
| Component: | video | Assignee: | Sam Lantinga <slouken> |
| Status: | RESOLVED FIXED | QA Contact: | Sam Lantinga <slouken> |
| Severity: | normal | ||
| Priority: | P2 | CC: | adam, axel.mattauch |
| Version: | 2.0.1 | ||
| Hardware: | x86 | ||
| OS: | Windows (XP) | ||
The texture formats listed in SDL_RendererInfo are sorted in order of preference. Most of the renderers only support one RGBA format, the optimal one. What format is the surface returned by plf_create_surface()? plf_create_surface is returning a surface with surface->format->format = 376840196 - I don't know which pixelformat that is, unfortunately. I found the basic issue is this: https://wiki.libsdl.org/SDL_QueryTexture "a pointer filled in with the raw format of the texture; the actual format may differ, but pixel transfers will use this format" So this is useless for determining what pixelformat to create new textures in. It's only useful for determining what pixelformat to make your surfaces that you want to transfer to a texture. But to create a texture to match those surfaces, you pretty much have to use CreateTextureFromSurface, otherwise you're going to lose your alpha channel most of the time. What I want is this: a function that gives me the _actual_ pixelformat that the texture is using. Then I can create a small texture from an RGBA (or ARGB or whatever) surface in my initialisation code, then grab the pixelformat off it to determine the default format to create my textures in if I want to transfer ARGB (or RGBA) surfaces to them. Then I want to have the other 'pixel transfer' format specified by QueryTexture so I can use ConvertPixels on loaded image textures to transform them to a format that I can copy to the texture. At the moment I have to create a blank surface the size of my texture_atlas, and then CreateTextureFromSurface to make the atlas. It's very slow. Basically QueryTexture needs an additional argument - 'actual_format'. Basically what I'm trying to do is to create 32-bit red-green-blue-alpha textures in whatever the renderer's default or preferred 32-bit format is. If you know of a better way to determine this, please let me know- Cheers, Matt Just a comment: pixel format 376840196 is SDL_PIXELFORMAT_ABGR8888 (which I find confusing because it's actually RGBA in memory). Also, why is this still marked as WAITING? mattreecebentley answered the question months ago. Because I was waiting for an answer about how to determine the best pixelformat for textures without using SDL_CreateTextureFromSurface. Unfortunately nobody got back to me, so I posted about it on the forums. After looking up the SDL source code I was able to determine what SDL_CreateTextureFromSurface was doing and replicate it. By that point I'd forgotten about this bug report. Closed. (In reply to Adam M. from comment #5) > Just a comment: pixel format 376840196 is SDL_PIXELFORMAT_ABGR8888 (which I > find confusing because it's actually RGBA in memory). ps. How did you look up the pixelformat code, just out of interest? My question was really directed at Sam, who is the one who moved the bug into the WAITING state when he asked you what the pixel format was, but never moved it back out of the WAITING state or made any other reply when you answered the question months ago. Maybe he's too busy, but if that's the case they should advertise on the forums and website for somebody who has the time and desire to take the bug database seriously. (Sorry for the mini-rant, but I'm a bit frustrated with it. I just hate seeing bugs that are years or months old with no meaningful responses.) Anyway, as for how I got the pixel format, I created a C# game library built partly on top of SDL, and it includes a binding for the SDL API. It's easy to convert an enum value into its symbolic name in C#. (But actually, I believe Visual Studio will do that even for C/C++ code, if you cast a number to an enum type. I actually just typed (SDL.PixelFormat)376840196 into the watch window.) Thank you - yes SDL in general does seem low-priority for many of the contributors at present, but I expect full-time work with Valve is taking it's toll. I'm not quite sure how I'd do something equivalent in codelite, but it'd be good to know. Cheers Adam Well, a more tedious method is to look at the way SDL_PixelFormatEnum is defined. It's a bitwise combination of multiple fields. One field describes the bits per pixel, another the bytes per pixel, another the pixel type, another the packing order, etc. It would be possible to write code like the following to extract these parts (not valid YUV pixel formats): BytesPerPixel = (byte)format; BitsPerPixel = (byte)((int)format >> 8); ChannelLayout = (format >> 16) & 0xF; ChannelOrder = (format >> 20) & 0xF; PixelType = (format >> 24) & 0xF; FYI, the preferred format for textures is SDL_PIXELFORMAT_ARGB8888. If you get the renderer info, it lists the native texture formats in preferred order. |
The following code works (atlas_texture and s_renderer are predefined, plf_create_surface is macro function for the usual little/big-endian stuff + CreateRGBSurface): SDL_Surface *surface = plf_create_surface(atlas_width, atlas_height); if (surface == NULL) { return; } atlas_texture = SDL_CreateTextureFromSurface(s_renderer, surface); SDL_FreeSurface(surface); surface = NULL; The following code does not work: SDL_Surface *surface = plf_create_surface(atlas_width, atlas_height); if (surface == NULL) { return; } Uint32 pixel_format = surface->format->format; SDL_FreeSurface(surface); surface = NULL; atlas_texture = SDL_CreateTexture(s_renderer, pixel_format, SDL_TEXTUREACCESS_STATIC, atlas_width, atlas_height); When using UpdateTexture with other surfaces and the former resultant texture, the alpha channels are copied correctly. When using the same with the second example, the alpha channels are white once copied. The only difference is using CreateTextureFromSurface instead of grabbing the renderer's default pixel_format from the same surface, and using CreateTexture instead. Speaking of which, is there a way to determine the renderer's preferred default pixelformat without resorting to creating RGBSurface's? It seems like a cludge, and the renderers usually support multiple formats, but the rendererinfo doesn't specify the default one. M@