We are currently migrating Bugzilla to GitHub issues.
Any changes made to the bug tracker now will be lost, so please do not post new bugs or make changes to them.
When we're done, all bug URLs will redirect to their equivalent location on the new bug tracker.

Bug 1331 - SDL_CreateTextureFromSurface fails for OpenGL + Win XP 64 NVidia 285.58 with glTexImage2D(): GL_INVALID_ENUM
Summary: SDL_CreateTextureFromSurface fails for OpenGL + Win XP 64 NVidia 285.58 with ...
Status: RESOLVED FIXED
Alias: None
Product: SDL
Classification: Unclassified
Component: video (show other bugs)
Version: HG 2.0
Hardware: x86 Windows 7
: P2 major
Assignee: Sam Lantinga
QA Contact: Sam Lantinga
URL:
Keywords:
Depends on:
Blocks:
 
Reported: 2011-11-15 16:38 UTC by Ellie
Modified: 2014-06-25 09:19 UTC (History)
2 users (show)

See Also:


Attachments

Note You need to log in before you can comment on or make changes to this bug.
Description Ellie 2011-11-15 16:38:41 UTC
SDL_CreateTextureFromSurface/SDL_CreateTexture fails for OpenGL + Win XP 64 NVidia 285.58 with glTexImage2D(): GL_INVALID_ENUM.

Win Vista 64 bit NVidia/OpenGL with recent drivers was also reported to me to be affected.

On the same configurations when picking a Direct3D renderer for creating the textures on, it works perfectly fine. Also, the OpenGL path works fine on other non-NVidia machines (intel 945gme crap chipset on x86 fedora 16, intel i5 on board graphics on win 7 64bit), it just fails on those NVidia configurations and it's OpenGL exclusively, Direct3D/Software works fine.

Since NVidia should most likely support the required OpenGL foundations in their recent drivers, I guess SDL devs should look into this and check out why this breaks on those configurations.

Also to be more explicit on this, I am just talking about the SDL 1.3 SDL_Texture interface with those new accelerated 2d renderers, *not* about manual OpenGL usage with SDL.
Comment 1 Ellie 2011-11-25 01:49:24 UTC
Sorry for putting the wrong OS, it should be Windows of course.
Comment 2 Arnaud Geltzenlichter 2011-12-21 12:43:53 UTC
Same problem with SDL_CreateTextureFromSurface() with SDL_Surface* coming from a PNG image, on Windows 7 x86_64 SP1 and driver Nvidia 285.62

D3D and Software renderer work fine.

Works fine on Ubuntu 10.10 2.6.38 with OpenGL 3.3 too.
Comment 3 Sam Lantinga 2011-12-29 14:08:21 UTC
Do you get this with the testsprite2 sample program?  If not, can you attach a minimal test case that shows the issue?

I don't have an NVidia card here, so if anybody else wants to take a crack at this, please do!
Comment 4 Arnaud Geltzenlichter 2011-12-29 15:15:28 UTC
(In reply to comment #3)
> Do you get this with the testsprite2 sample program?  If not, can you attach a
> minimal test case that shows the issue?
> 
> I don't have an NVidia card here, so if anybody else wants to take a crack at
> this, please do!

Yes, same problem with testsprite2 on Windows 7 with driver NVidia 285.58.
Comment 5 Ellie 2011-12-29 15:37:17 UTC
I attempted to make a minimal test case while Arnaud Geltzenlichter apparently checked out the testsprite2 program. It boils down to this:

#include "SDL.h"
#include <stdio.h>
#include <unistd.h>
#include <windows.h>
#include <string.h>

static SDL_Window* mainwindow;
static SDL_Renderer* mainrenderer;

int main(int argc, char** argv) {
    //initialisation stuff
    if (SDL_Init(SDL_INIT_VIDEO) < 0) {
        return -1;
    }
    mainwindow = SDL_CreateWindow("SDL bug", 0,0, 512,512, 0);
    if (!mainwindow) {
        SDL_Quit();
        return -1;
    }

    //find the opengl renderer
    int rendererindex = -1;
    int count = SDL_GetNumRenderDrivers();
    int r = 0;
    while (r < count) {
        SDL_RendererInfo info;
        SDL_GetRenderDriverInfo(r, &info);
        if (strcasecmp(info.name, "opengl") == 0) {
            rendererindex = r;
            break;
        }
        r++;
    }

    //create renderer
    mainrenderer = SDL_CreateRenderer(mainwindow, rendererindex, SDL_RENDERER_ACCELERATED|SDL_RENDERER_PRESENTVSYNC);
    if (!mainrenderer) {
        SDL_Quit();
        return -1;
    }

    //create texture
    SDL_Texture* t = SDL_CreateTexture(mainrenderer, SDL_PIXELFORMAT_ABGR8888, SDL_TEXTUREACCESS_STREAMING, 512, 512);
    if (!t) {
        printf("OpenGL NVIDIA issue appears to be triggered: %s\n",SDL_GetError());
        return 0;
    }else{
        printf("OpenGL NVIDIA issue is apparently not triggered\n");
    }
    printf("Please press return to close\n");
    fflush(stdout);

    //wait for return key
    int c = fgetc(stdin);
    while (c != '\n') {
        c = fgetc(stdin);
    }

    SDL_Quit();
}

The output is:

OpenGL NVIDIA issue appears to be triggered: glTexImage2D(): GL_INVALID_ENUM
Please press return to close

Tested on Windows XP 64bit, driver version 285.58 aswell
Comment 6 Arnaud Geltzenlichter 2012-01-08 04:06:13 UTC
Sam, if you can lead me, I can try to find the problem inside the SDL 1.3 OpenGL code.

Do you have any idea about this issue ?
Comment 7 Sam Lantinga 2012-01-08 10:23:37 UTC
SDL's OpenGL renderer internally only supports one RGB format, and that's SDL_PIXELFORMAT_ARGB8888.

It maps this into the following set of OpenGL constants:
        *internalFormat = GL_RGBA8;
        *format = GL_BGRA;
        *type = GL_UNSIGNED_INT_8_8_8_8_REV;

I'm guessing that the latest NVidia drivers don't support GL_UNSIGNED_INT_8_8_8_8_REV?

You can confirm this by changing convert_format() in src/render/opengl/SDL_render_gl.c
Comment 8 Arnaud Geltzenlichter 2012-01-09 13:20:55 UTC
Well, it works with GL_UNSIGNED_BYTE instead of GL_UNSIGNED_INT_8_8_8_8_REV.


GL_UNSIGNED_BYTE should work with format GL_RGBA, giving 8 bits per component.


I'm currently testing the opengl renderer blit speed with this type, will give results asap.
Comment 9 Arnaud Geltzenlichter 2012-01-09 13:54:18 UTC
Some additional information :

I confirm my previous comment but there is more. 

The glTexImage2D() call seems to always return a GL_INVALID_ENUM error, even with the type set to GL_UNSIGNED_BYTE.

But, if I ignore the error check just after, all the rest of the code works well and the texture will be created without problem.

In src/render/opengl/SDL_render_gl.c in the function GL_CreateTexture():547 :

...

#endif
    {
        renderdata->glTexImage2D(data->type, 0, internalFormat, texture_w,
                                 texture_h, 0, format, type, NULL);

        // Always return GL_INVALID_ENUM ... ??
    }
    renderdata->glDisable(data->type);
    result = renderdata->glGetError();

  /*  if (result != GL_NO_ERROR) {
        GL_SetError("glTexImage2D()", result);
        return -1;
    }*/

...
Comment 10 Sam Lantinga 2012-01-09 18:14:56 UTC
That's crazy!
Comment 11 Arnaud Geltzenlichter 2012-01-10 12:51:15 UTC
I got a FPS with opengl renderer being only one-tenth of the D3D one !
Comment 12 Arnaud Geltzenlichter 2012-01-10 14:54:49 UTC
Well, in reality, the  GL_INVALID_ENUM error comes from each of the following calls :

renderdata->glTexParameteri(data->type, GL_TEXTURE_WRAP_S,
                                GL_CLAMP_TO_EDGE);

renderdata->glTexParameteri(data->type, GL_TEXTURE_WRAP_T,
                                GL_CLAMP_TO_EDGE);

This gl error was stored until the lines :

 result = renderdata->glGetError();
if (result != GL_NO_ERROR) {
        GL_SetError("glTexImage2D()", result);
        return -1;
    }

And so the if condition is true.

With GL_CLAMP, I got no GL error ...

But still have poor performances to draw textures with opengl renderer.
Comment 13 Sam Lantinga 2012-01-10 18:01:35 UTC
Good find.  Can you try this fix out?
http://hg.libsdl.org/SDL/rev/15ff38383cb7
Comment 14 Sam Lantinga 2012-01-10 18:02:25 UTC
Out of curiousity, is your framerate 60? :)
Comment 15 Arnaud Geltzenlichter 2012-01-11 12:51:08 UTC
Sam,

Your patch doesn't work, you'r still using GL_CLAMP_TO_EDGE, is it a mistake on your part ? We fall true in the if condition, so the same code than the previous one is executed.

For the FPS, I do not limit the framerate.First, I try to get the maximum FPS for different texture size, then I check the CPU consumption for a fixed, given FPS.

My results :

Texture's size :       D3D FPS:           OpenGL FPS :
64x64                  2808               1520
128x128                2650               1072
256x256                2383               710
512x512                2051               231
1024x1024              1478               108

I can't believe that OpenGL has so poor performances ..??

For a fixed 60 FPS, the Windows task manager shows 0 CPU consumption, while 3 for OpenGL ... 300 % more.
Comment 16 Arnaud Geltzenlichter 2012-01-11 12:52:07 UTC
0 CPU for D3D renderer of course ... :)
Comment 17 Arnaud Geltzenlichter 2012-02-01 08:19:38 UTC
Sam,

Is there a specific reason to use GL_CLAMP_TO_EDGE and not GL_CLAMP ?

Is is about texture's borders ?
Comment 18 Arnaud Geltzenlichter 2012-02-21 12:33:33 UTC
Problem still occurs with the new Nvidia driver 295.73.
Comment 19 Arnaud Geltzenlichter 2012-02-21 14:26:25 UTC
Well, in fact, it seems to be OK with the last SDL 2.0 snapshot 6302 and Driver 295.73 !
Comment 20 Adam M. 2014-05-12 21:46:19 UTC
Should this be closed now?
Comment 21 Sam Lantinga 2014-06-25 09:19:29 UTC
It sounds like this is now fixed.