Search code examples
openglmesaglteximage2d

Cube map don't work with optirun on Ubuntu


I can't understand why this code work until I don't use the Optirun

//Initialize GLUT and OpenGL
void ViewPort::startVideo(int argc, char** argv) {
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_RGBA | GLUT_DOUBLE | GLUT_DEPTH);
glutInitWindowSize(XWIN, YWIN);
glutCreateWindow("Titolo");
glEnable(GL_DEPTH_TEST);
glEnable(GL_LIGHTING);
glEnable(GL_COLOR_MATERIAL);
glEnable(GL_TEXTURE_CUBE_MAP);
glHint(GL_LINE_SMOOTH_HINT, GL_NICEST);
glEnable(GL_CULL_FACE);
glCullFace(GL_BACK);

//OpenGL INFO
printf("GL_VERSION:   %s\n", glGetString(GL_VERSION));
printf("GL_RENDERER: %s\n", glGetString(GL_RENDERER));
printf("GL_VENDOR:  %s\n", glGetString(GL_VENDOR));
shadow = new ShadowMap(256 * SHADOW_QUALITY, 256 * SHADOW_QUALITY);
}

/*...*/

ShadowMap::ShadowMap(int w, int h) {
this->h = h;
this->w = w;

shadowShader = new Shader("./basicShader.vs", "./shadowShader.fs");

// Create the cube map
glGenTextures(1, &shadowCubeID);
glBindTexture(GL_TEXTURE_CUBE_MAP, shadowCubeID);

if (int err = glGetError())
    cerr << "glDebug: " << gluErrorString(err) << " shadow map" << endl;

for (unsigned int i = 0; i < 6; i++) {
    glTexImage2D(GL_TEXTURE_CUBE_MAP_POSITIVE_X + i, 0, GL_DEPTH_COMPONENT, w, w, 0,
    GL_DEPTH_COMPONENT, GL_FLOAT, NULL);
    glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
    glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
    glTexParameteri(GL_TEXTURE_CUBE_MAP, GL_TEXTURE_WRAP_R, GL_CLAMP_TO_EDGE);
    if (int err = glGetError())
        cerr << "glDebug: " << gluErrorString(err) << " shadow map" << endl;
}

//now i render to the shadowFBO
glGenFramebuffers(1, &shadowFBO);
glBindFramebuffer(GL_FRAMEBUFFER, shadowFBO);

//disable read and write from color buffer
glDrawBuffer(GL_NONE);
glReadBuffer(GL_NONE);

glFramebufferTexture2D(GL_FRAMEBUFFER, GL_DEPTH_ATTACHMENT,     GL_TEXTURE_CUBE_MAP_POSITIVE_X,
        shadowCubeID, 0);

errorCheckFBO();

glBindFramebuffer(GL_FRAMEBUFFER, 0);
glBindTexture(GL_TEXTURE_CUBE_MAP, 0);
}

The error (invalid operation) is when I create everyone of the six face of the cube map with glTexImage2D(GL_TEXTURE_CUBE_MAP_POSITIVE_X + i, ...) The only thing I can imagine is that there is some incompatibility with the OpenGL version.

The OpenGL info printed when the program whork are:

GL_VERSION:   3.0 Mesa 10.1.3
GL_RENDERER: Mesa DRI Intel(R) Haswell Mobile 
GL_VENDOR:  Intel Open Source Technology Center

When it don't work:

GL_VERSION:   2.1 Mesa 10.1.3
GL_RENDERER: Gallium 0.4 on llvmpipe (LLVM 3.4, 256 bits)
GL_VENDOR:  VMware, Inc.

How can I by-pass the problem? (sorry for my english)


Solution

  • Cube map depth textures are not supported in OpenGL 2.1. From the spec:

    Textures with a base internal format of DEPTH_COMPONENT are supported by texture image specification commands only if target is TEXTURE_1D, TEXTURE_2D, PROXY_TEXTURE_1D or PROXY_TEXTURE_2D. Using this format in conjunction with any other target will result in an INVALID_OPERATION error.

    The corresponding paragraph in the 3.0 spec changes to:

    Textures with a base internal format of DEPTH_COMPONENT or DEPTH_STENCIL are supported by texture image specification commands only if target is TEXTURE_1D, TEXTURE_2D, TEXTURE_1D_ARRAY, TEXTURE_2D_ARRAY, TEXTURE_CUBE_MAP, PROXY_TEXTURE_1D, PROXY_TEXTURE_2D, PROXY_TEXTURE_1D_ARRAY, PROXY_TEXTURE_2D_ARRAY, or PROXY_TEXTURE_CUBE_MAP. Using these formats in conjunction with any other target will result in an INVALID_OPERATION error.

    Based on this, cube map depth textures are a feature of OpenGL 3.0 and higher.