Search code examples
openglnvidiatexture-mapping

nVidia openGL fails to display simple COLOR_INDEX texture


I have my first simple OpenGL program to display 2D images using OpenGL. I'm using an index-based image, calling glTexImage2D(.. GL_RGB, ... GL_COLOR_IMAGE...) This is working as expected on an ATI card. Having swapped to an nVidia card, I see a black quad instead of my image. Given that it works on the ATI I guess the code is basically correct - but maybe I have missed a setting - or maybe the card doesn't support what I'm doing (?!)

First the Setup code (I'm using Qt btw, so there's probably some context calls I'm missing):-

glClearColor( 0.1, 0.1, 0.25, 0); // background color

glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);

glPixelStorei(GL_UNPACK_ALIGNMENT, 4);      // 4-byte pixel alignment

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR ); 
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR );
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP);
glTexEnvf( GL_TEXTURE_ENV, GL_TEXTURE_ENV_MODE, GL_DECAL );

Here's the code to set the texture :-

GLfloat Greys[256];
GLfloat Ones[256];
for( int I(0); I < 256; ++I )
{
  Greys[I] = (GLfloat)I/256;
  Ones[I] = 1.0;
}

makeCurrent();
glPixelMapfv( GL_PIXEL_MAP_I_TO_R, 256, Greys );
glPixelMapfv( GL_PIXEL_MAP_I_TO_G, 256, Greys );
glPixelMapfv( GL_PIXEL_MAP_I_TO_A, 256, Ones );

glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB, m_pImage->size().width(), m_pImage->size().height(), 0, GL_COLOR_INDEX, GL_UNSIGNED_BYTE, m_pImage->bits() );

Here's the display code

glLoadIdentity();
// Get the camera in the right place
glRotatef( 180, 1, 0, 0 );
// Apply the Pan(Offset), and Zoom
glTranslatef( m_Offset.x(), m_Offset.y(), 0);
glScalef( m_Zoom, m_Zoom, 1 );

// Display the image texture mapped to a rectangle
glColor3f( 1,1,0 );

glEnable(GL_TEXTURE_2D);
glBegin(GL_QUADS);
glTexCoord2f( 0, 0 );  glVertex3f( 0, 0, 10 );
glTexCoord2f( 1, 0 );   glVertex3f( ImSize.width(), 0, 10 );
glTexCoord2f( 1, 1 );   glVertex3f( ImSize.width(), ImSize.height(), 10 );
glTexCoord2f( 0, 1 );   glVertex3f( 0, ImSize.height(), 10 );
glEnd();
glDisable(GL_TEXTURE_2D);

I also display the same image in full colour, in a separate window, using a straight RGB - RGB call to glTexImage2D. So I'm confident the dimensions are acceptable.

If I remove the call to glTexImage2D then I get a yellow quad as expected. Thus I suspect I have a problem with my calls to set the colour LUTs.

Board is an ASUS GeForce 210 silent Windows XP 32 bit. nVidia Drivers 6.14.13.681 (9-23-2012), R306.81 (branch: r304_70-122)


Solution

  • Did you test for OpenGL error codes? You may use this code: https://gist.github.com/4144988 – regarding color index formats? I wouldn't be surprised if it simply wasn't supported by the driver. Nobody uses color index formats these days. If you want to draw a palleted texture, upload the pallete into a 1D RGB texture and the color indexed image into a single channel (GL_RED or GL_LUMINANCE, depending on the OpenGL version) 2D texture and use the value as index into the pallete texture.