Search code examples
c++opengltextures

storing integers in single value texture in opengl not working


I am trying to write a compute shader that works on a state. I want to store the initial value of this state in a single channel texture. However, this does not work.

I have the following code:

GLuint tex;
glGenTextures(1, &tex);

int time_horizon = 1000;
std::vector<decltype(time_horizon)> tex_vals(width * height * 4, time_horizon);
for (auto x: tex_vals)
  assert(x == time_horizon);


glBindTexture(GL_TEXTURE_2D, tex);
assert(glGetError() == 0);

glTexImage2D(GL_TEXTURE_2D, 0, GL_R32I, width, height, 0, GL_RED_INTEGER, GL_INT, tex_vals.data());

auto err = glGetError();
if (err) {
  std::cerr << "gl error is: " << err << '\n';
  exit(1);
}

decltype(time_horizon) test[static_cast<size_t>(width*height)];

glBindTexture(GL_TEXTURE_2D, tex);
glReadPixels(0, 0, width, height, GL_RED_INTEGER, GL_SHORT, test);

for (auto x: test) {
  std::cerr << x << '\n';
  assert(time_horizon == x);
}

The call to glGetError returns 0 in both cases and the assertion after the glReadPixels call is tripped on its first call. The texture consists primary of 0s. However, there also seem to be a few blocks of 1s, -1s and random values.

I've tried storing ints and shorts and nothing seems to work. I've also fiddled with different GL_ENUMs in the glTexImage2D call to no avail.


Solution

  • I will just put here the answer which was already provided in the question comments.

    glReadPixels reads pixels from the currently bound frame buffer.

    To read the pixels from a texture you should use either glGetTexImage or the more modern version glGetTextureImage (4.5), which allows passing texture handle directly without the need to bind the texture to the context.