Search code examples
c++openglshadertextures

Can't see my texture on 2d object using OpenGL


I'm trying to use textures in my mini OpenGL program. As using OpenGL requires a lot of duplicated code so I abstracted my code into a class. But I can't see anything on the window. OpenGL doesn't throw any errors. I'm using stb_image.h from https://github.com/nothings/stb/ to process image files. What I am doing wrong? I'm using Ubuntu 20.04.

Class declaration:

class texture {
protected:
    unsigned int textureid;
    unsigned char* localbuf;
    int length, width, pixelbit;
    std::string srcpath;
public:
    texture(std::string file);
    ~texture();
    int getwidth() const;
    int getlength() const;
    void bind(unsigned int slot = 0) const;
    void unbind() const;
};

Class implementation:

texture::texture(std::string file) {
    srcpath = file;

    stbi_set_flip_vertically_on_load(1);
    localbuf = stbi_load(file.c_str(), &width, &length, &pixelbit, 4);

    glcall(glGenTextures(1, &textureid));
    glcall(glBindTexture(GL_TEXTURE_2D, textureid));

    glcall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR));
    glcall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR));
    glcall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE));
    glcall(glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE));

    glcall(glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA8, width, length, 0, GL_RGBA, GL_UNSIGNED_BYTE, localbuf))

    glcall(glBindTexture(GL_TEXTURE_2D, 0));

    if (localbuf) stbi_image_free(localbuf);
}

texture::~texture() {
    glcall(glDeleteTextures(1, &textureid));
}

void texture::bind(unsigned int slot) const {
    glcall(glActiveTexture(GL_TEXTURE0 + slot));
    glcall(glBindTexture(GL_TEXTURE_2D, textureid));
}

void texture::unbind() const {
    glcall(glBindTexture(GL_TEXTURE_2D, 0));
}

int texture::getwidth() const {
    return width;
}

int texture::getlength() const {
    return length;
}

Vertex shader:

#version 330 core
layout(location = 0) in vec4 position;
layout(location = 1) in vec2 texpos;
out vec2 v_texpos;
void main() {
    gl_Position = position;
    v_texpos = texpos;
};

Fragment shader:

#version 330 core
layout(location = 0) out vec4 color;
in vec2 v_texpos; 
uniform sampler2D u_texture;
void main() {
    vec4 texcolor = texture2D(u_texture, v_texpos);
    color = texcolor;
};

main function:

int main() {
    ...
    texture mytexture("path/to/image/file.png");
    texture.bind();
    glUniform1i(glGetUniformLocation(shader, "u_texture"), 0);
    ...
    while (window_open) {
        ...
        glDrawElements(...);
        ...
    }
    ...
}

Solution

  • I'm using class to handle the layout of vertices. In the implementation of that class, there is a loop like this:

    for (int i = 0; i < ...; i++) {
        ...
        glEnableVertexAttribArray(i);
        ...
    }
    

    Above one is correct, but I made a mistake previously by typing 0 instead of i like this:

    glEnableVertexAttribArray(0);
    

    When I was not dealing with textures, this loop was run once and the value of i was 0 . But when started to deal with textures, this loop was run twice and i wasn't always 0 , resulting in an error.