Search code examples
javaopengltextureslwjgl

OpenGL drawing texture wrong


I'm using LWJGL and trying to draw a texture, its render code is the following:

public static void main(String[] args) {
    GLFWErrorCallback.createPrint(System.err).set();
    if (!GLFW.glfwInit()) {
        throw new IllegalStateException("Unable to initialize GLFW");
    }
    GLFW.glfwWindowHint(GLFW.GLFW_VISIBLE, GLFW.GLFW_FALSE);
    GLFW.glfwWindowHint(GLFW.GLFW_RESIZABLE, GLFW.GLFW_TRUE);
    window = GLFW.glfwCreateWindow(1280, 720, "Test", 0, 0);
    GLFW.glfwMakeContextCurrent(window);
    GL.createCapabilities();
    GL11.glMatrixMode(GL11.GL_PROJECTION);
    GL11.glLoadIdentity();
    GL11.glOrtho(0, 1280, 0, 720, 1, -1);
    GL11.glMatrixMode(GL11.GL_MODELVIEW);
    GL11.glViewport(0, 0, 1920, 1200);
    GL11.glClearColor(1.0F, 1.0F, 1.0F, 1.0F);
    int x = 0, y = 0;
    ByteBuffer imageBuffer = readFile(filename);
    IntBuffer w = BufferUtils.createIntBuffer(1);
    IntBuffer h = BufferUtils.createIntBuffer(1);
    IntBuffer comp = BufferUtils.createIntBuffer(1);
    ByteBuffer image = STBImage.stbi_load_from_memory(imageBuffer, w, h, comp, 0);  
    int textureId = GL11.glGenTextures();
    int glTarget = GL11.GL_TEXTURE_2D;
    GL11.glBindTexture(glTarget, textureId);
    glTexParameteri(glTarget, GL11.GL_TEXTURE_WRAP_S, GL12.GL_CLAMP_TO_EDGE);
    glTexParameteri(glTarget, GL11.GL_TEXTURE_WRAP_T, GL12.GL_CLAMP_TO_EDGE);
    glTexParameteri(glTarget, GL11.GL_TEXTURE_MIN_FILTER, GL11.GL_NEAREST);
    glTexParameteri(glTarget, GL11.GL_TEXTURE_MAG_FILTER, GL11.GL_NEAREST);
    int width = w.get(0);
    int height = h.get(0);
    /** Send texel data to OpenGL if texture is 2d texture */
    if (glTarget == GL11.GL_TEXTURE_2D) {
        if(comp.get(0) == 3){
            GL11.glTexImage2D(glTarget, 0, GL11.GL_RGB, width, height, 0, GL11.GL_RGB, GL11.GL_UNSIGNED_BYTE, image);
        }
        else{
            GL11.glTexImage2D(glTarget, 0, GL11.GL_RGBA, width, height, 0, GL11.GL_RGBA, GL11.GL_UNSIGNED_BYTE, image);
            GL11.glEnable(GL11.GL_BLEND);
            GL11.glBlendFunc(GL11.GL_SRC_ALPHA, GL11.GL_ONE_MINUS_SRC_ALPHA);
        }
    }
    while (Sgl.window.isAlive()) {
        GLFW.glfwPollEvents();
        GL11.glClear(GL11.GL_COLOR_BUFFER_BIT | GL11.GL_DEPTH_BUFFER_BIT);
        /* Texture display part */
        bind();
        GL11.glEnable(glTarget);
        GL11.glBegin(GL11.GL_QUADS);
        GL11.glTexCoord2f(0,0);
        GL11.glVertex2f(x,y);
        GL11.glTexCoord2f(1,0);
        GL11.glVertex2f(x+width,y);
        GL11.glTexCoord2f(1,1);
        GL11.glVertex2f(x+width,y+height);
        GL11.glTexCoord2f(0,1);
        GL11.glVertex2f(x,y+height);
        GL11.glEnd();
        GL11.glDisable(glTarget);
        /*End texture display part*/
        GLFW.glfwSwapBuffers(window);
    }
}

The problem is that the window is 1280x720 big and the image only 392x69 but it is displayed like this:

enter image description here

So, it's upside down, much bigger than expected and at the wrong position.

What am I doing wrong?

Edit: I remove some if clauses due to size of the code.


Solution

  • Going through your issues one by one

    1. So, it's upside down,

    OpenGL's texture coordinates (not only GLs, this is generally true for all common render APIs) is defined in a way that the origin will b at the very first pixel you specify when uploading the data. Your image is most likely defined and loaded with the convention left-to-right, top-to-bottom in mind - so the vertex where you will assign the (0,0) texcoords will show the upper right corner of your image.

    Now, GL's window space is defined (by default at least) with mathematical conventions, origin is at bottom left. And you're seeting up some proejction matrix:

    GL11.glOrtho(0, 1280, 0, 720, 1, -1);

    This will map x=0 to x_ndc=-1 and x=1280 to x_ndc=1, and y=0 to y_ndc=-1 and y=720 to y_ndc=1. (It will map the z coordinate just flipped relative to the range [1,-1] you specified, so z=-1 to z_ndc=-1 and z=1 to z_ndc=1, but that is irrelevant here.)

    NDC are the normalized device coordinates, where (-1,-1) is the bottom left corner of your viewport, and (1,1) the top-right corner, respectively.

    So when you now do:

      GL11.glVertex2f(x,y); // NOTE: x and y are just 0 here
      GL11.glTexCoord2f(1,0);
    

    the above transformations will be applied, and the vertex with the top-left texel will end up at the bottom-left corner of your viewport.

    2. much bigger than expected

    You set up your viewport transformation as follows:

    GL11.glViewport(0, 0, 1920, 1200);
    

    Now, this just definesanother transformation, now from NDC to window space. Just that x_ndc=-1 is mapped to x_win=0 x_ndc=1 to x_win=1920 and for y respectively.

    So, the input coordiante (0,0) is mapped to (-1,1) in NDC, and further to (0,0) in window space, which is still the bottom left corner. (392,93) will be mapped to ~(0.3,0.129) in NDC, and (588,115) in window space - which is way bigger than your image actually is.

    It should still fit inside your window, but from the screenshot, it looks like it doesn't. There may be several explanations:

    • your code isn't exactly reproducing the issue, and the

      I remove some if clauses due to size of the code.

      might or might not have anything to do with that.

    • you are using some "high DPI scaling" (as microsoft calls it) or similar features of your operating system. The size of the window you specify in GLFW is not in pixels, but in some system- and platform-specific unit. GLFW provides means to query the actual pixel sizes, which you should use for your setting an appropriate viewport for the window.
    • ...

    3. and at the wrong position

    That's also a result of your mis-match of OpenGL coordinate conventions.

    4. What am I doing wrong?

    The code you have written uses deprecated OpenGL. You are relying on the fixed function pipeline which has been deprecated by a decade ago by now. Those functions were completely removed from modern core profiles of OpenGL. And even before that, immediate mode rendering by glBegin()/glEnd() was basically superseeded by vertex arrays back 20 years ago. If you are learning OpenGL right now, you should really try to avoid learning that old stuff, and start with a clean core profile OpenGL context.