Search code examples
c++openglfreetype2

Can't load "glyp" properly for opengl using freetype2


It seems as if freetype library isn't loading the font properly and doesn't convert it to bytes for later use as opengl texture.

Here is the result I'm getting with freetype:

Image containing what is rendered

I've already tried using another texture so it's unlikely that it's my texture manager's problem.

Font(const std::string font) //constructor
        :name(font)
    {
        std::string _font = "fonts/" + font + ".ttf";

        FT_Library ft;
        FT_Init_FreeType(&ft);


        FT_Face face;
        if (FT_New_Face(ft, _font.c_str(), 0, &face))
            EXIT_ERROR(-11);

        FT_Set_Pixel_Sizes(face, 0, 48);

        for (unsigned int a = 1; a < 128; a++)
        {
            char c = a;

            if (FT_Load_Char(face, c, FT_LOAD_RENDER))
            {
                EXIT_ERROR(-12);
            }


//This just creates the FontTexture type, as I said before it works 
//fine,FontTexture is abstracted from BaseTexture which stores char* with 
//data. I also made sure that the stuff is loaded 
//there properly from inside FontTexture.


            FontTexture* _char = new FontTexture(
//here is buffer passing
face->glyph->bitmap.buffer, 
                0,
                { static_cast<float>(face->glyph->bitmap.width), 
                static_cast<float>(face->glyph->bitmap.rows) });

            _char->SetAdvance(face->glyph->advance.x);
            _char->SetBearing({ static_cast<float>(face->glyph->bitmap_left), 
                static_cast<float>(face->glyph->bitmap_top )});


//This just caches texture, so instead of loading it multiple times I can 
//just call "getTexture(name) store that pointer in the entitie's memory 
//and bind when Draw() is being called.
            TextureManager::getTextureManager().PrecacheTexture(std::to_string(a) + font, _char);

            Characters.insert(std::pair<char, FontTexture*>(c, _char));
        }

        FT_Done_Face(face);
        FT_Done_FreeType(ft);
    }

Solution

  • The GL_UNPACK_ALIGNMENT parameter defines the alignment of the first pixel in each row (line) of an image when the image is read form the buffer. By default the this parameter is 4.
    Each pixel of the glyph image is encoded in one byte and the image is tightly packed. So the the alignment of the glyph image is 1 and the parameter has to be changed before the image is read and the texture is specified:

    glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
    

    If that is missed, this would cause a shift effect at each line of the image (except if the width of the image is divisible by 4).