I'm using FreeType to load the glyph textures, basically using FT_Load_Char to get the texture, then creating an instance of my custom class Character that contains all the metrics and texture for rendering later:
for (unsigned int c = min; c < max; c++)
{
if (FT_Get_Char_Index(faces[faceIndex].second, c) == 0)
continue;
if (FT_Load_Char(faces[faceIndex].second, c, FT_LOAD_RENDER))
{
cout << "Error: Failed to load Glyph: " << c << endl;
continue;
}
FT_GlyphSlot glyph = faces[faceIndex].second->glyph;
GLuint texture;
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glPixelStorei(GL_UNPACK_ALIGNMENT, 1);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RED, glyph->bitmap.width, glyph->bitmap.rows, 0, GL_RED, GL_UNSIGNED_BYTE, glyph->bitmap.buffer);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_LINEAR);
Character character;
character.texture = texture;
character.bearingX = glyph->bitmap_left;
character.bearingY = glyph->bitmap_top;
character.width = glyph->bitmap.width;
character.height = glyph->bitmap.rows;
character.advance = glyph->advance.x;
characters[faceIndex][c] = character;
count++;
}
This works correctly for a vast majority of characters, but typing something like asd-=_+123
will produce the following:
So it's not loading some glyphs correctly like +=-
, I am using NotoMono-Regular so it obviously has these basic glyphs
Further debugging by printing out the glyph bitmap buffer gives the following for "bad" characters:
bitmap buffer of - gives: α α α α α α α α α α
While for "good" characters it's something like: bitmap buffer of 1 gives: 94
So I think the problem is here, but not sure how to fix it
Character character;
character.texture = texture;
character.bearingX = glyph->bitmap_left;
character.bearingY = glyph->bitmap_top;
character.width = glyph->bitmap.width;
character.height = glyph->bitmap.rows;
character.advance = glyph->advance.x;
As it turns out, using unsigned int
for the height
and int
for bearingY
is semantically correct, however, there's is arithmetic down the line that goes like
transformation.translation().y() - (ch.height - ch.bearingY) * size * renderer->aspectRatio
where ch.height - ch.bearingY
is implicit "promoted" to unsigned int
, then implicitly converted to float
, as you can guess, if the expected result of ch.height - ch.bearingY
is negative bad things will happen. Who decided unsigned int
is greater conversion rank anyways....