I am using an OpenGL 3.3 context, and GLFW3 for windowing. I am also using C. I am using STB_truetype to rasterize characters to an dynamically allocated unsigned char.
I get the data from STB_truetype as such :
unsigned char ttf_buffer[1<<25];
stbtt_fontinfo font;
fread(ttf_buffer, 1, 1<<25, fopen("Script Thing.ttf", "rb"));
stbtt_InitFont(&font, ttf_buffer, stbtt_GetFontOffsetForIndex(ttf_buffer,0));
int w,h, xoff, yoff;
int c = 'C', s = 60;
unsigned char *bitmap = stbtt_GetCodepointBitmap(&font, 0,stbtt_ScaleForPixelHeight(&font, s), c, &w, &h, &xoff,&yoff);
To make sure I have the right data, I print it to the console :
for(int i = 0; i < h; i++)
{
for(int x = 0; x < w; x++)
{
printf("%c", bitmap[x+i*w]>>5 ? '1' : ' ');
}
printf("\n");
}
Then I create the 2D OpenGL Texture :
glGenTextures(1,&Texture);
glBindTexture(GL_TEXTURE_2D, Texture);
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE );
glTexParameteri( GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE );
glTexImage2D( GL_TEXTURE_2D, 0, GL_RED, w,h,
0, GL_RED, GL_UNSIGNED_BYTE, NewTemp );
I tested it with the 'A' character and I get :
11111
1111111
111111111
1111111111
11111111111
111111 11111
111111 11111
111111 1111
111111 1111
11111 1111
111111 1111
111111111111 11111
1111111111111111111111
111111111111111111111
111111111111111111111
111111111 1111111111
11111 111111
1111 11111111
11111 11111111
1111 11111111
1111 111 111
1111 111 1111
1111 1111 111
111 1111 111
111 111 111
1111 111 1111
1111 11111111
1111 1111111
1 111
I render the texture OpenGL context :
It is right as you can see.
However when I try the C character :
11
1111111111
1111111111111
111111111111111
11111111111111111
1111111 111111
111111 11111
111111 1111
11111 1111
11111 11
1111
11111
1111
1111
1111
1111
1111
1111
1111
111
111
1111
1111
111
1111
1111 11
111111 11111
11111111111111111
111111111111
11
Same for other characters such as 'X' :
1111 1111
11111 11111
11111 111111
111111 111111
11111 1111111
11111 1111111
11111 1111111
1111 1111111
11111111111
1111111111
11111111
1111111
111111
111111
1111111
11111111
111111111
11111 1111
11111 1111
11111 1111
1111 1111
1111 1111
1111 111
111 1111
111 111
111 1111
11 11111
1111 11111
11 1
Any idea what I might be doing wrong? Thanks.
EDIT : Fixed it with adding glPixelStorei( GL_UNPACK_ALIGNMENT, 1); just before glTexImage2D. Thanks again to doron
Normally each row will be a multiple the width and the bytes per pixel. But some 2D rasterization software will require additional dead space afterwards (for alignment purposes). The jump from row to row is known as the stride and can be greater than the width. From the images you have displayed, it looks like the stride of your bitmap is greater than the width.
So what you will have to do is find out what the stride is, and either copy the bitmap into a bitmap where the stride is equal to the width or use glPixelStorei
with the correct GL_UNPACK_ROW_LENGTH
.