Search code examples
freetypefreetype2

Freetype2 FT_Load_Char memory leaks


I am hunting my programs memory leaks using Visual leak detector and cannot find how to get rid of FT_Load_Char leak, also documentation doesn't say anything about memory deallociation of GlyphSlot..

Here is my code snipplet where w get the leak of about 350 bytes.

// creating ascii symbol map
for (int i = 32; i < 128; i++) {
    if (FT_Load_Char(face, i, FT_LOAD_RENDER)) { // leak comes from here
        fprintf(stderr, "Loading character %c failed!\n", i);
        continue;
    }

    glTexSubImage2D(GL_TEXTURE_2D, 0, ox, oy, g->bitmap.width, g->bitmap.rows, 
        GL_ALPHA, GL_UNSIGNED_BYTE, g->bitmap.buffer);

    float ax = g->advance.x >> 6;
    float ay = ay = g->advance.y >> 6;

    float bw = g->bitmap.width;
    float bh = g->bitmap.rows;

    float bl = g->bitmap_left;
    float bt = g->bitmap_top;

    m_GlyphMap[i] = Glyph(ax,ay, bw, bh, bl, bt, ox, oy);

    ox += g->bitmap.width + 1;

    // there should be some sort of deallociation...
}

So main question: is there is some function to deallocate GlyphSlot that I am missing? Or is it bug in Freetype?


Solution

  • Make sure you call FT_Done_FreeType(lib_); after you close your program or stop using freetype. If this is not the case, then make sure you are using latest freetype version. I have almost the same loop and it works just fine on Windows 8 x64. Here's my code:

    for (UINT32 i = 0; i < text.length(); i++) {
        err_ = FT_Load_Char(face_, text[i], FT_LOAD_RENDER);
        if (err_) {
            LOGW("Unable to select, load and render character."
                " Error code: %d", err_);
            continue;
        }
        FT_Bitmap bitmap = glyphSlot->bitmap;
        FT_UInt glyphIndex = FT_Get_Char_Index(face_, text[i]);
        err_ = FT_Get_Kerning(face_, previous, glyphIndex,
            FT_KERNING_DEFAULT, &delta);
        if (err_) {
            LOGW("Unable to get kerning for character."
                " Error code: %d", err_);
            continue;
        }
        Glyph tmp;
        tmp.kerningOffset = delta.x >> 6;
        tmp.buffer = new UINT8[bitmap.rows * bitmap.width];
        memcpy(tmp.buffer, bitmap.buffer, bitmap.rows * bitmap.width);
        tmp.height = bitmap.rows;
        tmp.width = bitmap.width;
        tmp.offsetLeft = glyphSlot->bitmap_left;
        if (tmp.offsetLeft < 0) {
            tmp.offsetLeft = 0;
        }
        tmp.offsetTop = glyphSlot->bitmap_top;
        tmp.advanceX = glyphSlot->advance.x >> 6;
        tmp.advanceY = glyphSlot->advance.y >> 6;
        glyphs.push_back(tmp);
        previous = glyphIndex;
        width += tmp.advanceX + tmp.kerningOffset;
    }
    

    Also don't forget to delete symbol buffers if you allocate them separately:

    for (SIZE i = 0; i < glyphs.size(); i++) {
        Glyph g = glyphs[i];
        delete [] g.buffer;
    }