Currently I'm writing a renderer which uses many textures and will fill up my graphics card's video memory (3 Gb for my nVidia GTX 780 Ti). So I pre-compressed all necessary images by using Mali's texture compression tool and integrated my renderer with libktx for loading compressed textures(*.ktx).
The compression works really well. For RGB images(compressed with GL_COMPRESSED_RGB8_ETC2), it reaches 4 bpp consistently and 8 bpp for RGBA ones(GL_COMPRESSED_RGBA8_ETC2_EAC) as stated in the specs. But whenever those compressed images are uploaded onto GPU, they appear as the original sizes (before compression)
I'm loading the compressed textures using:
ktxLoadTextureN(...);
and I can see that inside that function, libktx will call:
glCompressedTexImage2D( GLenum target, GLint level,
GLenum internalformat,
GLsizei width, GLsizei height,
GLint border,
GLsizei imageSize,
const GLvoid * data);
The imageSize parameter in glCompressedTexImage2D(); matches my compressed data size, but after this function is executed, the video memory increases by the decompressed image size.
So my question is: Are compressed textures always decompressed before being uploaded onto GPUs? If so, is there any standardized texture compression format that allows a compressed texture to be decoded on the fly on gpu?
ETC2
and ETC
formats are not commonly used by desktop applications. As such, they might not be natively supported by the desktop GPU and/or its driver. However, they are required for GLES 3.0 compatibility, so if your desktop OpenGL driver reports GL_ARB_ES3_compatibility, then it must also support the ETC2
format. Because many developers want to develop GLES 3.0 applications on their desktops to avoid constant deployment and have easier debugging, it is desirable for the driver to report this extension.
It is likely that your driver is merely emulating support for the ETC2
format, by decompressing the data in software to an uncompressed RGB(A) target. This would explain the unchanged memory usage from uncompressed textures. This isn't necessarily true for every desktop driver, but likely true for most. It is still compliant with the spec - although it's assumed, there is no requirement that compressed textures consume the memory passed into glCompressedTexImage2D
.
If you want to emulate the same level of memory usage on your desktop, you should compress your texture to a commonly used desktop compressed format, such as one of the S3TC formats, using the GL_texture_compression_s3tc extension, which should be available on all desktop GPU drivers.