The topic involves OpenGL ES 2.0.
I have a device that when queried on the extensions via
glGetString(GL_EXTENSIONS)
Returns a list of supported extensions, none of which is GL_EXT_texture_compression_s3tc
.
AFAIK , not haveing GL_EXT_texture_compression_s3tc
shouldn't allow using DXT compressed textures.
However, when DXT compressed texture are used on the device , they render without any problems.
Texture data is commited using glCompressedTexImage2D
.
Tried for DXT1 , DXT3 and DXT5 .
Why does it work ? Is it safe to use a texture compression although the compression seems to not be supported ?
I think, that missing support for GL_EXT_texture_compression_s3tc
does not mean, that you can't use compressed formats. They may be supported anyway.
Quote from glCompressedTexImage2D doc page for ES2:
The texture image is decoded according to the extension specification defining the specified
internalformat
. OpenGL ES (...) provide a mechanism to obtain symbolic constants for such formats provided by extensions. The number of compressed texture formats supported can be obtained by querying the value ofGL_NUM_COMPRESSED_TEXTURE_FORMATS
. The list of specific compressed texture formats supported can be obtained by querying the value ofGL_COMPRESSED_TEXTURE_FORMATS
.
Note, that there is nothing about GL_EXT_texture_compression_s3tc
. Support for various capabilities may be implemented even though their 'pseudo-standardized' (I mean - extensionized) substitutes are not listed as supported.
You should probably query those constants (GL_NUM_COMPRESSED_TEXTURE_FORMATS
and GL_COMPRESSED_TEXTURE_FORMATS
) using glGetIntegerv()
to learn, which compressed formats are actually supported.