So I'm using SDL_image to load heightmap and create terrain in my OpenGL app.
That's how I initialize SDL_image:
int flags = IMG_INIT_PNG;
int initted = IMG_Init(flags);
if((initted & flags) != flags) {
printf("IMG_Init: Failed to init required jpg and png support!\n");
printf("IMG_Init: %s\n", IMG_GetError());
return;
}
Load(filename);
...And this is my Load function:
void Load(string filename) {
img = IMG_Load(filename.c_str());
if(!img) {
printf("IMG_Load: %s\n", IMG_GetError());
return;
}
printf("IMG_Load: %s\n", IMG_GetError());
xsize = img->w;
ysize = img->h;
SDL_LockSurface(img);
imgData = (Uint32*)img->pixels;
SDL_UnlockSurface(img);
}
Then, in my Terrain class where I'm preparing my vertex buffer I'm reading pixel values with this method:
Uint32 getPixel(int x, int y) {
SDL_LockSurface(img);
int bpp = img->format->BytesPerPixel;
//cout << "bpp " << bpp << "\n";
/* Here p is the address to the pixel we want to retrieve */
Uint8 *p = (Uint8 *)img->pixels + y * img->pitch + x * bpp;
SDL_UnlockSurface(img);
switch(bpp) {
case 1:
return *p;
break;
case 2:
return *(Uint16 *)p;
break;
case 3:
if(SDL_BYTEORDER == SDL_BIG_ENDIAN)
return p[0] << 16 | p[1] << 8 | p[2];
else
return p[0] | p[1] << 8 | p[2] << 16;
break;
case 4:
return *(Uint32 *)p;
break;
default:
return 0; /* shouldn't happen, but avoids warnings */
}
}
...and it turns out that every time I run the program img->format->BytesPerPixel
returns a random value... What the heck? Does anyone have any idea? This should only return 1, 2, 3 or 4.
OK, I was just being stupid... But if anyone ever has a problem like me: I was including wrong version of SDL_image... #include <SDL/SDL_image.h>
instead of #include <SDL2/SDL_image.h>
. Everything now works as expected :)