I was surprised when I found this out, initially I thought that something's wrong with my algorithms, but after closer inspections I found out that the more you change the colors the more it has an impact on the performance. Why is that?
Here's the (all) code:
#include <iostream>
#include <SDL2/SDL.h>
const int WIDTH = 1024;
const int HEIGHT = 768;
int main(int argc, char *argv[])
{
SDL_Window *window;
SDL_Renderer *renderer;
SDL_Texture *texture;
SDL_Event event;
if (SDL_Init(SDL_INIT_VIDEO) < 0)
{
SDL_LogError(SDL_LOG_CATEGORY_APPLICATION, "Couldn't initialize SDL: %s", SDL_GetError());
return 3;
}
window = SDL_CreateWindow("SDL_CreateTexture",
SDL_WINDOWPOS_UNDEFINED,
SDL_WINDOWPOS_UNDEFINED,
1024, 768,
SDL_WINDOW_RESIZABLE);
renderer = SDL_CreateRenderer(window, -1, SDL_RENDERER_ACCELERATED);
texture = SDL_CreateTexture(renderer, SDL_PIXELFORMAT_RGBA8888, SDL_TEXTUREACCESS_TARGET, WIDTH, HEIGHT);
bool alive = true;
while (alive)
{
while(SDL_PollEvent(&event) > 0)
{
switch(event.type)
{
case SDL_QUIT:
alive = false;
break;
}
}
const Uint64 start = SDL_GetPerformanceCounter();
SDL_SetRenderTarget(renderer, texture);
SDL_SetRenderDrawColor(renderer, 0x00, 0x00, 0x00, 0x00);
SDL_RenderClear(renderer);
for(int i = 0; i < 10000; ++i)
{
SDL_SetRenderDrawColor(renderer, rand() % 255, rand() % 255, rand() % 255, 255);
SDL_RenderDrawPoint(renderer, rand() % WIDTH, rand() % HEIGHT);
}
SDL_SetRenderTarget(renderer, NULL);
SDL_RenderCopy(renderer, texture, NULL, NULL);
SDL_RenderPresent(renderer);
const Uint64 end = SDL_GetPerformanceCounter();
const static Uint64 freq = SDL_GetPerformanceFrequency();
const double seconds = ( end - start ) / static_cast< double >( freq );
std::cout << "Frame time: " << seconds * 1000.0 << "ms" << std::endl;
}
SDL_DestroyRenderer(renderer);
SDL_DestroyWindow(window);
SDL_Quit();
return 0;
}
The problem is this block of code:
for(int i = 0; i < 10000; ++i)
{
SDL_SetRenderDrawColor(renderer, rand() % 255, rand() % 255, rand() % 255, 255);
SDL_RenderDrawPoint(renderer, rand() % WIDTH, rand() % HEIGHT);
}
Here's the performance with this code:
And here's the performance with this code:
SDL_SetRenderDrawColor(renderer, 255, 255, 255, 255);
for(int i = 0; i < 10000; ++i)
{
SDL_RenderDrawPoint(renderer, rand() % WIDTH, rand() % HEIGHT);
}
As you can see, there's a quite big performance impact when you change the colors a lot. In fact it gets over 100 times slower. What am I doing wrong? Or is this how it's supposed work?
I asked a guy(Gorbit99) who knows stuff about SDL, and he told me that the problem was lying in using textures and SDL_SetRenderDrawColor
which is working on the GPU, but per pixel interaction is faster on the CPU, so instead of using SDL_Texture
you use SDL_Surface
. And now this is my final code (performance ~2ms).
SDL_Surface surface = SDL_CreateRGBSurfaceWithFormat(0, WIDTH, HEIGHT, 32, SDL_PIXELFORMAT_ABGR32);
surface = SDL_CreateRGBSurfaceWithFormat(0, WIDTH, HEIGHT, 32, SDL_PIXELFORMAT_ABGR32);
uint32_t* colors = (uint32_t*)surface->pixels;
for( unsigned int i = 0; i < 1000; i++ )
{
int x = rand() % WIDTH;
int y = rand() % HEIGHT;
int offset = ( WIDTH * y ) + x;
colors[ offset ] = 0x00ff00ff; // 0xrrggbbaa
}
SDL_Texture *texture = SDL_CreateTextureFromSurface(renderer, surface);
SDL_RenderCopy(renderer, texture, NULL, NULL);
SDL_RenderPresent(renderer);
SDL_DestroyTexture(texture);
SDL_FreeSurface(surface);