Search code examples
architecturegame-engine

What is the difference between a GUI system and Rendering System in Game Engine Design?


After reading part 1 of Michael Kissner's blog article on writing a game engine from scratch, a lot of concepts became a lot clearer to me. But a question that I seem to be stuck on is, what is the difference between the system used for rendering and the system used for gui. What makes these two systems separate from one another. My thinking maybe flawed but doesn't the GUI have to be "rendered" or is this done differently than something like a sprite. Thanks for any help!


Solution

  • Actually your thinking is good!

    As the GUI system usually uses the rendering system of the game. However, both systems still tend to be separated. To give an idea, the GUI system would handle the data of the buttons, layouts, police, clicks or scrolling. It will then simplify everything to a bunch of vertices, triangles and textures and give it to the rendering system.

    This is how works the hud of Unreal Engine 4 or our dear imgui for example.

    You can also have additional GUI systems for the game editor or game tools, but it will usually be external and contains the game engine rather than the other way around.

    In this case, the rendering system will output the entire frame in some texture that you may show anywhere you like. You will see it done this way with libraries like Qt, directly as extensions for software like Maya or with proprietary editors like Unreal Editor or Unity Editor. This type of GUI will never end on the player side unless as a modding tool.