Search code examples
vulkan

What is a subpass? And how does it relate to swapchains and framebuffers?


So I'm following this Vulkan tutorial and my understanding is that when you call vkAcquireNextImageKHR it makes a swapchain image available to the application for rendering and you can set the render pass' framebuffer attachment which points to the image views and the swap chains' images in memory. But I don't understand what a sub pass is. I found this definition on the web:

"Vulkan introduces the concept of sub-passes to subdivide a single render pass into separate logical phases. The benefit of using sub-passes over multiple render passes is that a GPU is able to perform various optimizations."

What phases? And why can it optimize more easily?


Solution

  • Subpasses are individual parts of a renderpass that have a common set of attachments. BUT, it's important to remember that subpasses can only read attachments that were written in the same renderpass from the same exact pixel. It's easier to understand with an exaple.

    Say you have a deferred rendering setup that draws all your 3D objects to a number of different color attachments (say, one containing color, one containing the normal, one containing additional material info or texture information, etc). But as long as you're writing to the attachments you can switch up pipelines and draw calls... so you might do a solid items pass and then a transparent pass within the same subpass.

    But then to actually finish and make something that's suitable to show to the user, you need to take those attachments and combine them in a lighting pass, and for that you'd want to take your color attachment outputs from the previous subpass and use them as inputs to the new subpass. Because the input and output attachments have changed, you need a new subpass, but because the output calculations for each pixel only require the information from the same specific pixel from the inputs it can be inside the same renderpass.

    However, if you wanted to do something like a bloom effect or a blur, you would need to put that into an entirely new renderpass, because such effects require you to access data from arbitrary locations on the input attachments. The specific reasons for these per-pixel restrictions has to do with how some GPU architectures work, and aren't really critical to understand, as long as you understand the rules.

    Framebuffers are how attachments are bundled together for a renderpass, so a framebuffer has to have all of the attachments you'll use through all of your subpasses connected to it, and then the subpasses reference them by index into the framebuffer attachment definitions.

    Swapchains are special containers set up for presentation to a surface and they contain a number of images, which in turn can be put into framebuffers as attachments.

    Sascha Willems has a more in-depth explanation here