Search code examples
c++shaderhlsldirect3ddirect3d11

Mesh getting cut off


I'm using DirectX 11. I'm trying to draw a Cube mesh to the screen but the bottom half is getting cut off. If I move the camera up/down the bottom half is still cut off, which leads me to think that it's not a viewport/rasterizer issue, but I'm not sure. The pictures are of the cube looking down and then looking up. You can see the cube is getting cut off regardless of the camera position. I think it might be an issue with my projection matrices.

enter image description here enter image description here

I've attached the RenderDoc capture here, and you can see that the VS input is correct, but when viewing the VS output with solid shading, the same thing happens. https://drive.google.com/file/d/1sh7tj0hPYwD936BEQCL0wtH8ZzXMiEno/view?usp=sharing

This is how I'm calculating my matrices:

mat4 LookAtMatrix(float3 Position, float3 Target, float3 Up) {
    float3 Forward = Normalise(Target - Position);
    float3 Right = Cross(Normalise(Up), Forward);
    float3 UpV = Cross(Forward, Right);
    
    mat4 Out;
    Out.v[0] = float4(Right, 0);
    Out.v[1] = float4(UpV, 0);
    Out.v[2] = float4(Forward, 0);
    Out.v[3] = float4(Position, 1);
    return Out;
}

mat4 ProjectionMatrix(f32 FOV, f32 Aspect, f32 Near, f32 Far) {
    mat4 Out;
    f32 YScale = 1.0f / tan((FOV * Deg2Rad) / 2.0f);
    f32 XScale = YScale / Aspect;
    f32 NmF = Near - Far;
    
    Out.v[0] = float4(XScale, 0, 0, 0);
    Out.v[1] = float4(0, YScale, 0, 0);
    Out.v[2] = float4(0, 0, (Far + Near) / NmF, -1.0f);
    Out.v[3] = float4(0, 0, 2 * Far * Near / NmF, 0);
    
    return Out;
}

And this is how I'm calling these functions (The issue happens reguardless of whether I use rotation or not):

    D3D11_MAPPED_SUBRESOURCE Resource;
    HRESULT Result = DeviceContext->Map(ConstantBuffer, 0, D3D11_MAP_WRITE_DISCARD, 0, &Resource);
    if(FAILED(Result)) FatalError("DeviceContext->Map failed");
    matrix_buffer *Buffer = (matrix_buffer *)Resource.pData;
    
    static float yR = 0.0f;
    yR += 50.0f * DeltaTime;
    while(yR > 360.0f) yR -= 360.0f;
    while(yR < 0.0f) yR += 360.0f;
    quat R = QuatFromAngles(0.0f, yR, 0.0f);
    
    const float Speed = 100.0f;
    static float3 Position = float3(0, 0, -300);
    if(WDown) Position.z += Speed * DeltaTime;
    if(ADown) Position.x += Speed * DeltaTime;
    if(SDown) Position.z -= Speed * DeltaTime;
    if(DDown) Position.x -= Speed * DeltaTime;
    if(QDown) Position.y -= Speed * DeltaTime;
    if(EDown) Position.y += Speed * DeltaTime;
    
    Buffer->WorldMatrix = RotationMatrix(R, float3(0, 0, 0));
    Buffer->ViewMatrix = LookAtMatrix(Position, Position+float3(0, 0, 1), float3(0, 1, 0));
    Buffer->ProjectionMatrix = ProjectionMatrix(45.0f, 1366/768, 0.1f, 1000.0f);
    
    DeviceContext->Unmap(ConstantBuffer, 0);

And this is my vertex shader code:

struct vertex_data {
    float3 Position : POSITION;
    float2 UV : TEXCOORD;
    float4 Colour : COLOR;
    float3 Normal : NORMAL;
};

struct pixel_data {
    float4 Position : SV_POSITION;
    float2 UV : TEXCOORD;
    float4 Colour : COLOR;
    float3 Normal : NORMAL;
};

cbuffer MatrixBuffer {
    float4x4 WorldMatrix;
    float4x4 ViewMatrix;
    float4x4 ProjectionMatrix;
};

pixel_data VertexMain(vertex_data Input) {
    pixel_data Output;

    float4 V = float4(Input.Position, 1);
    Output.Position = mul(V, transpose(WorldMatrix));
    Output.Position = mul(Output.Position, transpose(ViewMatrix));
    Output.Position = mul(Output.Position, transpose(ProjectionMatrix));

    Output.UV = Input.UV;
    Output.Colour = Input.Colour;
    Output.Normal = Input.Normal;

    return Output;
}

Here is my code for setting up the viewport (Width/Height are 1366/768 - the size of the window):

    D3D11_VIEWPORT Viewport;
    Viewport.Width = (float)Width;
    Viewport.Height = (float)Height;
    Viewport.MinDepth = 0.0f;
    Viewport.MaxDepth = 1.0f;
    Viewport.TopLeftX = 0.0f;
    Viewport.TopLeftY = 0.0f;
    
    DeviceContext->RSSetViewports(1, &Viewport);

Solution

  • I've seen similar issues caused by:

    1. Transposed matrices (are you using row major or column major matrices? Do you need a #pragma pack_matrix? It looks like you've finnicked with transposing quite a bit - avoid doing that, as you will make mistakes that are difficult to reason about)

    2. Otherwise messed up matrix multiplication order. If you bob the camera up/down/left/right or arcball it around & rotate the model, does it actually work? Make sure you incorporate camera rotations with camera translations and object rotations / translations, otherwise you might incorrectly think your code works. What if you zoom near or far?

    I recommend when debugging these issues that you first try running your shader transformations in CPU code:

    1. Take a simple model-space coordinate (e.g. 0,0,0).
    2. Pass it through your world matrix, and check if it looks right.
    3. Pass it through your view matrix, verify it.
    4. Then your proj matrix.

    Even that simple test can be quite revealing. Basically, if you think your vertex shader is wrong, that's fortunately usually the easiest shader to validate in software! If this passes, try a few other vertices, like the vertices if your box. If that succeeds in software, then now you know it somehow has to do with how you're passing vertex data to the GPU (e.g. row-major vs column-major). If not, then you've built a simple CPU-side repro, great.

    (Also, I'm not sure what your pixel shader is, but to rule it out and isolate the vertex shader, consider making the pixel shader just return a solid white)