Search code examples
c++directxdirectxtk

Input Assembler - Vertex Shader Linkage error


I'm trying to render text in my DX11 project, by using SpriteFont and SpriteBatch.

Everything worked before adding the code in, but when the code in uncommented, I get the following error and nothing renders.

D3D11 ERROR: ID3D11DeviceContext::DrawIndexed: Input Assembler - Vertex Shader linkage error: Signatures between stages are incompatible. Semantic 'TEXCOORD' is defined for mismatched hardware registers between the output stage and input stage. [ EXECUTION ERROR #343: DEVICE_SHADER_LINKAGE_REGISTERINDEX]
D3D11 ERROR: ID3D11DeviceContext::DrawIndexed: Input Assembler - Vertex Shader linkage error: Signatures between stages are incompatible. The input stage requires Semantic/Index (POSITION,0) as input, but it is not provided by the output stage. [ EXECUTION ERROR #342: DEVICE_SHADER_LINKAGE_SEMANTICNAME_NOT_FOUND]
D3D11 ERROR: ID3D11DeviceContext::DrawIndexed: Input Assembler - Vertex Shader linkage error: Signatures between stages are incompatible. The input stage requires Semantic/Index (NORMAL,0) as input, but it is not provided by the output stage. [ EXECUTION ERROR #342: DEVICE_SHADER_LINKAGE_SEMANTICNAME_NOT_FOUND]

I have tried to change the DeviceContext & Device, but still the same error.

    //Draw Text
    g_spriteBatch = std::make_unique<SpriteBatch>(g_pImmediateContext);
    g_spriteFont = std::make_unique<SpriteFont>(g_pd3dDevice1, L"Data\\game_font.spritefont");

    //Render Text
    g_spriteBatch->Begin();
    //g_spriteFont->DrawString(g_spriteBatch.get(), L"Score: 0", XMFLOAT2(0,0), DirectX::Colors::White);
    g_spriteBatch->End();

I've tried using different Devices and DeviceContext's, but still the same error.


I have been trying many different ways to get this working. This is what my render function looks like at the moment

Updated: 18/02/2019

void Render() {

    float backgroundColour[] = { 0.0f, 0.1f, 0.5f, 0.0f };
    g_pImmediateContext->ClearRenderTargetView(g_pRenderTargetView, backgroundColour);
    g_pImmediateContext->ClearDepthStencilView(g_depthStencilView, D3D11_CLEAR_DEPTH | D3D11_CLEAR_STENCIL, 1.0f, 0);

    g_pImmediateContext->IASetInputLayout(g_pVertexLayout);
    g_pImmediateContext->IASetPrimitiveTopology(D3D11_PRIMITIVE_TOPOLOGY::D3D10_PRIMITIVE_TOPOLOGY_TRIANGLELIST);
    g_pImmediateContext->RSSetState(g_rasterState);
    g_pImmediateContext->OMSetDepthStencilState(g_depthStencilState, 0);
    g_pImmediateContext->VSSetShader(g_pVertexShader, NULL, 0);
    g_pImmediateContext->PSSetShader(g_pPixelShader, NULL, 0);

    UINT stride = sizeof(SimpleVertex);
    UINT offset = 0;

    g_pImmediateContext->IASetVertexBuffers(0, 1, &g_pVertexBuffer, &stride, &offset);
    g_pImmediateContext->Draw(3, 0);
    //g_pImmediateContext->DrawIndexed(36, 0, 0);

    //Draw Text
    spriteBatch->Begin();
    spriteFont->DrawString(spriteBatch.get(), L"Test", DirectX::XMFLOAT2(0, 0), DirectX::Colors::White);
    spriteBatch->End();

    //Present our back buffer to the front buffer
    g_pSwapChain->Present(0, NULL);

}

When running the project, I get the following error D3D11 ERROR: ID3D11DeviceContext::IASetVertexBuffers: A Buffer trying to be bound to slot 0 did not have the appropriate bind flag set at creation time to allow the Buffer to be bound as a VertexBuffer. [ STATE_SETTING ERROR #238: IASETVERTEXBUFFERS_INVALIDBUFFER]

This is the code that stores the data for a tri and sets up the index/vertex buffer

    SimpleVertex vertices[] =
    {           
        SimpleVertex(-0.5f, -0.5f, 1.0f, 1.0f, 0.0f, 0.0f),
        SimpleVertex( 0.0f,  0.5f, 1.0f, 1.0f, 0.0f, 0.0f),
        SimpleVertex( 0.5f, -0.5f, 1.0f, 1.0f, 0.0f, 0.0f),
    };

    D3D11_BUFFER_DESC bd = {};
    ZeroMemory(&bd, sizeof(bd));
    bd.Usage = D3D11_USAGE_DEFAULT;
    bd.ByteWidth = sizeof( SimpleVertex ) * ARRAYSIZE(vertices);
    bd.BindFlags = D3D11_BIND_INDEX_BUFFER;
    bd.CPUAccessFlags = 0;
    bd.MiscFlags = 0;

    D3D11_SUBRESOURCE_DATA InitData;
    InitData.pSysMem = vertices;
    hr = g_pd3dDevice->CreateBuffer( &bd, &InitData, &g_pVertexBuffer );
    if (FAILED(hr))
    {
        printf("Failed ro Create Vertex Buffer.");
        return hr;
    }

I have changed D3D11_BIND_INDEX_BUFFER to D3D11_BIND_VERTEX_BUFFER and the same error occurs.

Following this, the following warnings are produced too.

D3D11 WARNING: ID3D11DeviceContext::Draw: Vertex Buffer at the input vertex slot 0 is not big enough for what the Draw*() call expects to traverse. This is OK, as reading off the end of the Buffer is defined to return 0. However the developer probably did not intend to make use of this behavior.  [ EXECUTION WARNING #356: DEVICE_DRAW_VERTEX_BUFFER_TOO_SMALL]

D3D11 WARNING: ID3D11DeviceContext::Draw: The size of the Constant Buffer at slot 0 of the Vertex Shader unit is too small (64 bytes provided, 192 bytes, at least, expected). This is OK, as out-of-bounds reads are defined to return 0. It is also possible the developer knows the missing data will not be used anyway. This is only a problem if the developer actually intended to bind a sufficiently large Constant Buffer for what the shader expects.  [ EXECUTION WARNING #351: DEVICE_DRAW_CONSTANT_BUFFER_TOO_SMALL]

D3D11 WARNING: ID3D11DeviceContext::Draw: Input vertex slot 0 has stride 24 which is less than the minimum stride logically expected from the current Input Layout (32 bytes). This is OK, as hardware is perfectly capable of reading overlapping data. However the developer probably did not intend to make use of this behavior.  [ EXECUTION WARNING #355: DEVICE_DRAW_VERTEX_BUFFER_STRIDE_TOO_SMALL]

Solution

  • TL;DR: You need to set your cube's Input Layout, Vertex Shader, and Pixel Shader each frame before you call Draw.

    Specifically, you forgot to set your Vertex Buffer as you left that code commented out:

    // >>> Set vertex buffer
    //g_pImmediateContext->IASetVertexBuffers(0, 1, &g_pVertexBuffer, &stride, &offset);
    

    Your comment includes the Debug Device error you get that caused you to comment it out in the first place. It is pretty clear that your bug is actually where you call CreateBuffer for a VB without setting D3D11_BUFFER_DESC.BindFlags to D3D11_BIND_VERTEX_BUFFER.

    Before each call to Draw* you need to set any state that is required for that draw.

    The exact state that is changed by calling SpriteFont::Begin/End is documented on the wiki. In particular, it changes both the input layout and the vertex shader. Your "cube drawing" code is relying on these states to be unchanged between each frame, which really only works in trivial cases.

    At the start of each render frame, you should:

    • Clear the render target and/or depth stencil buffer
    • Call OMSetRenderTargets to set your primary render target
    • Call RSSetViewports to set your viewport state

    Then before each time you call Draw* set all the state you use. For a simple 3D cube drawing, this includes at a minimum:

    • Render Target view, Depth Stencil view, Viewport (which in simple cases can be set once at the start of the frame)
    • BlendState, DepthStencilState, RasterizerState
    • Input layout
    • Any Constant buffers, Pixel shader, Vertex shader, and any required SamplerState or Shader resources.

    The DirectX Tool Kit includes a number of helpers like CommonState and Effects to make this easier to manage, but you still have to make sure all state you rely on is set.

    State management

    State management is an essential requirement for Direct3D programming, and particularly important for performance. At any given time when you call Draw*, all the state needed for that draw needs to be set on the rendering pipeline, yet you really want to minimize the number of times you change state for performance.

    There are really three basic strategies for state management in utility libraries like DirectX Tool Kit:

    1. Capture and restore all state. This is what the legacy Direct3D 9 era ID3DXFont code in the deprecated D3DX9 library did. In Direct3D 9, this was originally accomplished through the fact that the runtime tracked the current state. The "PURE" device was introduced specifically to try to remove this overhead, so state blocks were added as a way to try to make it work. The state vector in Direct3D 11 is also quite large, and the runtime doesn't support 'state blocks' in the same way. Overall, this approach is super easy to use, but never fast.

    2. Set all required state, clear used state afterwards. Another option would be to set all the state every time you use any helper functions like SpriteBatch::Begin, and then when you call SpriteBatch::End clear any state values it used. This approach again is conceptually easy because there's no lingering side-effects, but in practice is also slow because there's lots of extra calls to set null state.

    3. Set required state, let later draws clean up the state. This is the option that DirectX Tool Kit uses. I set the state I need to draw, and then leave it alone. If you have more drawing to do after you call my code before the end of the frame, you need to set them. In practice the helpers in DirectX Tool Kit only used the standard states common to basically all drawing, but you definitely can get some strange interactions if you aren't aware of this behavior. For example, if you have a Geometry Shader set before calling SpriteBatch, weird things will happen when you call End. I assume if you used something beyond the usual VS/PS combo, you cleared it before calling into my code. I document all the state used by each helper on the wiki, and in practice this is a good balance of performance and predictable behavior even though it does sometimes confuse people.

    Direct3D 11 vs. Direct3D 12 Notes

    In Direct3D 11, any state set at the end of a frame when you call Present is still set at the start of the next frame, so you'll see some older tutorials set some state up once before starting the rendering loop and assume remains unchanged. This is not true of Direct3D 12 where the state is completely reset by the Present. Therefore, it's much better to get in the habit of setting all the state every frame. One way to validate you did this correctly is to call ClearState right after the Present each frame, although you probably only want to do this in Debug builds since it's a waste of time otherwise in a properly written program.

    In Direct3D 11, the call to RSSetViewports to set the viewport will also initialize the clipping scissor rectangles to the same value. In Direct3D 12, you need to do this explicitly by calling RSSetScissorRects otherwise you end up with a (0,0,0,0) clipping rectangle which clips out everything.

    See GitHub for some examples of these basic render loops.