Search code examples
c++windowsdirectxdirectx-11dxgi

How to check if a true hardware video adapter is used


I develop an application which shows something like a video in its window. I use technologies which are described here Introducing Direct2D 1.1. In my case the only difference is that eventually I create a bitmap using

ID2D1DeviceContext::CreateBitmap

then I use

ID2D1Bitmap::CopyFromMemory

to copy raw RGB data to it and then I call

ID2D1DeviceContext::DrawBitmap

to draw the bitmap. I use the high quality cubic interpolation mode D2D1_INTERPOLATION_MODE_HIGH_QUALITY_CUBIC for scaling to have the best picture but in some cases (RDP, Citrix, virtual machines, etc) it is very slow and has very high CPU consumption. It happens because in those cases a non-hardware video adapter is used. So for non-hardware adapters I am trying to turn off the interpolation and use faster methods. The problem is that I cannot exactly check if the system has a true hardware adapter.

When I call D3D11CreateDevice, I use it with D3D_DRIVER_TYPE_HARDWARE but on virtual machines it typically returns "Microsoft Basic Render Driver" which is a software driver and does not use GPU (it consumes CPU). So currently I check the vendor ID. If the vendor is AMD (ATI), NVIDIA or Intel, then I use the cubic interpolation. In the other case I use the fastest method which does not consume CPU a lot.

Microsoft::WRL::ComPtr<IDXGIDevice> dxgiDevice;
if (SUCCEEDED(m_pD3dDevice->QueryInterface(...)))
{
    Microsoft::WRL::ComPtr<IDXGIAdapter> adapter;
    if (SUCCEEDED(dxgiDevice->GetAdapter(&adapter)))
    {
        DXGI_ADAPTER_DESC desc;
        if (SUCCEEDED(adapter->GetDesc(&desc)))
        {
            // NVIDIA
            if (desc.VendorId == 0x10DE ||
                // AMD
                desc.VendorId == 0x1002 || // 0x1022 ?
                // Intel
                desc.VendorId == 0x8086) // 0x163C, 0x8087 ?
            {
                bSupported = true;
            }
        }
     }
 }

It works for physical (console) Windows session even in virtual machines. But for RDP sessions IDXGIAdapter still returns the vendors in case of real machines but it does not use GPU (I can see it via the Process Hacker 2 and AMD System Monitor (in case of ATI Radeon)) so I still have high CPU consumption with the cubic interpolation. In case of an RDP session to Windows 7 with ATI Radeon it is 10% bigger than via the physical console.

Or am I mistaken and somehow RDP uses GPU resources and that is the reason why it returns a real hardware adapter via IDXGIAdapter::GetDesc?

DirectDraw

Also I looked at DirectX Diagnostic Tool. It looks like the "DirectDraw Acceleration" info field returns exactly what I need. In case of physical (console) sessions it says "Enabled". In case of RDP and virtual machine (without hardware video acceleration) sessions it says "Not Available". I looked at sources and theoretically I can use the verification algorithm. But it is actually for DirectDraw which I do not use in my application. I would like to use something which is directly linked to ID3D11Device, IDXGIDevice, IDXGIAdapter and so on.

IDXGIAdapter1::GetDesc1 and DXGI_ADAPTER_FLAG

I also tried to use IDXGIAdapter1::GetDesc1 and check the flags.

Microsoft::WRL::ComPtr<IDXGIDevice> dxgiDevice;
if (SUCCEEDED(m_pD3dDevice->QueryInterface(...)))
{
    Microsoft::WRL::ComPtr<IDXGIAdapter> adapter;
    if (SUCCEEDED(dxgiDevice->GetAdapter(&adapter)))
    {
         Microsoft::WRL::ComPtr<IDXGIAdapter1> adapter1;
         if (SUCCEEDED(adapter->QueryInterface(__uuidof(IDXGIAdapter1), reinterpret_cast<void**>(adapter1.GetAddressOf()))))
         {
             DXGI_ADAPTER_DESC1 desc;
             if (SUCCEEDED(adapter1->GetDesc1(&desc)))
             {
                    // desc.Flags
                    // DXGI_ADAPTER_FLAG_NONE         = 0,
                    // DXGI_ADAPTER_FLAG_REMOTE       = 1,
                    // DXGI_ADAPTER_FLAG_SOFTWARE     = 2,
                    // DXGI_ADAPTER_FLAG_FORCE_DWORD  = 0xffffffff
             }
         }
     }
 }

Information about the DXGI_ADAPTER_FLAG_SOFTWARE flag

 Virtual Machine RDP Win Serv 2012 (Microsoft Basic Render Driver) -> (0x02) DXGI_ADAPTER_FLAG_SOFTWARE
 Physical Win 10 (Intel Video) -> (0x00) DXGI_ADAPTER_FLAG_NONE
 Physical Win 7 (ATI Radeon) - > (0x00) DXGI_ADAPTER_FLAG_NONE
 RDP Win 10 (Intel Video) -> (0x00) DXGI_ADAPTER_FLAG_NONE
 RDP Win 7 (ATI Radeon) -> (0x00) DXGI_ADAPTER_FLAG_NONE

In case of RDP session on a real machine with a hardware adapter, Flags == 0 but as I can see via Process Hacker 2 the GPU is not used. At least on Windows 7 with ATI Radeon I can see bigger CPU usage in case of an RDP session. So it looks like DXGI_ADAPTER_FLAG_SOFTWARE is only for Microsoft Basic Render Driver. So the issue is not solved.

The question

Is there a correct way to check if a real hardware video card (GPU) is used for the current Windows session? Or maybe it is possible to check if a specific interpolation mode of ID2D1DeviceContext::DrawBitmap has hardware implementation and uses GPU for the current session?

UPD

The topic is not about detecting RDP or Citrix sessions. It is not about detecting if the application is inside a virtual machine or not. I already have the all verifications and use the linear interpolation for those cases. The topic is about detecting if a real GPU is used for the current Windows session to display the desktop. I am looking for a more sophisticated solution to make decision using features of DirectX and DXGI.


Solution

  • Answering a 3 years old question as I struggled myself to do so.

    I had to go through the registry. First thing is to find the adapter LUID in the registry, to get the adapter GUID

    private string GetAdapterGuid(long luid)
    {
        var directXRegistryKey = Registry.LocalMachine.OpenSubKey(@"SOFTWARE\Microsoft\DirectX");
        if (directXRegistryKey == null)
            return "";
        var subKeyNames = directXRegistryKey.GetSubKeyNames();
        foreach (var subKeyName in subKeyNames)
        {
            var subKey = directXRegistryKey.OpenSubKey(subKeyName);
            if (subKey.GetValueKind("AdapterLuid") != RegistryValueKind.QWord)
                continue;
            var luidValue = (long)subKey.GetValue("AdapterLuid");
            if (luidValue == luid)
                return subKeyName;
        }
        return "";
    }
    

    Once you have that Guid, you can search for the details of the graphic card in HKLM like this. If it is virtual, the service name will be "INDIRECTKMD" :

    private bool IsVirtualAdapter(string adapterGuid)
    {
        var videoRegistryKey = Registry.LocalMachine.OpenSubKey($@"SYSTEM\CurrentControlSet\Control\Video\{adapterGuid}\Video");
        if (videoRegistryKey == null)
            return false;
        if (videoRegistryKey.GetValueKind("Service") != RegistryValueKind.String)
            return false;
        var serviceName = (string)videoRegistryKey.GetValue("Service");
        return serviceName.ToUpper() == "INDIRECTKMD";
    }
    

    Checking the service name felt easier than parsing the DeviceDesc value.

    My use case involved having the Guid ready so I split up the function, you could merge it into one.

    It also only detect RDP/MSTSC through this, additional service names might be needed for other virtual adapters. Or you could try to detect only Nvidia/AMD/Intel driver names... up to you.