Search code examples
visual-studiossevisual-studio-debuggingcpu-registers

Meaning of XMM register values shown in Visual Studio debugger's register window


I am finding it hard to interpret the value of xmm registers in the register window of Visual Studio. The windows displays the following :

XMM0 = 00000000000000004018000000000000 XMM1 = 00000000000000004020000000000000 
XMM2 = 00000000000000000000000000000000 XMM3 = 00000000000000000000000000000000 
XMM4 = 00000000000000000000000000000000 XMM5 = 00000000000000000000000000000000 
XMM6 = 00000000000000000000000000000000 XMM7 = 00000000000000000000000000000000 

XMM00 = +0.00000E+000      XMM01 = +2.37500E+000      XMM02 = +0.00000E+000      
XMM03 = +0.00000E+000      XMM10 = +0.00000E+000      XMM11 = +2.50000E+000      
XMM12 = +0.00000E+000      XMM13 = +0.00000E+000

From the code that I am running, the value of XMM0 and XMM1 should be 6 and 8 (or other way round). The register value here shown is : XMM01 = +2.37500E+000

What does this translate to ?


Solution

  • Yes, it looks like:

    XMM0 = { 6.0, 0.0 }  // 6.0 = 0x4018000000000000 (double precision)
    XMM1 = { 8.0, 0.0 }  // 8.0 = 0x4020000000000000 (double precision)
    

    The reason you are having problems interpreting this is that your debugger is only displaying each 128 bit XMM register in hex and then below that as 4 x single precision floats, but you are evidently using double precision floats.

    I'm not familiar with the Visual Studio debugger, but there should ideally be a way to change the representation of your XMM registers - you may have to look at the manual or online help for this.

    Note that in general using double precision with SSE is rarely of any value, particularly if you have a fairly modern x86 CPU with two FPUs.