Search code examples
cudabinarynvidianvcc

CUDA NVCC compiler binary variable


I'm trying do to something like this in CUDA:

char_sig=code[k][1] & 0b00000010;

And the NVCC compiler keeps giving me the error expected a ";"

The same code works with GCC C compiler. I noticed the problem is with setting the binary value as 0b00000010, is there some other notation that NVCC expects?


Solution

  • Binary constants using the 0b prefix are a gcc extension, and is not part of standard C99 or C++98/C++03. The open64 and LLVM/clang compilers on which the CUDA toolchain is based don't support this. You will need to convert your constants to octal, hexadecimal, or decimal to use them in CUDA.