Is there a performance issue or any other reason?
There is absolutely no performance benefit in using one over the other, but note that the use of a hexadecimal literal means that the implied type can include unsigned
integral types, (cf. decimal literals which can't) and that can have surprising effects:
void foo(const unsigned&)
{
// pay me a bonus
}
void foo(const long&)
{
// reformat my hard disk
}
int main()
{
foo(0xffffffff); // thankfully unsigned on a platform with 32 bit int.
}
See http://en.cppreference.com/w/cpp/language/integer_literal, including the link to C at the bottom of that page.