Here's an (admittedly brain-dead) refactoring algorithm I've performed on several occasions:
.cpp
file that compiles cleanly and (AFAICT) works correctly.const
keyword, prepend the const
keyword to its declaration..cpp
file againconst
, remove the const
keyword from it; otherwise fix whatever underlying issue the const
keyword's addition has revealed..cpp
file again compiles cleanlySetting aside for the moment whether or not it's a good idea to "const all the local variables", is there any risk of this practice introducing a run-time/logic error into the program that wouldn't be caught at compile-time? AFAICT this seems "safe" in that it won't introduce regressions, only compile-time errors which I can then fix right away; but C++ is a many-splendored thing so perhaps there is some risk I haven't thought of.
If you're willing to accept a contrived example, you could enter the world of undefined behavior.
void increment(int & num)
{
++num;
}
int main()
{
int n = 99;
increment(const_cast<int&>(n));
cout << n;
}
The above compiles and outputs 100. The below compiles and is allowed to do whatever it wants (but happened to output 99 for me). Modifying a const object through a non-const access path results in undefined behavior.
void increment(int & num)
{
++num;
}
int main()
{
const int n = 99;
increment(const_cast<int&>(n));
cout << n;
}
Yes, this is contrived because why would someone do a const_cast
on a non-const
object? On the other hand, this is a simple example. Maybe in more complex code this might actually come up. Shrug I won't claim that this is a big risk, but it does fall under "any risk", as stated in the question.