I'm reading the CLR via C# book which says:
the biggest benefit of IL(intermediate language) isn’t that it abstracts away the underlying CPU. The biggest benefit IL provides is application robustness and security. While compiling IL into native CPU instructions, the CLR performs a process called verification. Verification examines the high-level IL code and ensures that everything the code does is safe. For example, verification checks that every method is called with the correct number of parameters, that each parameter passed to every method is of the correct type, that every method’s return value is used properly, that every method has a return statement, and so on
I'm a little bit confused, for C or C++ that doesn't use IL, if we define a method that takes one argument but call it with no argument, the gcc will throw an exception indicating there is too few argument, so C or C++ can also provides robustness and security without the concept of intermediate language, so what's the real benefits to use IL?
In C/C++ you can, for example, take a function pointer and cast it to anything you like, so you could cast it to a function pointer that does not conform to the specification of the original function, then call it like that, causing an Access Violation, Stack Corruption or any other Undefined Behaviour.
The benefits of verified IL can be seen in providing both memory safety and type safety.
The benefits of IL can be seen as giving everything defined behaviour and preventing whole classes of errors at compile time, as opposed to C/C++ where undefined behaviour can happen very easily.