I am trying my hand at a bit of C. And I thought I had understood this linking business. But I guess not. I have a simple file main.c:
#include "function.h"
int main(char args[])
{
int print = myfunction();
}
then a second pair of files function.c/function.h
int myfunction(); //function.h
int myfunction() //function.c
{
return 5;
}
compiling this works great. However, it works great regardless, whether I use #include "function.h"
in my main file or not. Why would I need to include function.h then?
A C compiler doesn't require that you specify the prototype for a function1 before you use it. The prototype just lets the compiler verify that the type(s) of parameter(s) you pass fit with the type(s) the function requires -- and implicitly convert to the right type if it's not right, and there is an implicit conversion from/to the types involved.
As long as your code is perfect, and there's no mismatch between the how you use a function and how that function was intended to be used, you won't have a problem. In your test, you have a function that takes no parameters, and returns an int, and the code that uses it does essentially nothing else. That's a situation that's pretty hard to screw up, and it works just fine. In a real program with hundreds or thousands of functions taking multiple parameters of complex types, etc., the situation changes pretty quickly. Letting the compiler assure that you're calling functions correctly becomes much more important.
1 With the exception of a variadic function, and even there the "variable" parameters still basically follow the same rules as if there was no prototype for the function.