All my quetions are related to vc++ compiler but I guess other c++ compilers have the same behavior.
//stdafx.h
#include <boost/variant/variant.hpp>
//test1.cpp
#include "stdafx.h"
#include <boost/variant/variant.hpp>
...
//test2.cpp
#include "stdafx.h"
...
I have no particular knowledge of VC++'s innards. However, having some knowledge of compiler design and theory, what these so-called "precompiled headers" are can't be anything more than just the result of the initial lexical analysis and tokenization phases of a classical compiler design.
Consider a simple header file that contains the following:
#ifdef FOO
#define BAR 10
#else
#undef FOOBAR
class Foo {
public:
void bar();
};
#include "foobar.h"
#endif
You have to understand that the effect of using a so-called "pre-compiled" header must be identical to using the header file, as is.
Here, you don't really know what this header file is going to do. It all depends on what preprocessor macros are defined when the header file is actually included. You don't know which macros this header file will define. You don't know which macros this header file will undefine. You don't know what other header files this header file will include. You don't really know a lot, here.
The only thing you can conceptually do, to "precompile" a header file, is to pre-parse it. Convert the individual elements of the language, the individual keywords -- like "#ifdef", "class", and all others, into individual binary tokens. Remove any comments, whitespace, etc...
The first phase of compiling a traditional language involves parsing the plain text source into the internal language elements. The lexical analysis and the tokenization phase. After the individual language elements get parsed, then an attempt is made to figure out how the resulting, parsed source code, should get turned into an object module. And that's where 99% of the compiler's work is. The initial lexical analysis phase is not really a lot, but that's pretty much all you can do to "precompile" the source code, and save the internal binary representation of the tokenized source, so that this phase can be skipped, when actual code that uses the "precompiled" source is compiled.
I am assuming that VC++ places little, if no restrictions at all, on the contents of precompiled headers. However, if there are some restrictions -- say, the precompiled headers cannot have any conditional preprocessor directives (ifdef/ifndef) except for the classical guards -- than it would be possible to do more work to produce the precompiled headers, and save a little bit more work, here. Other restrictions on the contents of precompiled headers could also result in some additional functionality being shifted into the precompilation phase.