Compilers are able to compile huge quantities of source code. This giant code is translated in AST during the compilation phase. I imagine that if the code is huge, this AST will become huge as well.
Can we assume that the compiler will never run out of memory on modern computer by building the AST and keeping it in memory ?
With virtual memory, the compiler/linker tools don't really have to worry much about memory footprint.
The tool requests what it needs, and the OS either provides enough in a process address space as virtual memory, or [policy decision for particular machine] the OS refuses the tool's request to grow the space at some point, and the process gets an error and typically quits.
Of course, you may have a system with a huge VM limit and not enough physical memory to support it. Then the tool will page-thrash until it succeeds or the operator stops it in disgust.
[We have our own compiler that we often use to compile a several million line monolithic program. That compiler typically wants several hundred megabytes of VM to process that program. This is easily provided by most modern PCs.]