Suppose that a file server has an uptime of 80%. How many more replicated servers with same uptime have to be used to give an availability of at least 99.99 percent?
The logic you require follows this pattern:
Server1 is up 80% of the time.
Server2 is up 80% of the time that Server1 isn't up: in other words, for the 20% of the time that server1 is down, server2 is up.
Server3 is up 80% of the time that Server1 and Server2 aren't up: in other words, for the 20% of the time that server1 (80%) and server2 (80% of 20%) are down, server3 is up.
Server4 is up 80% of the time that Server1, Server2 and Server3 aren't up: in order words, for the 20% of the time that (server1 (80%) and server2 (80% of 20%) and server3 (80% of (80% of 20%)) are down, server4 is up.
Etc., etc., ad nauseam.
Does that give you enough information to figure the answer out?