Recently had this T/F question on a Comp. Systems quiz:
Consider the CPU time formula:
CPU Time = IC × CPI × (clock cycle time).
If we only compare the first term IC, RISC performs better.
And the answer was false. Can someone explain why this is? I thought since RISC has fewer instructions than CISC that the IC on RISC would be lower leading to better CPU time.
IC is Instruction Count. It does not mean "how many instructions the CPU implements" but "how many instructions it takes to implement a given algorithm".
Since the instructions in a RISC machine tend to be simpler than instructions in a CISC machine, you need to execute more instructions to achieve your desired ends.
i.e., on RISC, IC is higher and therefore worse (but of course we expect a lower CPI and cycle time to make up for it).