In this slide, things looks a little off to me. Clock cycle time or clock period, is already time required per clock cycle. Question is, does the word Clock Rate
makes sense?
It also says, Hardware designer must often trade off clock rate against cycle count
. But, they are inversely related. If one increases Clock speed, the clock period(time for per clock cycle) will reduce automatically. Why there will be a choice?
Or am I missing something?
Clock Rate simply means frequency, which the reciprocal of the time of a single clock cycle, so the equations make perfect sense.
Regarding the second question, cycle count is the same as "CPU Clock Cycles"; it is not the same as clock period or time per clock cycle.