CS 473 - Introduction

History of RISC and CISC

``Semantic Gap'' drove computer architecture for a long time. The idea with this notion was that a gap exists in the level of abstraction between the various levels (eg raw hardware, programmer-visible instruction set, high level language) of a computer system, which is obviously true, and that this is a bad thing, which seemed obviously true as well at the time.

So: both of the earlier extremes were fallacies. CISC attempts to optimize Number of Instructions, at the expense of CPI and clock rate. RISC attempts to optimize clock rate, at the expense of Number of Instructions. More current approaches attempt to optimize time, willing to trade off all the other parameters to do it.

Note on DEC view of CPI: when the Alpha was introduced, they considered the VAX's lifespan (~25 years), and saw a factor of 1000 improvement in that time. Assumption: there will be another factor of 1000 performance in next 25 years. How?