Apple put out a minor speed-bump on their G5 Xserve machines (they went from 2.0 Ghz to 2.3 Ghz and nothing else in the system or architecture changed).
The dual 2.0 Xserve could yield 9 gigaflops of raw processing power (which is really fast to begin with). The dual 2.3 Ghz Xserve yields 35 gigaflops of power.
Can someone please explain to me how a 15% increase in CPU speed gives you 4 times more processing power? Seems a little odd if you ask me.