Are we stuck around 3ghz? I haven't seen much speed
improvement in years other than more cores and 64 bit. Some
4ghz around,seems like we should be at 7 or 10 ghz by now.
Gigahertz myth [according to wiki] - is a software benchmark to
unify what a CPU cycle can or not do. [What is done at x4
faster speed is identical to work engineered to be x4 more
efficient at x1/4 the speed].
http://en.wikipedia.org/wiki/File:Transistor_Count_and_Moore's
_Law_-_2008.svg
A chart of transistor model density, which supposes -- will
twice the transistors at a given speed (the chart stops at
2008) be utilized twice as efficiently at half the time?
Six years ago, a micro computer in a research lab was run at
500Ghz. Speculation is put forth that "atomic-level"
miniaturization will be the final limit in 20 years.
Some theorists further speculate on "technological singularity"
- progress in technology will be instantaneous.
Then, presumably however sweet that would be, is to see only
what you thought you saw in multifarious advancements. A focal
imperative, possibly, for events to follow, per se as
identifiably futuristic technology, by rapidly succeeding
design implementations to some axiomatic end. Entropy,
randomness, and evolution, of course, apart a conscious event
horizon of machinery that doesn't go awry -- thinking
preposterous silly thoughts, such as taking us mere mortals
over.