R
rmorea
Hi!
I have seen a lot of posts about the so-called end of Moore's Law or at
least the seeming end of the continuous Intel/AMD drive to release
faster chips. I have found surprisingly little direct speculation on
what this will mean in the next couple years for new systems, etc.
For example, is this merely a speedbump and they'll get around it? Is
the issue mostly heat, or is quantum tunnelling of electrons really
beginning to impact further down-scaling? Will multi-core architecture
allow effective speed to continue to ramp up even if clock speed
doesn't? Or are the old days of fast-doubling gains gone forever?
All this just invites speculation, but I would like to see what people
think. The practical reason is that I have "always" bought a new
gaming computer every two years since 1994. Except, I didn't in 2004
because it hardly seemed necessary. Now, in 2005, I might just go
ahead and upgrade but it again seems barely necessary (I did spring for
a new video card). I don't want to buy a system at the end of a
tech-generation if great things are around the corner. But if the pace
of change has merely slowed, that might be an excuse to spend a bit
more and assume the system lasts 3 or 4 years instead of 18 or 24
months. Just looking for thoughts and comments...
Thanks,
Craig
I have seen a lot of posts about the so-called end of Moore's Law or at
least the seeming end of the continuous Intel/AMD drive to release
faster chips. I have found surprisingly little direct speculation on
what this will mean in the next couple years for new systems, etc.
For example, is this merely a speedbump and they'll get around it? Is
the issue mostly heat, or is quantum tunnelling of electrons really
beginning to impact further down-scaling? Will multi-core architecture
allow effective speed to continue to ramp up even if clock speed
doesn't? Or are the old days of fast-doubling gains gone forever?
All this just invites speculation, but I would like to see what people
think. The practical reason is that I have "always" bought a new
gaming computer every two years since 1994. Except, I didn't in 2004
because it hardly seemed necessary. Now, in 2005, I might just go
ahead and upgrade but it again seems barely necessary (I did spring for
a new video card). I don't want to buy a system at the end of a
tech-generation if great things are around the corner. But if the pace
of change has merely slowed, that might be an excuse to spend a bit
more and assume the system lasts 3 or 4 years instead of 18 or 24
months. Just looking for thoughts and comments...
Thanks,
Craig