Y
Yousuf Khan said:
Carlo Razzeto said:"Yousuf Khan" <[email protected]> wrote in message
Interesting article, I honestly don't think they're way to far off
base... I wouldn't be surprised to see the vast majority of diversity
in CPU architecture disappear over the next few years.
George Macdonald said:So now we have Sun, post-Ed (Zander that is). This could kill them off....
or?? I guess they could always buy up Gateway.<guffaw>
jack said:I feel like I'm having a major case of Deja Vu! Way back in 1983, I was
working at Zilog in Cupertino, California, and I remember the head of
the Engineering department saying almost exactly the same thing (can't
remember his name). I mean, this is almost word-for-word (the part
about "brain damaged" memory management and register allocation =
Russian roulette)! The reason I remember this is that I was SO struck
by his comments (he was x-Intel) and that I couldn't believe he actually
confided in me, a fresh out-of-college puke.
Man, what a trip down memory lane to read this article. As I said,
tooooooo funny!
Yousuf said:What do you mean? I thought Sun is claiming Opteron to be their saviour?
Rob Stow said:No. Sun is merely seeing 10K Opteron server sales per quarter -
and growing - and they have decided that they want to be part of
that market. Sun is a big enough company that even if they had had
*all* of that market it wouldn't have saved them from bleeding
red ink.
In comp.sys.ibm.pc.hardware.chips Yousuf Khan said:I can't fault them for any flaws in logic either. It makes sense that x86
descendents will take over the world, especially as they get expanded and
cleaned up, through the natural evolutionary processes. They've taken the
time to explain what the remaining advantages were in proprietary
architectures over x86, and how they are now mostly disappearing too.
Mike Tomlinson said:"At least there will be less fundamental change in the industry a decade
from now, the x86-128 will be much less of an upset."
Aaaaarrrghhh! -- already talking of x86-128..
LOL so we go 65 bitIt's probably not going to be necessary until 2050 though.
Somebody pointed out that if you employ that highly convenient and highly
overused Moore's Law, where you insert whatever convenient computer metric
and double it every year-and-a-half. Every additional bit on top of 32-bit
would be another doubling of memory; 33-bits is double of 32-bits, 34-bits
is double of 33, etc. So going from 32-bit to 64-bit is 32 doublings, or
about 50 years, to completely use up the full 64-bit address space.
Yousuf Khan
It's probably not going to be necessary until 2050 though.
Somebody pointed out that if you employ that highly convenient and highly
overused Moore's Law, where you insert whatever convenient computer metric
and double it every year-and-a-half. Every additional bit on top of 32-bit
would be another doubling of memory; 33-bits is double of 32-bits, 34-bits
is double of 33, etc. So going from 32-bit to 64-bit is 32 doublings, or
about 50 years, to completely use up the full 64-bit address space.
David Schwartz said:This is one way to look at it, but there's another way to look at it. If
you're trying to perform certain types of computations, say multiply two
large numbers, a 32-bit processor will be able to do it faster than a 16-bit
processor at the same instruction rate.
If there's no other way to make processors keep getting faster, adding
more bits will allow them to perform at least some types of computations
more rapidly. If we have to go to 128 bits to do this, we will. And that
could happen long before 2050.
Yousuf Khan said:If Intel and AMD and the rest of the x86 field are smart, they will setup a
consortium or a committee to drive x86 development, much like Sparc
International does for Sparc, MIPS International does for MIPS, or Arm
Holdings does for ARM. The time is right to turn x86 from a defacto standard
to a true dejure standard.
Yousuf Khan
What do you mean? I thought Sun is claiming Opteron to be their saviour?