Intel says no to 64-bit until MS Longhorn arrives?

  • Thread starter Thread starter Black Jack
  • Start date Start date
Eric said:
Not foolish, smart! Do you think Intel is just sitting there? What do you
think the initial costs are in marketing, developing compilers, 64 bit
apps, OS's, drivers etc etc? Whos funding that? Not Intel, AMD! And once
that is all done, Intel releases their "32 bit extended to 64 bit" CPU to
the world (the one (or more)they are testing and refining now), and whos
ahead then? Remember, its about money, thats all, nothing more, just money
and in the end Intel will come out on top in this saga.
Eric

I have to agree. Carl is way off on this one. The support simply IS NOT
there. No beta OS can be considered support and isn't MS releasing 64-bit
Windows for the Advanced Server product? I haven't heard anything lately on
Windows XP Home edition for 64-bit. If they are only supporting 64-bit on
the server OS for now, then Intel is smart for only doing 64-bit on Xeon
while waiting for the desktop support. Why support something that doesn't
exist? It's a waste of money. Good move by Intel on this one. The server
market is where 64-bit is needed for the most part. Enabling a feature that
won't be used due to non support doesn't make much sense.
 
In the sense that you take something that has a capability and you
deliberately disable that capability.


No, I think you're missing the point. I don't buy crippled chips. I want
the best technology that exists, not something that's deliberately had its
capabilities reduced for marketing reasons.


Exactly, in other words, a crippled chip.


How does crippling a chip save them money? Are you suggesting they can
sell a crippled chip for more than the corresponding uncrippled chip?

Never really asked anybody who would know for sure, but I'm assuming
that disabling part of the cache improves the yield. That is to say
that, where a smaller cache is the discriminator, they must be able to
sell some chips as "value" chips that would otherwise be unsalable.
That's only a slight generalization of binning.

Not so obvious that a similar strategy is worth the bother for the
transistors that make up the 64-bit extensions, but I wouldn't know
that it isn't.

Whatever the effects of such strategies on cost of goods sold, the
effect on cost of goods sold is not what ever drives the decision to
sell chips that have part of the capability deliberately disabled.

Such decisions are driven by a desire to segment the market and sell
essentially the same product to different market segments at different
price points. If done properly, the net effect is a win all around:
more chips are sold at an average lower price and the manufacturer
gets increased gross margin in absolute dollars if not necessarily in
percentage terms.

"What is it with Intel and crippled chips?" They're the master of
this particular game, that's all.

While you, as a consumer, can choose not to buy what you regard as
crippled chips, consumers would be the net losers if manufacturers
weren't smart enough to segment markets effectively.

RM
 
I have to agree. Carl is way off on this one. The support simply IS NOT
there. No beta OS can be considered support and isn't MS releasing 64-bit
Windows for the Advanced Server product? I haven't heard anything lately on
Windows XP Home edition for 64-bit.

Microsoft has released two x86-64 betas so far. The first is Win2K3
Server, the second is WinXP Professional. As of right now, there will
be no WinXP Home Edition.

Of course, outside of Microsoft-land, x86-64 support is coming along
very well. SuSE, Redhat and Mandrake have had their "enterprise"
Linux distributions out for close to a year now, and their "desktop"
Linux distributions for x86-64 all shipped last fall. Gentoo and
Fedora have full x86-64 support now and Debian is coming along too,
taking care of the main community-supported Linux distros. On the BSD
front, FreeBSD has a fully support port while NetBSD has a
finished-but-not-released port. And finally Sun has announced that
they plan to release Solaris for x86-64 sometime around mid-year.

In short, there is rather broad operating system support for the chip,
nearly everything except for WinXP Home.
If they are only supporting 64-bit on
the server OS for now, then Intel is smart for only doing 64-bit on Xeon
while waiting for the desktop support. Why support something that doesn't
exist? It's a waste of money. Good move by Intel on this one. The server
market is where 64-bit is needed for the most part. Enabling a feature that
won't be used due to non support doesn't make much sense.

MANY desktop users will be able to use x86-64 support. Not only all
the people running Linux, but also all of those running WinXP
Professional. Microsoft recommends WinXP Pro for all business desktop
users, and while I'm sure many don't follow that recommendation, I'm
sure that there are still LOTS of people running that version of
Windows.
 
Not foolish, smart! Do you think Intel is just sitting there? What do you
think the initial costs are in marketing, developing compilers, 64 bit
apps, OS's, drivers etc etc? Whos funding that? Not Intel, AMD! And once
that is all done, Intel releases their "32 bit extended to 64 bit" CPU to
the world (the one (or more)they are testing and refining now), and whos
ahead then? Remember, its about money, thats all, nothing more, just money
and in the end Intel will come out on top in this saga.
Eric
The Intal CPU will have these bugs... oh I mean errata sheet...
No one is gonna buy, everyone will wait for the next mask.
By the time they get it right AMD will be 128 bits ;-)
JP
 
Jan said:
The Intal CPU will have these bugs... oh I mean errata sheet...
No one is gonna buy, everyone will wait for the next mask.
By the time they get it right AMD will be 128 bits ;-)
JP

Well, no one is going to go to 128 bits any time soon.
There is no reason to. Actually there is really no need for 64 bits
on the home desktop. In the server (think big servers, ie nasdaq, banking
etc) world, 64 bits is badly needed and itanium (ia64) fills that bill very
nicely. x86-64 wont change that. The only reason x86-64 is even being
worked on is because it sits well with the marketing folks.
I am willing to bet that you will NEVER see AMD or Intel
produce a mainstream 128 bit processor (at the most, maybe a "contracted
for" specialty cpu for some kind of gov scientific use) in your lifetime.
As for bugs, surely you dont suggest that AMD is bug free?
Both companies work very hard to produce bug free chips, but
with the current chip complexity (in both camps) that is nearly
impossible.
Eric
 
Well, no one is going to go to 128 bits any time soon.
There is no reason to. Actually there is really no need for 64 bits
on the home desktop.

I'd tend to strongly disagree with this statement. As soon as you get
beyond about 2GB of memory in a desktop system you start running into
some real problems on a 32-bit system. Yes, you can get around them
with dirty hacks, but switching to 64-bit is the *proper* solution.
It is neat, tidy and it works well with less complexity, both on the
hardware side and especially on the software side.

2GB of memory isn't really all that much these days. The current
standard is 1GB for desktops (has been for almost a year), and that
number doubles roughly every 18 to 24 months. So by 2005, 64-bit
desktops will be rather badly needed, and by 2007 it will be rather
critical.

So when do you implement it? Ideally about 3-5 years ago! Software
is always a ways behind the hardware, so you need a fair bit of time
to get the ball rolling. AMD's introduction of 64-bit desktop chips
last year is, IMO, really rather late. Intel's plan to introduce
64-bit workstation/server chips this year is VERY late, though they
are fortunate that AMD has done a lot of the leg-work for them.
Intel's plan to only release 64-bit desktop processors in 2007 is
simply stupid if you ask me!
In the server (think big servers, ie nasdaq, banking
etc) world, 64 bits is badly needed and itanium (ia64) fills that bill very
nicely. x86-64 wont change that. The only reason x86-64 is even being
worked on is because it sits well with the marketing folks.

You can buy an Opteron system for $3000 and get a machine that will
outperform a $12,000 Itanium system for many applications while using
less power but otherwise having all of them same reliability features.

Which system do you think most companies will chose?

Also don't forget the workstation market, many people need 4GB+
*TODAY*, but the limits of 32-bit software make it completely useless
to have more than 3GB of memory. These people have had to go for
really expensive but comparatively slow Unix machines, simply for the
large memory access. There are tons of workstation users that are
ecstatic about the Opteron.
I am willing to bet that you will NEVER see AMD or Intel
produce a mainstream 128 bit processor (at the most, maybe a "contracted
for" specialty cpu for some kind of gov scientific use) in your lifetime.

You are probably correct, or at the very least any 128-bit processors
are unlikely to resemble any chips we know today. Of course, I don't
think ANYONE would believe back in 1978 when the 8086 was introduced
that we would still be producing new 8086 compatible chips 26 years
later, but here we are!

FWIW 64-bits would presumably start to become a limiting factor
somewhere between 2045 and 2075 if memory capacity continues to
increase at anything like the exponential rate we've seen over the
past 20 years. I certainly hope to still be kicking around in 2045
and maybe even until 2075 if medical science keeps it up.
As for bugs, surely you dont suggest that AMD is bug free?
Both companies work very hard to produce bug free chips, but
with the current chip complexity (in both camps) that is nearly
impossible.

Certainly all chips have bugs. For the last while AMD has had
noticeably fewer bugs than Intel if one simply counts the "errata"
sheets, though as I've mentioned previously in this newsgroup, that
could be either because AMD makes less buggy chips or simply that
Intel documents their chips better. Either way these bugs are almost
always rather inconsequential.
 
In comp.sys.ibm.pc.hardware.chips Eric said:
Well, no one is going to go to 128 bits any time soon.
There is no reason to. Actually there is really no need for 64 bits
on the home desktop.

32 bits is getting pretty tight with a single gigabyte of physical RAM, so
you're just wrong there. If you didn't use any redundant virtual addressing,
32 bits would carry you to 4gb physical ram, but Linux has issues with 1gb
on 32 bits, and even with a VAX/NT style 2gb/2gb split you'll start hitting
problems after just under 2gb of physical RAM.

At today's prices, 1gb is not that unusual for home systems, and the next
time we get a serious price drop -- probably a year or two, since the shift
to DDR2 will probably keep prices up for a bit -- seeing 2gb on home systems
won't be that unusual.
In the server (think big servers, ie nasdaq, banking etc) world, 64 bits
is badly needed and itanium (ia64) fills that bill very nicely.

Underperforming and overheated for a lot of workloads, not to mention
basically useless on legacy x86 software.

And > 2gb physical ram, or even > 4gb physical ram is no long the domain of
really big servers. x86-64 gives that option, and I expect to see it taken
advantage of.
x86-64 wont change that. The only reason x86-64 is even being worked on is
because it sits well with the marketing folks.
I am willing to bet that you will NEVER see AMD or Intel produce a
mainstream 128 bit processor (at the most, maybe a "contracted for"
specialty cpu for some kind of gov scientific use) in your lifetime.

Certainly not in the near future. A real 64 bit virtual address is enough
for SASOS stuff, and the current 40-48 bit "64-bit" systems will be enough
for physical memory for many more doublings yet -- even if things keep going
at their most optimistic, we've got a decade or two.

My lifetime may be about another 50 years; one of the few uses for a 128-bit
virtual address space I can think of is to push SASOS stuff out to a
bigger-than-today's internet. 64 bits should be fine for it for now, but
increasingly it's not sounding THAT big given the need to avoid
predictability or collisions.
 
I'd tend to strongly disagree with this statement. As soon as you get
beyond about 2GB of memory in a desktop system you start running into
some real problems on a 32-bit system.

There's a BIG difference between 32 bit processing vs 64 bit,
and how much memory you can address.

Memory addressing is NOT tied directly to the processing bits.





To reply by email, remove the XYZ.

Lumber Cartel (tinlc) #2063. Spam this account at your own risk.

This sig censored by the Office of Home and Land Insecurity....
 
Never anonymous Bud said:
There's a BIG difference between 32 bit processing vs 64 bit,
and how much memory you can address.

If you can't do arithmetic on > 32 bit pointers, you're going to have extra
trouble using more then 32 bits of linearly-addressable virtual memory. PAE
is a hack.
Memory addressing is NOT tied directly to the processing bits.

"Processing bits" is an ambiguous term. It can mean the ALU width, the width
of certain registers (general purpose or not) or the abstract virtual
address width.

The three need not be the same. But they often are, and I can't think of any
major non-embedded architectures left where they aren't.
 
You can buy an Opteron system for $3000 and get a machine that will
outperform a $12,000 Itanium system for many applications while using
less power but otherwise having all of them same reliability features.

Which system do you think most companies will chose?

I think quite a number of companies might still be choosing the
Itanium, because it's an Intel (music) :P Especially if the guy making
the recommendation knows his boss only knows what's Intel and he's
liable to get royally screwed if even one thing goes wrong on the
Opteron (whether its actually the hardware or not).

FWIW 64-bits would presumably start to become a limiting factor
somewhere between 2045 and 2075 if memory capacity continues to
increase at anything like the exponential rate we've seen over the
past 20 years. I certainly hope to still be kicking around in 2045
and maybe even until 2075 if medical science keeps it up.

Off topic, why would anybody want to live until like 90+ ? or even 70+
? We'll look more like raisins and prunes than human then and even the
best plastic surgery has its limits. Not to mention the idea of being
able to do nothing except sitt around the whole day, unable to see
what's really on the screen, requiring a nurse to take me to the
bathroom and all really turns me off this whole longevity and forever
living stuff.
--
L.Angel: I'm looking for web design work.
If you need basic to med complexity webpages at affordable rates, email me :)
Standard HTML, SHTML, MySQL + PHP or ASP, Javascript.
If you really want, FrontPage & DreamWeaver too.
But keep in mind you pay extra bandwidth for their bloated code
 
There's a BIG difference between 32 bit processing vs 64 bit,
and how much memory you can address.

Memory addressing is NOT tied directly to the processing bits.

Actually, these days it really is. While we used to define "bitness"
to be the size of the registers or even the width of the data bus,
when it comes to 32 vs. 64 bit CPUs, the real definition is how much
memory the chips can address without resorting to ugly hacks.

Most 64-bit CPUs, including the Opteron, will usually use 32-bit (or
smaller) integers anyway, even though they can handle up to 64-bit
integers natively. The fact that they have 64-bit wide registers
really isn't all that crucial as compared to the fact that they can
handle 64-bit pointers.
 
I think quite a number of companies might still be choosing the
Itanium, because it's an Intel (music) :P Especially if the guy making
the recommendation knows his boss only knows what's Intel and he's
liable to get royally screwed if even one thing goes wrong on the
Opteron (whether its actually the hardware or not).

That's a possibility, though more and more I think it's a question of
Dell vs. HP rather than Intel vs. AMD.
Off topic, why would anybody want to live until like 90+ ? or even 70+
? We'll look more like raisins and prunes than human then and even the
best plastic surgery has its limits. Not to mention the idea of being
able to do nothing except sitt around the whole day, unable to see
what's really on the screen, requiring a nurse to take me to the
bathroom and all really turns me off this whole longevity and forever
living stuff.

I don't know about you, but I know several people in their 70s still
living rather active lifestyles, very little pruning going on there.
Heck, my uncle is in his 70s now he is only just starting to get his
first few gray hairs. I'm not one for the whole sitting around and
needing help for the basic necessities of life, but I think it's
entirely reasonable to feel that I'll be enjoying my life well into my
70s and perhaps beyond.
 
nicely. x86-64 wont change that. The only reason x86-64 is even being
worked on is because it sits well with the marketing folks.
I am willing to bet that you will NEVER see AMD or Intel
produce a mainstream 128 bit processor (at the most, maybe a "contracted
for" specialty cpu for some kind of gov scientific use) in your lifetime.
As for bugs, surely you dont suggest that AMD is bug free?
Both companies work very hard to produce bug free chips, but
with the current chip complexity (in both camps) that is nearly
impossible.
Eric
One point, 64 bit, NEEDED, what we need as a fator of > 100 (preferably
100) increase in speed for video desktop processing.
And that will only increase with HDTV.
ALready now people are asking for 'on the fly' encoding of HDTV on the PC.
That would require a terahertz clock or so.
64 bits helps.
JP
 
Crippled, how so? Did you not get what was advertised and what you paid for?
All Intel is doing is adding some transistors to a chip, testing them
internally and disabling them before selling the chip on the market, they
never claimed that part to be even available. Taking the costs of spinning
silicon into account, I'd say they have a rather innovative way to save
money, wouldn't you?

Many computer manufacturers have done that. But while they are saving
some money on design, still, those poor transistors are going to
waste. Why can't they just make only chips with those transistors
enabled, and sell them at the price of a chip with them disabled,
since the marginal cost of making them is the same either way?

You mean it actually costs money to design these chips? (Well, of
course it does, but we assumed they sold so many that it added up to
1/10 of a cent per chip sold or something like that.)

John Savard
http://home.ecn.ab.ca/~jsavard/index.html
 
Eric said:
Well, no one is going to go to 128 bits any time soon.
There is no reason to.

there is at least one - marketing.
In the server (think big servers, ie
nasdaq, banking etc) world, 64 bits is badly needed and itanium
(ia64) fills that bill very nicely. x86-64 wont change that. The
only reason x86-64 is even being worked on is because it sits well
with the marketing folks.

No, AMD64 is being worked on because x86 works ok for the most jobs
average server does. In fact it works BEST from the cost/job
(TCO) standpoind.
I am willing to bet that you will NEVER
see AMD or Intel produce a mainstream 128 bit processor

You got it. I shall give you my bank account number when that happends.
(at the
most, maybe a "contracted for" specialty cpu for some kind of gov
scientific use) in your lifetime.

More like for the new game console.


Pozdrawiam.
 
Many computer manufacturers have done that. But while they are saving
some money on design, still, those poor transistors are going to
waste. Why can't they just make only chips with those transistors
enabled, and sell them at the price of a chip with them disabled,
since the marginal cost of making them is the same either way?

Probably because it costs a lot more (IIRC from past discussions, it's
in the millions) to make another set of photolith masks to make them
without these transistors totally. Thus it does make more sense to
disable them rather than make a totally different chip.

Not to mention, with disable-able parts, a chip that fails partially
(i.e. one of the HT controllers, or part of the cache) could well be
ok as a crippled part.


--
L.Angel: I'm looking for web design work.
If you need basic to med complexity webpages at affordable rates, email me :)
Standard HTML, SHTML, MySQL + PHP or ASP, Javascript.
If you really want, FrontPage & DreamWeaver too.
But keep in mind you pay extra bandwidth for their bloated code
 
I don't know about you, but I know several people in their 70s still
living rather active lifestyles, very little pruning going on there.
Heck, my uncle is in his 70s now he is only just starting to get his
first few gray hairs. I'm not one for the whole sitting around and
needing help for the basic necessities of life, but I think it's
entirely reasonable to feel that I'll be enjoying my life well into my
70s and perhaps beyond.

*looks around locally* Darn, I think it must be the air here and the
lifestyle... I should migrate to a colder and slower paced country...

Generally speaking, it seems to me here that by the time we hit 60~70
here, there isn't much of a 'life' so to speak, unless you are oodles
rich. :(


--
L.Angel: I'm looking for web design work.
If you need basic to med complexity webpages at affordable rates, email me :)
Standard HTML, SHTML, MySQL + PHP or ASP, Javascript.
If you really want, FrontPage & DreamWeaver too.
But keep in mind you pay extra bandwidth for their bloated code
 
Well what's the point of disabling the feature at the chip level?

Marketing? Perhaps it's not tested so they can't guarantee that
function/compatibility? They're not selling the feature and
there is a significant risk of not "getting it right", so it
makes sense not to market the feature until it is well tested
(and there is a market).
 
In the sense that you take something that has a capability and you
deliberately disable that capability.

Oh, my. Pretty much all RAM widgets have bits that you can't
see. You paid fer 'em, you insist on using them? PLease. You
paid for a defined function, and they delivered it. If they had
to put a few transistors in there to make themselves happy,
what's your bitch?
No, I think you're missing the point. I don't buy crippled chips. I want
the best technology that exists, not something that's deliberately had its
capabilities reduced for marketing reasons.

Then pay for it. This is *nothing* new. Intel's 80106 and 80188
were the same chips, bonded out differently. THe 80386 and
80386SX were a similar deal. Get over it. You *got* what you
paid for.
Exactly, in other words, a crippled chip.

Boohoo. You paid for what was advertised. Did it do what was
advertised? Yes or no.
How does crippling a chip save them money? Are you suggesting they can
sell a crippled chip for more than the corresponding uncrippled chip?

Certainly. They can also sell the "uncrippled" chip for more
money, though keeping the cost the same.

You'r likely be really pissed to know that IBM sells it's Regatta
servers (and the 'z' stuff) with all processors and disables the
ones you didn't "pay" for. At least with the "z" stuff you can
call up and have it enabled for a given time, for a price.

Oh, the horrors of capitalism!
 
a?n?g?e? said:
Probably because it costs a lot more (IIRC from past discussions, it's
in the millions) to make another set of photolith masks to make them
without these transistors totally. Thus it does make more sense to
disable them rather than make a totally different chip.

....and what about the transistors that are there only for
test/debug/performance monitoring? They're *never* going to be
used again. They're sooo lonely, boohooooo.

***FREE THE CAPTIVE TRANSISTORS***
Not to mention, with disable-able parts, a chip that fails partially
(i.e. one of the HT controllers, or part of the cache) could well be
ok as a crippled part.

Sure, but I don't think that's the primary issue. The point is
that you got what you paid for. How the manufacturer delivers
that advertised function shouldn't be of your concern.
 
Back
Top