Opteron gives a thrashing to Xeon...once again

  • Thread starter Thread starter nobody
  • Start date Start date
The little lost angel said:
Certainly so if the high ASP parts consume all their capacity. My
point about why should they bother with dog food was on the pretext
that the high ASP stuff doesn't consume all their capacity. Especially
since I doubt the demand for their high ASP parts would max out their
capacity except for unexpected spikes. Does anybody has figures to say
otherwise?

Say the high ASP only uses up 80% of capacity, then it makes sense to
use whatever's left to make dog food parts even if it makes them only
$1 per piece (profit, not priced at $1)

After all, $1 million in extra profits from dog food parts is still
$1M dollars, compared to letting the plant idle 20% of the capacity
just because they sold all the high end parts the market will take.


Erm tell Dell that? :PpPpP
I guess Tigerdirect would be considered dogfood.
But then they seem to be doing alright with AMD supplies for DIY'ers.
Never bought a ready made or see the point of doing so.
 
Arnold Walker said:
I guess Tigerdirect would be considered dogfood.
But then they seem to be doing alright with AMD supplies for DIY'ers.
Never bought a ready made or see the point of doing so.
I have to admit they(parts supplied from TigerDirect) are all single and
dual 244 processor motherboard based U2 rack mounts with a light touch of
single 820 and 16? processor based motherboards.I prefer the Systemax
version barebones when building for an engineering computer
guys.AMD based computers offer a budget priced alternative to Silicon
Graphics,etc., 64bit "ole timers".Them "mormon country" guys know what they
are doing there in Salt Lake.
Unsure of Microsoft since Engineer Pro(the software was 64bit 15or 20 years
ago ,much less now) still runs better in UNIX/LINUX.
AMD had less trouble than Intel on OS compatiblity issues.
Same story on lower level AMD Athlon/Semtron vrs Intel.
(Dell computer level processing...)
Same story on Nvidia graphics........
 
a?n?g?e? said:
Certainly so if the high ASP parts consume all their capacity. My
point about why should they bother with dog food was on the pretext
that the high ASP stuff doesn't consume all their capacity. Especially
since I doubt the demand for their high ASP parts would max out their
capacity except for unexpected spikes.

Remember that there is a significant delay between "hmm, we need
some dog food to sell" and the dog getting fed. OTOH, what if we
make one chip and, um, back-end customize it for the target market?
Does anybody has figures to say otherwise?

Certainly. AMD execs. ;-)
Say the high ASP only uses up 80% of capacity, then it makes sense to
use whatever's left to make dog food parts even if it makes them only
$1 per piece (profit, not priced at $1)

That's assuming that you can predict what the optimum mix will be
in, say, four months.
After all, $1 million in extra profits from dog food parts is still
$1M dollars, compared to letting the plant idle 20% of the capacity
just because they sold all the high end parts the market will take.

I don't see a glut (with commensurate price crash) of AMD64s on the
market either.
Erm tell Dell that? :PpPpP
Ok. ;-) Now if you want to stuff Mikey in a garbage can somewhere
would you do it in the dog food aisle of the supermarket?
 
I guess Tigerdirect would be considered dogfood.

They are.
But then they seem to be doing alright with AMD supplies for DIY'ers.
Never bought a ready made or see the point of doing so.


I have, a decade ago. I'd never do it today (other than a laptop,
of course).
 
Say the high ASP only uses up 80% of capacity, then it makes sense to
use whatever's left to make dog food parts even if it makes them only
$1 per piece (profit, not priced at $1)

After all, $1 million in extra profits from dog food parts is still
$1M dollars, compared to letting the plant idle 20% of the capacity
just because they sold all the high end parts the market will take.

That only makes sense if you can't save at least $1M by shutting down
that idle 20% of the plant. Keeping a modern fab up and running is an
expensive proposition. You need to be making enough profit on the
low-end to at least pay for all the costs involved. You also should
maybe consider what else you could be doing with that 20% fab
capacity, ie can you become an contract fab for other companies? Fab
space of the quality that AMD and Intel can offer could be worth quite
a premium.
Erm tell Dell that? :PpPpP

Dell is in a slightly different position than AMD in this regard.
Dell doesn't make dog-food parts, they just sell them. This means
that Dell's overhead and capital costs are MUCH lower. They do have a
certain per-unit price that they pay to purchase pre-made computers
from China, and then they have the whole cost of sales and support of
those computers, but they don't have to actually design or build
anything.

While Dell has gone around talking the talk about keeping jobs in the
US and opening new manufacturing facilities in North Carolina, I am
99% certain that most of what you'll find coming out of that site is
NOT low-end stuff. Things like workstations and servers (which are
almost always config-to-order builds) are MUCH cheaper to assemble
close to where they will be sold since the volumes of each exact
configuration is so small. But when it comes to the $299 PC, that is
a high-volume, one-size-fits-all type of PC, and THAT is what you hire
the likes of Foxconn to manufacture.
 
I would think so that given the commoditized nature of PC nowadays,
even the junk end is worth something. They already have the fabs, too
much capacity just to produce high ASP stuff. If they don't use the
full capacity, they are effectively losing money on the investment.
Even $1 per junk CPU will add up to quite a lot given the volume. At
the very least, it could help pay for the next fab or upgrade.

$1 profit per CPU that Dell sells translates to about $40M dollars a
year (~35M computers per year with only a small handful having
multiple processors). The cost to operate a modern fab is quite a bit
more than that. Intel or AMD could get by with selling, *maybe* 10%
of their volume with only $1 profit if they can offset that with a lot
of chips at $10-$50 profit and at least 10% of the chips with $100+
profit, but that's about it.

At a *VERY* rough guess I would say that the break-even point for AMD
would be somewhere around $10 profit/CPU as an average. Intel's
break-even point should be much lower (higher volume lets them offset
some R&D and capital costs more easily), but they have several other
unprofitable businesses to support as well as MUCH higher expectations
from their shareholders (if AMD makes 10% profit on CPUs it's seen as
a big success for the company, if Intel makes 40% profit it's seen as
a HUGE disappointment).
Also, if you capture a sizable share of the market, I think in some
ways it gives more credibility as well as brand awareness. Honestly,
if AMD still holds only 4% of the market like they did in early/mid
90s, would big companies trust them to stick around with the parts for
the big ticket items in 5 years time, regardless of how good the
processor is?

Now this is very true, and I think that's a lot of the reason why
Intel is willing to put up with the junk end of the market. In fact,
I think this is one lesson that Intel actually has learned in since
1997. Back then they tried to pretty much eliminate the low-end stuff
when they brought out the PII and tried to ditch the PentiumMMX. The
market responded by jumping all over AMD's K6 chips and all of a
sudden the near-dead AMD was a competitor once again. AMD wasn't
making any money, but at least people knew who they were.

Right now AMD has the luxury of not needing to be in this market. The
low-end of computer parts is a TOUGH place to compete in, especially
for North American or European companies (Indian and especially
Chinese companies tend to do much better here, for a variety of
reasons). However sometimes it's a necessary evil that companies must
face up to. Intel is in that position now, AMD might be in the
not-too-distant future.
 
Now this is very true, and I think that's a lot of the reason why
Intel is willing to put up with the junk end of the market. In fact,
I think this is one lesson that Intel actually has learned in since
1997. Back then they tried to pretty much eliminate the low-end stuff
when they brought out the PII and tried to ditch the PentiumMMX. The
market responded by jumping all over AMD's K6 chips and all of a
sudden the near-dead AMD was a competitor once again. AMD wasn't
making any money, but at least people knew who they were.

In fact, both Intel consumer processor brands came to life as the
result of Intel being forced to defend against AMD. The reason
Celeron brand came to life is pretty much described by Tony. And
soon-to-be buried Pentium was ment to be 586 - the natural
continuation of x86 line of CPUs. But Intel problem was that as a
result of lengthy legal battle (OK, only one of many implications
thereof) AMD retained the right to reverse-engineer and produce any
x86 Intel CPU, as long as it is sold not as i486 (for instance) but as
Am486. AMD became quite good at this - their 486DX2-80 were faster
than i486DX2-66 by a good margin and about by the same margin cheaper,
and were drop-in replacements. So Intel didn't want this to repeat
with 586. Therefore, "Pentium" - Greek for "Five" with Latin suffix
added.
These were only 2 of many Intel moves forced by AMD competition,
including premature birth of Pentium in Socket4 - the first Intel
attempt on space heater, the whole Rambust saga, SSE - belated answer
to 3dnow, and so on. Though admittedly the playing field was not
level, in many senses. Intel always had to hold the final deadly blow
because Intel needed AMD - as a near-dead weakling, but not quite
dead. Quite a few times AMD was so cheap that Intel could easily buy
them outright, probably they had enough cash on hand to do a hostile
takeover even when AMD was at all times high in 2000. But the very
day AMD gives its ghost the sights of trust busters in Washington DC
and all around the world - EU and Japan for sure, and probably some
smaller markets as well - would be trained firmly on Intel, with
possible consequences on Standard Oil scale.

NNN
 
In fact, both Intel consumer processor brands came to life as the
result of Intel being forced to defend against AMD. The reason
Celeron brand came to life is pretty much described by Tony. And
soon-to-be buried Pentium was ment to be 586 - the natural
continuation of x86 line of CPUs. But Intel problem was that as a
result of lengthy legal battle (OK, only one of many implications
thereof) AMD retained the right to reverse-engineer and produce any
x86 Intel CPU, as long as it is sold not as i486 (for instance) but as
Am486. AMD became quite good at this - their 486DX2-80 were faster
than i486DX2-66 by a good margin and about by the same margin cheaper,
and were drop-in replacements. So Intel didn't want this to repeat
with 586. Therefore, "Pentium" - Greek for "Five" with Latin suffix
added.

That's not 100% accurate. AMD was specifically forbidden from reverse
engineering ANY future Intel CPUs, regardless of what they were
called. They were given the right to continue selling their 486 chips
(though by the time the settlement was finished AMD had completed
their own 486 design). It didn't matter what Intel was going to call
their next generation processor, AMD was definitely NOT going to be
able to copy it. They were granted the right to use the Pentium (but
not PentiumPro) bus, but the processor cores had to be of AMD's own
design.

The 'Pentium' name stemmed from a separate but related arguments in
the courts over trademarks, not patents or reverse engineering or
anything like that. The courts ruled that Intel (or any other
company) could NOT trademark a number. So while they could (and did)
trademark i486, there was no legal means that Intel had to prevent AMD
from selling chips labeled Am486. Words, on the other hand, could be
trademarked, hence the "Pentium" name. No one other than Intel is
allowed to sell a processor with the name "Pentium" or some
sufficiently close derivation of that.
These were only 2 of many Intel moves forced by AMD competition,
including premature birth of Pentium in Socket4 - the first Intel
attempt on space heater,

While the Socket 4 Pentiums weren't exactly great chips, I don't think
AMD did much of anything to rush this. That chip arrived as expected
and more or less on schedule.
the whole Rambust saga,

Rambus is a whole other deal and didn't really involve AMD all that
much. In fact, if anything it was more VIA that forced Intel to drop
their RDRAM plans, not AMD. AMD's market share grew by 1 or 2
percentage points as a result of the Rambus affair. VIA, on the other
hand, went from something like 10% of the chipset business up to 50%
of the business in about a year.
SSE - belated answer to 3dnow, and so on.

SSE and 3DNow! started out at about the same time and both were kind
of natural extensions to MMX. AMD just happened to reach the finish
line quicker with a much simpler design. Intel's answer, SSE, took
longer and was more complicated, though also slightly more complete
and with more room to grow.
 
On Sun, 01 Jan 2006 00:16:42 GMT, "(e-mail address removed)"


That's not 100% accurate. AMD was specifically forbidden from reverse
engineering ANY future Intel CPUs, regardless of what they were
called. They were given the right to continue selling their 486 chips
(though by the time the settlement was finished AMD had completed
their own 486 design). It didn't matter what Intel was going to call
their next generation processor, AMD was definitely NOT going to be
able to copy it. They were granted the right to use the Pentium (but
not PentiumPro) bus, but the processor cores had to be of AMD's own
design.
Possibly 2 independent legal processes of that remote past mixed into
one in my memory, but still the main idea holds true. If there was no
viable competition in x86 universe, the word "Pentium" would be never
invented.
The 'Pentium' name stemmed from a separate but related arguments in
the courts over trademarks, not patents or reverse engineering or
anything like that. The courts ruled that Intel (or any other
company) could NOT trademark a number. So while they could (and did)
trademark i486, there was no legal means that Intel had to prevent AMD
from selling chips labeled Am486. Words, on the other hand, could be
trademarked, hence the "Pentium" name. No one other than Intel is
allowed to sell a processor with the name "Pentium" or some
sufficiently close derivation of that.


While the Socket 4 Pentiums weren't exactly great chips, I don't think
AMD did much of anything to rush this. That chip arrived as expected
and more or less on schedule.
I read someplace that socket4 was invented as proof of concept and
supposed to be used with engineering samples or something to that
extent. The real Pentium was ment to be Socket5 and supposed to come
out only after another process shrink. Yet because the competition on
486 level became too heated Intel decided to rush the release of
Pentium as it was, huge, clumsy, expensive, and hot (not as hot as
Prescott, but still hotter than anything else on the market back
then). One indirect evidence of the rush job was the FDIV bug.
Probably, if not for competition from AMD and, to lesser extent,
Cyrix, Intel would've milked 486 market for much longer before
releasing Pentium as it was ment to be.
Rambus is a whole other deal and didn't really involve AMD all that
much. In fact, if anything it was more VIA that forced Intel to drop
their RDRAM plans, not AMD. AMD's market share grew by 1 or 2
percentage points as a result of the Rambus affair. VIA, on the other
hand, went from something like 10% of the chipset business up to 50%
of the business in about a year.
However, lower price of PC133 vs. RDRAM added noticeably to Athlon
price/performance advantage. AMD licensed Rambus technology but
decided to go with SDRAM for a reason. IIRC, most benchmarks of that
time compared Athlon to P3 on Intel's own chipsets, not on cheaper,
lower-performing VIA, and the prices for complete systems being
benchmarked were most of the time provided as well.
The real RDRAM killer - DDR SDRAM - came out on AMD chipset first.
And the first VIA DDR chipset also was for Athlon, followed much later
by Pentium counterpart. Again, my memory may be not perfect.

NNN
 
Back
Top