Opteron 939 Price-Trends?

  • Thread starter Thread starter andrew.gullans
  • Start date Start date
A

andrew.gullans

A month back, I read that AMD was slashing its Opteron prices to let go
of inventory before releasing socket M2. Lately, I've been watching
AMD CPU prices rather hawkishly (the darling being the 3200+ Venice).
Generally, we're watching stable prices with occasional dips, but
lately I've seen something rather alarming:

It appears that Dual-Core Socket-939 Opterons (Denmark) are slowly
going up in price. Recently, they've gone up about $17 (according to
Anandtech's pricegrabber and my own memory of pricewatch.com)....

Opteron 165 = $400
Opteron 170 = $500
Opteron 175 = $600
Opteron 180 = $700

.....or thereabouts. Are prices being inflated to reflect increased
demand in the shadow of reduced (or discontinued) supply, or is this
just the resellers realized that Opteron Denmarks are more
cost-effective than Athlon 64 X2s? ....or, is there something more
sinister afoot?

Has anyone else noticed a slight price increase, and should I lock in
an Opteron 175 or 180 now before the price skyrockets as these
discontinued superchips become rarer, or should I wait for prices to
start dropping again?
 
A month back, I read that AMD was slashing its Opteron prices to let go
of inventory before releasing socket M2. Lately, I've been watching
AMD CPU prices rather hawkishly (the darling being the 3200+ Venice).
Generally, we're watching stable prices with occasional dips, but
lately I've seen something rather alarming:

It appears that Dual-Core Socket-939 Opterons (Denmark) are slowly
going up in price. Recently, they've gone up about $17 (according to
Anandtech's pricegrabber and my own memory of pricewatch.com)....

Opteron 165 = $400
Opteron 170 = $500
Opteron 175 = $600
Opteron 180 = $700

....or thereabouts. Are prices being inflated to reflect increased
demand in the shadow of reduced (or discontinued) supply, or is this
just the resellers realized that Opteron Denmarks are more
cost-effective than Athlon 64 X2s? ....or, is there something more
sinister afoot?

Has anyone else noticed a slight price increase, and should I lock in
an Opteron 175 or 180 now before the price skyrockets as these
discontinued superchips become rarer, or should I wait for prices to
start dropping again?

A few weeks ago AMD actually raised prices on all Opterons. Late last year
rumor claimed that AMD was tightening up the retail channel except for legit
server builders. Again rumor has it that AMD wanted to slow down people
buying 144s/148s & 165s & overclocking them to the freakin MOON, getting the
performance of CPUs costing 1G cheap! Or just paying slightly less &
getting the same or a better CPU with no o'clocking. Basically it was to
"encourage" the enthusiasts to buy the Athlon64s rather than the Opts.

I got lucky buying a 148 before the price went up. I got it to 2860mgz
Prime95 stable, faster than a FX57 & I am quite happy with my purchase.
Today I would most likely get a 3700+ 1meg L2 & overclock that.
 
I'm watching the AMD Athlon 64 3800+. With Gigabyte mobo
and 1 gig ram that bundle is about $440. I think that will be about
optimum for gaming and multimedia. My tests ( at work ) are
showing me some ominous results with the so-called high end
systems using SLI, HDR, SM3.0. The games look good as slide
shows, but the in-game performance is not good at all. I see
crippled AI, and constant screen hiccups and lagging. I have to
play with settings for days, just to get back to what I can do on
an AMD 64 Athlon 3000+ using an ATI 9800 Pro. Actually the
9800 system is still my top gaming box. I'm guessing that the
most effective upgrade will be a small one to gain cpu speed,
lower latency, pci-e, and a video card that stays compatible
with in-game standards and settings. Add in that most of us
will be moving to Vista fairly soon ... meaning a year or more
of crashing and patching . I'm seeing that most of the big
guns are hardware incompatible. The Opterons may be fast,
but they are running on crap mobos with high latency. My
latest X2 build ( 4800+ ) kicks the pants off a dual Opteron
I have that is only about 1 year old. Still, neither system, X2
or Opteron, runs games as well as my AMD 64 Athlon 3000+
with an ATI 9800 Pro 128 ??????? Explain that ?? I have an
nVidia 7800GT in the X2. It is OK, but that is about all. It is
faster than the 3000+, but it does not out perform the 3000+
in games. I'm looking at an ATI 850 pci-e on a single AMD
Athlon 3800+. I suspect that will be optimum under the present
state of the art in games .. and the move to Vista.

johns
 
I'm watching the AMD Athlon 64 3800+. With Gigabyte mobo
and 1 gig ram that bundle is about $440. I think that will be about
optimum for gaming and multimedia. My tests ( at work ) are
showing me some ominous results with the so-called high end
systems using SLI, HDR, SM3.0. The games look good as slide
shows, but the in-game performance is not good at all. I see
crippled AI, and constant screen hiccups and lagging. I have to
play with settings for days, just to get back to what I can do on
an AMD 64 Athlon 3000+ using an ATI 9800 Pro. Actually the
9800 system is still my top gaming box. I'm guessing that the
most effective upgrade will be a small one to gain cpu speed,
lower latency, pci-e, and a video card that stays compatible
with in-game standards and settings. Add in that most of us
will be moving to Vista fairly soon ... meaning a year or more
of crashing and patching . I'm seeing that most of the big
guns are hardware incompatible. The Opterons may be fast,
but they are running on crap mobos with high latency. My
latest X2 build ( 4800+ ) kicks the pants off a dual Opteron
I have that is only about 1 year old. Still, neither system, X2
or Opteron, runs games as well as my AMD 64 Athlon 3000+
with an ATI 9800 Pro 128 ??????? Explain that ?? I have an
nVidia 7800GT in the X2. It is OK, but that is about all. It is
faster than the 3000+, but it does not out perform the 3000+
in games. I'm looking at an ATI 850 pci-e on a single AMD
Athlon 3800+. I suspect that will be optimum under the present
state of the art in games .. and the move to Vista.

johns

Have you got the latest drivers for the Nvidia card? How about the
latest CPU speed governors? (that's the Linux term, don't know what
Microsoft calls their Cool & Quiet drivers). An X2 4800+ should never ever
be slower then a 3000+. The 4800+ has a much faster clock rate, a bigger
cache for each processor, and an identical memory system. Even if you are
using only one of the cores on the 4800+ your system will be faster. The
only thing that I can think of that would cause the 4800+ to be slower
then a 3000+ would be if you haven't set the power management correctly.
If the X2 4800+ were running a 1GHz and the 3000+ was running at it's full
speed of 1.8GHz then the 3000+ would be much faster. XP has a power
management control panel, make sure that you've set your processors to run
at full speed.
 
When you say your 4800+ beats your dual Opteron setup, I'm wondering what
speed Opterons you are using, what video card in the dual Opteron box, and
how much you are overclocking your 4800+, if any. How much RAM does each
system have and are you running with an even number of sticks so you get the
full data rate? What OS's? What motherboards you are running these on?
And when you say "they (meaning Opterons) are running crap mobos with high
latency" I wonder what motherboards you mean?

BTW, my dual Opteron 252 setup is stable in every game and every piece of
software I've thrown at it. Running Windows XP Professional. It's fast, too,
with pretty much everything turned up to max in every game I play and I play
at 1600x1200 in most games. But then a lot of this playability is due to my
video card being a 7800 GTX so we can't really compare systems equally since
you are running a 7800 GT.

As far as latency goes, have you run SiSoft Sandra's memory speed tests on
all your systems to compare memory bandwidth & latency? I'm getting a memory
bandwidth score of over 11,000 and I can guarantee you there is no X2 system
out there that will touch this because at the moment, as far as I know, only
a dual or quad (or eight-way) Opteron system can do that kind of memory
bandwidth, using NUMA. So yeah, I'm hindered some by my RAM having to be
ECC/REG but I make up for that with such massive bandwidth and each
processor having it's own 4 banks of RAM.

It sounds like you have had problems with your dual Opteron setup, probably
due to some configuration issue, as General Schvantzkoph guessed. So please
do not be so quick to judge the inferiority of some type of part because it
didn't work for you. Easy to make that judgement (we all do it) but not
necessarily accurate.

Regarding your thoughts on getting an ATI 850 video card, I'd recommend
against that because that is one generation backwards. Or, if you consider
the jump from 1800 to 1900 a generation, then that would mean the 850 is TWO
generations old. IMHO ATI's 850 was the card to compete with the 6800 Ultra.
So if I were you, I'd keep that 7800 GT you have, which is a great card!
Definitely superior to an 850! The reason your X2 with 7800 GT is not
performing well is not the video card or the CPU's fault. There is something
else going on.
--
Scotter
Tyan Thunder K8WE
Dual Opteron 252s (2.6ghz)
6 gig DDR400 RAM
XFX 7800 GTX 256 w/VGAsilencerV3
500 gig SATA2 Hitachi
Dual 24" Dell LCDs
550W power supply
-
 
I haven't looked at the power management settings for the X2.
However, it is not speed that I am talking about. It is over-all
graphics quality and playability. My 9800 system is just beating
the crap out of the X2 with the 7800 card. I'm using the 81.98
driver and Coolbits to attempt to optimize settings, but the
best I can do is try to get back to what the 9800 does with
little or no tweaking except setting AGP to 4x and FastWrites
OFF. The X2 / 7800 system glitches at door openings in the
game Far Cry, and also, it will corrupt and crash if I set anything
to Very High. I've determined that is caused by autoenable of
HDR and SM3.0 which conflicts with AA and AF, and also
limits the AI much of which simply disappears. I know the AI
is crippled, because it is not crippled on my 9800, and I see
it. With this X2 / 7800, I should by default be able to run Far
Cry much better than on the 9800. Instead, I have a research
project going because of bad drivers which autoenble
incompatible hardware that the game does not support.
I switched to COD2 which is one of the latest games to see
if I could get a better idea of what is going on, and what I
discovered is that the X2 / 7800 tries to run COD2 in a higher
mode, but does not do a good job. And simply putting the
9800 in dx7 mode totally passes up the 7800 in both graphics
and playability. I don't care what benchmarks are saying
about these new systems. They are not optimized, and
they run like crap as a result. That is a lot of money for
nothing as far as I'm concerned. Also, I'm now convinced
that the reviewers out there are just lying. They've been
paid off or something, and they weasel-word their reviews
to be highly misleading. Tom's Hardware needs its ass
kicked. I've listened to those jerks the last time I'm going
to. From now on, I'm going to conduct my own tests, and
then come out on the net and kick their butts. They need
to stop it with the benchmarks from crappy software
vendors, and just run the game and use their own eyes.
That is the only benchmark that works these days. Their
tests claim my 9800 is 5 times slower in nearly every
benchmark as compared to the 6800. That is total garbage.
A 6800 has to be babied to death to get it to run Far Cry
at all. I saw that myself, on 2 different versions of the 6800
until I was just disgusted and sent both cards back for
restocking. Cost me about $50 each time, but no way
am I going to put up with that. The 9800 just murdered
both of those pigs in graphics quality and gameplay. There's
something rotten going on with nVidia.

johns
 
The something else is in the games themselves. The X2 / 7800
is trying to run incompatible rendering and shading that the
game for the most part simply doesn't support ... and the
7800 driver is not smart enough to back away from that, and
run the game properly. The dual Opteron I'm referring to
belongs to a research group that supposedly set it up
to run engineering calculations under Linux. Another group
could not afford a similar system to run the same software,
so I told them I could build them an X2 that might do what
they needed. The X2 beat the Opeteron so bad it is not
funny. Software is ALGOR and Fluent. I'm guessing the
Opterons problem is trying to send data all over a mobo,
and the latency is killing it. And then, I have an engineering
lab running AMD 64 Athlon 3200+ systems. ALGOR runs
fine on those systems, and essentially matches the
performance of the Opterons. The X2 beats both by more
than a factor of 2. I just read that if I go to the single 3800+,
I am likely going to get the same or better performance
in these calcs as the X2. Benchmarks on the 3800+
are already topping any of the comparable X2 systems
in games. But, when nVidia designs hardware that
simply won't run in the games, and conflicts with game
settings, it doesn't matter what I put that 7800 in. I'm
still going to go back to my 9800 if I want to enjoy
myself. I suspect the X850 is the last of a compatible
generation that is designed to actually do well in the
games ... and not just be a bragging rights video card
that can't get out of its own way like I'm seeing with
the 7800.

johns
 
Back
Top