2 socket/4 core AMD FX system coming

  • Thread starter Thread starter Yousuf Khan
  • Start date Start date
In comp.sys.ibm.pc.hardware.chips Mark said:
Unfortunately, this doesn't answer the question we are discussing. The issue
is whether a 4-core AMD FX system would be preferable to a dual-core Conroe
processor for home users.

For an *average* home user, either the 4-core FX or an Extreme Edition
Conroe is likely to be pissing money away (good only for bragging rights.)

That said, for typical home workloads, any decent 4 cores will likely
outperform 2 (even if ridiculously high end), for everyone but hardcore
gamers because the big issues which use up CPU cycles will be (A)
multimedia, (B) multitasking, and (C) general windows responsiveness.

For hardcore gamers, it will totally depends on whether games take advantage
of multithreading beyond two threads. My guess is that once you're at two,
scaling it to n- isn't that hard, but I'm not a game programmer.
 
In comp.sys.ibm.pc.hardware.chips Yousuf Khan said:
There's also the added benefit that if this technology works for FX's,
they'll work for other lower-ranked AM2 processors too, such as X2's and
even Semprons.

While in theory it should still work, who's to say which ones it will/won't
be enabled for. Have they actually said?
Of course, the first crop of 4X4 motherboards will be aimed at the
high-end, but perhaps in a few months, they'll have economy 4X4
motherboards too?

Perhaps! We can certainly hope.
 
Nate said:
While in theory it should still work, who's to say which ones it will/won't
be enabled for. Have they actually said?

I think an AMD executive has let slip that it will work with X2 CPUs
too. If it's working on X2 and FX, then what's to prevent it from being
enabled on single-core A64 and Sempron, all of which are running on AM2
sockets too?

Yousuf Khan
 
In comp.sys.ibm.pc.hardware.chips Yousuf Khan said:
I think an AMD executive has let slip that it will work with X2 CPUs
too. If it's working on X2 and FX, then what's to prevent it from being
enabled on single-core A64 and Sempron, all of which are running on AM2
sockets too?

Well, that's very good news if there's no slip in the other direction before
such things actually become available... although dual Sempron/A64 I think
will remain a novelty, given the price cuts for X2s.

Now, two bottom of the line X2 chips (3800+ or whatever's then-current)
sounds like an exceptionally attractive option.
 
Nate said:
For hardcore gamers, it will totally depends on whether games take advantage
of multithreading beyond two threads. My guess is that once you're at two,
scaling it to n- isn't that hard, but I'm not a game programmer.

Those poor game designers. Now they'll not only not know whether
you've got built-in Intel graphics or a pair of 512MB PCIe cards
running SLI, they'll also not know if you've got 1,2, or 4 CPU's in
your system. 8)

Would make me go straight to game-console land. 8)
 
chrisv said:
Those poor game designers. Now they'll not only not know whether
you've got built-in Intel graphics or a pair of 512MB PCIe cards
running SLI, they'll also not know if you've got 1,2, or 4 CPU's in
your system. 8)

Would make me go straight to game-console land. 8)

It is not any easier there, considering comping 8 CPU play-station...

Regards,
Yevgen
 
Evgenij said:
It is not any easier there, considering comping 8 CPU play-station...

Cetainly it is easier, though, since all Playstation 3's will have the
same hardware.
 
For serious home power users, though - either work from home or hobbyist
multimedia, etc - a multicore can make a big difference. DVD encoding
while doing serious work can slow down the machine a lot, for one example,
and letting one core crank on that can really help with both
time-to-complete and performance of your other applications.

So, lets say you're running the usual suspects under WinXP - a few
browsers, Agent, an MP3 player, burning a DVD, and video encoding in
the background.

Does the system recognize the relative loads and balance them,
dedicating one core to your encoding while leaving the other to handle
everything else, or do different apps contend for the cores? Does the
user have any control over such things at all?

Sorry if this is a stupid question, but I have no insight into how
WinXP handles issues like this.

max
 
So, lets say you're running the usual suspects under WinXP - a few
browsers, Agent, an MP3 player, burning a DVD, and video encoding in
the background.

Does the system recognize the relative loads and balance them,
dedicating one core to your encoding while leaving the other to handle
everything else, or do different apps contend for the cores? Does the
user have any control over such things at all?

The user doesn't have too much controller at all over such things, but
WinXP (or Linux, or any other modern OS) is not just randomly sending
processes all over the place. All modern OSes use a certain degree of
"processor affinity" to keep a process (and threads as well) to a
particular logical CPU.

It's important to have some degree of processor affinity in any
multi-CPU setup (including a single P4 chip with Hyperthreading),
otherwise you'll tend to spend a lot of time thrashing data around in
and out of cache. In current CPUs you usually have at least 90%
(often over 95%) of memory access being handled by L1 cache, so if a
task is being tossed in between two processors a lot of time gets
wasted doing cache snoops and such.

Now, of course, just HOW you decide what task gets loaded onto what
CPU is a whole other matter. A lot of time and effort is spent in
optimizing the task scheduler in an OS to tune this sort of thing. Of
course, every different hardware and software setup requires different
tuning here. A system with a single dual-core Core 2 Duo chip, with
it's large shared L2 cache and common path to main memory is going to
be a bit different from a pair of single-core Opterons, each with
separate L2 cache and separate local memory, and both are going to be
VERY different from a P4 with hyperthreading.

Similarly there are a lot of software situations. In your example
above you've got one single background task taking up a LOT of
processing time with a variety of other tasks with low CPU
requirements. This is quite different from something like a webserver
where you'll have MANY tasks all with a similar level of processing
requirements.

FWIW it is possible to set a particular application to "stick" to a
single logical processor in WinXP through Task Manager. Just
right-click on the process in question and select "Set Affinity".
Note that this option is only visible if you've have a multiprocessor
system.
 
In comp.sys.ibm.pc.hardware.chips max said:
So, lets say you're running the usual suspects under WinXP - a few
browsers, Agent, an MP3 player, burning a DVD, and video encoding in
the background.

Does the system recognize the relative loads and balance them,
dedicating one core to your encoding while leaving the other to handle
everything else, or do different apps contend for the cores?

It tries to. Fairly well on the 2P, 1P/2-core, and 2P/4-thread systems I've
seen Windows run on. Fairly well on the first two for Linux, although my
experience with linux and hyperthreading is a bit more mixed.

It looks likely we may get a 2P, 2-core-per-processor, hyperthreaded system
(dual bensleys, to eval a platform prior to the woodcrest availability,
followed by a bunch of the same machine with dual woodcrests in a couple of
months when available if we like it) and I am very curious how 8x virtual
processors will do. This is for development/QA workloads, though.
Does the user have any control over such things at all?

Yes. It should not, normally, be necessary but can at times be useful
nonetheless. On a 2+ processor (real or virtual), right clicking on a
process will let you "set affinity" which is the set of processors a given
process is allowed to run on.

One case where it's very handy is if you have a multithreaded application
that is otherwise prone to taking over the machine. The video encoding app
would be the likely suspect (although setting it to low priority would
likely also work; many such apps put long cpu-intensive processes on low to
begin with) ... at my work, we run local copies of sql server developer
edition, and setting it to run only on one core/virtual processor seems to
help avoid it "taking over the whole machine" at times.
Sorry if this is a stupid question, but I have no insight into how
WinXP handles issues like this.

Nope, good question!
 
In comp.sys.ibm.pc.hardware.chips Tony Hill said:
FWIW it is possible to set a particular application to "stick" to a
single logical processor in WinXP through Task Manager. Just
right-click on the process in question and select "Set Affinity".
Note that this option is only visible if you've have a multiprocessor
system.

Real or virtual. I believe there's also a cmd.exe command line option for
it, and there are various APIs (setting sql server affinity is done through
the sql server configuration, for example.)

I'm not aware of any easy user tool to do it on Linux, but there must be;
there are certainly APIs to do process affinity.
 
In comp.sys.ibm.pc.hardware.chips chrisv said:
Those poor game designers. Now they'll not only not know whether
you've got built-in Intel graphics or a pair of 512MB PCIe cards
running SLI, they'll also not know if you've got 1,2, or 4 CPU's in
your system. 8)

My impression, as a programmer but not a game programmer, is that it's
harder to write things for "exactly 2" than it is for "2 or more." Threads
are light, and once you start using 'em, it's often for a lot of them.

Obviously, being able to depend on a single fast CPU is easier, but I doubt
we'll see the kind of fast improvements in CPU speed we've seen in the past
few years for at least several years going ahead.
Would make me go straight to game-console land. 8)

Would you prefer 8 threads on the PS/3, 6 on the 360, or dunno-how-many on
the Wii?
 
The people who don't strain their home computers hard enough to require
multiprocessing, won't need multiprocessing. Those who do strain them
hard enough to require multiprocessing, need multiprocessing. Simple as
that! There's not one single requirement that describes all home users.

This product is targeted at gamers. How many games can use 4 cores,
let alone need them?
Those that do require multiprocessing, will initially go for single
multi-cores, then if they find even that's not enough, they will go one
step beyond and go for multi-processing & multi-cores. There's also the
added benefit that if this technology works for FX's, they'll work for
other lower-ranked AM2 processors too

You think so? I rather doubt it.
, such as X2's and even Semprons.
Who knows maybe a dual-Sempron might have some cost advantages over a
single X2 in some cases? Gives the consumer some level of flexibility.
Of course, the first crop of 4X4 motherboards will be aimed at the
high-end, but perhaps in a few months, they'll have economy 4X4
motherboards too?

Yes, right after those economy 16 socket motherboards show up...

Wake up yousuf...

DK
 
My impression, as a programmer but not a game programmer, is that it's
harder to write things for "exactly 2" than it is for "2 or more." Threads
are light, and once you start using 'em, it's often for a lot of them.

That depends entirely on the algorithm(s). From my POV it's easier to get
something tangibly useful with up to 2 cores than it is with >2. Sure
there are many multi-threaded programs out there but how many with
substantially *simultaneous* parallel threads are there? Most of the
threads you see in an average Windows app are nothing to do with processing
efficiency.
 
Back
Top