George said:
Depends how Intel allocates its chipset production but nVidia has no
"Mainstream PC" chipset for Intel systems. I believe that, though there
was a small amount of Crossfire stuff, ATI's chipsets were used mostly in
lower cost integrated graphics Intel systems... though theres nothing to
stop the OEMs from continuing there: they don't have to buy Intel "mfr'd"
mbrds, which are sub-contracted anyway.... as long as Intel doesn't pull
the FSB license.;-)
I don't see Intel allocating its chipset production to low-cost integrated
graphics chipsets and umm, leaving the high-end to nVidia.
Even though integrated chipset motherboards have the reputation of
being cheap, I don't think the cost savings is acheived by reduced
margins on the chipsets, it's achieved by reducing the number of
discrete chips in the chipset. An integrated chipset should most likely
be more expensive to make and cost more than a non-integrated one, but
the overall motherboard costs go down due to fewer components. Also the
lower overall price makes them more attractive to buyers, so more of
them are sold.
So for that reason, I don't see Intel being too unhappy about making
just integrated chipsets for its own platforms. However, I also think
that once AMD integrates a GPU into its CPU, then all of its
motherboards would be integrated motherboards by definition, and the
chipsets used will become less important. You can use cheaper
non-integrated chipsets, which will drive the cost of those
motherboards down even further. If Intel does likewise (probably only
happen once they have an integrated memory controller), then integrated
chipsets will disappear off the face of the planet.
Yousuf Khan