J
John said:... the REAL scoop...
TheSmokingGnu said:Proprietary architectures make me a sad panda.
Let's just hope it's better than AMD's atrocious attempts at "quad"
core processing, else it'll be the end of both companies. Can't
compete in the market WITHOUT A FLANGING PRODUCT, guys!
TheSmokingGnu
chainbreaker said:You would think that the market would have learned something about
proprietary from Packard Bell. Guess not.
TheSmokingGnu said:Proprietary architectures make me a sad panda.
Let's just hope it's better than AMD's atrocious attempts at "quad" core
processing, else it'll be the end of both companies.
Anyone can fly, any time. Landings are a bitch though. ;^)Before doing something radical, AMD needs to do better in the conventional
cpu market. AMD is trying to fly when it cannot even run fast.
John Lewis said:
TheSmokingGnu said:Proprietary architectures make me a sad panda.
HockeyTownUSA said:Guys, you didn't take the red pill did you?
remember: 0104
Ham said:Can you explain how this architecture is any more proprietary than the
existing ones that are currently being used?
Or is Intel now considered a
standards body?
The said:April fools!
TheSmokingGnu said:If, say, I purchase an Intel chip now, it's just an Intel chip. It
only does the CPU job, and when I buy a motherboard, I'm free to use
essentially any other competing product (even the motherboard!) with
it.
Same thing with existing AMD chips. I can buy that chip and /only/
that chip. Any other component is freely choosable by the consumer.
This architecture locks the consumer into a choice of video hardware,
and further subjugates them by forcing their upgrade path into,
surprise surprise, more of the same. If they wanted to switch
mid-year from an ATI setup to an nVidia one, they're SOL, thus
proprietary.
J. Clarke said:And having video integrated on the chip prevents you from choosing a
"competing motherboard" how?
Nope, you are forced to buy AMD's memory manager. On Intel it's
separate.
So you're saying that that architecture would make it physically
impossible to stick an nvidia video board into a PCI Express slot?
If the integrated video gives you more performance than a separate board
can where's the problem? And if it doesn't then how long do you think
it's going to last in the market?
TheSmokingGnu said:Nuh uh!
http://www.dailytech.com/article.aspx?newsid=3471
Posted very /not/ on April Fool's.
TheSmokingGnu
If, say, I purchase an Intel chip now, it's just an Intel chip. It only
does the CPU job, and when I buy a motherboard, I'm free to use
essentially any other competing product (even the motherboard!) with it.
Same thing with existing AMD chips. I can buy that chip and /only/ that
chip. Any other component is freely choosable by the consumer.
This architecture locks the consumer into a choice of video hardware,
and further subjugates them by forcing their upgrade path into, surprise
surprise, more of the same. If they wanted to switch mid-year from an
ATI setup to an nVidia one, they're SOL, thus proprietary.
TheSmokingGnu said:It doesn't, that's not my point. This architecture forces the pairing
of CPU and GPU, that's the sticky bit. The point was that an
existing-architecture CPU purchase is just that, a CPU, not a CPU/GPU
combination.
Oh, well consider the nits picked. Next I suppose we can argue about
why I'm forced to use Samsung L2 cache, or why I can only use an
Award BIOS on my Giga-byte mobo.
My impression from the article was that the boards were radically
redesigned, and that the only upgrade path available was then either
to plug in another Fusion processor or else use a RenderX chip, and
that the board would not carry the necessary PCIe connections for
video cards.
If it is faster (and if they're using eDRAM, it may well be),
To some of you, I apologise for the posting date, but I needed to
cater for reception in a bunch of different time zones
John Lewis