AMD 64 webcast

  • Thread starter Thread starter methylenedioxy
  • Start date Start date
M

methylenedioxy

Just watched the first hour of this and have to say that I am totally
unconvinced by it. The launch started off as being about opening up the
imagination and how 64 bit would do that, but then showed things like
wireless div x players, digital media "improving" (but didn't show how),
games having faster fps etc (with 128 and 256 bit on gfc cards, how on earth
is a 64 bit cpu going to help? is it going to start rendering instead of the
gpu's :s). All AMD did there was show a really long, bad advert trying to
repackage 64 bit computing from 32 bit, and doing it badly.
With ddr2 almost here, cpu's that still have a long way to go before
reaching breaking point, faster and faster fsb's etc on 32 bit platforms
still to come, do we really need 64 bit computing? I don't think so, but
that may just be me, another 2 or 3 years and I might move up, but we still
have to hit the end road for 32 bit first :)

Bad launch if you ask me anyway........Anyone else watch it?
 
methylenedioxy said:
Just watched the first hour of this and have to say that I am totally
unconvinced by it. The launch started off as being about opening up the
imagination and how 64 bit would do that, but then showed things like
wireless div x players, digital media "improving" (but didn't show how),
games having faster fps etc (with 128 and 256 bit on gfc cards, how on earth
is a 64 bit cpu going to help? is it going to start rendering instead of the
gpu's :s). All AMD did there was show a really long, bad advert trying to
repackage 64 bit computing from 32 bit, and doing it badly.
With ddr2 almost here, cpu's that still have a long way to go before
reaching breaking point, faster and faster fsb's etc on 32 bit platforms
still to come, do we really need 64 bit computing? I don't think so, but
that may just be me, another 2 or 3 years and I might move up, but we still
have to hit the end road for 32 bit first :)

Bad launch if you ask me anyway........Anyone else watch it?

hmmm... sounds familiar... didnt the same thing happen when we went from
16bit to the 32bit pentiums?
 
methylenedioxy said:
Just watched the first hour of this and have to say that I am totally
unconvinced by it. The launch started off as being about opening up the
imagination and how 64 bit would do that, but then showed things like
wireless div x players, digital media "improving" (but didn't show how),
games having faster fps etc (with 128 and 256 bit on gfc cards, how on earth
is a 64 bit cpu going to help? is it going to start rendering instead of the
gpu's :s). All AMD did there was show a really long, bad advert trying to
repackage 64 bit computing from 32 bit, and doing it badly.
With ddr2 almost here, cpu's that still have a long way to go before
reaching breaking point, faster and faster fsb's etc on 32 bit platforms
still to come, do we really need 64 bit computing? I don't think so, but
that may just be me, another 2 or 3 years and I might move up, but we still
have to hit the end road for 32 bit first :)

Bad launch if you ask me anyway........Anyone else watch it?

No, but I've always known there's no magic about 64-bit. I have a 300MHz
64-bit UltraSparc standing here, doing no more revolutions than my Dual
32-bit celeron =)

64-bit is just an evolution, not a revolution.

/M
 
hmmm... sounds familiar... didnt the same thing happen when we went from
16bit to the 32bit pentiums?

My, aren't we behind the times? :)

32-bit in the x86 era started with the 80386 a great many years before the pentium premiered.

Of course, due to microsoft being a slouch, it took until 1995 with win95 before 32-bit computingstarted to get into its own in the PC universe (and it took until the launch of winxp some seven, years later before it became totally embraced across all segments of the market.
 
do we really need 64 bit
computing? I don't think so,

Then you meant "you" instead of "we", didn't you?

Of course, I'm really sure you really didn't mean to massively offtopic
crosspost, either - right?
 
do we really need 64 bit
computing? I don't think so,

Then you meant "you" instead of "we", didn't you?

Of course, I'm really sure you really didn't mean to massively offtopic
crosspost, either - right?

No, It was a rhetorical question which I was answering myself. Maybe you
should grasp the english language?
As for it being offtopic and crossposted, not really as Nvidia are pushing
this technology along with Ati for their gfx cards, so not off topic at all.
 
still to come, do we really need 64 bit computing? I don't think so, but
that may just be me, another 2 or 3 years and I might move up, but we still
have to hit the end road for 32 bit first :)
As the Opteron 2Ghz is faster than just about any current PC processor
in just 32bit mode, I am quite looking forward to seeing its gaming
performance on a 64bit OS and the future faster models.
 
methylenedioxy said:
Just watched the first hour of this and have to say that I am totally
unconvinced by it. The launch started off as being about opening up the
imagination and how 64 bit would do that, but then showed things like
wireless div x players, digital media "improving" (but didn't show how),
games having faster fps etc (with 128 and 256 bit on gfc cards, how on earth
is a 64 bit cpu going to help? is it going to start rendering instead of the
gpu's :s). All AMD did there was show a really long, bad advert trying to
repackage 64 bit computing from 32 bit, and doing it badly.
With ddr2 almost here, cpu's that still have a long way to go before
reaching breaking point, faster and faster fsb's etc on 32 bit platforms
still to come, do we really need 64 bit computing? I don't think so, but
that may just be me, another 2 or 3 years and I might move up, but we still
have to hit the end road for 32 bit first :)

Bad launch if you ask me anyway........Anyone else watch it?

I'm sure those who have very large extensive databases disagree with
you. 64-bits helps those users out by being fast with database
operations.

It's true that 64-bits is not really...something for on the
desktop..not yet.

But for servers, it's a great liked future :-)

Ah, I would love to have a CS server on a dual opteron..but ah
well..money...
 
Here's a stupid question:

What is 64-bit computing?

People talk about it as being the next big thing, and with the G-5 and
opteron, its on its way.

I was under the impression that its the size of the instruction which
is executed by the CPU. The longer the instruction, the more complex
and varied instructions/actions can be executed in one clock cycle.

Is my understanding wrong?

jus curious,
path0021
 
methylenedioxy said:
Then you meant "you" instead of "we", didn't you?

Of course, I'm really sure you really didn't mean to massively offtopic
crosspost, either - right?

No, It was a rhetorical question which I was answering myself. Maybe you
should grasp the english language?

Yeah, and maybe YOU should note what newsgroup you're posting in.


As for it being offtopic and crossposted, not really as Nvidia are pushing
this technology along with Ati for their gfx cards, so not off topic at
all.

This newsgroup is about nVidia video cards, it's not about their nForce
chipset or what AMD is doing. Your post is off topic here.

TMC
 
I didn't watch the launch so can't comment much on it except for what you
are saying. It should be noted though that in 64-bit mode this CPU offers
much more than just 64-bit compatibility. In 64-bit mode, the main thing
you get other than 64-bit is the ability to access 8 extra registers. The
beauty is that all of these extra registers are general purpose i.e. the
programmer can use them for whatever they want.

Intel CPU's (and compatible ones as well) have always had basically two
problems as I recall:

1) The instruction length could be variable.
2) The Intel CPU doesn't have all that many general purpose CPU's.

#1 has been dealt over the years and is basically moot point now. However,
to maintain compatibility, the number of registers on the Intel CPU's
haven't changed. Yes, they've added registers if I recall corrrectly but
they have been for new instructions such as SSE2 meaning only SSE2
instructions can access the new registers. The other registers which have
always been there are fairly numerous but only a precious few are general
purpose. Since AMD came out with a 64-bit CPU, they [wisely] decided to go
a bit further: they added 8 extra registers that were general purpose that
can be used, and only used, in 64-bit mode.

Programs converted to 64-bit and which can use these extra registers (let's
face it, Wordpad isnt' getting any faster hehehe) have shown tremendous
potential opportunity for performance gains. CounterStrike Server (but not
the client) and UT2003 have been converted to AMD 64-bit. Both have shown
improvements of greater than 20%-45% compared to their 32-bit counterparts.
I think the press releases are on AMD's website, check them out if you can
find them. 20% gain in peformance isn't nothing to gloss over in a game. I
think some other games have been converted as well, not sure though.

Not only have games been converted, but several apps have as well. Of
course databases have been converted and have shown improvement. But,
several animation applications have been converted (the kind of stuff used
to make the Star Wars and Lord of the Rings movies) and have shown large
improvements in performance both from a memory management standpoint and the
use of the extra regsiters.

Something else to consider: Windows 64-bit. Because Windows is able to take
advantage of the extra windows, windows itself will run faster. When
running 32-bit apps on Windows 64-bit, they should in turn run a bit
faster/smoother, probably not by much though.

I guess the launch didn't talk much about the extra registers and such which
would've left knowledable users wondering how it was 64-bit was going to
improve their games. However, 64-bit mode has some suprises to it that
64-bit software should be able to take advantage of. The future is quite
exciting.

Chris Smith
 
Here's a stupid question:

What is 64-bit computing?

People talk about it as being the next big thing, and with the G-5 and
opteron, its on its way.

I was under the impression that its the size of the instruction which
is executed by the CPU. The longer the instruction, the more complex
and varied instructions/actions can be executed in one clock cycle.

Is my understanding wrong?

jus curious,
path0021

Not the instructions. The Registers. In a 64Bit processor, the
general purpose registers would be 64bits wide.

There would, of course, be new instructions to deal with the increased
size of the registers, but the instructions themselves can be any
width -- at least on CISC processors such as the x86 family.
 
methylenedioxy said:
digital media "improving" (but didn't show how),
games having faster fps etc (with 128 and 256 bit on gfc cards, how on earth
is a 64 bit cpu going to help? is it going to start rendering instead of the
gpu's :s).

A good CPU will do more than most people think... Try having a Radeon 9800
pro on an old Duron and expect to get 200fps on something!!!
 
Sounds like an Intel ad on TV here some months back. They like to advertise
their processors with all sorts of chicks dancing around to music, fast
moving graphical eq displays, then have some voice over crapping on about
unleashing the power etc. Seems to me all they showed was PC software and
some ass, no Intel processors, everything shown can be achieved just the
same without Intel.

But I guess for the stupid masses, they dont see computers as a series of
interconnected components running a plethora of different applications, they
just think INTEL because thays what the flashy ad said. The same people
that dash down to their local department store and buy some Compaq, Dell or
Gateway with a 2.4Ghz Intel and 128Mb PC133 on some integrated board, then
wonder why it runs like a shite and cant be upgraded 12 months later. Must
be good right, it has the latest 2.4Ghz intel!
 
spamtrap@localhost said:
As the Opteron 2Ghz is faster than just about any current PC processor
in just 32bit mode, I am quite looking forward to seeing its gaming
performance on a 64bit OS and the future faster models.

Also, Im sure AMD left plenty of room for ramping up their cpu speeds.
If 2Ghz is their launch speed and its already in the ball park of its 3+Ghz
compeditors, I can see a big future of them
 
My, aren't we behind the times? :)

32-bit in the x86 era started with the 80386 a great many years before the pentium premiered.

Of course, due to microsoft being a slouch, it took until 1995 with win95 before 32-bit computingstarted to get into its own in the PC universe (and it took until the launch of winxp some seven, years later before it became totally embraced across all segments of the market.

Win3.1 had 32 bit disk access for the swap file :-)

Oh, and there was Win/32 so some 32 bit programs could run. Win numpty
five, was still hybrid 16/32 bit.
 
JAD said:
http://english.aopen.com.tw/products/mb/

here comes some of the first offerings...
Theres loads out already, Asus make them, gigabyte make them etc etc (and
no, I am not getting confused with opterons which are an entirely different
socket!). They have also already been benchmarked and there is no difference
whatsoever to the current 32 bit cpus, indeed the Intel P4 (despite the crap
on the amd website with "their" benchmarks) 3.2ghz beats it hands
down.....heres a link to the benchmarks, certainly not worth paying the
£500+ for the cpu's anyway, ridiculous prices, and for what? 1mb of cache,
whoopee :s
Can't find the link, but it's a french site....
 
< here comes some of the first offerings>

left out "from Aopen'

Yeah hype hype hype........not gonna be the guinea pig or the first one to fork out the 500+
 
Back
Top