T
Tony.McK
Hullooo, Folks!
A wee query regarding graphics cards:
Anyone know if the big power-hungry NVIDIA / ATI beasties scale back
their power consumption when outputting in 2D mode? Or do they just
carry on watt-gobbling regardless?
For eg: Many hardware review sites have graphs that show system
wattage consumption in these card's 3D 'idle' and 'load' states; say,
for a powerful gaming card like the 'GeForce GTX 280' they may show
wattage as 160 watts (idle) and 330 watts (full load) when running a
top-notch game - but reviewers never seem to state the power draw of
such cards when, say, you are using the computer for a more mundane
'2D task' - writing a letter in MS Word for instance. (Unless the idle
state **is** the '2D' state - in which case I've just made a right
public plonker of myself...) Also, the graphs do state 'system power
consumption' - but these big cards do account for the lion's share of
that consumption when gaming or outputting HD movies, etc.
Yes, I know what you're thinking: 'Tony, ya don't buy a friggin' GTX
280 to load up 'Word' on the screen!"
But, when the gaming is done, what about those every-day chores we all
have to do with our PCs... y'know, the grocery lists, letters, e-
mails, payin' bills, porn - ohmygoddidisaythat-
where'sthedeletekey! ;-)
Do these monster cards 'sense' the lighter visual requirement and
dramatically scale back their wattage apetites?
This scenario will become increasingly important as computers and
their graphics cards become more powerful, our tastes for digital
visual candy on our PCs grow, and as household electricity costs climb
relentlessly up the wattage ladder. Bet'cha it'll be the next 'big
thing' feature-wise.
The boffins are rising to meet the challenge: There are cards like the
new 'Asus EN9600GT MATRIX' which has 'built-in intelligence' that
senses the visual task at hand and scales wattage up or down to suite;
and a similar 'hybrid power' capability with some NVIDIA cards in
conjunction with GeForce mobos. That said, I've always thought that
the big NVIDIA / ATI cards scaled their power consumption to suite the
task anyway, but lately I've had my doubts about that assumption -
hence, this post.
Any advice appreciated.
Cheers, Tony McKee
A wee query regarding graphics cards:
Anyone know if the big power-hungry NVIDIA / ATI beasties scale back
their power consumption when outputting in 2D mode? Or do they just
carry on watt-gobbling regardless?
For eg: Many hardware review sites have graphs that show system
wattage consumption in these card's 3D 'idle' and 'load' states; say,
for a powerful gaming card like the 'GeForce GTX 280' they may show
wattage as 160 watts (idle) and 330 watts (full load) when running a
top-notch game - but reviewers never seem to state the power draw of
such cards when, say, you are using the computer for a more mundane
'2D task' - writing a letter in MS Word for instance. (Unless the idle
state **is** the '2D' state - in which case I've just made a right
public plonker of myself...) Also, the graphs do state 'system power
consumption' - but these big cards do account for the lion's share of
that consumption when gaming or outputting HD movies, etc.
Yes, I know what you're thinking: 'Tony, ya don't buy a friggin' GTX
280 to load up 'Word' on the screen!"
But, when the gaming is done, what about those every-day chores we all
have to do with our PCs... y'know, the grocery lists, letters, e-
mails, payin' bills, porn - ohmygoddidisaythat-
where'sthedeletekey! ;-)
Do these monster cards 'sense' the lighter visual requirement and
dramatically scale back their wattage apetites?
This scenario will become increasingly important as computers and
their graphics cards become more powerful, our tastes for digital
visual candy on our PCs grow, and as household electricity costs climb
relentlessly up the wattage ladder. Bet'cha it'll be the next 'big
thing' feature-wise.
The boffins are rising to meet the challenge: There are cards like the
new 'Asus EN9600GT MATRIX' which has 'built-in intelligence' that
senses the visual task at hand and scales wattage up or down to suite;
and a similar 'hybrid power' capability with some NVIDIA cards in
conjunction with GeForce mobos. That said, I've always thought that
the big NVIDIA / ATI cards scaled their power consumption to suite the
task anyway, but lately I've had my doubts about that assumption -
hence, this post.
Any advice appreciated.
Cheers, Tony McKee