If a video cards specifications says that it needs 250Watts minimum to run
and my power supply is 250 Watts output maximum, does this mean that the card
might not run at full speed, or that it might be a strain on the computer?
You'll get better answers if you specify what video card you're
looking at. Not knowing that, I would say that the card probably uses
about 20 watts. Otherwise, they would recommend a larger power supply.
People grossly overestimate the power consumption of computers. Here
are a couple of computers and the inupt power to the power supply:
Pentium 2.4C Northwood
AOpen AX4C Max II
512 MB RAM
eVGA FX5500 256MB AGP
MyHD MDP-120
FusionHDTV II
WinTV 401
1 Seagate ST3200822A 200 GB Hard Drive
2 Seagate St3300831A 300 GB Hard Drives
NEC ND-2500A DVD+/-RW Drive
Floppy Drive
Power Supply: Works W64BF-SBL 400w
Input Power (CPU 0 - 1 %): 152 Watts; PF: .68
(HT CPU1+2 95 %): 170 Watts; PF: .69
Athlon XP 2500+ Barton
Epox EP-8RDA+
512 MB RAM
ATI AIW 7500 Radeon AGP
FusionHDTV II
2 160 GB Hard Drives
CD-RW Drive (48X)
Floppy Drive
Power Supply: Enhance ENP-2120H 200W
Input Power (CPU 0 - 1 %): 123 Watts; PF: .66
(CPU 100 %): 138 Watts; PF: .69