Do we (nearly) all use totally oversized power supply units ?

  • Thread starter Thread starter Jason Stacy
  • Start date Start date
J

Jason Stacy

Recently I bought a power device for measuring the real power needed by a computer.
I put this measuring device just between the computers power cable and the power plugin in the wall.

After some days of measuring I was really surprised.

My computer (AMD Athlon XP 64 + PCIexpress) need on average 82 Watt!
Maximum during these days was 112 Watt.

So why do I need a 350 power supply unit in my computer?
Moreover 350W - 400W seems to be the standard.

When one take into account that the power supply unit operates most efficently only
when its is near to full capacity then it seems to me that all these 350W power supply units
are completely oversized.

150W would be sufficient.

All these gamers which run a high level video/graphic cards could buy a 400W PSU.

But for the "normal" office user this is bull shit.

Am I right ? Or what is the reason for these high capacity PSU ?

Jason
 
Recently I bought a power device for measuring the real power needed
by a computer. I put this measuring device just between the computers
power cable and the power plugin in the wall.

After some days of measuring I was really surprised.

My computer (AMD Athlon XP 64 + PCIexpress) need on average 82 Watt!
Maximum during these days was 112 Watt.

So why do I need a 350 power supply unit in my computer? Moreover
350W - 400W seems to be the standard.

When one take into account that the power supply unit operates most
efficently only when its is near to full capacity then it seems to me
that all these 350W power supply units are completely oversized.

150W would be sufficient.

All these gamers which run a high level video/graphic cards could buy
a 400W PSU.

But for the "normal" office user this is bull shit.

Am I right ? Or what is the reason for these high capacity PSU ?

The rated power on a power supply is the total of all the outputs. It
is quite easy to reach the limit of, say, the 3.3V output while having
plenty to spare on -12V.

Replacing a ~300W PSU with a 450W (I think) stopped a machine I had
from crashing fairly reliably any time the CD drive and two or more
hard drives were accessed at the same time.

The peak power used any part in a computer is *much* higher than the
average. You want some headroom in your power supply to survive if
several parts should happen to peak at the same time. It is quite
possible that your machine never reached its peak power usage during
your test period.
 
In comp.sys.ibm.pc.hardware.misc Jason Stacy said:
Recently I bought a power device for measuring the real power needed
by a computer. I put this measuring device just between the
computers power cable and the power plugin in the wall.
After some days of measuring I was really surprised.
My computer (AMD Athlon XP 64 + PCIexpress) need on average 82 Watt!
Maximum during these days was 112 Watt.
So why do I need a 350 power supply unit in my computer?
Moreover 350W - 400W seems to be the standard.
When one take into account that the power supply unit operates most
efficently only when its is near to full capacity

Actually most are at peak efficiency around 70% load.
then it seems to
me that all these 350W power supply units are completely oversized.
150W would be sufficient.

Not at all.
All these gamers which run a high level video/graphic cards could
buy a 400W PSU.
But for the "normal" office user this is bull shit.
Am I right ? Or what is the reason for these high capacity PSU ?

For example startup-current of HDDs. For each HDD you have to have
about 30W of reserves. Also for peak CPU load. And then, the wattage
cannot be drawn from each ouutput, but the thypicl 350W PSU gives you
maybe 180W on +12V. If you take into account that most of the HDD
startup current comes from there and that most modern CPUs also draw
their power from there, then using 2 HDDs and a current CPU, you can
already have somethinf like up to 150W when bott disks are pulled from
sleep mode during high CPU activity. Add the CDROM with another 20W on
+12V and the mainboard with another 10W, and you have zero reserves
for a graphics card.

Arno
 
Recently I bought a power device for measuring the real power needed by a
computer. I put this measuring device just between the computers power
cable and the power plugin in the wall.

After some days of measuring I was really surprised.

My computer (AMD Athlon XP 64 + PCIexpress) need on average 82 Watt!
Maximum during these days was 112 Watt.

So why do I need a 350power supplyunit in my computer?
Moreover 350W - 400W seems to be the standard.

You have simply looked at a tip of a big iceberg. Most computer
assemblers have little idea how electricity works. Therefore they
hype only two numbers - watts and dollars. Well a power supply
marketed to the naive may be 400 watts. But that same supply if
provided by responsible computer manufactures might be 280 watts. Did
the manufacturer lie? No. He is simply selling to a market where the
consumer does not do numbers. So he used a different watts number.

Meanwhile, a system also has other 'actions'. For example, CPU
specs say it might go from 1 amp to many tens of amps - and make that
demand in microseconds. Power supply must be able to feed CPU power
supply with a sudden large current.

And then we have the many voltages. Which voltage needs how many
watts? We don't really know. So every voltage must be rated
sufficiently higher. Some voltages may end up providing few watts
whereas another provides many more. Therefore the supply is made even
larger so that any possible load is covered by all voltages.

Having posted this, well, sizing a power supply is only about on
watts number when the naive are doing engineering. In reality,
current from each voltage is more important. Each power supply
voltage must provide sufficient current to each voltage. How do
computer assemblers solve this? Rather than learn which voltage was
drawing too much current, they shotgun - "More Power". They recommend
more watts - bigger supplies.

I could continue. But the point should be obvious. We tend to
install "more power" rather than learn and solve a problem. Just
another reason why brand name computers 'appear' to have smaller power
supplies when the manufacturer is really using honest numbers and when
one voltage is not massively overpowered.

I assume your measuring device also does power factor. Another
problem that means a computer consumes power less efficiently.
Meanwhile, European standards for power supplies also address that
power factor problem. In this past decade, as more "computer experts"
have less knowledge (eye glaze over as soon as the numbers arrive),
many new standards are now created by Europe. American power supplies
do not address this power factor problem. Not (yet) a serious
problem. But another reason why a supply may consume more power. So
many little things contribute to this 'iceberg'.
 
Jason said:
Am I right ? Or what is the reason for these high capacity PSU ?

Some power units may seem over sized. But did you measure current
drain during start-up of the machine?

Disks may require during spin-up from 4 to 6 time the power they use
during normal read/write operations and from 10 to 15 time the power
they use when idle.

The same is true for CD/DVD units. Writing requires a lot of power.

And also the drain of the CPU itself varies in a wide range depending
on the load.

Ciao
Giovanni
 
Recently I bought a power device for measuring the real power needed by a computer.
I put this measuring device just between the computers power cable and the power plugin in the wall.

After some days of measuring I was really surprised.

My computer (AMD Athlon XP 64 + PCIexpress) need on average 82 Watt!
Maximum during these days was 112 Watt.

So why do I need a 350 power supply unit in my computer?

Maybe you don't. But if you expand your conputer, you might.
When one take into account that the power supply unit operates most efficently only
when its is near to full capacity

That is not generally true. Looking at the power supply test in c't
24/2006, I see that the tested power supplies have 71%-83% efficiency
at 20% load, 76%-86% efficiency at 50% load, and 72%-82% at full load.

If you strive for efficiency, it's more important to get an efficient
power supply than one that is more than 50% loaded. BTW, if you shop
around for a new power supply (especially a tightly sized one), look
up at which voltage your motherboard consumes to most power, and check
how much power the PSU can deliver at that voltage.
150W would be sufficient.

Maybe. A few stories:

- We recently bought a Dual-Xeon server. Our vendor could not get it
to boot with a 600W (IIRC) power supply, so this machine got an 800W
PSU. The highest power consumption that we measured on this machine
is 423W.

- My computer has a 350W power supply, like yours. It used to consume
up to 180W, and its PSU was sufficient for that. So for your 112W,
you probably can use something smaller, if you can get it.

- Several years ago a friend of mine bought a broken Elitegroup
motherboard for an Athlon or Duron system. He's the kind of person
who never does returns; instead, he tinkered around with various
combinations of CPUs, RAMs, and PSUs, and he did get this board to
run with exactly one PSU, a ridiculously small one (IIRC 125W); with
the stronger power supplies it failed.

- anton
 
Recently I bought a power device for measuring the real power needed by a computer.
I put this measuring device just between the computers power cable and the power plugin in the wall.

After some days of measuring I was really surprised.

My computer (AMD Athlon XP 64 + PCIexpress) need on average 82 Watt!
Maximum during these days was 112 Watt.

So why do I need a 350 power supply unit in my computer?

Maybe you don't. But if you expand your conputer, you might.
When one take into account that the power supply unit operates most efficently only
when its is near to full capacity

That is not generally true. Looking at the power supply test in c't
24/2006, I see that the tested power supplies have 71%-83% efficiency
at 20% load, 76%-86% efficiency at 50% load, and 72%-82% at full load.

If you strive for efficiency, it's more important to get an efficient
power supply than one that is more than 50% loaded. BTW, if you shop
around for a new power supply (especially a tightly sized one), look
up at which voltage your motherboard consumes the most power, and
check how much power the PSU can deliver at that voltage.
150W would be sufficient.

Maybe. A few stories:

- We recently bought a Dual-Xeon server. Our vendor could not get it
to boot with a 600W (IIRC) power supply, so this machine got an 800W
PSU. The highest power consumption that we measured on this machine
is 423W.

- My computer has a 350W power supply, like yours. It used to consume
up to 180W, and its PSU was sufficient for that. So for your 112W,
you probably can use something smaller, if you can get it.

- Several years ago a friend of mine bought a broken Elitegroup
motherboard for an Athlon or Duron system. He's the kind of person
who never does returns; instead, he tinkered around with various
combinations of CPUs, RAMs, and PSUs, and he did get this board to
run with exactly one PSU, a ridiculously small one (IIRC 125W); with
the stronger power supplies it failed.

Followups set to comp.os.linux.hardware

- anton
 
Power supplies are designed X 3 of the total power consumption. In other
words, if max draw is 3 amps then the power supply is designed to supply 3 X
3 or 9 amps. I don't know who told you that power supplies work well at max
demand. That is not true. You want max draw to be @ 1/3 of the capability
of the power supply.

Also, the power supplies in computers are designed to have all pci ports in
use, etc. In other words, maxed out with hardware and all working to the
max. If you are not using all devices then the amperage draw is less.

BSEE
 
Jason Stacy said:
Recently I bought a power device for measuring the real power needed by a
computer.
I put this measuring device just between the computers power cable and the
power plugin in the wall.

After some days of measuring I was really surprised.

My computer (AMD Athlon XP 64 + PCIexpress) need on average 82 Watt!
Maximum during these days was 112 Watt.

So why do I need a 350 power supply unit in my computer?
Moreover 350W - 400W seems to be the standard.

When one take into account that the power supply unit operates most
efficently only
when its is near to full capacity then it seems to me that all these 350W
power supply units
are completely oversized.

150W would be sufficient.

All these gamers which run a high level video/graphic cards could buy a
400W PSU.

But for the "normal" office user this is bull shit.

Am I right ? Or what is the reason for these high capacity PSU ?

There is a big difference between average power (which a what you are
measuring) and transient power. Most drives that involve mechanical
rotating parts take a brief transient amount of power to get the disk
spinning. The power supply has to be rated such that it can handle all of
this at the same time along with transients from the rest of the system
(main processor, graphics processor and so on and so forth).

It was such a large problem with SCSI hard disk drives, that the system was
jigged such that the drive didn't start until it was told to. They could
then be started one at a time (bearing in mind that SCSI bus systems
supported up to 7 drives - 15 in later configurations).
 
Back
Top