W
Wayne
Probably a dumb question, won't be my first. But other than disk drive
motors and fans, what computer component needs all the 12V amperage in
new high power supplies?
I have an older Antec TruePower 380 watt supply, which furnishes
28A 3.3V, 30A 5V, and 18A 12V. Runs fine with four disks and a DVD. I
do have to disable Cool&Quiet to prevent hangs with the X2 chip, dont
know if that is power supply related or not. Runs fine without C&Q, but
it started my thinking about power, and if I need more.
My AMD X2 4800+ CPU runs on 1.35V. I always assumed this comes from the
3.3V, but I dont know how they increase amps to 67A. Could only be
with a transformer, but I am not aware of one. A toriod I suppose?
But 3.3V must be wrong, it must come from 12V, to have approx 12/1.35
= 8.9x current multipler... so 67A is closer to 67/8.9 = 7.5 amps at
12V. Is that conceptually right?
SDRAM memory runs at 2.6V or thereabouts.
My Nvidia video board chip (7600GS) is spec'd at 1.1V. Video boards
have extra power cables now, but I thought it was about 12V?
Big new power supplies, like the 720 watt Enermax, typically provide
25A 3.3V, 30A 5V, and 80A or 90A in three or four 12V sources.
Which is less 3.3V amps and same 5V amps as in my puny 380 watt job, of
roughly half power.
I can understand the isolation of multiple windings, but who uses all
that 12V amperage? Is it the source for the lower voltages? Probably
that higher multiplier is how they get greater current with a
transformer? But then what are the lower voltages used for?
What am I missing?
Generally, how are the 3.3V, 5V, and 12V allocated out to resources?
Thanks
motors and fans, what computer component needs all the 12V amperage in
new high power supplies?
I have an older Antec TruePower 380 watt supply, which furnishes
28A 3.3V, 30A 5V, and 18A 12V. Runs fine with four disks and a DVD. I
do have to disable Cool&Quiet to prevent hangs with the X2 chip, dont
know if that is power supply related or not. Runs fine without C&Q, but
it started my thinking about power, and if I need more.
My AMD X2 4800+ CPU runs on 1.35V. I always assumed this comes from the
3.3V, but I dont know how they increase amps to 67A. Could only be
with a transformer, but I am not aware of one. A toriod I suppose?
But 3.3V must be wrong, it must come from 12V, to have approx 12/1.35
= 8.9x current multipler... so 67A is closer to 67/8.9 = 7.5 amps at
12V. Is that conceptually right?
SDRAM memory runs at 2.6V or thereabouts.
My Nvidia video board chip (7600GS) is spec'd at 1.1V. Video boards
have extra power cables now, but I thought it was about 12V?
Big new power supplies, like the 720 watt Enermax, typically provide
25A 3.3V, 30A 5V, and 80A or 90A in three or four 12V sources.
Which is less 3.3V amps and same 5V amps as in my puny 380 watt job, of
roughly half power.
I can understand the isolation of multiple windings, but who uses all
that 12V amperage? Is it the source for the lower voltages? Probably
that higher multiplier is how they get greater current with a
transformer? But then what are the lower voltages used for?
What am I missing?
Generally, how are the 3.3V, 5V, and 12V allocated out to resources?
Thanks