Asus p5a w/ amd 450; Ok to run a little overvoltage??

  • Thread starter Thread starter Mel
  • Start date Start date
M

Mel

Hi, I've got an

Asus p5a(atx) with bios 1007.a
Amd k6-2 450mhz (4.5 x 100)
256mb pc100 mem
Matrox g200 video
Buslogic bt948 scsi
Dec de500 ethernet
Ensoniq soundscape/vivo90

I just started using it last week, before that I had something really
old :-)

I use dos, win nt4, sometimes linux, but mostly Os/2.
I seem to be having trouble with Os/2.

With the cpu chip jumpered to 2.1v, the power management only shows
it at 2.0v, with a cruising temp of about 40 degrees. But while the
board and chip are still warming up, I have a very hard time trying
to get Os/2 to boot at all. (It says 2.1v on the chip, btw)

If I jumper the cpu chip to 2.2v, power managment shows the voltage
flipping between 2.1v and 2.2v, with a crusing temp of about 36 degrees
(which doesn't sound very logical, that it should be lower ...)
If I do this, Os/2 seems to boot just great.

But I was wondering if running the chip just a little bit overvoltage
was going to be a Really Bad Idea (tm) I'm not over-clocking, and I
want this pc to last me a few years.

So, is this a safe thing to do??

Thanks.

--
 
if I understand you correctly, I think you would be fine, your not really doing much at all. But I would be concerned 'sortof'
about the PSU.
 
(e-mail address removed) (Mel) astounded us with: NScPdZk7sJiS@localhost:
Hi, I've got an

Asus p5a(atx) with bios 1007.a
Amd k6-2 450mhz (4.5 x 100)
256mb pc100 mem
Matrox g200 video
Buslogic bt948 scsi
Dec de500 ethernet
Ensoniq soundscape/vivo90

I just started using it last week, before that I had something really
old :-)

I use dos, win nt4, sometimes linux, but mostly Os/2.
I seem to be having trouble with Os/2.

With the cpu chip jumpered to 2.1v, the power management only shows
it at 2.0v, with a cruising temp of about 40 degrees. But while the
board and chip are still warming up, I have a very hard time trying
to get Os/2 to boot at all. (It says 2.1v on the chip, btw)

If I jumper the cpu chip to 2.2v, power managment shows the voltage
flipping between 2.1v and 2.2v, with a crusing temp of about 36 degrees
(which doesn't sound very logical, that it should be lower ...)
If I do this, Os/2 seems to boot just great.

But I was wondering if running the chip just a little bit overvoltage
was going to be a Really Bad Idea (tm) I'm not over-clocking, and I
want this pc to last me a few years.

So, is this a safe thing to do??

Thanks.

I recall a setting in the BIOS that was only for OS/2.
IIRC, something to do with a 64 meg boundary on the RAM
You set it for more than, or less than...
 
Mel said:
Hi, I've got an

Asus p5a(atx) with bios 1007.a
Amd k6-2 450mhz (4.5 x 100)
256mb pc100 mem
Matrox g200 video
Buslogic bt948 scsi
Dec de500 ethernet
Ensoniq soundscape/vivo90

I just started using it last week, before that I had something really
old :-)

I use dos, win nt4, sometimes linux, but mostly Os/2.
I seem to be having trouble with Os/2.

With the cpu chip jumpered to 2.1v, the power management only shows
it at 2.0v, with a cruising temp of about 40 degrees. But while the
board and chip are still warming up, I have a very hard time trying
to get Os/2 to boot at all. (It says 2.1v on the chip, btw)

If I jumper the cpu chip to 2.2v, power managment shows the voltage
flipping between 2.1v and 2.2v, with a crusing temp of about 36 degrees
(which doesn't sound very logical, that it should be lower ...)
If I do this, Os/2 seems to boot just great.

But I was wondering if running the chip just a little bit overvoltage
was going to be a Really Bad Idea (tm) I'm not over-clocking, and I
want this pc to last me a few years.


First off, regardless of what any tech manual says...
those AMD-450's are usually marked with the vcore voltage...( and i believe
that 2.2 volts is pretty normal)
it can actually vary a bit... I did a bit of experimenting and found that
changing the voltage by + or - .1 volt did not harm the chip...
I did some bench mark tests and found the cpu ran best at the mfg's rated
voltage....but a variation by .1volt made little real difference...

since a cpu draws a fixed amount of power for any specified task
and power is a product of current times voltage....if you reduce the
voltage...
the current will have to increase. Without getting into additional
electronic theory...
suffice it to say that increased current will result in increase temp.

BTW: I once forgot the change the vcore jumper for a machine with an AMD
cpu...and ran it at 2.8 volts...A few weeks later the cpu burned out...
but that was really a substantial over- voltage.
 
since a cpu draws a fixed amount of power for any specified task
and power is a product of current times voltage....if you reduce the
voltage...
the current will have to increase. Without getting into additional
electronic theory...
suffice it to say that increased current will result in increase temp.

Most of what you said was fine but this is totally wrong. The cpu does not
dtaw a fixed ammount of power for a specific tast. YOu reduce the voltage
and the current will also be lower. That reduces the ammount of power the
processor will use and the heat is lower. The speed is partly determined by
how fast the processor can swithch states. A higher voltage will help that.
If the voltage is too high the power dissapated in the chip is too much and
the insides of the chip melt or breakdown.

To overclock some processors it is helpful to raise the voltage but you run
the risk of melting down the processor. A bigger heatsink/fan helps out in
this case.
 
(e-mail address removed) (Mel) astounded us with: NScPdZk7sJiS@localhost:


I recall a setting in the BIOS that was only for OS/2.
IIRC, something to do with a 64 meg boundary on the RAM
You set it for more than, or less than...

Oh, I know what you're talking about. I've been given 2
completely different answers as to what that's all about;
(1) That it only applies to the really ancient versions of
Os/2 (like 2.1 or before)
(2) That it only applies to the latest version, when used
with certain bios's.

I'm not sure which is true. I've got a memory check utility
that tells how much my Os/2 can see. And it's all there, whether
or not I have that bios switch set.

Thanks

--
 
Most of what you said was fine but this is totally wrong. The cpu does not
dtaw a fixed ammount of power for a specific tast. YOu reduce the voltage
and the current will also be lower. That reduces the ammount of power the
processor will use and the heat is lower. The speed is partly determined by
how fast the processor can swithch states. A higher voltage will help that.
If the voltage is too high the power dissapated in the chip is too much and
the insides of the chip melt or breakdown.

To overclock some processors it is helpful to raise the voltage but you run
the risk of melting down the processor. A bigger heatsink/fan helps out in
this case.

Hi, I just thought of one idea for why power management looks like
it's running cooler with a slightly higher voltage. I recall hearing
that the way tempature sensors work is by increasing resistance as
the heat rises. But for a single-function circuit like that, not
getting enough voltage might be registered as an apparent increase
in resistance, and so interpeted by the bios as the cpu being hotter.

Or maybe that's not it, I don't know.

Another question I had, was whether I should trust the the board's
voltage sensors? It started out by saying the voltage was low, but
what if it's wrong instead of the power regulator being a little low?

Any thoughts?

Thanks
--
 
Ralph Mowery said:
Most of what you said was fine but this is totally wrong. The cpu does not
dtaw a fixed ammount of power for a specific tast. YOu reduce the voltage
and the current will also be lower. That reduces the ammount of power the
processor will use and the heat is lower. The speed is partly determined by
how fast the processor can swithch states. A higher voltage will help that.
If the voltage is too high the power dissapated in the chip is too much and
the insides of the chip melt or breakdown.

To overclock some processors it is helpful to raise the voltage but you run
the risk of melting down the processor. A bigger heatsink/fan helps out in
this case.

My statement was based purely on my knowledge of ohm's law...
and not on my knowledge of how a cpu works...so I certainly could have been
wrong.

At any rate...it is heat which will kill a cpu
 
philo said:
determined

My statement was based purely on my knowledge of ohm's law...
and not on my knowledge of how a cpu works...so I certainly could have been
wrong.

At any rate...it is heat which will kill a cpu

Even your knowledge of ohm's law is faulty. Unless dealing with some
oddball simiconductors that have a 'negative resistance' reagon , if the
voltage is reduced the current will be reduced also. The divice may not
work if this hapens. Computers work by switching on and off. The faster
the switching speed the more power is used in the chip. Think of it as a
lightbulb (forgetting about the hot vers cold resistance). If you switch it
on and off at a rate of once per second it will stay cooler than if you
switch it off and on at 100 times per second. The apparant light output
will also be greater. You still use the same voltage and current but the
ammout of power used is greater dut to the greater ammout of time the light
is on.


As far as the processor chips, the switching speed is one thing that
determins the ammount of power used all other things being equal. If you
keep the core voltage the same and start off at a low processor speed and
then increase the clock speed the power used will go up and so will the
heat. At some point the dv/dt (switching speed for a given voltage) will
not be able to keep up. If the voltage in then raised the dv/dt will then
be satisfied . This can be raised to a point to the ammount of power that
is dissapated in the device starts to 'melt' the internal workings of the
processor. If you could keep the processor in liquid nitrogen the speed
could be very fast before the heat was too much for the processor.
 
Even your knowledge of ohm's law is faulty. Unless dealing with some
oddball simiconductors that have a 'negative resistance' reagon , if the
voltage is reduced the current will be reduced also. The divice may not
work if this hapens. Computers work by switching on and off. The faster
the switching speed the more power is used in the chip. Think of it as a
lightbulb (forgetting about the hot vers cold resistance). If you switch it
on and off at a rate of once per second it will stay cooler than if you
switch it off and on at 100 times per second. The apparant light output
will also be greater. You still use the same voltage and current but the
ammout of power used is greater dut to the greater ammout of time the light
is on.


As far as the processor chips, the switching speed is one thing that
determins the ammount of power used all other things being equal. If you
keep the core voltage the same and start off at a low processor speed and
then increase the clock speed the power used will go up and so will the
heat. At some point the dv/dt (switching speed for a given voltage) will
not be able to keep up. If the voltage in then raised the dv/dt will then
be satisfied . This can be raised to a point to the ammount of power that
is dissapated in the device starts to 'melt' the internal workings of the
processor. If you could keep the processor in liquid nitrogen the speed
could be very fast before the heat was too much for the processor.

My knowledge of ohms law is not faulty but my command of the English
language leaves a bit to be desired...
Let me start over:

Let's say (just for an example) that a cpu is performing a task and that
during that task we'd measure overall power consumption and found it to be
15 watts.
If the voltage was dropped slightly but the cpu was performing a task that
still consumed 15 watts...then the total current would have to be higher.
That
is ohms law. Higher current for the same resistance will result in greater
heating.
Again...ohms law. But the error I made was *assuming* that if the voltage
was dropped the cpu would still draw the same amount of power per given
task.
 
My knowledge of ohms law is not faulty but my command of the English
language leaves a bit to be desired...
Let me start over:

Let's say (just for an example) that a cpu is performing a task and that
during that task we'd measure overall power consumption and found it to be
15 watts.
If the voltage was dropped slightly but the cpu was performing a task that
still consumed 15 watts...then the total current would have to be higher.
That
is ohms law. Higher current for the same resistance will result in greater
heating.
Again...ohms law. But the error I made was *assuming* that if the voltage
was dropped the cpu would still draw the same amount of power per given
task.

As I mentioned unless the device is operating in a 'negative resistance
reagon' , how can the same device use more current if the voltage is
lowered ? The resistance is the same. The power will also be lower. There
is no way to make it draw more current at the lower voltage.

I will agree if you are using a heater that produces 15 watts and cut the
voltage you have to raise the current if you still want 15 watts of heat.
The only way I know of to do this is to lower the resistance but you can not
do that with the processor. As the voltage is lowered the power used will
be lower. At some point the voltage will be low enough the processor can
not switch fast enough.
 
Ralph Mowery said:
As I mentioned unless the device is operating in a 'negative resistance
reagon' , how can the same device use more current if the voltage is
lowered ? The resistance is the same. The power will also be lower. There
is no way to make it draw more current at the lower voltage.

I will agree if you are using a heater that produces 15 watts and cut the
voltage you have to raise the current if you still want 15 watts of heat.
The only way I know of to do this is to lower the resistance but you can not
do that with the processor. As the voltage is lowered the power used will
be lower. At some point the voltage will be low enough the processor can
not switch fast enough.

First off...
What you said is quite true...if for example you lowered the voltage across
a resistor, the current would drop.

However, as I did a poor job of explaining...for a given amount of power...
the lower the voltage, the higher the current ...


IE: 15 watts = 5 volts X 3 amps

15 watts = 4.5 volts X 3.333 amps


So if the cpu is using the *same* amount of power , the lower the applied
voltage, the *higher* the current draw.


Now here is another example...
lets say we have a motor that lifts a 150# weight 10 feet.

If the motor is supplied with 100 volts or if it is supplies with 120
volts...
to lift the 150# weight , 10 feet...it does exactly the same amount of
work...

but it would have to draw more current at the lower voltage!

So what I did was compare the cpu to a motor...Instead of lifting the
weight...
the cpu would have to perform some equivalent task at two different applied
voltages... Of course, a cpu is considerably more complex than a motor...
so my analogy could certainly be wrong.
 
First off...
What you said is quite true...if for example you lowered the voltage across
a resistor, the current would drop.

However, as I did a poor job of explaining...for a given amount of power...
the lower the voltage, the higher the current ...


IE: 15 watts = 5 volts X 3 amps

15 watts = 4.5 volts X 3.333 amps


So if the cpu is using the *same* amount of power , the lower the applied
voltage, the *higher* the current draw.


Now here is another example...
lets say we have a motor that lifts a 150# weight 10 feet.

If the motor is supplied with 100 volts or if it is supplies with 120
volts...
to lift the 150# weight , 10 feet...it does exactly the same amount of
work...

but it would have to draw more current at the lower voltage!

How can it draw more current at a lower voltage ? I am not saying that it
would not have to , just what reasoning are you using to get it to use the
same ammout of power at a lower voltage ?

I can see that you still do not grasp the power part. You will not have 15
watts in the processor if you reduce the voltage. The current will drop and
and you will have less than 15 watts. If you take a DC motor and apply a
higher voltage to it, it will probably run faster. It will also draw more
current doing it. You use more power for a shorter time to do the same
ammount of work.

Simiconductors when used as switches work in a differant maner. In simple
terms , at a higher voltage they can switch from an off to an on state
faster. Any excess voltage at the same clock speed is just wasted in the
form of more heat for no more work being done. If a processor is using 15
watts to run at 1 ghz then raising the voltage a few tenths of a volt will
not cause it to do any more work. That is it will not run fster untuil the
clock speed is alos changed. The excess power is going to produce heat
with out any more work being done.
 
the above should be self-explainatory

to do 15 watts worth of work with a lower voltage...a *higher* current
is required


How can it draw more current at a lower voltage ? I am not saying that it
would not have to , just what reasoning are you using to get it to use the
same ammout of power at a lower voltage ?

To do the *same* amount of work it has to draw more current at a lower
voltage...or if the voltage would be raised...to do the *same* amount of
work...
less current is required.
I can see that you still do not grasp the power part. You will not have 15
watts in the processor if you reduce the voltage. The current will drop and
and you will have less than 15 watts.

What you said would be true of a resistor which is a *passive* device...
however a cpu is an *active* device and a motor is a *dynamic* device...

Again...I am not an expert on cpu's by any means...so I do not know
if my analogy is necessarily true.

If you take a DC motor and apply a
higher voltage to it, it will probably run faster. It will also draw more
current doing it.

If the voltage is raised to a motor *for the same amount of work* the
current will be less. Believe me, I work with DC motors on a daily basis

You use more power for a shorter time to do the same
ammount of work.

Power is strictly a matter of voltage times current... or watts

when you bring time into the equation you are now talking about watt-hours.

work...in the mechanical sense is a product of force X distance

time is not used when talking about work...


in other words... if you lift that 150# weight a distance of ten feet

the amount of work is identical if you do it on an hour or in a minute
that's simple physics (semester 1)
Simiconductors when used as switches work in a differant maner. In simple
terms , at a higher voltage they can switch from an off to an on state
faster. Any excess voltage at the same clock speed is just wasted in the
form of more heat for no more work being done. If a processor is using 15
watts to run at 1 ghz then raising the voltage a few tenths of a volt will
not cause it to do any more work.

If you raise the voltage and do not do any more work...
then the current draw would have to be less... at least with motors

but again by me comparing a cpu to a motor...is not necessarily correct


That is it will not run fster untuil the
clock speed is alos changed. The excess power is going to produce heat
with out any more work being done.
That makes sense... I think you are right there... especially of you go way
over
the cpu's rating...however that said

I think that if you vary the voltage *slightly* withing the general vicinity
of the cpu's rated voltage...there can be changes in the cpu's overall
efficiency

I have run bench mark tests on an AMD 450 with a vcore of 2.2 volts
and found (as should not be surprising) that it bench marked best right
at the factory rated vcore voltage.

But of course I did not take current ratings...but it would be interesting
to see
what actually happened.


Put it this way...think of me as a dc motor expert and a cpu dummy...
let's talk about cooling fans ok <G>
 
My knowledge of ohms law is not faulty but my command of the English
language leaves a bit to be desired...
Let me start over:

Let's say (just for an example) that a cpu is performing a task and that
during that task we'd measure overall power consumption and found it to be
15 watts.
If the voltage was dropped slightly but the cpu was performing a task that
still consumed 15 watts...then the total current would have to be higher.
That
is ohms law. Higher current for the same resistance will result in greater
heating.
Again...ohms law. But the error I made was *assuming* that if the voltage
was dropped the cpu would still draw the same amount of power per given
task.
Your knowledge of Watts law is faulty. Heat loss is directly related to
watts, not current. The heat produced by 2 devices drawing 15 watts is the
same regardless of wether it is a device pulling 15amps at 1 volt or 1amp
at 15 volts. Basic physics and electronics.

JT
 
Your knowledge of Watts law is faulty. Heat loss is directly related to
watts, not current. The heat produced by 2 devices drawing 15 watts is the
same regardless of wether it is a device pulling 15amps at 1 volt or 1amp
at 15 volts. Basic physics and electronics.


Yes, 15 watts is 15 watts and therefore of course that would produce the
same amount of heat total...

But look at it this way:


In any electrical circuit, you have a power source , the device and the
*wiring*.

Simply put you have an equivalent circuit consisting of two resistors.

one resistor is the device itself, the 2nd is the wiring supplying the
device...

so for example you have a total dissipation of 15 watts...
in an ideal world you'd want your device to be consuming the full 15 watts
and the wiring to consume 0 watts.

Now going to this example

3 amps X 5 volts = 15 watts

since E = I R the *total* resistance is

5 volts / 3 amps = 1.66 ohms


Now...just for the sake of this example

to keep the math easy let me assume that the device has a resistance of
..66ohms and the wiring has a resistance of 1 ohm


since power = current squared X resistance

it can be seen that at 1 ohm power is equal to current *squared*

thus as resistance increases power dissipation increases exponentially

thus to minimize heating losses it's always best to maximize voltage and
minimize current...

that's the reason the electric company uses such extremely high voltage
lines
to transmit power across the country


Here is an old story ...I can't verify the truth of it...but it should make
the point.

Way back in the old days...Thomas Edison and Henry Ford were good friends.
Edison was willing to supply DC power to the Ford plant...and since DC
voltage cannot be stepped up by transformers...If a DC generator were to
supply 240 volts (for example) over a distance of several miles...the
copper wires would have to be *extremely* heavy gauge..

On the other hand Westinghouse...who was supplying AC which can be
stepped up and down easily by the use of transformers...could raise the
voltage to thousands of volts to send power through relatively small
wires...
then step it back down.

As the story goes...Ford was all set to go with his buddy Edison
until they realized there was not enough copper on earth to supply
a huge amount of power at a low voltage...over a long distance.

Probably just a fable,,,,but hopefully I made the point...
that;s one of the reasons the automotive industry went from 6 volts to 12
volts
many years ago...
 
Yes, 15 watts is 15 watts and therefore of course that would produce the
same amount of heat total...

But look at it this way:


In any electrical circuit, you have a power source , the device and the
*wiring*.

Simply put you have an equivalent circuit consisting of two resistors.

one resistor is the device itself, the 2nd is the wiring supplying the
device...

so for example you have a total dissipation of 15 watts...
in an ideal world you'd want your device to be consuming the full 15 watts
and the wiring to consume 0 watts.

Now going to this example

3 amps X 5 volts = 15 watts

since E = I R the *total* resistance is

5 volts / 3 amps = 1.66 ohms


Now...just for the sake of this example

to keep the math easy let me assume that the device has a resistance of
.66ohms and the wiring has a resistance of 1 ohm


since power = current squared X resistance

it can be seen that at 1 ohm power is equal to current *squared*

thus as resistance increases power dissipation increases exponentially

thus to minimize heating losses it's always best to maximize voltage and
minimize current...

that's the reason the electric company uses such extremely high voltage
lines
to transmit power across the country


Here is an old story ...I can't verify the truth of it...but it should make
the point.

Way back in the old days...Thomas Edison and Henry Ford were good friends.
Edison was willing to supply DC power to the Ford plant...and since DC
voltage cannot be stepped up by transformers...If a DC generator were to
supply 240 volts (for example) over a distance of several miles...the
copper wires would have to be *extremely* heavy gauge..

On the other hand Westinghouse...who was supplying AC which can be
stepped up and down easily by the use of transformers...could raise the
voltage to thousands of volts to send power through relatively small
wires...
then step it back down.

As the story goes...Ford was all set to go with his buddy Edison
until they realized there was not enough copper on earth to supply
a huge amount of power at a low voltage...over a long distance.

Probably just a fable,,,,but hopefully I made the point...
that;s one of the reasons the automotive industry went from 6 volts to 12
volts
many years ago...

Your story is interesting, but an attempt to weasel out of what you said.
You still made the statement that higher current equaled higher heat. You
said nothing about supply wires. You also ignore that any circuit with
properly sized supply wires will have an insignificant loss in the supply
wires. If your supply wires are getting hot, they are too small. Basic ohms
law and engineering.

JT
 
Your story is interesting, but an attempt to weasel out of what you said.
You still made the statement that higher current equaled higher heat. You
said nothing about supply wires. You also ignore that any circuit with
properly sized supply wires will have an insignificant loss in the supply
wires. If your supply wires are getting hot, they are too small. Basic ohms
law and engineering.

JT

I'm glad you liked the story at any rate...
I had a thought in there somewhere...but it seems to have been lost.

At any rate...let me say that at this point let's forget about the cpu
entirely
it's obvious my analogy is no good...so I shall withdraw from beating that
dead horse...

But if I may get back to motors for a minute...at least that is something I
understand :

For a motor to do an identical amount of work.... due to I squared R
losses... there will be more heat generated at a lower voltage because the
current will have to be higher. As I said earlier I work with DC motors
every day.

The ones I work with are powered by large industrial batteries and I have
emperical results...as I have to take a lot of voltage temp. and current
readings.

Let's start out with a fully charged 36 volt battery. The open circuit
voltage
is about 38 volts when fully charged...under load it may drop to 35volts
with a current draw of 150 amps.

Now as the motor gets used...the voltage on the battery drops...
but for the motor the motor the lift the *same amount of weight...
the same distance*...it takes the same amount of power...Since the voltage
drops... the current rises. As the battery nears discharge, the voltage
under
load drops to maybe 31 volts or so...and the current rises to about 170
amps.

This is what's commonly know as a "vicious cycle" because the more the
voltage drops...the higher the current...and of course, the faster the
voltage will
drop. If the voltage would drop any lower than about 31 volts under load...
the current would rise so high...that the extreme raise it temperature
would burn the motor out! Again that's due to I squared R lossess...
as the voltage drops, the curent (and heat) rise exponentially.

Now, the analogy I had used before merely equated 'work' done by a cpu
to 'work' done by a motor..and since I am not in a position to be able to
analyze
cpu current draw...I will have to keep my big mouth shut on that one
and say I probably had no business commenting...

However my comments about power , current draw and heat etc...
are all true...not only does it follow the text books...I've had plenty of
hands on experience.

If you still do not agree, I think I'm going to just stop arguing as it does
not seem to be getting anywhere <g>
 
Your story is interesting, but an attempt to weasel out of what you said.
You still made the statement that higher current equaled higher heat. You
said nothing about supply wires. You also ignore that any circuit with
properly sized supply wires will have an insignificant loss in the supply
wires. If your supply wires are getting hot, they are too small. Basic ohms
law and engineering.

JT
You are wasting your time with this thread. I have given up on this . He
does not unserstand the simple comcept that if you reduce the voltage of a
device that is rated a given wattage the current will be reduced along with
the wattage. That is if you have a 100 watt light bulb rated at 120 volts
it will draw a certain ammount of current. If the voltage is reduced , the
current will also be reduced and the light bulb will not be able to draw any
more current to maintain the 100 watt rating. It will become less than a
100 watt bulb.
 
Ralph Mowery said:
You are wasting your time with this thread. I have given up on this . He
does not unserstand the simple comcept that if you reduce the voltage of a
device that is rated a given wattage the current will be reduced along with
the wattage. That is if you have a 100 watt light bulb rated at 120 volts
it will draw a certain ammount of current. If the voltage is reduced , the
current will also be reduced and the light bulb will not be able to draw any
more current to maintain the 100 watt rating. It will become less than a
100 watt bulb.
Yes I totally understand that...but a lightbulb or resistor is a *passive*
device.

I fully know that if you reduce the voltage across a resistor, the current
will drop!

However I am *not* talking about a passive device

I am specifically talking about *work*

to do a specified amount of *work* ...to get the same amount of *power*

if you reduce to voltage,,,the current drawn by the device must increase!



I will repeate this one more time:

If a motor lifts a 150# load 10 feet...
it will require a specific amount of power....

regardless of the voltage or current it takes X amount
of power to lift the 150# weight 10 feet


if the volatge is high, the current will be low

or if the volatge is low....the current is high

P = IE


I hope I got my point across this time...

If not...just open any physics or basic electronics book....
 
Back
Top