Power and Heat

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

MedPR

Membership Revoked
Removed
10+ Year Member
Joined
Dec 1, 2011
Messages
18,579
Reaction score
57
If you step up the voltage in a wire by a factor of 100, how is heat affected?

A. No change
B. Heat loss increases by a factor of 10000
C. Heat loss decreases by a factor of 10000
D. Not enough information to tell

Answer (and my question) are in white: C, heat loss decreases by a factor of 10000 because the joule heat generated is I^2R and since P=IV, a 100fold increase in V is a 100fold decrease in I. How do you know to use I^2/R? I know that P=I^2/R=IV, but if you just used P=IV you would get that the heat doesn't change since the 100fold increase in V and decrease in I just cancel each other out..

Members don't see this ad.
 
Last edited:
Why would current decrease in the wire if voltage increases? they are proportional

And why isn't D correct? What if there is 0 resistance?

:confused:
 
It IS true that all three equations are mathematically equivalent but conceptually they are not. So if you plug them in a problem, you'll get the same results. But in a conceptual problem like this, it will get you the wrong answer.

This source explained it to me quite well:
http://www.physicsclassroom.com/class/circuits/u9l3d.cfm

Sorry to send you off to another site, but it's a complex question. Did it help?

Edit: But in case you don't have the time...I think this question really focuses on making you realize the inverse relationship between voltage and current in relation to power. Voltage goes up, then current goes down proportionally. Although as chiddler points out, you're making a lot of assumptions in this scenario that I think AAMC would spell out. (I think they would add "given resistance remains constant")
 
Last edited:
It IS true that all three equations are mathematically equivalent but conceptually they are not. So if you plug them in a problem, you'll get the same results. But in a conceptual problem like this, it will get you the wrong answer.

This source explained it to me quite well:
http://www.physicsclassroom.com/class/circuits/u9l3d.cfm

Sorry to send you off to another site, but it's a complex question. Did it help?

Not really. I already knew everything that link is talking about. I'm not sure exactly why P=IV doesn't work. :(
 
Members don't see this ad :)
I think it's because you can't assume that the current will stay the same once you up the voltage. The one thing you know will stay the same is the resistance, because that's an intrinsic property of the wire. So you have to use P=V^2/R.
 
yes but in this situation the power loss is less when you step up the voltage right? Isn't that wy power plants use step-up voltages to reduce power loss by resistance?

What I am confused on is how when you step up the voltage, while keeping the current the same, does the current decrease? V=IR

And in that case if you step up voltage and current went up proportionally, why wouldn't the power loss increase?
 
yes but in this situation the power loss is less when you step up the voltage right? Isn't that wy power plants use step-up voltages to reduce power loss by resistance?

What I am confused on is how when you step up the voltage, while keeping the current the same, does the current decrease? V=IR

And in that case if you step up voltage and current went up proportionally, why wouldn't the power loss increase?


You can't increase the voltage through a wire without also increasing the current.. The wire's resistance doesn't change unless you add a resistor or change the wire.

power plants use step-up voltages to reduce the amount of heat loss in the wire. LOTS of heat loss = less efficient power = costs the electric company more money to send out the same amount of power.
 
You can't increase the voltage through a wire without also increasing the current.. The wire's resistance doesn't change unless you add a resistor or change the wire.

power plants use step-up voltages to reduce the amount of heat loss in the wire. LOTS of heat loss = less efficient power = costs the electric company more money to send out the same amount of power.

Okay I think I have it wrong. I did some reading. V=IR you only really use when ou're concerned with aspects of a circuit and voltage drop. In this case P=IV is king and they are not necessarily 100% connected to eachother I think.

so in P=IV If you want to keep the power through the wire the same you increase the voltage, but the current will decrease. That is the key fact apparently. This is the power sent THROUGH THE WIRE.

When you want to consider POWER LOSS BY THE WIRE you can combine it with V=IR to factor in the resistance of the wire.

in P=I^2R since your increase in voltage decreases the current to keep the power through the wire the same, the loss will be proportional to the square of the current. So if you step up the voltage by a factor of 2, you decrease the current by a factor of 2 and thus decrease the power LOSS by a factor of 4.

I'm still fuzzy as to why you can't apply that to P=V^2/R. since an increase in voltage would result in an increase in power loss.


EDIT: I think P=V^2/R cannot be used because the V in that equation is the voltage drop and not the voltage supplied. The voltage drop is a function of the total circuit or a known voltage at the end of the wire before it hits a transformer. We don't know that and it isn't the same voltage as the voltage supplied by the power plant that is doubled to decrease the current.
 
Last edited:
Okay I think I have it wrong. I did some reading. V=IR you only really use when ou're concerned with aspects of a circuit and voltage drop. In this case P=IV is king and they are not necessarily 100% connected to eachother I think.

so in P=IV If you want to keep the power through the wire the same you increase the voltage, but the current will decrease. That is the key fact apparently. This is the power sent THROUGH THE WIRE.

I don't understand the point you're making here. Are you just saying this for contrast?

When you want to consider POWER LOSS BY THE WIRE you can combine it with V=IR to factor in the resistance of the wire.

in P=I^2R since your increase in voltage decreases the current to keep the power through the wire the same, the loss will be proportional to the square of the current. So if you step up the voltage by a factor of 2, you decrease the current by a factor of 2 and thus decrease the power LOSS by a factor of 4.

I'm still fuzzy as to why you can't apply that to P=V^2/R. since an increase in voltage would result in an increase in power loss.


EDIT: I think P=V^2/R cannot be used because the V in that equation is the voltage drop and not the voltage supplied. The voltage drop is a function of the total circuit or a known voltage at the end of the wire before it hits a transformer. We don't know that and it isn't the same voltage as the voltage supplied by the power plant that is doubled to decrease the current.

I'm not understanding this either. You're saying that if power is to remain the same with increasing voltage, current must decrease. How does current decrease? By increasing resistance. But increasing resistance increases power loss.

The edit is interesting. I didn't think of that.
 
I don't understand the point you're making here. Are you just saying this for contrast?



I'm not understanding this either. You're saying that if power is to remain the same with increasing voltage, current must decrease. How does current decrease? By increasing resistance. But increasing resistance increases power loss.

The edit is interesting. I didn't think of that.

Well I am basing this off the fact that power stations pump up the voltage on power lines in order to reduce the current through the line as much as possible so that the power lost due to heat via the internal resistance of the power line is as small as possible. So instead of supplying the line with a high current and low voltage to keep the same power output, they make it high voltage low current.
 
I don't understand the point you're making here. Are you just saying this for contrast?



I'm not understanding this either. You're saying that if power is to remain the same with increasing voltage, current must decrease. How does current decrease? By increasing resistance. But increasing resistance increases power loss.

The edit is interesting. I didn't think of that.



I found this http://www.bsharp.org/physics/transmission

It might help you as I am almost understanding it myself.

"This means the current drawn by the substation is I=P/V and the higher the transmission line voltage, the smaller the current. The line loss is given by Ploss=I²R, or, substituting for I,

Ploss = P²R/V²"
 
I figured out this disgusting question. The hidden assumption that the answer tells us is that power should remain constant.

So. Power source and wire. Imagine powering a city. Power should remain constant. But you increase voltage. What must necessarily happen?

By P=IV, this tell us that current should decrease.

So now we have a bunch of wire with decreased current because power must remain the same, but the voltage has been amped up 100x.

The voltage source has increased in voltage 100x, but the wire's current has deceased by 100.* Therefore, the voltage in the wire must also decrease 100x.

Now we can use any of the equations. P=IV, P=V^2/R, P=I^2*R. All give the same result. 10000x decrease!

*This is what sucks. The question says specifically that "step up the voltage in a wire". But screw them. This is the only circumstance that this works in.

As always, correct me if i'm wrong.
 
Top