# WTF is a "state function"?

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

#### DrMattOglesby

##### Grand Master
Moderator Emeritus
15+ Year Member
what is a "state function" ?
in context of a question i ran into while studying:

"name the two state functions related to energy transformation in chemical reactions. "

it answers that energy and enthalpy are path independent and as such, are state functions.

what does this mean in relation to entropy?
I think im fine understanding it as it relates to enthalpy (cause the enthalpy value is still the same regardless if a catalyst was used or not).

can someone explain how entropy is path independent (ie, a state function) ?

Entropy is, in simple terms, a measurement of disorder. So, lets say you add heat to water at 25C and it gains enough kinetic energy to enter the gas phase at 110C. Its entropy has increased, because it has gone from more orderly (liquid) to less orderly (gas).

The amount of change in entropy between water at 25C and dihydrogen monoxide (H2O ) at 110C doesn't matter "how" the gas was heated. The difference is always the same.

E.g. if you heat 25->98->32->46->40->110 your change in entropy is the same as if you went straight from 25->110, thus its "path independent".

sweet.
easy to understand!
muchas gracias.
by the way, dont forget to sign that famous petition going around, "Ban Dihydrogen Monoxide."

Will do, I hear that stuff can kill when inhaled!

Members don't see this ad :)
lol. good stuff

what is a "state function" ?
in context of a question i ran into while studying:

"name the two state functions related to energy transformation in chemical reactions. "

it answers that energy and enthalpy are path independent and as such, are state functions.

what does this mean in relation to entropy?
I think im fine understanding it as it relates to enthalpy (cause the enthalpy value is still the same regardless if a catalyst was used or not).

can someone explain how entropy is path independent (ie, a state function) ?

The terms are confusing as hell.. never seen a state vs.non state function problem on an MCAT. Don't worry about it. You'll just get confused.

I don't get it.. and I've never gotten it from other people explaining it to me. If you're having difficulty understanding it fully, I wouldn't bother.

State function is dependent on the position or state of the system. It is independent on the pathway to get there.

As an analogy, you can have ten different roller coasters travel down ten different routes, but if they start at the same height and end at ground level, the potential energy spent in all cases is a state function. In contrast, the time it took for each ride might vary so this is not a state function.

Entropy is a state function. However, unlike energy, entropy is not conserved. You can create entropy but you can't destroy it.

Entropy is actually best described by its definition S = K*log(W) where K is the boltzmann constant(a scaling factor) and W is the total possible # of states the particle could be in.

Now, for two identical systems(let's say a mole of nitrogen gas in a box of volume 1 liter, identical pressure, and identical temperature) they have the same entropy regardless of how they got there. Why? Because the total possible # of ways the particles could arrange themselves in the box is the same for both gases.

So what do I mean by states? Well, let's start with a simple example. Let's say I have 10 coins that can be in one of two states--heads or tails. If I have two boxes with ten coins in them, shake the boxes vigorously, then the state of each coin could be either heads or tails. When you count the total possible number of ways you could have your coins read when you check them(coin1-up, coin2-up, coin3-down, coin4-up, coin5-down, etc. etc.) then thats the 'W' I discussed above. The technical term for this is the number of microstates. You could wikipedia the term for more info.

So you see the entropy is the measure of the number of possible ways you can organize things or, moreover, find them if you make an observation of the system. This is why people describe entropy as the disorder. Systems are always tending towards states where there is the most possibilities of the ways they can be arranged.

One step further that's not really necessary. Say you have a box split into two sides--the left and right side--with 100 gas molecules floating around. The odds that you find all the molecules on one side are very slim...it's the same odds as flipping 100 coins and getting 100 heads. The odds of finding 50 heads and 50 tails are very, very high in comparison. The second law of thermodynamics is then more accurately seen as a statement of probability--states tend to be found in their most probable configuration. Maybe I didn't explain that enough. But the entropy stuff should help.

statistical thermodynamics

I understand that work is path dependent. W=Fd

don't quite understand heat thought...it takes the same heat to go from one temp to another.

edit: ok got it, the net change is the same but we can have many paths and we have to add or remove heat accordingly.

Last edited: