Could somebody please dumb down what is entropy?

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

xoxo111

Full Member
5+ Year Member
Joined
Jun 7, 2018
Messages
320
Reaction score
116
I understand it is a disorder of some sort but it becomes really difficult to apply to MCAT answers. Thank you.

Members don't see this ad.
 
Imagine 1 ping pong ball in your cupped hands. Now toss it up real high and let it land. It'll fly up, separate, come down, and scatter. Now, how many ways can you imagine it going? Pretty easy, because if you tossed it up, it'll come back down and you can track it pretty easily.

Now repeat that thought experiment with 10 ping pong balls. How many ways can you imagine those 10 ping pong balls going? Almost impossible right?

Entropy is a measure of the randomness/disorder represented by the 10 ping pong balls. The number of ways the ping pong balls can be arranged is how we measure the randomness of a system.

Now consider two atoms bonded. Say, two Hydrogen atoms (H-H). These two hydrogen atoms are bonded, and as a result we can somewhat predict the position of the atoms right? If we know where one is, we can be sure the other one will be right next to it because they are connected.

Now let's break that bond. The two hydrogen atoms are now free floating in space, and we can't as easily predict the position of the atoms anymore. If we know where one is, all we can be sure is that the other atom does not somehow occupy the exact same position in three dimensional space. In that sense, we have increased the randomness/disorder of the system, aka entropy, by breaking that bond.

Now let's try to create that bond again. We have to add some energy to create that bond though. But once we add the energy to the system and create that bond, we now have decreased the randomness/entropy because now we can predict where one hydrogen is based on knowing where the other one exists.

Hopefully that helps you better understand both entropy and the Gibbs free energy equation (deltaG = deltaH - Temp*deltaS). By increasing energy, G, by creating a bond, we can force a decrease in S, entropy. By lowering the energy, G, by removing bonds, we then cause an increase in S, entropy.
 
  • Like
Reactions: 1 users
Members don't see this ad :)
OP, @Zenabi90 hit the nail on the head. My suggestion is this, do a simple google search of entropy to get a "real world" feel of it. Entropy did NOT make sense to me at all, until I found a picture of a clean room and a dirty room. The clean room represents a low/less chaotic entropy value, and a messy room represents a high/more chaotic entropy value.
 
If you are having trouble with the word, you can try to think of it as more like an adjective that is describing the scenario. A positive value would mean describing more disorder, more entropy in the final scenario than the original scenario. A negative value means describing less disorder, less entropy in the final scenario than the original scenario. And then once you understand this, you can see how this applies to like what was described earlier-- which can be similar to atoms, molecules, etc.
 
  • Like
Reactions: 1 user
Also worth noting is the 2nd law of thermo stating that entropy will always be greater than zero in physical processes within an isolated system
 
Here are my notes on Entropy:

Entropy
is a measure of the spontaneous dispersal of energy at a specific temperature: how much energy is spread out, or how widely spread out energy becomes in a process.

The equation for calculating the change in entropy is:
S = Qrev/T
where ∆S is the change in entropy, Qrev is the heat that is gained or lost in a reversible process, and T is the temperature in kelvin. The units of entropy are usually J/mol•K.

When entropy is distributed into a sytem at a given temperature, its entropy increases. When energy is distributed out of a system at a given temperature, its entropy decreases.
 
  • Like
Reactions: 1 user
Top