Entropy is a state function. However, unlike energy, entropy is not conserved. You can create entropy but you can't destroy it.

Entropy is actually best described by its definition S = K*log(W) where K is the boltzmann constant(a scaling factor) and W is the total possible # of states the particle could be in.

Now, for two identical systems(let's say a mole of nitrogen gas in a box of volume 1 liter, identical pressure, and identical temperature) they have the same entropy regardless of how they got there. Why? Because the total possible # of ways the particles could arrange themselves in the box is the same for both gases.

So what do I mean by states? Well, let's start with a simple example. Let's say I have 10 coins that can be in one of two states--heads or tails. If I have two boxes with ten coins in them, shake the boxes vigorously, then the state of each coin could be either heads or tails. When you count the total possible number of ways you could have your coins read when you check them(coin1-up, coin2-up, coin3-down, coin4-up, coin5-down, etc. etc.) then thats the 'W' I discussed above. The technical term for this is the number of microstates. You could wikipedia the term for more info.

So you see the entropy is the measure of the number of possible ways you can organize things or, moreover, find them if you make an observation of the system. This is why people describe entropy as the disorder. Systems are always tending towards states where there is the most possibilities of the ways they can be arranged.

One step further that's not really necessary. Say you have a box split into two sides--the left and right side--with 100 gas molecules floating around. The odds that you find all the molecules on one side are very slim...it's the same odds as flipping 100 coins and getting 100 heads. The odds of finding 50 heads and 50 tails are very, very high in comparison. The second law of thermodynamics is then more accurately seen as a statement of probability--states tend to be found in their most probable configuration. Maybe I didn't explain that enough. But the entropy stuff should help.

statistical thermodynamics