As a measurement of a system's randomness, entropy has specific scientific meaning. Last month, we saw how entropy can be scientifically defined in terms of heat. Now, we explore how statistics can also express the phenomenon.

Spread, sprinkled, or expanded systems in general possess higher relative entropy: Mustard in a neat line on a hotdog is less entropic than mustard splattered on a shirt; similarly, the carbon in a diamond is more orderly and less entropic than that in graphite.

Speaking of allotropes, if systems tend towards increasing entropy, then why do materials freeze into orderly crystal lattices? Even here, the Second Law of Thermodynamics holds. Entropy coexists with forces that drive atoms and molecules to lower energy states; even where crystals themselves are lower in entropy and energy, they form in non-isolated systems that interact with the environment, its kinetics, and greater energy.

The densest lattices (as the face-centered cubic here) are most efficient for packing spherical atoms. Researchers at the University of Pennsylvania led by Randall D. Kamien have shown that though they're more unique, systems of oddly shaped molecules also freeze into efficient and predictable lattices for maximum overall system entropy.

In a paper published in the American Journal of Physics, Daniel F. Styer proposes that the hard association of entropy with notions of disorder is troublesome, and that the scientific definitions deserve more distinction. He also proposes the simile entropy as freedom.

In this case, two suddenly joined (and possibly less “organized”) systems might also be called less bounded.

Odds are even

The first way to describe entropy is in terms of thermodynamics or heat. The second way to describe entropy is with statistics.

Assume that we now have a single-chamber insulated tank that contains one dozen identical molecules of an ideal gas.

At any instant, each of our molecules has a 50/50 chance of being on one half of the chamber. However, the position permutations for which all 12 of the molecules are on one side at once are miniscule; rather, probability is highest that at any given moment, the molecules will be spread evenly throughout the chamber. This unfurled arrangement has higher entropy and is expressed by a relationship developed by Austrian Louis Boltzmann:

Note that this statistical definition is related to the thermal definition, as molecules must have the energy to spread through the chamber — and any resulting entropy results from their interactions.

Murphy's Law: More correlates and functions

Nature always sides with the hidden flaw. • The amount of damage something sustains will be in direct proportion to its value. • If there is a possibility that several things will go wrong, the one that will cause most damage will be first to go wrong.

Common — tendency to entropy

Configurations with higher multiplicity and probability of occuring (as the below arrangements, in which the molecules are spread equally over chamber halves, for example) also have higher entropy, as there are more channels for free energy.

Visit motionsystemdesign.com for the first of this two-part series on entropy, where its definition in terms of heat is outlined.