Zach
Why is the Arrow of Time Asymmetric?
Why is the Arrow of Time Asymmetric?
DYO Abstract Video:
Common Examples of Entropy Increasing:
Woman Aging Time Lapse
Food Coloring Diffusing into Water
Ice Melting at Room Temperature
Introduction:
The science of classical thermodynamics is closely related to the period of the industrial revolution. Engineers in the 18th and 19th centuries are responsible for the foundations of thermodynamics while attempting to improve steam engines and explore the uses of heat. Sadi Carnot, a French military engineer, pioneered the realm of thermodynamics, introducing the concept of the heat-engine cycle and the principle of reversibility in 1824. His work dealt with the limitations of the maximum amount of work a steam engine could produce with a high-temperature heat transfer as its driving force. Over time, the science of thermodynamics quickly spread throughout Europe, eventually reaching Rudolf Cladius, who would develop the theory in greater detail. Thermodynamics can be called a “phenomenal science,” meaning that its variables range over macroscopic parameters such as temperature, pressure, and volume, dealing with the relationships between heat and other forms of energy. In the realm of thermodynamics, there are certain laws that reign supreme founded upon observations of relationships between particular macroscopic parameters. Traditionally, there are two important classic thermodynamic laws: the first law and the second law of thermodynamics. For this paper, we will be focusing our attention on the second law of thermodynamics which states that the total entropy of an isolated system either increases or remains constant in any spontaneous process; it can never decrease. Overall, implying an isolated system’s value of entropy tends to increase over time.
Formally defined as the measure of a system’s thermal energy per unit temperature that is unavailable for doing useful work, entropy was introduced by German physicist Rudolf Clausius in 1850, representing a breakthrough in the world of physics in the 19th century. The amount of entropy is also a measure of the molecular disorder or randomness of a system because work is obtained from ordered molecular motion. Rather than trying to unpack the changes of events or phenomena individually with abstract reasoning, entropy opens the door to a more general mathematical approach, providing insight into the direction of spontaneous changes in everyday phenomena. Seemingly unrelated common occurrences such as ice melting at room temperature and workplaces getting messier over time can be generalized and related using the concept of entropy. The goal of this investigation is to explain the asymmetric nature of the arrow of time through a detailed investigation of entropy.
Theoretical Work:
The Coin Toss
The asymmetric nature of entropy can be explained in simpler terms using a standard coin toss. Assume an isolated coin toss consisting of five coins that are not asymmetric in a way that favors one side over the other. The detailed probable outcomes can be represented using the table below:
Figure 1: Table representing probable microstates corresponding to macrostates for a 5 coin toss. A total of 32 separate microstates correspond to 5 macrostates
In Figure 1 above the macrostates represent the overall outcomes within the system. In this system, they are the # of heads and tails that come up in a given coin toss. The microstates represent each individual outcome of the system. In this system, they represent the sequences of each individual coin toss, taking into account the order of heads and tails. In larger systems, it is possible for millions of microstates to correspond to a single macrostate. In this specific coin toss, the table shows us the least and most probable macrostates based on the # of microstates. The least probable macrostates each with only 1 corresponding microstate are 5 heads, 0 tails and 0 heads, 5 tails. Their probabilities equate to 1/32 or 3.125%. On the other spectrum, the most probable macrostates each with 10 corresponding microstates are 3 heads, 2 tails and 2 heads, 3 tails. Their probabilities equate to 10/32 or 31.25%. Interestingly enough, the two most orderly macrostates–5 heads, 0 tails and 5 tails, 0 heads–have the least probable outcomes, while the two most disorderly macrostates–3 heads, 2 tails and 2 heads, 3 tails–have the most probable outcomes. This simple example unpacks entropy’s tendency to increase over time into a smaller and more realistic system. The coin toss shows us that in an isolated system the most probable macrostate is the one with the greatest entropy or disorder; therefore, it would only make sense mathematically for the overall entropy to increase over long periods of time.
Figure 2: # of Microstates Corresponding to Macrostates in a Coin Toss. Microstates relate to each separate coin toss and macrostates represent the total # of heads. 10 coins were tossed 600 times, recording every microstate for every coin toss. 600 microstates correspond to 10 separate macrostates.
Figure 2 displays my own coin toss using 10 coins. In order to better understand entropy’s tendency to increase over time, I wanted to model a realistic coin toss after studying a theoretical one. The 10 coin toss graph displays a similar shape to that of a bell curve, where the probabilities near the middle (the most area underneath the curve) are much higher than the ones on the most left and right (the least area underneath the curve). Although the curve is not perfectly symmetrical due to a realistic coin toss, the pattern explained in the theoretical coin toss is similar to this realistic one. The macrostate with the greatest number of microstates (5 heads and 5 tails) is the most disordered distribution or the one with greatest entropy. On the other hand, the macrostate with the least number of microstates (1 head, 9 tails and 9 heads, 1 tail) is the most ordered distribution or the one with least entropy. This real model leads us to the same conclusion as the theoretical model: in an isolated system the most probable macrostate is the one with the greatest entropy or disorder.
“Boltzmann Entropy”
While the coin toss is an effective way to see entropy’s nature to increase over time, the idea can still be confusing or abstract. In order to demystify and explain entropy’s nature in greater detail, we can unpack Boltzmann’s entropy equation:
Named after the great physicist Ludwig Boltzmann, this equation can be considered using an isolated system with its own macrostates and microstates. Imagine an isolated gas in a box with x amount of freely moving particles randomly colliding with each other. The macrostate represents the measurable quantities of the gas such as volume, pressure, and temperature. The microstates represent the positions and momentum of each particle in the box. While there is a large number of microstates in the system due to the possibility of an infinite number of particles, the microstates collectively exhibit an average configuration which makes up the macrostate, where each microstate’s impact is negligibly small. The equation above relates the value of entropy to the # of possible microstates in the system. S represents the value of entropy in the system, k_b represents the Boltzmann constant (1.38 * 10^23 J/K), a fundamental constant of physics, and W represents the # of available microstates for the macrostate in question. This equation suggests all microstates with macrostates that are not in equilibrium will tend to have an overwhelming increase in their entropy value over time. Since W must be a natural number (1, 2, 3….); therefore, the entropy value (S) will either be 0 (W=1, ln(W)=0) or positive (W>1).
Conclusion:
Entropy’s tendency to only go in one direction directly relates to the arrow of time’s asymmetric nature. Time possessing asymmetry means it only goes in one direction, which is exactly how we experience it. As of now, only the concept of entropy allows us to distinguish between the past and the future, giving a direction for time; an increase in overall entropy suggests we are going towards the future, while a decrease in overall entropy suggests we are moving back towards the past.
Bibliography:
https://www.britannica.com/science/entropy-physics
https://plato.stanford.edu/entries/time-thermo/
http://www.scholarpedia.org/article/Time%27s_arrow_and_Boltzmann%27s_entropy