Entropy
There is also a property of physical systems, in the 2nd law of thermodynamics, called entropy, that increases with time. Entropy is hidden information. The entropy of an isolated system can never decrease with time. This law of entropy increase is known as the 2nd law of thermodynamics. Entropy is related to randomness. It is the measure of disorder in a physical system. As the entropy of a system increases, it becomes more and more random. Indeed, entropy measures the degree of disorder of a physical system. A measure of entropy will be the number of different microscopic configurations that a physical system can undergo that can still lead to the same macroscopic properties. Another way to think about entropy is that it is the number of ways you can change the distribution of atoms and molecules in a region of space, without changing the macroscopic properties of the system as a whole. The larger the entropy, the more ways the atoms and molecules can be configured and reconfigured, without changing the macroscopic appearance of the physical system. When a system has the maximum possible amount of entropy, it can said to be in thermal equilibrium.
Ludwig Boltzman
Ludwig Boltzmann, an Austrian physicists, who contributed greatly to the discovery of statistical mechanics, is going to carefully look at this concept in the 1870s. What Boltzman demonstrated was that the thermodynamic properties of a gas, could be understood by averaging properties of its constituent molecules. This process of “averaging” led to the ability to understand various properties, such as temperature and pressure.
Newton's laws of motion, however, says that the laws of physics are reversible. Thus, there cannot be a quantity that always increases. What is entropy then? What is the meaning of the 2nd law of thermodynamics? Entropy, according to Boltzmann, is hidden information. Information is hidden, contained in degrees of freedom that are to small and too numerous.