Entropy Statistical Limitations

Entropy: Statistical Approach Advantages and Limitations by M. Kostic


Entropy, the thermal displacement property, dS=dQrev/T with J/K unit, is a measure of thermal dynamic-disorder or thermal randomness and may be expressed as being related to logarithm of number of all thermal, dynamic microstates, or to their logarithmic probability or uncertainty, that correspond, or are consistent with the given thermodynamic macrostate. Note that the meanings of all relevant adjectives are deeply important to reflect reality and as such it has metaphoric description for real systems.

Every research and scholarly writing should be based on facts and objective reasoning and it should be correlated (or put in perspective) with existing knowledge. Therefore, along with outcome merit, the limitations, shortcomings and unresolved issues should be objectively and responsibly presented. The simplified simulations (analytical, statistical, numerical, etc.) should not take precedence over phenomenological reality and reliable observations, but to the contrary. Extreme judgments based on simulations are usually risky, particularly if detached from reality checks or with attempt to suppress reality.

   There is a "strange propensity” of some authors involved with simplified statistical interpretation of complex, random natural phenomena, to make unjustified statements that their analyses are true descriptions of natural phenomena and that the phenomenological definitions are deficient and misleading, or even worse, that the natural phenomena are a subset of more general statistical theory, for example, that information entropy is more general than thermodynamic entropy, the latter being a subset of the former. For example, some “promoters” of statistical descriptions of entropy become so detached from physical reality as if not aware of the reality.

    Since entropy is directly related to the random thermal motion of a system micro (atomic and molecular) structure, it is suitable to statistical analysis, particularly of simple system structures, like ideal gases, consisting of completely randomized particle motion in thermal equilibrium, without any other particle interactions, but elastic, random collisions of material point-like particles. For more complex, thus all real systems, the thermal motion  and interactions are much more complex, thus the statistical analysis is metaphorical only and cannot be quantitatively reduced to physical entropy, the latter well-defined and measured in laboratory for all substances of practical interest.

    Just because we could scale entropy using a statistical description of statistically random thermal motion of simple system particulate structure, the latter related to both, the thermal energy and thermodynamic temperature, thus entropy, it does not mean that entropy is a simple statistical concept and not physical quantity of its own right. Actually, the statistical representation is so simple and so limited, that without knowing the result upfront, the scaling would be impossible but for trivially simple and fully randomized mono-atomic ideal gas structure. The interpretation of the statistical analysis is going so far as to forget about the phenomena it is trying to describe, and presenting it as spatial particle arrangement, and or simplified statistics of position and momenta of particles without other realistic interactions. As if entropy is a measure of statistical randomness without reference to thermal energy, or reference to energy in general, both physically inappropriate. The real entropy, as defined and measured, is related to the thermal energy and thermodynamic temperature, ds=dq/T, not others internal energies.

    The Boltzmann's metaphorical entropy description, S=k*log(W), refers to a logarithmic measure of the number of possible microscopic states (or microstates), W, of a system in thermodynamic equilibrium, consistent with its macroscopic entropy state (thus number of thermal, dynamic microstates). This is really far-fetched qualitative description that transfers all real complexity to W  (number of relevant thermal, dynamic microstates) with deep meaning of relevant adjectives: equivalent number of microstates consistent with the well-defined macro-state. This is not a number of all possible spatial distributions of micro-particles within the system volume as often graphically depicted. For example, the microstates with all molecules in one half or one quarter of system volume and similar are inappropriate to count, since they are not consistent with the macrostate, nor physically possible to self-force all molecules in one half volume with vacuum in the other half. That would be quite different macrostate with null probability!

    The macrostate of a very simple, ideal system could be described by the positions and momenta of all the atoms. In principle, all the physical properties of the system are determined by its microstates. The Gibbs or von Neumann quantum or Shanon or other probabilistic entropy descriptions are also statistical as Boltzmann's. Actually they all reduce to the latter for fully randomized large system in equilibrium, since the logarithmic probability of all discrete microstates, where p_i=1/W, result in the Boltzmann's logarithmic value, i.e.,

                                 -Sum(p_i*log(p_i)=log(W)

    Again, a statistical interpretation is important as metaphoric only: The sum of the probabilities of  possible discrete microstates, p_i's, that could occur during the "random fluctuations" of a given macrostate. The adjective, possible, could occur, consistent random fluctuations (thus thermal), and the holistic of the statement have deep meanings, and could not be evaluated for any real system, but only scaled for the trivial one(s).

    Granted, there are some benefits from simplified statistical descriptions to better understand the randomness of thermal motion and related physical quantities but the limitations should be stated so the real physics would not be overlooked, or worse discredited. The phenomenological thermodynamics has the supremacy due to its logical reasoning based on the fundamental laws and without the regard to the system complex dynamic structure and even more complex interactions.

    The fundamental laws and physical phenomena could not be caused and governed by mathematical modeling and calculation outcomes as suggested by some, but the other way around. The energy, temperature and entropy are subtle and elusive, but well-defined and precisely measured as physical quantities, and used as such. They should be further refined and explained for what they are and not be misrepresented as something they might be, or even worse, they are not. Any new approach should be correlated with existing knowledge and limitations clearly and objectively presented.


                                                       *****


ADDITIONAL NOTES (more at > Comments on...):


  1. Importance of Sadi Carnot's treatise of reversible heat-engine cycles for Entropy and the 2nd Law definitions:

Carnot's ingenious reasoning of limiting, reversible engine cycles allowed others to prove that entropy is conserved in ideal cycles (Clausius equality - definition of entropy), that entropy could not be destroyed since it will imply spontaneous heat flow from colder to hotter body against natural forcing, i.e., hyper-ideal cycles, more efficient than ideal, reversible ones. Therefore, entropy is always generated (overall increased) due to dissipation of any work potential to heat and generation of entropy (Clausius inequality) in irreversible cycles. These are straightforwardly expanded for all reversible and irreversible processes and generalization of the 2nd Law of thermodynamics.


  1. Thermal energy versus Internal energy concepts in Thermodynamics:

The entropy is thermal-displacement, related to internal thermal energy or "stored heat" (either transferred and/or generated heat - obvious for incompressible substances), but is more subtle for compressible gases due to coupling of internal thermal energy (transferred as heat TdS) and internal elastic-mechanical energy (transferred as work PdV). Entropy is NOT related to any other internal energy type, but thermal (unless the former is converted to thermal in a dissipative process).


  1. Disorder versus Spreading/Dispersal as statistical metaphorical-concepts of entropy:

The three terms are qualitative concepts related to each other and have to relate to the random, complex thermal motion and complex thermal interactions of material structure, like its thermal heat capacity and temperature, among others. Only for simple ideal gases (with all internal energy consisting of random thermal motion and elastic collisions), entropy could be correlated with statistical and probabilistic modeling, but has to be and is measured for any and all real substances (regardless of its structure) as  phenomenologically defined by Clausius (dS=dQrev/Tabs). Thus entropy and the second Law are well defined i classical Thermodynamics.

Comments