Entropy, probability and equilibrium in thermodynamic systems.

Statistical mechanics series, lecture — I, II

These lectures were delivered on 22 November 2017.

( All “statistical mechanics series” lectures)Information Go to other available “statistical mechanics” lectures

“Entropy, probability and equilibrium in thermodynamic systems”

Topics covered

The current lecture ( I and II ) is intended to be an introduction to the statistical mechanics paper of a physics honors degree.

i. Micro and macro state.

ii. Entropy, thermodynamic probability and thermal equilibrium.

Thermodynamic limit

Lets consider a physical system which is composed of N identical particles, in a volume of V. N is an extremely large number, typically in the order of 10^{23}.

Lets confine ourselves to the “thermodynamic limit“. i.e. N\to \infty , V \to \infty so that; n = \frac{N}{V} is fixed at a value chosen.

Important note: The ratio n is known as number density or particle number density — also concentration is sometimes used instead of density. One can distinguish them by referring to mass concentration vs number concentration. In a similar way one must distinguish number density from the not so unrelated parameter by the name mass-density.

Extensive properties

In the thermodynamic limit, the “extensive properties” of the system such as energy E and entropy S are directly proportional to the size of the system, viz. N or V.

Intensive properties

Similarly the “intensive properties” such as temperature T, pressure P and chemical potential \mu are independent of the size.

If the particles of the system do not interact among themselves the total energy E of the system is given by sum of energy of the individual particles \epsilon_i i.e. E=\sum_i  n_i \epsilon_i, where n_i is the number of particles each with energy \epsilon_i . Also; N = \sum_i n_i

note i. If the system were quantum mechanical, energies \epsilon_i  and thus E would all be discrete.

note ii. For large V, the energies \epsilon_i would be very small and so, spacing between different energy levels would be very small compared to the total energy E. As a result E can be considered continuous, — this would hold even when particles were interacting among themselves.


A physical system consisting of N particles occupying volume V and having energy E constitutes a “macro-state“.

But looking deeper corresponding to each macro-state specified by N, E, V, there are a large number of ways how this can be achieved by the system. This is because of the microscopic nature of the physical system.

E.g. there are a large number of ways in which N = \sum_i n_i and E = \sum_i n_i \epsilon_i  can be satisfied by the particles of the system.


Given a particular macro-state there are a large number of ways in which the microscopic system can be arranged to achieve the macro-state, e.g. there are a large number of ways in which a given energy E can be distributed among the particles. each of these possible arrangement correspond to a “micro-state” or “complexion” of the system.

E.g. the micro-states are the independent solutions of a N-{particle}  system Schrodinger equation corresponding to a given eigenvalue E of the Hamiltonian of the system; \psi\,(\vec{r}_1, ...,\, \vec{r}_N).

Fundamental postulate of statistical mechanics

At any instant of time 't', the system is equally likely to be found in any one of the micro-states of a given macro-state, for all micro-states of the same macro-state. This is referred to as the “postulate of equal a priori probabilities“.

The actual number of micro-states depends upon the parameters NV and E — which defines the macro-state, and this number is denoted by \Omega\, (N,\, V,\, E). All thermodynamic behavior of a system can be derived based on the magnitude of \Omega and its dependence on N,\,V,\, E .

Entropy; relation between thermodynamics and statistical mechanics

Consider two physical systems A_1 and A_2. Lets assume they are in equilibrium separately. Let A_1 be represented by a macro-state (N_1, \,V_1, \,E_1) and A_2 be represented by a macro-state (N_2, \,V_2, \,E_2). Let us also assume the macro-state of A_1 has \Omega_1 \,(N_1, \,V_1, \,E_1)  possible micro-states and the macro-state of A_2 has \Omega_2 \,(N_2, \,V_2, \,E_2) possible micro-states.

Entropy: two physical systems in "selective" thermal contact. Photo Credit mdashf.org

Entropy: two physical systems in “selective” thermal contact. Photo Credit mdashf.org

When the two systems A_1 and A_2 are brought into thermal contact with each other, there is a possibility of exchange of energy between the two. — E.g. imagine sliding-in a conducting wall and removing the resistive one. If the two systems are separated by a rigid impenetrable wall  the volume V_1,\, V_2  and the particle numbers N_1, \,N_2 remain unaltered. But the energies E_1 and E_2 vary, so that the total energy given by the following restriction remains unaltered.

\boxed{E^{(0)} = E_1 + E_2 = constant}

Here E^{(0)} is the energy of the composite system A^{(0)} \equiv A_1+A_2 and we have neglected the interaction between A_1 and A_2.

At time t, A_1 is likely to be found in any one of the \Omega_1\,(E_1) micro-states. Similarly A_2 is likely to be found in any one of the \Omega_2\, (E_2) micro-states. As a result the composite system A^{(0)} is equally likely to be found in any one of the following micro-states.

\boxed{\Omega^{(0)}\, (E^{(0)}, \,E_1) = \Omega_1\, (E_1) \,\Omega_2\,(E_2) = \Omega\,(E_1)\,\Omega_2\,(E^{(0)}-E_1)}.

Thus \Omega^{(0)} varies with E_1. But at what value of E_1 will the composite system stop varying anymore, i.e. attain equilibrium?

— That is, how far will the exchange of energy continue before subsystems A_1 and A_2 are brought into thermal equilibrium?

Thermal equilibrium from statistical principles

We make the assertion that the value of E_1 = \bar{E_1}  which maximizes the number \Omega^{(0)}\, (E^{(0)},\,E_1)  is the energy at which the composite system A^{(0)} will attain equilibrium. We think a physical system left to itself proceeds in a direction which enables it to assume more and more micro-states until it settles into a macro-state having the maximum number of possible micro-states.

Thus statistically; a macro-state with larger number of micro-states is thermodynamically a more probable state and a macro-state with the largest number of micro-states is thermodynamically the most probable macro-state. The thermodynamically most probable system is the one in which the system spends an “overwhelmingly” large fraction of time and this defines “thermodynamic equilibrium“.

To maximize \Omega^{(0)} we need to differentiate it wrt E_1 and E_2 and set to zero at equilibrium energies E_1 = \bar{E_1} and E_2 = \bar{E_2}.

We have:

\big(\frac{\partial \Omega_1\,(E_1)}{\partial E_1}\big)_{E_1 = \bar{E_1}}\Omega_2\,(\bar{E_2}) + \Omega_1\,(\bar{E_1})\big(\frac{\partial \Omega_2\,(E_2)}{\partial E_2}\big)_{E_2 = \bar{E_2}}.\frac{\partial E_2}{\partial E_1}= 0.

But \frac{\partial E_2}{\partial E_1} = -1 since: E_1 + E_2 = constant.

So, for equilibrium we have:

\big(\frac{\partial\, ln \,\Omega_1 \,(E_1)}{\partial E_1}\big)_{E_1 = \bar{E_1}} = \big (\frac{\partial \,ln\, \Omega_2\, (E_2)}{\partial E_2}\big)_{E_2 = \bar{E_2}}

If we define a parameter \beta; \beta = \big(\frac{\partial \,ln \,\Omega \,(N,\, V,\, E)}{\partial E}\big)_{N,\, V,\, E = \bar{E}}

the condition of equilibrium is stated by equality of the parameters \beta_1 and \beta_2 of the 2 individual systems A_1 and A_2.

Thus setting \beta_1 = \beta_2 is equivalent to zeroth law of thermodynamics which stipulates existence of a common parameter called temperature T, for multiple physical systems to be in equilibrium.

Thus our parameter \beta must be related to thermodynamic temperature T.

From our knowledge of thermodynamics: \big(\frac{\partial S}{\partial E}\big)_{N, V}= \frac{1}{T} where S \equiv entropy of the system.

Thus there exists a deep relationship between a thermodynamic quantity, S — the entropy and a statistical quantity, \Omega — the thermodynamic probability.

\boxed{\frac{\Delta S}{\Delta\,(ln \,\Omega)}=\frac{1}{\beta T}=constant}

Planck thus wrote the following (boxed) solution without any constant of integration S_0.

\boxed{S= k \,ln \,\Omega}

This determines the absolute value of entropy of a given physical system in terms of total number of micro-states accessible to the system associated with a given macro-state.

Note that the qualifier “accessible” simply stands for consistency with the specification of the macro-state and any given thermodynamic constraints on the system . i.e. only that subset of the micro-states which satisfy the macro-state and other thermodynamic constraints of the system, are termed as accessible micro-states of the system.

Zero entropy corresponds to \Omega = 1 , a special state for which there is only 1 accessible micro-state. Thus \Omega = 1  with S=0  is known as “unique configuration“.

The entropy and micro-sate relation stated by Planck thus relates the micro-state with the macro-state. It is a fundamentally important equation in physics.

According to second law of thermodynamics the energy content of the universe is less and less available for performance of useful work, as entropy is slated to increase in a natural fashion.

Entropy is thus a measure of the chaos or disorder in a given system. When number of micro-states of a system are larger the associated entropy is correspondingly higher.

This is because when choice of micro-states is higher, the degree of predictability is lesser, this corresponds to increase in the level of disorder, hence entropy.

But when a system is in its unique configuration, there are no choices to be made, no scope for prediction and disorder. Thus this corresponds to “zero entropy“.

Also from last two equations we have: \boxed{\beta = \frac{1}{kT}}

where k is known as Boltzmann constant, — also denoted as k_B.

k = \frac{R}{N_A}=\frac{8.31}{6.02 \times 10^{23}}( S.I. units ) = 1.38\times 10^{-23} \frac{J}{K-molecule}

Categories: basic physics, Courses I developed, Physics, Statistical Mechanics, Teaching

Tags: , , , , , ,

2 replies


  1. Lectures on statistical mechanics, lecture – III. Basics. – "Invariance" from M Dash Foundation
  2. Lectures on statistical mechanics. Lecture — IV. Microcanonical ensemble. – "Invariance" from M Dash Foundation

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s