Statistical mechanics series, lecture — I, II
These lectures were delivered on 22 November 2017.
“Entropy, probability and equilibrium in thermodynamic systems”
The current lecture ( I and II ) is intended to be an introduction to the statistical mechanics paper of a physics honors degree.
i. Micro and macro state.
ii. Entropy, thermodynamic probability and thermal equilibrium.
Lets consider a physical system which is composed of identical particles, in a volume of . is an extremely large number, typically in the order of .
Lets confine ourselves to the “thermodynamic limit“. i.e. so that; is fixed at a value chosen.
Important note: The ratio is known as number density or particle number density — also concentration is sometimes used instead of density. One can distinguish them by referring to mass concentration vs number concentration. In a similar way one must distinguish number density from the not so unrelated parameter by the name mass-density.
In the thermodynamic limit, the “extensive properties” of the system such as energy and entropy are directly proportional to the size of the system, viz. or .
Similarly the “intensive properties” such as temperature , pressure and chemical potential are independent of the size.
If the particles of the system do not interact among themselves the total energy of the system is given by sum of energy of the individual particles i.e. , where is the number of particles each with energy . Also;
note i. If the system were quantum mechanical, energies and thus would all be discrete.
note ii. For large , the energies would be very small and so, spacing between different energy levels would be very small compared to the total energy . As a result can be considered continuous, — this would hold even when particles were interacting among themselves.
A physical system consisting of particles occupying volume and having energy constitutes a “macro-state“.
But looking deeper corresponding to each macro-state specified by , there are a large number of ways how this can be achieved by the system. This is because of the microscopic nature of the physical system.
E.g. there are a large number of ways in which and can be satisfied by the particles of the system.
Given a particular macro-state there are a large number of ways in which the microscopic system can be arranged to achieve the macro-state, e.g. there are a large number of ways in which a given energy E can be distributed among the particles. each of these possible arrangement correspond to a “micro-state” or “complexion” of the system.
E.g. the micro-states are the independent solutions of a system Schrodinger equation corresponding to a given eigenvalue of the Hamiltonian of the system; .
Fundamental postulate of statistical mechanics
At any instant of time , the system is equally likely to be found in any one of the micro-states of a given macro-state, for all micro-states of the same macro-state. This is referred to as the “postulate of equal a priori probabilities“.
The actual number of micro-states depends upon the parameters , and — which defines the macro-state, and this number is denoted by . All thermodynamic behavior of a system can be derived based on the magnitude of and its dependence on .
Entropy; relation between thermodynamics and statistical mechanics
Consider two physical systems and . Lets assume they are in equilibrium separately. Let be represented by a macro-state and be represented by a macro-state . Let us also assume the macro-state of has possible micro-states and the macro-state of has possible micro-states.
When the two systems and are brought into thermal contact with each other, there is a possibility of exchange of energy between the two. — E.g. imagine sliding-in a conducting wall and removing the resistive one. If the two systems are separated by a rigid impenetrable wall the volume and the particle numbers remain unaltered. But the energies and vary, so that the total energy given by the following restriction remains unaltered.
Here is the energy of the composite system and we have neglected the interaction between and .
At time , is likely to be found in any one of the micro-states. Similarly is likely to be found in any one of the micro-states. As a result the composite system is equally likely to be found in any one of the following micro-states.
Thus varies with . But at what value of will the composite system stop varying anymore, i.e. attain equilibrium?
— That is, how far will the exchange of energy continue before subsystems and are brought into thermal equilibrium?
Thermal equilibrium from statistical principles
We make the assertion that the value of which maximizes the number is the energy at which the composite system will attain equilibrium. We think a physical system left to itself proceeds in a direction which enables it to assume more and more micro-states until it settles into a macro-state having the maximum number of possible micro-states.
Thus statistically; a macro-state with larger number of micro-states is thermodynamically a more probable state and a macro-state with the largest number of micro-states is thermodynamically the most probable macro-state. The thermodynamically most probable system is the one in which the system spends an “overwhelmingly” large fraction of time and this defines “thermodynamic equilibrium“.
To maximize we need to differentiate it wrt and and set to zero at equilibrium energies and .
But since: .
So, for equilibrium we have:
If we define a parameter ;
the condition of equilibrium is stated by equality of the parameters and of the 2 individual systems and .
Thus setting is equivalent to zeroth law of thermodynamics which stipulates existence of a common parameter called temperature , for multiple physical systems to be in equilibrium.
Thus our parameter must be related to thermodynamic temperature .
From our knowledge of thermodynamics: where of the system.
Thus there exists a deep relationship between a thermodynamic quantity, — the entropy and a statistical quantity, — the thermodynamic probability.
Planck thus wrote the following (boxed) solution without any constant of integration .
This determines the absolute value of entropy of a given physical system in terms of total number of micro-states accessible to the system associated with a given macro-state.
Note that the qualifier “accessible” simply stands for consistency with the specification of the macro-state and any given thermodynamic constraints on the system . i.e. only that subset of the micro-states which satisfy the macro-state and other thermodynamic constraints of the system, are termed as accessible micro-states of the system.
Zero entropy corresponds to , a special state for which there is only 1 accessible micro-state. Thus with is known as “unique configuration“.
The entropy and micro-sate relation stated by Planck thus relates the micro-state with the macro-state. It is a fundamentally important equation in physics.
According to second law of thermodynamics the energy content of the universe is less and less available for performance of useful work, as entropy is slated to increase in a natural fashion.
Entropy is thus a measure of the chaos or disorder in a given system. When number of micro-states of a system are larger the associated entropy is correspondingly higher.
This is because when choice of micro-states is higher, the degree of predictability is lesser, this corresponds to increase in the level of disorder, hence entropy.
But when a system is in its unique configuration, there are no choices to be made, no scope for prediction and disorder. Thus this corresponds to “zero entropy“.
Also from last two equations we have:
where k is known as Boltzmann constant, — also denoted as .