Lectures on statistical mechanics, Lecture — IV.
This lecture, the 4th in the series of statistical mechanics lectures, a paper for the physics honors degree class, was delivered on the 10th of January this year (2018).
You can find the previous lectures here ( Lecture — I, II ) and here ( Lecture — III ).
Topics covered in this lecture
a. Recapitulation of some previous ideas and — important remarks
b. Microcanonical ensemble — definition and properties
c. Some basic parameters and formalism
Recapitulation and remarks
In our previous lecture we defined the phase space density or distribution function for a classical statistical system with an aim to connect it to a thermodynamic system.
We saw that an ensemble system would be stationary if does not have any explicit time dependence, i.e. if .
- The type of general ensemble we defined as mental copies of actual system occupying each possible microstate can be called a Gibbsian Ensemble.
- The above condition of statistical density as a stationary or time-independent variable would represent conditions of equilibrium.
- We defined ensemble average of a physically measurable quantity as: One can also define a “most probable” value of as an average value of .
A. The two types of average mentioned here are nearly equal if mean-square fluctuation is small. Remember that mean square fluctuation is also known as dispersion and its the square of the standard deviation. This is the condition imposed for meaningfulness of a statistical construct for thermodynamic prediction. Mathematically: . B. In a stationary ensemble, i.e. equilibrium, is independent of time: where is phase space volume constrained by a given macrostate. We will meet this condition again, later. Note that we will redefine such relevant phase space volume to other symbols. One should keep in mind various symbols and avoid confusion as they keep on changing from one text book to another. The important thing to remember: meaning is important, the symbol not as much.
- We derived a general condition on the statistical system, described by specific macrostate variables that evolved through the canonical equations of motion. It is known as Liouville’s theorem. See lecture — III, for this theorem and its derivation.
Now we are ready to discuss what is a microcanonical ensemble.
Let us rewrite the Liouville’s theorem.
where the Poisson bracket is given by;
Let us note that the Liouville’s theorem is a general statement on a statistical system in the precinct of reasonable assumptions we made in its definition and the derivation of the theorem subsequently. — By it we mean the statistical system here.
But the equilibrium condition is not sufficiently general, it may or may not be achieved by a system. The condition only suggests how equilibrium might look like. With these qualifications, a system will be in equilibrium ( ) if its simultaneously amenable to the Liouville’s theorem.
One possible way of satisfying the theorem while the system is to be found in equilibrium is if is independent of and the canonical coordinates and canonical momenta, respectively.
In that case as; .
A system of ensembles which satisfies the above conditions, i.e. whose density of phase space is given as follows, is known as a “microcanonical ensemble“.
A. We have already discussed that the macrostates are defined well enough, by only 3 independent parameters, .
B. For the system to be isolated we do not let our energy to be altered. We make a small concession to it though, by allowing it to vary within or equivalently where is an arbitrarily small window of energy of the system. Usually we let .
C. For isolation should be strictly invariant. But can be allowed to change.
D. For a “microcanonical ensemble” ( ) — average total momentum of the system is zero.
E. Let be the phase space volume occupied by the microcanonical ensemble. — Another text defines it as , but we reserve for a different purpose here: CAUTION, “wet floor“.
i.e. phase space volume constrained by energy of macrostate. thus depends on .
F. Lets define volume in phase space enclosed by hypersurface constrained by ;
G. Thus .
if ; then where .
is called the density of states — that is, the number of states (here microstates, or phase points) per unit range of energy.
H. Entropy is defined by , we already derived this relation, in our lecture — I and — II of this lecture series. Entropy defined this way must be consonant with its thermodynamic definition.
- Entropy is extensive — for an interesting proof of this see Kerson Huang ( text book )
- Statistical definition of entropy is in line with 2nd law of thermodynamics, — we already discussed this in our first two lectures, linked above.
I. The following 3 definitions of entropy based on the 3 different definitions of relevant phase space volume — same as saying accessible microstates, are equivalent definitions of entropy valid within an error in the: or smaller.