Lectures on statistical mechanics. Lecture — IV. Microcanonical ensemble.


Microcanonical ensemble

Lecture IV; This lecture, the 4th in the series of statistical mechanics lectures, a paper for the physics honors degree class, was delivered on the 10th of January this year (2018).

You can find the previous lectures here ( Lecture — I, II ) and here ( Lecture — III ).


Topics covered in this lecture

a. Recapitulation of some previous ideas and  — important remarks

b. Microcanonical ensemble — definition and properties 

c. Some basic parameters and formalism 


Recapitulation and remarks


In our previous lecture we defined the phase space density or distribution function \rho \hspace{2pt}(q, \hspace{2pt} p; \hspace{3pt}t) for a classical statistical system with an aim to connect it to a thermodynamic system.

We saw that an ensemble system would be stationary if \rho does not have any explicit time dependence, i.e. if \frac{\partial \rho}{\partial t} =0.

Remarks

  1. The type of general ensemble we defined as mental copies of actual system occupying each possible microstate can be called a Gibbsian Ensemble.
  2. The above condition of statistical density as a stationary or time-independent variable would represent conditions of equilibrium.
  3. We defined ensemble average of a physically measurable quantity f(q, p)  as: <f> \hspace{3pt} = \hspace{3pt} \frac{\int f(q, \hspace{2pt}p) \rho (q, \hspace{2pt}p; \hspace{3pt}t) \big( d^{3N}q \big) \hspace{3pt} \big( d^{3N}p \big)}{\int \rho (q, \hspace{2pt}p;\hspace{3pt} t) \big( d^{3N}q \big) \hspace{3pt} \big( d^{3N} p \big)}    One can also define a “most probable” value of f(q, p)  as an average value of f(q, p) .

    A. The two types  of average mentioned here are nearly equal if mean-square fluctuation is small. Remember that mean square fluctuation is also known as dispersion and its the square of the standard deviation. This is the condition imposed for meaningfulness of a statistical construct for thermodynamic prediction. Mathematically: \frac{<f^2>-< f > ^2}{< f > ^2} <<1 .


    B. In a stationary ensemble, i.e. equilibrium, <f(q,p)>  is independent of time: <f> = \frac{1}{\tau} \int\limits_{\tau}f(q, p) d\tau  where \tau  is phase space volume constrained by a given macrostate. We will meet this condition again, later. Note that we will redefine such relevant phase space volume \tau  to other symbols. One should keep in mind various symbols and avoid confusion as they keep on changing from one text book to another. The important thing to remember: meaning is important, the symbol not as much.


  4. We derived a general condition on the statistical system, described by specific macrostate variables that evolved through the canonical equations of motion. It is known as Liouville’s theorem. See lecture — III, for this theorem and its derivation.

Now we are ready to discuss what is a microcanonical ensemble


Microcanonical ensemble


Let us rewrite the Liouville’s theorem.

\boxed{ \frac{d\rho}{dt} = \frac{\partial \rho}{\partial t} + [\rho, H] = 0} 

where the Poisson bracket is given by; [\rho, H] = \sum \limits_{i=1}^{3N} \big( \frac{\partial \rho}{\partial q_i} \frac{\partial H}{\partial p_i} - \frac{\partial \rho}{\partial p_i}  \frac{\partial H}{\partial q_i} \big)= \sum \limits_{i=1}^{3N} \big( \frac{\partial \rho}{\partial q_i} \dot{q}_i + \frac{\partial \rho}{\partial p_i}  \dot{p}_i \big)

Let us note that the Liouville’s theorem is a general statement on a statistical system in the precinct of reasonable assumptions we made in its definition and the derivation of the theorem subsequently. — By it we mean the statistical system here.

But the equilibrium condition \frac{\partial \rho}{\partial t} = 0 is not sufficiently general.  it may or may not be achieved by a system. The condition only suggests how equilibrium might look like. With these qualifications, a system will be in equilibrium ( \frac{\partial \rho}{\partial t} = 0 ) if its simultaneously amenable to the Liouville’s theorem.

One possible way of satisfying the theorem while the system is to be found in equilibrium, is, if, \rho (q, p) is independent of q and p the canonical coordinates and canonical momenta, respectively.

In that case [\rho, H]=0 as; \frac{\partial \rho}{\partial q_i} = \frac{\partial \rho}{\partial p_i}=0.

A system of ensembles which satisfies the above conditions, i.e. whose density of phase space is given as follows, is known as a “microcanonical ensemble“.

Microcanonical - Ensemble- \hspace{2pt} \rho (q, p)  \\ \big\{= const , if \hspace{2pt}E < H(q, p) < E + \Delta  \\ \big\{= 0, \hspace{5pt} otherwise

Remarks


A. We have already discussed that the macrostates are defined well enough, by only 3 independent parameters, N, V and E.


B. For the system to be isolated we do not let our energy E to be altered.  We make a small concession to it though, by allowing it to vary within (E, E+\Delta),  or equivalently (E-\frac{\Delta}{2}, E+\frac{\Delta}{2}) where \Delta, is an arbitrarily small window of energy of the system. Usually we let \Delta << E.


C. For isolation N should be strictly invariant. But V can be allowed to change.


D. For a “microcanonical ensemble” ( \rho = constant ) —  average total momentum of the system is zero.


E. Let \Gamma (E) be the phase space volume occupied by the microcanonical ensemble. — Another text defines it as \omega , but we reserve \omega for a different purpose here: CAUTION, “wet floor“.

wetfloor_hijabheels_dot_com.jpg

Photo-Credit: click on link to left.

\Gamma (E)=\int\limits_{E<H(q,p)<E+\Delta} d^{3N}q d^{3N}p

i.e. phase space volume constrained by energy of macrostate. \Gamma (E) thus depends on N, V, \Delta .


F. Lets define volume in phase space enclosed by hypersurface constrained by H ( q, p) = E \sum E=\int\limits_{H(q,p)<E} d^{3N}q d^{3N}p


G. Thus \Gamma (E)=\sum (E+\Delta) - \sum E.

if \Delta << E; then \Gamma (E)=\lim_{\Delta \to 0} \frac{\sum(E+\Delta) - \sum E}{\Delta}\Delta=\omega(E)\Delta where \omega(E)=\frac{\partial (\sum E)}{\partial E}.

\omega (E) is called the density of states — that is, the number of states (here microstates, or phase points) per unit range of energy.


H. Entropy is defined by S(E, V) = k_B ln \Gamma (E) , we already derived this relation, in our lecture — I and — II of this lecture series. Entropy defined this way must be consonant with its thermodynamic definition.

  1. Entropy is extensive — for an interesting proof of this see Kerson Huang ( text book )
  2. Statistical definition of entropy is in line with 2nd law of thermodynamics, — we already discussed this in our first two lectures, linked above.

I. The following 3 definitions of entropy based on the 3 different definitions of relevant phase space volume — same as saying accessible microstates, are equivalent definitions of  entropy valid within an error in the: O\hspace{2pt}(ln N) or smaller.

a. S = k_B \hspace{2pt}ln \hspace{2pt} \Gamma (E) 

b. S = k_B  \hspace{2pt} ln \hspace{2pt}\omega (E) 

c. S = k_B ln \hspace{2pt}(\sum E).


 



Categories: basic physics, Courses I developed, Statistical Mechanics, Teaching

Tags: , , ,

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s