Friday, August 15, 2014

statistical mechanics - Bolzmann entropy



The Boltzmann entropy is defined as the logarithm of the phase space volume (E). Is there a reference, book, paper which shows where this definition comes and how it is equal to the phase space volume?



Answer



I am not very sure about this, but here is an attempt.


Well there is a connection you can try to make with the Shannon entropy !! By apriori principle of microstates you can see that probability of each state is
$$ p = \frac{1}{\Omega} $$ In infromation theory given a set of events $ \{X_1,...,X_n\} $ with probabilities $ \{P_1,...,P_n\} $


Shannon information for the $i^{th}$ event is defined by $$ I_i = -log_2P_i $$


From the definition you can see that lesser the probability of an event, greater is its information. This is the motivation for this definition.


Now the shannon entropy is defined as average information in the given set of events.



For a probability distribution average of a quantity is defined by, $$ = \sum\limits_{i=1}^n P_iQ_i $$


So, the average information (or entropy) is given by


$$ S_{Shannon} = = -\sum\limits_{i=1}^n P_ilog_2P_i $$


As an exercise you can also verify this average information is maximised when the value of $$ P_i = \frac{1}{n} \:\:\:\:\:\: \forall \: i = 1,..,n $$ which is the idea of apriori principle saying entropy is maximised. (begin by setting $ \delta S = 0 $ )


The base 2 logarithm is a comfortable choice in the case of information theory.


However we can translate this to the idea in statistical mechanics with natural logarithm and using the first equation where $ \Omega $ is the total number of microstates (volume of phase space divided by $ h $ - unit volume element of phase space) .


$$ S_{Boltzmann} \propto ln \Omega \implies S = k_B ln\Omega $$


Where $ k_B $ is factor involved conversion from log base 2 to natural logarithm.


[EDIT 1]


Detailed explanation is quite involved, but to give you first sight, Microstate - is a particular set of values of (p,q) the momentum and position in the phase space. One set of (p,q) describes one physical state for the system. Now, since the phase space is continuous, we can't do counting of the every point (since that would sum infinite no. of them). From quantum mechanics, we have



$$ \Delta x\Delta p \ge \hbar $$ From this we deduce the smallest area element in phase space (ie x*p) goes like h(Planck's constant). After having discretised the space we can count the states, counting no. of smallest boxes (i.e. the bits of area h) within the area - condition given by its energy


In the proper mathematical language (for a 2D phase space) :


$$ \Omega = \int_{H\le E} \frac{dp dq}{h} $$ where E is energy of the system and H the hamiltonian. And the integration is over a 2-sphere.


In a much simpler language to associate to your familiar information theory ($ n \rightarrow \Omega $), which also gives you apriori principle for maximum entropy.


No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...