Thursday, November 10, 2016

statistical mechanics - Entropy definition, additivity, laws in different ensembles


I started studying Statistical Mechanics and specifically - microcanonical ensemble, which consists of defining different quantities such as temperature and entropy, and deriving some thermodynamic relations. After I finished it, I went to canonical ensemble and then many questions came to my mind. I will ask a lot of them and if it is wrong then please tell me and I can divide them in different posts.


For you to better understand what concerns me, I am going to write questions together with a thoughts I have about them.



In microcanonical ensemble, many books prefer to consider an isolated system with some energy and then divide this system into two systems (system 1 and system 2) that are separated by a diathermal wall - ones that can exchange energy by heat.



  1. If energy of the isolated system is $E$, volume is $V$ and number of particles is $N$, then entropy $S$ is defined as $S(E,V,N) = k_{B}\ln{\Omega(E,V,N)}$. Here, $k_{B}$ is Boltzmann's constant and ${\Omega}$ is the number of microstates that correspond to the specific macroscopic parameters mentioned previously. For which of the three systems is entropy defined this way?


I am pretty sure that entropy is defined in such manner for the two systems separated by a diathermal wall, but I am very unsure about the isolated system. By physical considerations I would assume that such definition applies for the isolated system by saying that no system is better than other. On the other hand, if I assume that this is true then it leads me to a contradiction in further questions.



  1. Why entropy of an isolated system is equal to the sum of subsystems $S=S_1+S_2$ ?


I think I can derive this if subsystems are isolated from each other and I think I can derive (as an approximation) it when both subsystems are in equilibrium. But I have no idea about transition to equilibrium. Also I am not sure whether it should be that way at all - because macroscopic parameters of an isolated system do not change - entropy should not change too. But it is "derived" somehow that entropy of the isolated system increases. Isn't it a contradiction? On the other hand, if I assume that $S$ is entropy of the isolated system when I know that subsystem 1 has, for example, energy $E_1$, volume $V_1$ and number of particles $N_1$, then I understand this. But then shouldn't this mean that to know entropy of a system I have to know everything about it's internal construction - each subsystem which can then be divided in more subsystems etc.




  1. Why $dS_{isolated system} \geq 0$?


I am aware that this is an experimental fact and it has a probabilistic nature and also that for some systems it does not hold. Is there a theoretical model that can motivate this equation? Or maybe if there is a number of conditions (known from experiments, for example) that must hold for this to be true?



  1. Is temperature, pressure, chemical potential etc. which are defined as derivated of entropy with respect to something defined for non-equilibrium also?


From high school studies I am pretty sure (which means I don't know) that pressure and temperature should be defined even when subsystems interact and they are not in equilibrium. But if that is true - both average energy and entropy should be defined in non-equilibrium. I have no problems with average energy just as it has no problems with me. But what about entropy? I understand why we use entropy formula that connects it to the microstates in equilibrium (assuming dominance of one macrostate). But what about non-equilibrium? I don't even know what energy subsystems have at each instant of time.


REMARK : I know answers to all of the questions if I assume that systems consist of identically distributed particles and that we take limit as number of particles in each subsystem tends to infinity and then approximate that in equilibrium energy $E$ density distribution for each of the systems is $\delta(E-C)$, where $C$ depends on the system. And I thought this is what happens - but then I started reading about canonical ensemble ... Because we assume one system is microscopic, so previous arguments can not be used (number of particles can be one which is not even close to infinity).



  1. In canonical ensemble - energy distribution of subsystems is not delta distribution - therefore we talk about average energies. But then how and why is entropy defined?



Do I take into account only microstates that have energy equal to the average energy? But then different distributions of energies could have the same entropy which is non-intuitive to me. On the other hand, if it is another function - what is it? And why is it the way it is? For me it would make sense to define entropy of a system to be something like an average of entropy for each of the energy macrostates. Because then I can calculate it and distribution of energy matters. But then, how is it connected to average energy? Because I surely have to calculate derivative with respect to it.


TL;DR: What is entropy? What is the meaning of life?


I thank everyone that has read my post and helped me!




No comments:

Post a Comment

classical mechanics - Moment of a force about a given axis (Torque) - Scalar or vectorial?

I am studying Statics and saw that: The moment of a force about a given axis (or Torque) is defined by the equation: $M_X = (\vec r \times \...