Distribution and Entropy
Storyboard 
When analyzing the probability of finding the system in a particular state, we observe that the equilibrium condition ($\beta$) is an integral part of the distribution's structure. Furthermore, it becomes evident that the function that best models the system is the logarithm of the number of states, which is associated with what we will term entropy.
ID:(437, 0)
Forming a maximum
Definition 
When we multiply the number of cases, we obtain a function with a very pronounced peak.
The system is more likely to be found at the energy where the peak of the probability curve occurs.
ID:(11543, 0)
Distribution and Entropy
Storyboard 
When analyzing the probability of finding the system in a particular state, we observe that the equilibrium condition ($\beta$) is an integral part of the distribution's structure. Furthermore, it becomes evident that the function that best models the system is the logarithm of the number of states, which is associated with what we will term entropy.
Variables
Calculations
Calculations
Equations
Examples
When we multiply the number of cases, we obtain a function with a very pronounced peak.
The system is more likely to be found at the energy where the peak of the probability curve occurs.
To study the behavior of the number of states function, we can expand it around the equilibrium energy value $\bar{E}$. If we do this expansion in the logarithm of the number of states, we obtain
$\ln\Omega(E)=\ln\Omega(\bar{E})+\displaystyle\frac{\partial\ln\Omega}{\partial E}\eta+\displaystyle\frac{1}{2}\displaystyle\frac{\partial^2\ln\Omega}{\partial E^2}\eta^2\ldots$
where $\eta=E-\bar{E}$. Using
and
$\lambda\equiv-\displaystyle\frac{\partial^2\ln\Omega}{\partial E^2}=-\displaystyle\frac{\partial\ln\beta}{\partial E}$
we obtain the expression with
If we consider the expansion of the number of states with
we can estimate the logarithm of the probability:
$\ln P = \ln[\Omega(E)\Omega'(E')] = \ln\Omega(E) + \ln\Omega'(E') = \ln\Omega(\bar{E}) + \ln\Omega'(\bar{E'}) + (\beta - \beta')\eta - \frac{1}{2}(\lambda + \lambda')\eta^2\ldots$
For the case of equilibrium, both betas are equal, and the probability that one of the systems has an energy $E$ is reduced to a Gaussian distribution, which is expressed with
where
$\lambda_0 = \lambda + \lambda'$
The factor that defines the width of the probability curve is the quadratic factor in the Taylor series expansion, which is expressed with
It can be shown that the introduced number $\lambda$ is always positive. An indication of this comes from the function of the number of states we have already calculated for the case of free particles. In that case, as the number of states is proportional to the energy raised to the power of the number of degrees of freedom $f$, we find that with
$\Omega\sim E^f$
we have
$\lambda_0\equiv-\displaystyle\frac{\partial^2\ln\Omega}{\partial E^2}=-f\displaystyle\frac{\partial^2\ln E}{\partial E^2}=\displaystyle\frac{f}{E^2}$
.
The key parameter in the study of equilibrium is given by the logarithm of the number of states, which with
The natural logarithm of the number of states multiplied by the Boltzmann constant $k_B$ is defined as the system's entropy, which is expressed with
The definition of $\beta$ is found in
that of temperature is in
and that of entropy is in
These definitions lead us to a thermodynamic relationship that indicates how temperature $T$ is related to
With the definition of entropy as
and considering that in equilibrium it holds with
we conclude that in equilibrium, the energy $E$ must always be maximum with
ID:(437, 0)
