Thermodynamic probability of the system state. Entropy and thermodynamic probability

Over time, the molecules of a physical system, due to interaction with each other, change their energy and move from one energy interval to another. Although the energy of individual particles changes over time, for an equilibrium system in each of the selected energy intervals the number of particles will remain on average the same; There is a kind of continuous rearrangement of particles across all possible energy states.

If in some way the molecules could be made distinguishable and assigned serial numbers, then one could see that the same equilibrium state of the system (macrostate) is realized not by one, but by many microstates of the system, characterized by a certain distribution of particles over energy levels. The number of microstates by which a given macrostate can be realized is called thermodynamic probability. For equilibrium state

thermodynamic probability is maximum. The spontaneous transition of a system from a nonequilibrium state to an equilibrium state is explained by the fact that the system tends to transition to a state with a maximum value of thermodynamic probability. It should be noted that, unlike mathematical probability, which is always a proper fraction, thermodynamic probability is equal to a whole number (usually a very large one). All theorems established for mathematical probability (theorems of addition, multiplication of probabilities, etc.) are applicable to thermodynamic probability. The calculation of thermodynamic probability is quite complicated.

In classical statistics, the number of microstates is determined as follows. Molecules are distributed over all energy levels, and the number of all possible permutations of these molecules, taking into account both those permutations when one molecule moves from one energy level to another, and those when molecules located at the same energy level change places, according to combination theory is equal to the product of the natural series of numbers from one to

To determine the thermodynamic probability, it is necessary to exclude from (39.1) the rearrangement of molecules at each energy level. (According to classical statistics, permutations inside energy cells do not give new microstates.) There will be such permutations inside the first energy cell inside the second. To eliminate them, it is necessary to divide (39.1) by the product. This is how thermodynamic probability is determined in classical statistics:

where is the symbol of the product of factorials taken over all energy levels.

To illustrate (39.2), consider the distribution of six particles in two cells (Table IV). Quantitatively, without taking into account the distinguishability of particles, in this example there are seven possible ways (macrostates) of their distribution over two cells. Each of them corresponds to a certain number of microstates (the number of permutations between cells). For the seventh macrostate, when there are equal numbers of particles in each cell, the thermodynamic probability is the highest and is equal to 20 (when calculating according to (39.2), it should be remembered that

Table V shows the microstates that implement the macrostate with one and five particles in two cells.

The state of the system is characterized not only by the energy distribution of particles, but also by their distribution in space. Let's find the distribution of ideal gas molecules in volume V. Let's break it down

Table IV (see scan) Thermodynamic probability for a system of six particles and two energy cells

Table V (see scan) Microstates for six particles and two cells, when one of them contains five particles

volume per cells: so Let us first assume that all molecules are distinguishable. Then the probability of one of these molecules entering the cell is equal to (§ 28). The probability of x labeled particles entering the specified cell is determined by the expression Distribution of labeled particles over all elementary volumes where the number of particles in a volume is a complex event, and its probability is determined through the product of the probabilities of filling individual elementary volumes:

To determine the probability of particle distribution over elementary volumes without taking into account their individual characteristics, the above expression should be multiplied by the thermodynamic probability (39.2):

Let all cells be identical: By introducing the average spatial concentration of molecules and using the relation, we rewrite (39.3) in the form

where is the average number of particles per selected elementary volume of the system. The previous expression can be represented in the form

Using Sterling's formula from the last expression we can obtain:

If we record the number of particles in cells (the number of occupancies) at equal, arbitrary intervals, then formula (39.4) will indicate the probability of detecting any of the predetermined distributions in such experiments. Each of the factors (39.4)

according to the law of probability multiplication (§ 28), it should be interpreted as the probability of detecting a given number of particles in an elementary cell of a system when states are recorded at equal arbitrary intervals of time.

The cell with the number is no different from the others, therefore a formula similar to (39.5) characterizes the averaged picture of the distribution of particles over all elementary volumes of the system.

Lecture No. 4. The second law of thermodynamics. Entropy.

The first law of thermodynamics establishes the equivalence of various types of energy and forms of their transmission. The purpose of the 2nd law of thermodynamics is to establish criteria for the spontaneous occurrence of various processes. The law has gone through a long path of evolution and was first formulated as the basic law of the operation of heat engines.

1. Carnot's theorem(1824) “Reflections on the driving force of fire” - the efficiency of a reversible cycle consisting of 2 isotherms and 2 adiabats depends only on the temperature difference of the thermal reservoirs and does not depend on the nature of the working fluid

h = (Q 1 – Q 2)/Q 1 = (T 1 –T 2)/T 1 = 1 – T 2 /T 1. ( 4.1)

Thomson (Lord Kelvin) (1848) introduced the concept of absolute temperature and (1851) formulated the 2nd law of TD - it is impossible to build a periodically operating heat engine that would only draw heat from one reservoir and perform mechanical work, i.e. A perpetual motion machine of the 2nd kind is impossible.

3. Clausius(1850) gave the first formulation of the 2nd law of TD - Spontaneous transfer of heat from a less heated body to a more heated body is impossible. Both formulations are equivalent; one is impossible without the other. In 1854 he introduced concept of entropy [S] = J/K .

4. These formulations are very far from chemistry; to solve practical problems in chemistry, they use the formulation of the second law of thermodynamics given by Gibbs: a system is in a state of equilibrium if its entropy, despite all possible changes, remains constant or decreases , i.e.

In fact, the concepts of the second law are based on molecular theory and have statistical character . Let's consider the second law from this point of view. Statistical thermodynamics began to develop at the end of the 19th century. Its basic principles were developed in the works of Boltzmann, Gibbs, Planck, Einstein, Ehrenfest,

Macro-microstates of the system. Thermodynamic probability

The one-sided nature of the spontaneous occurrence of processes is a natural consequence of the molecular nature of the substance, from which, in turn, follows the ability to determine the state of the system in two different ways.

1 The values ​​of thermodynamic parameters are set ( T, R, composition, etc.), which are averaged, i.e. is given macrostate systems.

2 The coordinates and velocities of all particles composing the system are specified, i.e. is given microstate systems.

Obviously, the same macrostate can be defined by a large number of microstates. The number of microstates that implement a given macrostate is called thermodynamic probability(W ). Unlike mathematical probability, which ranges from zero to one, W≥ 1. It is the numerator of mathematical probability (this is the ratio of the number of outcomes favoring a given event to the total number of outcomes). Like mathematical probability, the quantity W has the property multiplicativity , i.e. if two systems with thermodynamic

probabilities W 1 and W 2 are combined into one complex system, then for it W=WW 2.

Thermodynamic probability is a property of a system, which means it does not depend on the path along which the system came to a given state. In general, if N particles forming the system are distributed over s states so that each state corresponds

Since particles are indistinguishable within any one state, then out of the total number of microstates N!, determined by the number of permutations of all particles, permutations that do not define new microstates are excluded. From the above expression it follows that the value W the greater, the more uniform, and therefore more chaotic, the distribution of particles among states. Therefore, thermodynamic probability can serve as a criterion for the direction of processes in isolated systems. Using this concept, we can give another formulation of the second law of thermodynamics (statistical).

Second law: left to itself, i.e. isolated, the system moves from a state with a lower thermodynamic probability to a state with a higher value. In a state of equilibrium, the thermodynamic probability is maximum.

The statistical nature of the second law is manifested in the fact that it is satisfied with high accuracy only for a very large number of particles. As the size of the system decreases, its applicability decreases, since the boundaries between a more uniform and less uniform distribution of particles (between order and disorder) become blurred. Deviations from the uniform distribution of particles, and, consequently, the properties of matter, also exist in large systems. As a rule, they are short-term and (or) concentrated in microscopic volumes. Such deviations are called fluctuations. Their existence in real systems shows that local (in time and space) deviations from the second law are possible, and the law of increasing thermodynamic probability is satisfied only on average for a relatively large period of time. The role of fluctuations is great during the course of many processes, for example, during the crystallization of a liquid.

Entropy

We found that by changing the thermodynamic probability we can judge the direction of the process in an isolated system. However, in practice it is inconvenient to use this criterion for the following reasons:

A) W takes on very large values;

b) W is associated not with the thermal characteristics of the system, but with the mechanical ones - the positions of particles in space and their velocities.

Within the framework of thermodynamics, it was necessary to find a criterion for the direction of processes (or equilibrium) associated with thermal quantities. This problem was solved when introduced by one of the founders of thermodynamics R. Clausius concept entropy(S ) physicist L. Boltzmann associated with thermodynamic probability:

S = kln W . (9)

In this ratio k– Boltzmann constant. Entropy, like thermodynamic probability, is a function of the state of the system. It is expressed in smaller numbers than W, and, in addition, is additive:

S 1+2 = k ln W 1+2 = k ln( W 1W 2) = k ln W 1 + k ln W 2 = S 1 +S 2.

Since in an isolated system, as a result of a spontaneous process, the thermodynamic probability increases, and at equilibrium it takes on a maximum value, then entropy behaves in the same way and can also serve as a criterion for the direction of processes. Let us formulate the second law using the concept of entropy.

Second law: left to itself, i.e. isolated, the system moves from a state with less entropy to a state with greater entropy. At equilibrium, the entropy of the system is maximum.

Entropy is a quantitative characteristic of the chaos of a system, or a measure of disorder in it. The more randomly the particles are located, the greater the magnitude W And S.

For example, all other things being equal, we can write:

S crystalline phase< S liquid phase< S gaseous phase.

Entropy increases in any process that occurs under the influence of the movement of particles tending to a less ordered distribution in space - expansion of gas with decreasing pressure, thermal expansion of bodies, melting, evaporation, dissociation of molecules into free atoms, etc.

The mathematical notation of the second law of thermodynamics is considered to be the relation that was first obtained from purely thermodynamic considerations by Clausius:

ΔS ≥ ∫ (δQ/T)

Here the equal sign corresponds to a reversible process, and the inequality sign corresponds to an irreversible process.

dS = for reversible processes;

dS> for spontaneous processes.

dS < для несамопроизвольных (вынужденных) процессов

Note that the value Δ S, as a change in the property of the system, does not depend on the path of the process, in particular on whether it is reversible or not. The heat of the process, like the work, is maximum if it proceeds reversibly, this is what determines the sign of inequality in relation (10).

Calculation of entropy changes in various processes.

The change in entropy cannot be measured directly; only heat can be measured. The amount of heat may be different in a reversible and irreversible process, but the change in entropy in them must be the same if the initial and final states are the same

dS arr.= arr.

S 2 – S 1 = dS raw. > raw Þ arr. > raw (4.5)

Like work, the heat of a reversible process is always greater than the heat of an irreversible process for given initial and final states.

Helmholtz energy.

Physical meaning

Let's transform the expression so that the state functions fall on one side of the inequality - to the left:

dU - TdS £ -dW.

Let us imagine an isothermal process and integrate this equation

£ -W T Þ DU - TDS £ -W T Þ U 2 – U 1 – T(S 2 – S 1) £ -W T.

(U 2 –TS 2) – (U 1 –TS 1) = DA £ - W T. (7.8)

The physical meaning of entropy is revealed by statistical thermodynamics, which describes the behavior of large groups of particles based on the laws of probability theory from the standpoint of the molecular kinetic theory of matter and the concepts of quantum mechanics.

The state of any collection of particles can be characterized as macrostate values ​​of statistically average, directly measured characteristics (temperature, pressure, volume). Another characteristic of the condition is microstate, which is determined by the values ​​of the instantaneous parameters of each particle (position in space, speed, direction of movement, energy, etc.). One macrostate can be realized by a huge number of options for the distribution of particles in space, in velocities, in energies, etc., i.e. a huge number of microstates. The number of microstates with the help of which a macrostate can be realized is called thermodynamic probability The expected macrostate. The thermodynamic probability of real macrostates is so great that it has no other numerical analogies. For example, for a system of only 1000 gas molecules under standard conditions, the thermodynamic probability is of the order of 10000, and for 1 mol - 1010!

Thermodynamic probability W is a state function as internal energy U and enthalpy N. It could serve as a criterion for the spontaneous occurrence of processes, since, undoubtedly, the most probable macrostates are practically realized. However, unlike U And N thermodynamic probability is non-additive, i.e. if two systems with W ] And W 2 form a complex system, then for it the thermodynamic probability is not equal to the sum W x + W 2, and their product: F (1 2) = Wv An inconvenience for thermodynamic calculations is also a huge order of magnitude W and its lack of dimension. Therefore, it is more convenient to characterize the macrostate by a value proportional to the logarithm of the thermodynamic probability, which is entropy:

Expression (4.13) is called Boltzmann equation, and the proportionality coefficient k - Boltzmann constant:

Where R- universal gas constant; N A- Avogadro's number.

Unlike thermodynamic probability, entropy is additive, that is, if two systems with S] And S 2(respectively, W ] And W 2) form a complex system, then its entropy is: S = k 1 n(W x W 2) = = to (W" l+ In W 2) = k W x + k W 2 = S x + S 2. Entropy has the dimension of the Boltzmann constant (J/K) and a convenient order of magnitude:

So, the most probable macrostate corresponds to maximum entropy. If we consider ideal order to be the only way of realizing a macrostate, and chaos to be the presence of the maximum number of such possible ways, then entropy can be considered a measure of molecular chaos, disorder in a system.

Entropy is represented as the sum of components that describe various forms of particle motion. Typically, components such as the entropy of translational 5 and rotational ^ rotational motion of molecules, the entropy of internal rotation (rotational motion of atoms and atomic groups contained in a molecule) are considered. S BH rotation, the entropy of the vibrational motion of atoms and atomic groups in a molecule is 5 col and the entropy of electron motion is 5 el, so

(There are other components, but they do not change in chemical transformations and therefore may not be taken into account.)

The higher the value S, the more probable and in general the more disordered the state is. With increasing temperature, entropy always increases, since the intensity of the movement of particles increases, and therefore, the number of ways of their arrangement increases. It also increases when a substance transforms from a crystalline state to a liquid state and, in particular, from a liquid state to a gaseous state. Thus, heating, melting, boiling, expansion and dissolution lead to an increase in entropy, and cooling, condensation, compression, and precipitation lead to a decrease in entropy.

Entropy also changes during chemical reactions. These changes are usually especially large in the case of reactions that lead to a change in the number of gas molecules: an increase in the number of gas molecules leads to an increase in entropy, a decrease leads to its decrease. Similar to internal energy and enthalpy entropy is a function of state, those. the change in entropy (A-5) depends only on the initial (Sj) and final (dU 2) state of the system and does not depend on the process path: AS = S 2 - S y

Entropy of a substance at a given temperature T(in the absence of phase transformations when heated from absolute zero to 7) can be calculated from the heat capacity at constant pressure:

Expressing 5 Q from (4.15) and substituting the result into (4.11), we obtain:

After integration over the temperature range from absolute zero to T we get:


Where S T- entropy at a given temperature; S Q- entropy at absolute zero. The integral in (4.17) can be calculated, since the dependence S p temperature is usually known, and S p at absolute zero it is zero.

It remains to be found S Q . As absolute zero approaches, all particles in the crystal lattice move to the lowest energy level. This is the only microstate, i.e. W 0 = 1, and 5 0 = 0. This position, formulated in 1911 by Planck, was called third law of thermodynamics: the entropy of a pure crystalline substance tends to zero as the temperature tends to absolute zero. It follows that it is possible to calculate not only the change in entropy, but also its absolute value.

To compare the entropy values ​​of various substances with each other, the concept was introduced standard entropy 5^98 is the entropy of 1 mole of a substance in the standard state. The dimension of standard entropy is JDmol K). Standard entropies are known for many substances. For some of them, standard entropies are given in Table. 4.2.

Standard entropies S° 98 of some substances at 298 K (25 °C)

Substance

J-mol^K -1

Substance

J"MOL '-TO 1

Substance

J "mol '-K 1

A1 2 0 3 (corundum)

N0 2 (g)

C 2 H 4 (g)

N 2 0 (g)

OF 2 (g)

Since entropy is a function of state, like enthalpy, to calculate the entropy of a reaction aA + bB = dD + eE an expression similar in form to the third corollary from Hess’s law (4.8) is applicable:

Example.

Calculate D S° 2% reactions Fe 2 0 3 (k) + ZN 2 (g) = 2Fe (k) + ZN 2 0 (g). Solution.

From the table 4.2 we find standard entropies:

Definition 1

Thermodynamic probability is the number of methods through which it is possible to realize any state of a macroscopic physical system.

Figure 1. Entropy and probability. Author24 - online exchange of student works

In thermodynamics, the position of a concept is characterized by specific values ​​of density, temperature, pressure and other measurable quantities. The listed parameters determine the further state of the system as a whole, but at the same density, elementary particles can be located in different places in its volume and have completely different values ​​of momentum or energy.

Definition 2

Each state of a thermodynamic system with a certain division of its particles according to probable quantum or classical positions is called a microstate in physics.

The thermodynamic probability is equal to the number of microstates that realize the existing macrostate. Such a process is not a probability in the mathematical aspect, therefore, it is used in statistical physics to determine the properties of a concept that is in thermodynamic, constant equilibrium.

For the accurate calculation of probability in thermodynamics, it is important whether identical elements of a system are considered indistinguishable or different. Therefore, quantum and classical mechanics lead to completely different expressions for thermodynamic probability.

Features of probability in thermodynamics

Figure 2. Thermodynamic probability. Author24 - online exchange of student works

Note 1

The main advantage of thermodynamics is that it helps to consider the general properties of the concept at equilibrium and the general laws of determining density, to obtain important information about the substance itself, without fully knowing its initial internal structure.

Its laws and methods are applicable to any material body, to any systems that include magnetic and electric fields, so they have become the basis in the following areas:

  • gas and condensed media;
  • chemistry and technology;
  • necessary in the physics of the Universe and geophysics;
  • biology and control of physical processes.

The researcher Boltzmann considered the atomic theory to be completely justified. An infinite or huge number of particles makes the mechanical effect impossible and requires a statistical description. The mathematical tool of modern statistics is the calculus and determination of probabilities. Boltzmann proved that since the basis of thermodynamic processes are kinetic reversible processes, irreversibility in the entropy measured by thermodynamics cannot be absolute in practice. Therefore, entropy must be directly related to the possibility of realizing a given microstate.

The concept of probability, implicitly used by Maxwell, was used by Boltzmann to overcome difficulties related to understanding the second law of thermodynamics and the theory of the “heat death of the Universe.” The pinnacle of Boltzmann's scientific work was the establishment of the relationship between thermodynamic probability and entropy. Planck introduced this connection through the introduction of the constant $k = R / N$, which is called Boltzmann's constant.

Thus, an irreversible physical process is a smooth transition from a less probable position to a more probable one, and the logarithm of the change in the initial state, up to a stable factor, completely coincides with the movement of entropy. Boltzmann used this effect for an ideal gas.

The higher the level of disorder in the velocities and coordinates of the particles of a system, the greater the possibility that the concept will be in a state of chaos. Boltzmann's formula can be considered as the basic definition of entropy.

Calculation of probability in systems

If the system is very large, and its initial position is not too close to the equilibrium state, then transitions of substances into less probable states will be practically impossible, and in practice they have absolutely no significance. Then the law of increasing entropy is justified experimentally with absolute certainty.

Let us calculate the exact probability of such physical processes. Let there be only one molecule in a certain vessel. Then, in the absence of external force fields, an elementary particle with equal probability can end up either in part 1 or in part 2. The probabilities of such a hit are the same and are written as follows:

After the second molecule enters the vessel, their hits will always be independent states, since the elements of an ideal gas do not interact with each other. If you photograph the distribution of atoms in a vessel through equal intermediate positions for a long time, then for every 1000 frames there will be on average approximately one frame in which all molecules will be recorded only in part of the vessel 1. A similar phenomenon can be observed in part 2.

According to the hypothesis of adding probabilities, we get on average 2 frames per thousand with elementary particles concentrated in any part of the system. All this is not only completely possible in principle, but is actually accessible to ordinary observation. There is practically no chance of recording the corresponding fluctuation. With an equal Avogadro number, the temperature indicator for the corresponding probability turns out to be such a small value that such possibilities and the conditions corresponding to them can be completely ignored.

Difference between thermodynamic and mathematical systems

Today, scientists share two main probabilities in thermodynamics:

  • mathematical;
  • thermodynamic.

Thermodynamic probability is a certain number of microstates through which the required macrostate of the concept can be determined. To find the thermodynamic probability of its initial state, one should count the number of combinations that will help to realize any spatial distribution of elementary particles.

The mathematical probability of a state is equal to the ratio of the thermodynamic possibility to the total value of possible microstates. Mathematical probability is always less than one unit, while probability in thermodynamics is expressed in large numbers. Probability in mathematics is not additive and is directly related not to the thermal properties of the system, but to mechanical ones, for example, with the movement of molecules in the medium and their speed. 

One and the same macrostate can correspond to many minor microstates. According to L. Boltzmann, the greater the number of such provisions that a specific macrostate can be realized, the more probable it is in practice. The thermodynamic probability of a concept state is the number of microstates that ultimately realize the macrostate.

When using these methods, it is necessary to keep in mind that conclusions based on it are considered the most probable only in the thermodynamic question, and indicate only the possibility or impossibility of a particular physical process. In real conditions, minor deviations from the conclusions made cannot be excluded, and the occurring phenomena may, under certain circumstances, be different from those that acted on the basis of general thermodynamic considerations.

State of the system determined by thermodynamic parameters (p, V, T), called a macrostate, which is observed experimentally. In statistical physics, the number of microstates that realize a given macrostate of the system is called thermodynamic probability.

Let's explain this with an example. Let there be six gas molecules in a vessel. Mentally divide the vessel into three equal parts.

Moving chaotically, molecules create certain macrodistributions, some of which are shown in Fig. 43 A. Any distribution, for example, the first one, can be carried out by a number of microstates; Some of the possible microstates that give rise to the first macrostate are shown in Fig. 43 b.

In theoretical physics it is proven that thermodynamic probability, i.e. the number of possible distributions of N particles over n states (six particles in three parts of a vessel), is determined by the formula

Where N 1- the number of particles in the first state (the first part of the vessel); N 2— the number of particles in the second state (the second part of the vessel); N 3- the number of particles in the third state (third part of the vessel), etc.

Rice. 43. To the concept of thermodynamic probability

Let us calculate the thermodynamic probabilities of macrostates 1, 2, 3, 4 shown in Fig. 43 A:

= 90 - for state 1,

= 60 - for state 2,

= 20 - for state 3,

= 15 - for state 4,

= 90 - for state 5.

Uniform distribution has the highest thermodynamic probability; it can be carried out in the greatest number of ways.

The connection between entropy and thermodynamic probability was established by Boltzmann - entropy is proportional to the logarithm of thermodynamic probability:

S = klnW, (148)

Where k- Boltzmann constant.

Formula (148) is called the Boltzmann formula.

The statistical meaning of the concept of entropy is that an increase in the entropy of an isolated system is associated with the transition of this system from a less probable state to a more probable one.

One of the formulations of the second law of thermodynamics, which reveals the statistical nature of this law, is Boltzmann’s formulation: all processes in nature proceed in a direction leading to an increase in the probability of a state.

For example, the process of diffusion in gases occurs because a uniform distribution of molecules throughout the volume will be statistically more likely. The second law of thermodynamics is a statistical law that holds for closed systems consisting of a large number of particles. The second law is not applicable for systems consisting of an infinite number of particles, since for such systems all states are equally probable.

Entropy is an additive quantity. This means that the entropy of a system is equal to the sum of the entropies of its parts. Obviously, the Boltzmann equation satisfies this entropy property. Indeed, the Probability of a complex event is the product of the probabilities of states:


W = W 1 * W 2.

lnW = lnW 1 + lnW 2 .

In the thermodynamic definition of entropy, we encountered the difficulty of extending this concept to the case of thermodynamically nonequilibrium states. Boltzmann's formula (148) provides a fundamental way to overcome this difficulty. We must look at it as a definition of entropy. True, in order for this definition to receive specific content, it is necessary to supplement it with methods for calculating the probabilities of states in all required cases. But even without this, it is clear that with such an understanding of entropy, the law of its increase radically changes its character. It loses its absoluteness and turns into a statistical law. The entropy of a closed system can not only increase, but also decrease.

And it will indeed decrease if you wait long enough. However, the process of decrease will again be replaced in the future by a process of increase. What remains in this case from the second law of thermodynamics? What is its physical content? And the fact is that any given state of the system will be followed by even more probable states, if not necessarily, then in the overwhelming majority of cases. If the system is large, and its initial state is not very close to the equilibrium state, then transitions of the system to less probable states will be so unlikely that in practice they have no significance at all. Then the law of increasing entropy is justified with almost absolute certainty.

Let's calculate the probability of such processes. Let there be only one molecule in the vessel. Then, if there are no external force fields, the molecule can get into either part 1 or part 2 with equal probability. The probabilities of getting into these identical parts are P 1 = P 2 = 1 /. 2. Let's introduce a second molecule into the vessel. Since the molecules of an ideal gas do not interact with each other, their entry into one or another part of the vessel will be independent events. The probability that both of them will end up in part 1 can be found using the probability multiplication theorem and will be equal to P 1 = 1 / 2 1 / 2 = 1 / 4 .

If in a vessel N molecules, then, reasoning similarly, we find that the probability of them getting into part 1 will be P 1 = (1 / 2) N. At N= 10 we get P 1= (1 / 2) 10 = 1 / 1024 ≈ 0.001. If you photograph the distribution of molecules in a vessel at regular intervals for a long (in the limit of infinitely long) time, then for every 1000 frames on average there will be approximately one frame in which all 10 molecules will be recorded only in part of vessel 1. The same can be said and about part 2.

According to the theorem of addition of probabilities, you will get on average 2 frames for every thousand with molecules concentrated either in part 1 or in part 2 (it doesn’t matter which). All this is not only possible in principle, but also actually observable. However, when N= 100 we get P 1 = (1/2) 100 ≈ 10 -30 , and there is practically no chance of observing the corresponding fluctuation. When N is equal to Avogadro's number for the corresponding probability, the value obtained is so monstrously small that such probabilities and the events corresponding to them can be completely ignored.

Let's look at Fig. again. 1a. The entropy of the system is maximum for the 1st state. This state is the most chaotic, or the most disordered. In this state the system is absolutely homogeneous. States 2, 3, 4, 5 are characterized by the fact that the system becomes inhomogeneous and order appears in the arrangement of particles. When moving from the 1st state to the 5th state, there is more and more order in the arrangement of particles (the orderliness increases, or the heterogeneity of the system increases), and at the same time the entropy decreases. Option 5 has the lowest entropy, when all particles are located in one place. In this case, the system is maximally heterogeneous (or maximally ordered).



Did you like the article? Share with your friends!