Xem mẫu

Again consider a system constrained to constant total volume, and to have a uniform temperature. Then, as before, we can write the total change in entropy that might result from a change in volume of either part in the form (T, V) dS = dS1 dV1 + dS2 dV1 1 1 and, because total volume is constant, dV1 = - dV2. Substitution of equation 12 gives5 dS =P P dV ³0 (13)  1 2  This proves that two bodies in equilibrium must have not only the same temperature but also the same pressure. It also proves that, when two bodies in contact, at the same temperature, have different pressures, the body at the higher pressure will tend to expand and compress the body at the lower pressure. ISOTHERMAL ENTROPY CHANGES. Entropy changes at a single temperature follow directly from equation 5. (Calculations involving a temperature difference or temperature change will be treated later.) A change of phase, under equilibrium conditions, will be at constant temperature and constant pressure. The thermal energy transfer, Q = Qre , will be ΔH. Therefore, ΔS = ΔH (14) For example, the entropy change for the melting of ice was found to be 1.22 J/g•K or 22.0 J/mol•K. The entropy of vaporization of water at 100 C is 40,657 J/mol vap 373 .15 K = 108 .96 J/mol × K More typical liquids, not involving such strong intermolecular forces, have ΔSvap values of about 92 J/mol•K at their normal boiling points. This is known asTrouton’s rule. The work done in a reversible, isothermal expansion of an ideal gas was found to be (equation 10, Chapter 1) Wrev = - nRT ln V2/V1. For an ideal gas at constant temperature ΔE = 0 and therefore Qrev = -Wrev. The entropy change is thus 5 It is assumed that T1 = T2 and thus the total entropy is unchanged by a transfer of thermal energy between the two parts. The constant-energy restriction of equation 12 is satisfied because any work done by one part on the other can be compensated by such a transfer of thermal energy. 7/10/07 2- 40 T,I.G.,rev. ΔS = rev = nRlnV2 /V (15) If the gas expands through a pinhole or stopcock (an effusion-controlled leak) into an evacuated container (Figure 2), no work is done because the gas exerts forces only on the immovable walls of the container. By insulating the containers we can ensure that Q = 0. The energy is unchanged and therefore, if the gas is ideal, the temperature remains constant. This serves as an experimental demonstration (performed by Joule, in 1844) of the dependence of energy only on the temperature, although it is not as sensitive as a later method by Joule and Thomson. The entropy change of an ideal gas in such an irreversible, adiabatic6 expansion can be calculated as follows. Entropy is a state function, so the entropy change depends only on the initial state and the final state. Entropy changes are independent of the path taken between the initial and final states. When an ideal gas expands at constant temperature from V1 to V2, the final state is the same whether the process is reversible or irreversible. Therefore, the entropy change for the adiabatic expansion of the ideal gas into a vacuum is (I.G., T) ΔS = nR ln V2/V1 (16) even though Q = -W = 0 in this process. The example just cited illustrates a general method for evaluating entropy changes in irreversible processes. Having determined what are the initial and final states, a path is found between those states that will be entirely reversible. Then equation 5 tells us ΔS = Qre /T.7 It is of interest to calculate the entropy change for the surroundings in the reversible and irreversible expansions above. In the reversible expansion, Q = Qrev = - Qsurr = -(Qre )sur . 6 Diabatic, like its better-known twin diabetic, implies that something “passes through” (thermal energy or sugar). Thus adiabatic tells us there is no thermal energy transfer; Q = 0. In practice we achieve adiabatic conditions by insulating the system, by maintaining system and surroundings at the same temperature, or by carrying out the process very rapidly. Most gas expansions are fast enough to be nearly adiabatic. 7 One sometimes sees the statement that ΔS = Q/T for a reversible process. This is not really wrong, but it is doubly misleading. It de-emphasizes the need to find Qrev (which is often different from Q) and it provides no clue to finding ΔS for an irreversible process. Equation 5, ΔS = Qrev /T, covers both irreversible and reversible processes. 7/10/07 2- 41 Therefore (rev) ΔSsurr = - nR ln V2/V1 The adiabatic expansion into a vacuum produces no change at all in the surroundings, and therefore ΔSsurr = 0 Adding together the entropy changes for system and surroundings, we obtain, for the reversible expansion, (ΔS)system + surroundings = 0 and for the irreversible expansion, into a vacuum, (ΔS)system + surroundings = nR ln V2 /V1 > 0 This result agrees with the requirements of the second law. The particular irreversible path chosen is an extreme case. If the gas had been allowed to expand against a constant external pressure (as in Figure 1, Chapter 1) the entropy change for system plus surroundings would have been positive, but less than that for the expansion into a vacuum. The magnitude of the total entropy change (system plus surroundings) can be taken as a measure of the degree of thermodynamic irreversibility of any process. INTERPRETATION OF ENTROPY. We have seen that we can measure the increase in entropy by measuring the amount of energy transferred to the system, as thermal energy transfer, Q, divided by the (absolute) temperature, provided we measure Q along a path between initial and final states that is completely reversible. To know whether a process will actually occur, we must make a similar measurement for changes in the surroundings, as we will emphasize below. Without getting involved in the detailed calculations of statistical mechanics, it is important to look more carefully at the meaning of entropy as a spread function. In particular, there are two ways of measuring the spreading, or the randomness, of a state. We have seen that entropy is appropriately called the spread function because it is a measure of the amount of spreading of energy among the molecules of a substance. However, in addition to considering where the packets of energy are located we should look also at where the molecules are (like the yellow and white rice molecules we mixed earlier). Remarkably, the same measurement of Qrev/T gives information on this second type of spreading. Consider a sample of an ideal gas, occupying an initial volume Vi (e.g., 1 m3). Assume, quite arbitrarily for the moment, that each molecule requires some very small volume, Vo (e.g., 2 Å , most of which volume elements are unoccupied at any given instant). Then we can say there are spaces for Vi /Vo molecules (5 x 1029 in our example). If we let the gas expand to a larger volume, Vf , then by the same reasoning there are now spaces for V /V molecules. The important thing is 7/10/07 2- 42 that we have increased the number of available spaces by the ratio Vf/Vi. But we already know that if we expand an ideal gas, the entropy increases by ΔS = nR ln Vf/Vi. In other words, when we increased the number of available, or accessible, spaces by the ratio Vf/Vi, the entropy increased by the logarithm of this same ratio. We could equally well make the same kind of argument for molecules in a crystal, where the number of available spaces is more readily counted. If some number of “stray” molecules are introduced and given an opportunity to diffuse throughout the crystal, we would obtain a substantial increase in entropy per molecule. Or we may simply drop a crystal of salt into water. When we return, very likely the salt molecules will have diffused throughout the water. The salt molecules, or their constituent ions, are attracted to each other and roughly equally attracted to water molecules, so there is negligible energy difference, but there is a very high probability of the salt diffusing into the water, never to return to the crystal unless external conditions are changed. It is like a drop of water on the pavement. Even though energetically the water drop would prefer to remain with its neighbors, the lure of wide open spaces is enough to make the drop evaporate. The process is driven by an increase in entropy, or in probability. The crystal is highly ordered; the solution is highly disordered, or random. There are more states accessible to the salt when it is dissolved in water, or accessible to gas molecules when a larger volume is available to the molecules. More available, or accessible, states can be equated to a greater probability. Thus entropy is variously described as measuring disorder, or randomness, or probability. We can always organize molecules, forming crystals or condensing a gas to a liquid or arranging conditions such that molecules will form living structures. To do so always has a cost involved, specifically the cost of increased randomness in the surroundings. The second law of thermodynamics does not tell us we cannot decrease entropy in any given sample. It tells us only that, for the universe as a whole (system plus surroundings), entropy will always increase. A study of how signals may be communicated from one place to another gave rise to a new understanding of probability under the heading of information theory. Information theory turned out to be of major importance in understanding the meaning of entropy as well as in discovering better ways to organize the storage and distribution of information over telephone lines, as radio or television signals, or in computers. Gibbs Free Energy When investigating equilibrium or reversibility, it was necessary to find the change for the system but also for the surroundings. Usually we prefer to ignore the surroundings most of the time. The free energy functions provide a means of doing just that. We began with the variables P, V, and T, to which we added energy, E, and subsequently have added enthalpy, H, entropy, S, and now the two free energy functions, F and G, as shown in equations 17-19 and in Figure 3. H = E + PV (17) F = E - TS (18) G = H - TS (19) 7/10/07 2- 43 G = F + PV (19a) G = E + PV - TS (19b) We introduced the Helmholtz free energy8 from the relationship (eqn. 4) ΔA = ΔF = Wrev and the observation that a change in F represents the minimum amount of work that must be done on the system, at constant temperature, to get from the initial to the final state. For example, you cannot put a box on a shelf without doing some work on the box (and gravitational field), nor can you compress a gas without doing work on the gas. There is good reason for picking the correct function. Recall that we found enthalpy, H, to be much more convenient than energy, E, when measuring thermal energy transfers under conditions of constant pressure. For much the same reasons, we will find G to be generally more convenient than F. The Helmholtz free energy is more appropriate at constant volume; Gibbs free energy is better for constant pressure. FREE ENERGY AND EQUILIBRIUM. Assume for the moment that the only work done is work of expansion or compression, and that the composition of the system does not change. Then, from equation 19b, dG = dE + d(PV) - d(TS) As we have seen, because E is a state function, dE is independent of path, so dE = q + w = qrev + wrev regardless of path. We may therefore substitute qrev + wrev for dE, to obtain dG = qrev + wrev + P dV + V dP - T dS - S dT (20) But qrev = T dS and wrev = - P dV. Therefore, quite generally, dG = V dP - S dT (21) We effectively split this equation into two parts, by holding constant first the temperature, then the pressure: ¶PT =V (22) ¶T P = S (23) 8 For many years the Gibbs free energy and the Helmholtz free energy were not clearly distinguished, and thus both were designated by the symbol F. In recent decades the symbol F has been selected for Helmholtz free energy and the symbol G for Gibbs free energy. Unless specifically indicated otherwise, the term “free energy” will always mean the Gibbs free energy in this book. [Fig. 3: A becomes F.] 7/10/07 2- 44 ... - tailieumienphi.vn
nguon tai.lieu . vn