Question
Question: Two blocks of the same metal having the same mass and at temperature \({T_1}\) and \({T_2}\), respec...
Two blocks of the same metal having the same mass and at temperature T1 and T2, respectively, are brought in contact with each other and allowed to attain thermal equilibrium at constant pressure. The change in entropy, ΔS , for this process is:
A. 2Cpln(4T1T2T1+T2)
B. 2CplnT1T2(T1+T2)21
C. Cpln(4T1T2(T1+T2)2)
D. 2Cpln(2T1T2T1+T2)
Solution
Entropy is a function of the state of the system, so the change in entropy of a system is determined by its initial and final states. In the idealization that a process is reversible, the entropy does not change, while irreversible processes always increase the total entropy. Thus, entropy is the order of randomness of a system.
Complete step by step answer:
According to the question, the two blocks are placed in contact with each other. As there is a difference in the temperature of the two systems, there will be a transition of heat from the high temperature system to the low temperature system. Thus, the final temperature (Tf ) of the system will be:
Tf=2T1+T2
Now, as we know from the definition of the change in entropy,
ΔSsystem=∫Tdqrev=nCp∫TdT
Where, dqrev= small change in the reversible heat
Cp= specific heat at constant pressure
dT= small change in the temperature
Thus, the change in entropy of the system for the first block will be given as:
ΔS1=nCpT1∫TfTdT=nCplnT1Tf
Similarly, the change in entropy of the system for the second block will be given as:
ΔS2=nCpT2∫TfTdT=nCplnT2Tf
Now, the total change in entropy is equal to the sum of the change in entropy of the individual blocks, which is given by:
ΔS1+ΔS2=nCplnT1T2Tf2
But, we know that, Tf=2T1+T2
Thus, replacing the value of Tf in the above equation, we have the final entropy of the combined system as:
ΔS1+ΔS2=nCplnT1T2(2(T1+T2))2
Thus, the final entropy of the system is = nCpln(4T1T2(T1+T2)2)
For n =1, the final entropy will be = Cpln(4T1T2(T1+T2)2)
So, the correct answer is Option C.
Note:
Because it is determined by the number of random microstates, entropy is related to the amount of additional information needed to specify the exact physical state of a system, given its macroscopic specification. For this reason, it is often said that entropy is an expression of the disorder, or randomness of a system.