I had saved this for the last phase of Month of Equations series this year. I want you to understand this law fundamentally and appreciate its beauty. Everyone appreciates the Schrodinger wave equation, Mass energy equivalence, Uncertainty principle etc. These are popular science equations or the flagship equations of physics. Today, we'll learn about another equation that is even more fundamental: The Second Law of Thermodynamics.
The Second law of thermodynamics
The entropy of an isolated system can never decrease over time.
This law deals with an important property of the physical systems. It says something about its entropy. But I don't want to attack the readers with hardcore concepts and terms of thermodynamics. So I will first explain what entropy actually means and then step by step we'll develop the second law.
Entropy in thermodynamics
There was a scientist named Sadi Carnot. He built a theoretical model of an engine (Carnot's engine) that would convert heat energy to mechanical work. For the heat to flow, two reservoirs at different temperatures are used. More the temperature difference between the two reservoirs, more will be the efficiency of the engine, i.e, more of heat energy will convert to mechanical work. It is important to note that Carnot cycle is an idealized heat engine. The temperature of the reservoirs does not change as heat is added/extracted (infinite heat capacity). But in reality, as time progresses and as the engine does work, the temperature gradient between the reservoirs decreases and ultimately becomes so less that no work can be extracted from this engine. This happens when thermal equilibrium is established in the system.
Carnot did not explain any further. A few years later, another scientist, Rudolph Claussius gave the concept of entropy. He defined entropy as an internal property of the system that changes as the heat energy moves around in a system. Mathematically, it is written as ΔS = ΔQ/T. For an ideal Carnot cycle, the change in entropy is zero but for any other idealized system, it is positive. An increase in entropy actually means that the reservoirs are approaching the same temperature. This was the origin of the term entropy. But I don't want you to stop here. The story is not yet completed.
Entropy in statistical mechanics
Carnot and Claussius taught us about entropy in terms of heat engine. But this isn't enough to understand the reality of entropy. The concept of entropy is best explained by the most important branch of physics, the statistical mechanics, formulated by a genius named Ludwig Boltzmann. Statistical mechanics is the bridge between quantum and classical physics. Let us see how this successfully explains entropy.
I will explain the concepts of statistical mechanics in a brief and easy way. Suppose you have a huge collection of particles in a box. They can be in trillions. Each particle will have a different position, velocity, momentum, spin etc. Such states are known as microstates. A particle has an access to every possible microstate. It can have any speed, momentum etc. Also at equilibrium, the probability of accessing any microstate is equally likely. Now the collection of particles, which we call a gas, will have a particular temperature, volume, pressure etc. These are known as the macrostates. Statistical mechanics assumes that for a given configuration of macrostate, all the microstates are equally likely.
The last fact that should be mentioned is that if you leave a system undisturbed, the particles will try to take all the possible microstates and if you observe the system at any time, you'll always get a random configuration that is evenly spread out. All the combinations are possible. Say, it is possible that all the trillion particles occupy one half of the box and other half remains empty. However, the statistics tell that the probability of such a configuration is so low that it will never happen in an isolated system. But what entropy has to do with microstates? Well, Boltzmann figured out that too.
Link between entropy and microstates
The Boltzmann equation says that if there are W number of microstates, then the entropy of the system is S = k log (W), where k is Boltzmann's constant. So this explains why the configuration in which all the particles are in one half of the box has low probability. That's because we restricted the number of microstates (positions here) that the system could access thereby reducing the entropy.
The second law of thermodynamics should be clear now. Over the course of time, the entropy of an isolated system increases or at the most, remains constant. Remember, the word isolated is important. You should not add heat energy from outside to again increase the amount of work that can be extracted from the heat engine or you should not use a vacuum cleaner to push all the particles in the box to one half. Now when we know this law, I would finish this equation by answering the question: Why is it more fundamental than any other physics law?
Importance of second law of thermodynamics
Take any law of classical/ quantum mechanics. It doesn't really care about the direction of time. The second law of thermodynamics adds this information. It shows the arrow of time. The arrow of time points in the direction of increase in entropy. This explains why time always moves in forward direction. This law is thus of fundamental importance. Einstein once said, "The second law of thermodynamics is the only physical theory of universal content that won't be overthrown". Also, one of the greatest astrophysicist, Eddington commented on this law, "If your theory is found against the second law of thermodynamics, then I can give you no hope. There is nothing for it, but to collapse under deepest humiliation." Now you know the power of this law!
The second law of thermodynamics predicts the fate of the universe too. Isn't it beautiful how a law is applicable to, from the smallest of the systems to the entire universe. If you haven't understood the statistical picture well, don't worry. I have an article dedicated to the concept of microstates and macrostates. In that article, we will learn about the statistical picture of entropy.