Entropy and Second Law of Thermodynamics

What Is The Real Meaning Of Entropy And How To Understand It Completely?

We live in the 21st Century. Science has become so advanced that today we are smashing atoms in particle accelerators, launching interplanetary space probes, imaging the deep space across the entire electromagnetic spectrum and so on. Today the hot topics of research are particle physics, nuclear physics, relativity, gravitational waves, quantum mechanics etc. But in order to understand the term entropy, we need to travel about 150 years back in time, where we had no relativity, no Einstein, no quantum mechanics, no particle physics and no nuclear physics. What was the main topic of research? The answer is Thermodynamics.

There are two main interpretations of entropy. The first one is provided by Thermodynamics and second one by Statistical Mechanics. In order to completely understand entropy, both interpretations are important. However, the interpretation from Statistical Mechanics is more fundamental.

The Thermodynamic Picture (The Roots of Entropy)

There was a scientist named Sadi Carnot. He built a theoretical model of an engine (Carnot’s engine) that would convert heat energy to mechanical work. For the heat to flow, two reservoirs at different temperatures are used. More the temperature difference between the two reservoirs, more will be the efficiency of the engine, i.e, more of heat energy will convert to mechanical work. It is important to note that Carnot cycle is an idealised heat engine. The temperature of the reservoirs does not change as heat is added/extracted (infinite heat capacity). But in reality, as time progresses and as the engine does work, the temperature gradient between the reservoirs decreases and ultimately becomes so less that no work can be extracted from this engine. This happens when thermal equilibrium is established in the system.

The amount of work that can be extracted from the hot reservoir depends on the temperature difference between the two. As time progresses, heat energy redistributes itself and entropy increases thereby reducing W.

Carnot did not explain any further. A few years later, another scientist, Rudolph Claussius gave the concept of entropy. He defined entropy as an internal property of the system that changes as the heat energy moves around in a system.  Mathematically, it is written as ΔS = ΔQ/T. For an ideal Carnot cycle, the change in entropy is zero but for any other idealised system, it is positive. An increase in entropy actually means that the reservoirs are approaching the same temperature. So when we say that entropy is a measure of disorder in a system, we pretty much mean this. Just imagine, earlier, the heat was concentrated at one end of the system (the hot reservoir). There was a strong temperature gradient and maximum work was being withdrawn. But as time progressed, the heat energy redistributed itself in the system. It was no longer concentrated in just one part. It was all over the system and that is the reason the amount of work that could be extracted from the system decreased. The “internal” property that increased and caused this decrease in work done is known as entropy.

Now I don’t want you to stop here. The thermodynamic picture is purely classical picture. In order to understand the concept of entropy fundamentally, you must read the interpretation offered by one of the most important branch of physics, the bridge that connects quantum physics and classical physics: The Statistical Mechanics.

The Statistical Picture (The Real Meaning of Entropy)

This was developed about half a century later by a genius named Ludwig Boltzmann. I will explain the concepts of statistical mechanics in a brief and easy way. Suppose you have a huge collection of particles in a box. They can be in trillions. Each particle will have a different position, velocity, momentum, spin etc. Such states are known as microstates. A particle has an access to every possible microstate. It can have any speed, momentum etc. Also at equilibrium, the probability of accessing any microstate is equally likely. Now the collection of particles, which we call a gas, will have a particular temperature, volume, pressure etc. These are known as the macrostates. Statistical mechanics assumes that for a given configuration of macrostate, all the microstates are equally likely.


If I am talking about the position, momentum etc of each particle, I am referring to microstate of the system. But if I am talking about the volume, temperature etc. of the system as a whole, I am referring to macrostate.

The last fact that should be mentioned is that if you leave a system undisturbed, the particles will try to take all the possible microstates and if you observe the system at any time, you’ll always get a random configuration that is evenly spread out. All the combinations are possible. Say, it is possible that all the trillion particles occupy one half of the box and other half remains empty. However, the statistics tell that the probability of such a configuration is so low that it will never happen in an isolated system. But what entropy has to do with microstates? Well, Boltzmann figured out that too.

The Boltzmann equation says that if there are W number of microstates, then the entropy of the system is S = k log (W), where k is Boltzmann’s constant. So this explains why the configuration in which all the particles are in one half of the box has low probability. That’s because we restricted the number of microstates (positions here) that the system could access thereby reducing the entropy.

Relation With The Second Law of Thermodynamics

The second law of thermodynamics is the most fundamental law of physics. The equation of this law describes something that no other equation can. The second law of thermodynamics says, “Over the course of time, the entropy of an isolated system increases or at the most, remains constant.” Remember, the word isolated is important. You should not add heat energy from outside to again increase the amount of work that can be extracted from the heat engine or you should not use a vacuum cleaner to push all the particles in the box to one half.

I hope the meaning of the term entropy is clear by now. This is a very important concept in physical and chemical sciences. The second law starts making more sense with the definition of entropy. The concept of entropy has a lot to do with the arrow of time. In fact, the thermodynamic arrow of time is pointed in the direction of the increase in entropy of the universe. 

Also Read: Why Maxwell’s Equations are so important and what do they really mean?

Advertisements

3 comments

    1. Joule/ Kelvin is the SI unit for entropy and i think as third law suggest in absolute 0 Kelvin entropy of pure substance is zero. But it can never be negative

      Liked by 1 person

  1. awesome explaination.That’s what an astronomer need.l have some other doubts in thermal physics like; Consider a room of 0 deg cel temperature and we have a hot body (A)at temperature 50 deg cel. and a cold body (B)of temp -50 deg cel. Then which one reach at temperature 0 deg cel first , A or B ?? Simply am asking is rate of heat loss by A and gain by B are equal or not??
    Hope you’ll reply soon
    Here’s my email : bhashitshinde@gmail.com

    Liked by 2 people

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s