
Admin and Founder of ‘The Secrets Of The Universe’ and former intern at Indian Institute of Astrophysics, Bangalore, I am a science student pursuing a Master’s in Physics from India. I love to study and write about Stellar Astrophysics, Relativity & Quantum Mechanics.
We live in the 21st Century. Science has become so advanced that today we are smashing atoms in particle accelerators, launching interplanetary space probes, imaging the deep space across the entire electromagnetic spectrum, and so on. Today the hot topics of research are particle physics, nuclear physics, relativity, quantum mechanics, condensed matter physics, etc. But to understand the term entropy, we need to travel about 150 years back in time. Back then, we had no relativity, no Einstein, no quantum mechanics, no particle physics, and no nuclear physics. What was the main topic of research? The answer is Thermodynamics.

There are two main interpretations of entropy. The first one is provided by Thermodynamics and the second one by Statistical Mechanics. To completely understand entropy, both interpretations are important. However, the interpretation from Statistical Mechanics is more fundamental.
The Thermodynamic Picture – The Roots of Entropy
Carnot And His Heat Engine
There was a scientist named Sadi Carnot. He built a theoretical model of an engine (Carnot’s engine) to convert heat energy to mechanical work. For the heat to flow, two reservoirs at different temperatures are used. The more the temperature difference between the two reservoirs, the more the engine’s efficiency, i.e., more heat energy, will convert to mechanical work. It is important to note that the Carnot engine is an idealized heat engine. The reservoirs’ temperature does not change as heat is added/extracted (infinite heat capacity).

But in reality, as time progresses and as the engine does work, the temperature gradient between the reservoirs decreases and ultimately becomes so less that no work can be extracted from this engine. This happens when thermal equilibrium is established in the system.
The amount of work extracted from the hot reservoir depends on the temperature difference between the two. As time progresses, heat energy redistributes itself, and entropy increases, thereby reducing W.
More in Physics:
- Schrodinger’s cat experiment in quantum mechanics
- Understanding the Feynman diagrams in physics
- The standard model of particle physics
Claussius And His Explanation
Carnot did not explain any further. A few years later, another scientist, Rudolph Clausius, gave the concept of entropy. He defined entropy as an internal property of the system that changes as the heat energy moves around in a system. Mathematically, it is written as ΔS = ΔQ/T. For an ideal Carnot cycle, the change in entropy is zero, but it is positive for any other idealized system. An increase in entropy actually means that the reservoirs are approaching the same temperature. So when we say that entropy is a measure of disorder in a system, we pretty much mean this.
Just imagine, earlier, the heat was concentrated at one end of the system (the hot reservoir). There was a strong temperature gradient, and maximum work was being withdrawn. But as time progressed, the heat energy redistributed itself in the system. It was no longer concentrated in just one part. It was all over the system, which is why the amount of work that could be extracted from the system decreased. The “internal” property that increased and caused this decrease in work done is known as entropy.
Now I don’t want you to stop here. The thermodynamic picture is purely classical. To understand the concept of entropy fundamentally, you must read the interpretation offered by one of the most important physics branches, the bridge that connects quantum physics and classical physics: The Statistical Mechanics.
The Statistical Picture (The Real Meaning of Entropy)
Microstates And Macrostates
This was developed about half a century later by a genius named Ludwig Boltzmann. I will explain the concepts of statistical mechanics briefly and easily. Suppose you have a huge collection of particles in a box. They can be in trillions. Each particle will have a different position, velocity, momentum, spin, etc. Such states are known as microstates.
A particle has access to every possible microstate. It can have any speed, momentum, etc. Also, at equilibrium, the probability of accessing any microstate is equally likely. The collection of particles, which we call a gas, will have a particular temperature, volume, pressure, etc. These are known as the macrostates. Statistical mechanics assumes that for a given configuration of macrostate, all the microstates are equally likely.

The last fact that should be mentioned is that if you leave a system undisturbed, the particles will try to take all the possible microstates. If you observe the system at any time, you’ll always get a random configuration that’s evenly spread out. All the combinations are possible. All the trillion particles may occupy one half of the box, and the other half remains empty. However, the statistics tell that the probability of such a configuration is so low that it will never happen in an isolated system. But what entropy has to do with microstates? Well, Boltzmann figured out that too.
More articles for you to read:
- Einstein’s biggest blunder that turns out to be correct
- November 2020 In Science Highlights: 8 News Stories That You Must Know
- Using Randomness to Determine a Solution: The Magic of the Famous Monte Carlo Algorithm.
The Boltzmann equation says that if there are W number of microstates, then the entropy of the system is S = k log (W), where k is Boltzmann’s constant. So this explains why the configuration in which all the particles are in one half of the box has a low probability. That’s because we restricted the number of microstates (positions here) that the system could access, thereby reducing the entropy.
Relation of Entropy With The Second Law of Thermodynamics
The second law of thermodynamics is the most fundamental law of physics. The equation of this law describes something that no other equation can. The second law of thermodynamics says, “Over time, the entropy of an isolated system increases or at the most, remains constant.” Remember, the word isolated is important. You should not add heat energy from outside to increase the amount of work extracted from the heat engine or not use a vacuum cleaner to push all the particles in the box to one half.
I hope the meaning of the term entropy is clear by now. This is a fundamental concept in physical and chemical sciences. The second law starts making more sense with the definition of entropy. The concept of entropy has a lot to do with the arrow of time. In fact, the thermodynamic arrow of time is pointed in the direction of the universe’s increase in entropy.
[…] What is entropy, exactly? […]
[…] What is entropy, exactly? […]
And if the heat in an enclosed system does no work and is not there to work?
[…] is derived from the second law of thermodynamics. According to the second law of thermodynamics, entropy tends to increase with time, or at least stays constant but never decreases. In simple words, […]
[…] What is entropy, exactly? […]
[…] What is entropy, exactly? […]
[…] What is Entropy, Exactly? […]
[…] What is entropy, exactly? […]
[…] What is Entropy, Exactly? […]
[…] What is Entropy, exactly? […]
[…] What is entropy, exactly? […]
[…] The exact meaning of entropy […]
[…] A simple explanation to entropy […]
Hi
Pardon me if mu question is irrelevant to this topic but just in passing I read somewhere that ultimately the heat in the universe will settle into zero. If I got it right then where does entropy comes in?
The second law of thermodynamics predicts that there will be not heat that can be converted into work. That is when all the heat energy will be redistributed and this is the state of maximum entropy.
Hello,
I was wondering how the entropy of living system changes with time?
Regards
Mubashir,
Kashmir.
Living systems are not isolated systems, so there’s no straight answer..
Man it was amazing and helpful and now i think i should go for the debate ….thank u ❤❤❤