“Nor public flame, nor private, dares to shine;
Nor human spark is left, nor glimpse divine!
Lo! thy dread empire, Chaos! is restored;
Light dies before thy uncreating word:
Thy hand, great Anarch! lets the curtain fall;
And universal darkness buries all.”
― Alexander Pope, The Dunciad
The second law of thermodynamics states that “as one goes forward in time, the net entropy (degree of disorder) of any isolated or closed system will always increase (or at least stay the same).” That is a long way of saying that all things tend towards disorder. This is one of the basic laws of the universe and is something we can observe in our lives. Entropy is simply a measure of disorder. You can think of it as nature’s tax.
Uncontrolled disorder increases over time. Energy disperses and systems dissolve into chaos. The more disordered something is, the more entropic we consider it. In short, we can define entropy as a measure of the disorder of the universe, on both a macro and a microscopic level. The Greek root of the word translates to “a turning towards transformation” — with that transformation being chaos.
As you read this article, entropy is all around you. Cells within your body are dying and degrading, an employee or coworker is making a mistake, the floor is getting dusty, and the heat from your coffee is spreading out. Zoom out a little, and businesses are failing, crimes and revolutions are occurring, and relationships are ending. Zoom out a lot further and we see the entire universe marching towards a collapse.
Let’s take a look at what entropy is, why it occurs, and whether or not we can prevent it.
The Discovery of Entropy
The identification of entropy is attributed to Rudolf Clausius (1822–1888), a German mathematician and physicist. I say attributed because it was a young French engineer, Sadi Carnot (1796–1832), who first hit on the idea of thermodynamic efficiency; however, the idea was so foreign to people at the time that it had little impact. Clausius was oblivious to Carnot’s work, but hit on the same ideas.
Clausius studied the conversion of heat into work. He recognized that heat from a body at a high temperature would flow to one at a lower temperature. This is how your coffee cools down the longer it’s left out — the heat from the coffee flows into the room. This happens naturally. But if you want to heat cold water to make the coffee, you need to do work — you need a power source to heat the water.
From this idea comes Clausius’s statement of the second law of thermodynamics: “heat does not pass from a body at low temperature to one at high temperature without an accompanying change elsewhere.”
Clausius also observed that heat-powered devices worked in an unexpected manner: Only a percentage of the energy was converted into actual work. Nature was exerting a tax. Perplexed, scientists asked, where did the rest of the heat go and why?
Clausius solved the riddle by observing a steam engine and calculating that energy spread out and left the system. In The Mechanical Theory of Heat, Clausius explains his findings:
… the quantities of heat which must be imparted to, or withdrawn from a changeable body are not the same, when these changes occur in a non-reversible manner, as they are when the same changes occur reversibly. In the second place, with each non-reversible change is associated an uncompensated transformation…
… I propose to call the magnitude S the entropy of the body… I have intentionally formed the word entropy so as to be as similar as possible to the word energy….
The second fundamental theorem [the second law of thermodynamics], in the form which I have given to it, asserts that all transformations occurring in nature may take place in a certain direction, which I have assumed as positive, by themselves, that is, without compensation… [T]he entire condition of the universe must always continue to change in that first direction, and the universe must consequently approach incessantly a limiting condition.
… For every body two magnitudes have thereby presented themselves—the transformation value of its thermal content [the amount of inputted energy that is converted to “work”], and its disgregation [separation or disintegration]; the sum of which constitutes its entropy.
Clausius summarized the concept of entropy in simple terms: “The energy of the universe is constant. The entropy of the universe tends to a maximum.”
“The increase of disorder or entropy is what distinguishes the past from the future, giving a direction to time.”
— Stephen Hawking, A Brief History of Time
Entropy and Time
Entropy is one of the few concepts that provides evidence for the existence of time. The “Arrow of Time” is a name given to the idea that time is asymmetrical and flows in only one direction: forward. It is the non-reversible process wherein entropy increases.
Astronomer Arthur Eddington pioneered the concept of the Arrow of Time in 1927, writing:
Let us draw an arrow arbitrarily. If as we follow the arrow[,] we find more and more of the random element in the state of the world, then the arrow is pointing towards the future; if the random element decreases[,] the arrow points towards the past. That is the only distinction known to physics.
In a segment of Wonders of the Universe, produced for BBC Two, physicist Brian Cox explains:
The Arrow of Time dictates that as each moment passes, things change, and once these changes have happened, they are never undone. Permanent change is a fundamental part of what it means to be human. We all age as the years pass by — people are born, they live, and they die. I suppose it’s part of the joy and tragedy of our lives, but out there in the universe, those grand and epic cycles appear eternal and unchanging. But that’s an illusion. See, in the life of the universe, just as in our lives, everything is irreversibly changing.
In his play Arcadia, Tom Stoppard uses a novel metaphor for the non-reversible nature of entropy:
When you stir your rice pudding, Septimus, the spoonful of jam spreads itself round making red trails like the picture of a meteor in my astronomical atlas. But if you stir backwards, the jam will not come together again. Indeed, the pudding does not notice and continues to turn pink just as before. Do you think this is odd?
(If you want to dig deeper on time, I recommend the excellent book by John Gribbin, The Time Illusion.)
“As a student of business administration, I know that there is a law of evolution for organizations as stringent and inevitable as anything in life. The longer one exists, the more it grinds out restrictions that slow its own functions. It reaches entropy in a state of total narcissism. Only the people sufficiently far out in the field get anything done, and every time they do they are breaking half a dozen rules in the process.”
— Roger Zelazny, Doorways in the Sand
Entropy in Business and Economics
Most businesses fail—as many as 80% in the first 18 months alone. One way to understand this is with an analogy to entropy.
Entropy is fundamentally a probabilistic idea: For every possible “usefully ordered” state of molecules, there are many, many more possible “disordered” states. Just as energy tends towards a less useful, more disordered state, so do businesses and organizations in general. Rearranging the molecules — or business systems and people — into an “ordered” state requires an injection of outside energy.
Let’s imagine that we start a company by sticking 20 people in an office with an ill-defined but ambitious goal and no further leadership. We tell them we’ll pay them as long as they’re there, working. We come back two months later to find that five of them have quit, five are sleeping with each other, and the other ten have no idea how to solve the litany of problems that have arisen. The employees are certainly not much closer to the goal laid out for them. The whole enterprise just sort of falls apart.
It reminds one distinctly of entropy: For every useful arrangement of affairs towards a common business goal, there are many orders of magnitude more arrangements that will get us nowhere. For progress to be made, everything needs to be arranged and managed in a certain way; we have to input a lot of energy to keep things in an ordered state.
Of course, it’s not a perfect analogy: We have to consider the phenomenon of self-organization that happens in many systems, up to and including human organizations. Given a strong enough goal, a good enough team, and the right incentives, perhaps that group wouldn’t need a lot “outside ordering” — they would manage themselves.
“The … ultimate purpose of life, mind, and human striving: to deploy energy and information to fight back the tide of entropy and carve out refuges of beneficial order.”
— Steven Pinker
In practice, both models seem to be useful at different times. Any startup entrepreneur who has stayed long enough to see a company thrive in unexpected ways knows this. The amount of diligent management needed will vary. In physics, entropy is a law; in social systems, it’s a mere tendency — though a strong one, to be sure.
Entropy occurs in every aspect of a business. Employees may forget training, lose enthusiasm, cut corners, and ignore rules. Equipment may break down, become inefficient, or be subject to improper use. Products may become outdated or be in less demand. Even the best of intentions cannot prevent an entropic slide towards chaos.
Successful businesses invest time and money to minimize entropy. For example, they provide regular staff training, good reporting of any issues, inspections, detailed files, and monitoring reports of successes and failures. Anything less will mean almost inevitable problems and loss of potential revenue. Without the necessary effort, a business will reach the point of maximum entropy: bankruptcy.
Fortunately, unlike thermodynamic systems, a business can reverse the impact of entropy. A balance must be struck between creativity and control, though. Too little autonomy for employees results in disinterest, while too much leads to poor decisions.
Entropy in Sociology
Without constant maintenance from individuals and dominant institutions, societies tend towards chaos. Divergent behavior escalates — a concept known as the “broken windows” theory.
Sociologist Kenneth Bailey writes:
When I began studying the notion of entropy it became clear to me that thermodynamic entropy was merely one instance of a concept with much broader applications … I became convinced that entropy applied to social phenomena as well.
One example of what happens when entropy increases unchecked occurred in the Kowloon Walled City. For a substantial length of time, Kowloon was abandoned by the government after the British took control of Hong Kong. At one point, an estimated 33,000 residents were crammed into 300 buildings over 6.4 acres, making Kowloon the most densely populated place on earth. With no space for new construction, stories were added to the existing buildings. Because of minimal water supplies and a lack of ventilation (no sunlight or fresh air reached lower levels), the health of residents suffered. A community of unlicensed medical professionals flourished, alongside brothels and gambling dens.
With no one controlling the city, organized crime gangs took over. It became a haven for lawlessness. Though police were too scared to make any attempts to restore order, residents did make desperate attempts to reduce the entropy themselves. Groups formed to improve the quality of life, creating charities, places for religious practices, nurseries, and businesses to provide income.
In 1987, the Hong Kong government acknowledged the state of Kowloon. The government demolished and rebuilt the city, evicting residents and destroying all but a couple of historic buildings. Although reasonable compensation was provided for ex-residents, many were somewhat unhappy with the rebuilding project.
Looking at pictures and hearing stories from Kowloon, we have to wonder if all cities would be that way without consistent control. Was Kowloon an isolated instance of a few bad apples giving an otherwise peaceful place a terrible reputation? Or is chaos our natural state?
Needless to say, Kowloon was not an isolated incident. We saw chaos and brutality unleashed during the Vietnam War, when many young men with too much ammunition and too few orders set about murdering and torturing every living thing they encountered. We see it across the world right now, where places with no law enforcement (including Somalia and Western Sahara) face incessant civil wars, famine, and high crime rates.
Sociologists use an intuitive term for this phenomenon: social entropy. Societies must expend constant effort to stem the inevitable march towards dangerous chaos. The reduction of social entropy tends to require a stable government, active law enforcement, an organized economy, meaningful employment for a high percentage of people, infrastructure, and education.
However, the line between controlling entropy and suppressing people’s liberty is a thin one. Excessive control can lead to a situation akin to Foucault’s panopticon, wherein people are under constant surveillance, lack freedom of speech and movement, are denied other rights as well, and are subject to overzealous law enforcement. This approach is counterproductive and leads to eventual rebellion once a critical mass of dissenters forms.
“Everything that comes together falls apart. Everything. The chair I’m sitting on. It was built, and so it will fall apart. I’m going to fall apart, probably before this chair. And you’re going to fall apart. The cells and organs and systems that make you you—they came together, grew together, and so must fall apart. The Buddha knew one thing science didn’t prove for millennia after his death: Entropy increases. Things fall apart.”
— John Green, Looking for Alaska
Entropy in Our Everyday Lives
We have all observed entropy in our everyday lives. Everything tends towards disorder. Life always seems to get more complicated. Once-tidy rooms become cluttered and dusty. Strong relationships grow fractured and end. Formerly youthful faces wrinkle and hair turns grey. Complex skills are forgotten. Buildings degrade as brickwork cracks, paint chips, and tiles loosen.
Entropy is an important mental model because it applies to every part of our lives. It is inescapable, and even if we try to ignore it, the result is a collapse of some sort. Truly understanding entropy leads to a radical change in the way we see the world. Ignorance of it is responsible for many of our biggest mistakes and failures. We cannot expect anything to stay the way we leave it. To maintain our health, relationships, careers, skills, knowledge, societies, and possessions requires never-ending effort and vigilance. Disorder is not a mistake; it is our default. Order is always artificial and temporary.
Does that seem sad or pointless? It’s not. Imagine a world with no entropy — everything stays the way we leave it, no one ages or gets ill, nothing breaks or fails, everything remains pristine. Arguably, that would also be a world without innovation or creativity, a world without urgency or a need for progress.
Many people cite improving the world for future generations as their purpose in life. They hold protests, make new laws, create new forms of technology, work to alleviate poverty, and pursue other noble goals. Each of us makes our own efforts to reduce disorder. The existence of entropy is what keeps us on our toes.
Mental models are powerful because they enable us to make sense of the disorder that surrounds us. They provide us with a shortcut to understanding a chaotic world and exercising some control over it.
In The Information: A History, a Theory, a Flood, James Gleick writes,
Organisms organize. … We sort the mail, build sand castles, solve jigsaw puzzles, separate wheat from chaff, rearrange chess pieces, collect stamps, alphabetize books, create symmetry, compose sonnets and sonatas, and put our rooms in order… We propagate structure (not just we humans but we who are alive). We disturb the tendency toward equilibrium. It would be absurd to attempt a thermodynamic accounting for such processes, but it is not absurd to say we are reducing entropy, piece by piece. Bit by bit … Not only do living things lessen the disorder in their environments; they are in themselves, their skeletons and their flesh, vesicles and membranes, shells, and carapaces, leaves, and blossoms, circulatory systems and metabolic pathways—miracles of pattern and structure. It sometimes seems as if curbing entropy is our quixotic purpose in the universe.
The question is not whether we can prevent entropy (we can’t), but how we can curb, control, work with, and understand it. As we saw at the start of this post, entropy is all around us. Now it’s probably time to fix whatever mistake an employee or coworker just made, clear up your messy desk, and reheat your cold coffee.
How Can I Use Entropy to My Advantage?
This is where things get interesting.
Whether you’re starting a business or trying to bring about change in your organization, understanding the abstraction of entropy as a mental model will help you accomplish your goals in a more effective manner.
Because things naturally move to disorder over time, we can position ourselves to create stability. There are two types of stability: active and passive. Consider a ship, which, if designed well, should be able to sail through a storm without intervention. This is passive stability. A fighter jet, in contrast, requires active stability. The plane can’t fly for more than a few seconds without having to adjust its wings. This adjustment happens so fast that it’s controlled by software. There is no inherent stability here: if you cut the power, the plane crashes.
People get in trouble when they confuse the two types of stability. Relationships, for example, require attention and care. If you assume that your relationship is passively stable, you’ll wake up one day to divorce papers. Your house is also not passively stable. If not cleaned on a regular basis, it will continue to get messier and messier.
Organizations require stability as well. If you’re a company that relies on debt, you’re not passively stable but actively stable. Factoring in a margin of safety, this means that the people giving you the credit should be passively stable. If you’re both actively stable, then when the power gets cut, you’re likely to be in a position of weakness, not strength.
With active stability, you’re applying energy to a system in order to bring about some advantage (keeping the plane from crashing, your relationship going, the house clean, etc.), If we move a little further down the rabbit hole, we can see how applying the same amount of energy can yield totally different results.
Let’s use the analogy of coughing. Coughing is the transfer of energy as heat. If you cough in a quiet coffee shop, which you can think of as a system with low entropy, you cause a big change. Your cough is disruptive. On the other hand, if you cough in Times Square, a system with a lot of entropy, that same cough will have no impact. While you change the entropy in both cases, the impact you have with the same cough is proportional to the existing entropy.
Now think of this example in relation to your organization. You’re applying energy to get something done. The higher the entropy in the system, the less efficient the energy you apply will be. The same person applying 20 units of energy in a big bureaucracy is going to see less impact than someone applying the same 20 units in a small startup.
You can think about this idea in a competitive sense, too. If you’re starting a business and you’re competing against very effective and efficient people, a lot of effort will get absorbed. It’s not going to be very efficient. If, on the other hand, you compete against less efficient and effective people, the same amount of energy will be more efficient in its conversion.
In essence, for a change to occur, you must apply more energy to the system than is extracted by the system.
Members of the FS Learning Community can discuss this article here.
If you’re not a member see what you’re missing.
 Peter Atkins
 Based on the work of Tom Tombrello
 Derived from the work of Peter Atkins in The Laws of Thermodynamics: A Very Short Introduction