Here we're talking about the thermodynamic definition of entropy. Colloquially the word entropy is taken to mean chaos, but in thermodynamics, that's not quite accurate. In its most basic sense, entropy is a measure of disorder, randomness, or how spread out energy is in a system. It describes the unavailability of a system's energy to do work.
Entropy is inversely proportional to energy "quality." If a system's energy is high quality, it has low entropy and therefore has more potential to do work. Thermal energy has the potential to do work in what's called a heat engine, relying on the Carnot cycle. This is how almost all of our electricity is made in power plants and how our car engines work. The maximum efficiency of a heat engine is dependent on the temperature difference between the hot source and the cold sink. Keeping the cold sink constant, a hotter hot source will be much more efficient at converting thermal energy into mechanical energy. This is because hotter hot sources have higher quality and lower entropy.
We burn fossil fuels mostly to create very hot hot sources and use the ambient air as the cold sink in heat engines. The only thing limiting us from making hot sources even hotter is the melting point and other properties of the materials used in heat engines. So in a heat engine we make high-temperature low-entropy thermal energy by burning fossil fuels. One might conclude that we can lower the entropy of energy in this manner. It's true that energy is conserved when it is converted from one form to another. That's what the First Law of Thermodynamics says. But there is a cost that the universe puts on this conversion of energy. That cost is entropy.
The Second Law of Thermodynamics says that any spontaneous process is accompanied by an increase in entropy of the universe.
Let's take an example: producing electricity. We start with a lump of coal. Coal is basically a reservoir of chemical energy. When we combine coal with heat and oxygen (called combustion), a chemical reaction takes place that produces a lot of heat. We use that heat to boil water and produce steam. So we have converted chemical energy in coal to thermal energy in steam. In the process we have lowered the entropy of the steam. But here's the catch. We have increased the entropy of the coal by an even greater amount, resulting in a net increase in the entropy of the universe. Now we take that high-temperature low-entropy steam and run it through a turbine that spins a generator to produce electricity. Now we have converted thermal energy into electrical energy. The electrical energy is very low entropy, but we have created even more entropy since the temperature of the steam has decreased and become closer to the surrounding temperature. Besides that, there is friction within the turbine's bearings and eddy currents created within the generator that end up as heat. Crucially, this is low-quality high-entropy heat from which little to no usable energy is available to do work.
In our electricity generation process we have lowered the entropy of parts of the system, but at the expense of increasing the entropy of other parts of the system. No process in the universe can occur such that overall entropy decreases. Local abatements in entropy are always accompanied by even greater increases in entropy elsewhere.
This idea is critical to civilization and one whose far-reaching implications I feel are significantly underappreciated.
If two things are at the same temperature they are said to be in thermal equilibrium. This is the point of maximum entropy and their energy is not available to do useful work between them. But it's not just the temperature of something that has entropy. Entropy can be defined in other ways. A weight raised above the ground has lower entropy than a weight on the ground, since when that weight is lowered to the ground, its energy is available for conversion into work. A pressurized air tank has low entropy, whereas when that tank is empty (that is, at the same pressure as the surroundings), the entropy of that system is at its maximum. Entropy degrades the usefulness of energy.
Why is that important? Because we as a civilization need energy. Actually, we need "free energy." Without getting into the details, free energy is the energy available to do work. Ultimately we humans need free energy in order to do work to manufacture goods, move around, fuel our bodies, and everything else that we do. As entropy goes up, free energy goes down. If there is no free energy, everything grinds to a halt. Imagine a large box. Now imagine what a state of maximum entropy and no free energy looks like inside that box. Matter is uniform, everything has the same chemical composition, it's flat, it's all the same temperature and pressure and moisture content. It's not a place we would want to live – or could live.
Given enough time, that's what the Earth might look like. But fortunately there is a source of free energy constantly being delivered to the Earth system. It comes in the form of electromagnetic radiation from a nuclear fusion reactor about 94 million miles away – the sun. The low entropy energy from the sun provides the free energy on Earth that allows life to exist. Each time that energy is transformed, its entropy increases and free energy goes down.
The most useful forms of energy for humans to use are those of the lowest entropy. The trouble is, creating low entropy sources of energy often requires creating a lot of entropy somewhere else. Burning gasoline in a car for example, to send you zooming down the interstate at 70mph has indeed lowered the entropy of you and the car, but has created a massive quantity of entropy elsewhere. Most of it has gone to heat. Heat in the products of combustion coming out of the exhaust pipe, heat being shed from the engine through the radiator, heat from friction in the drivetrain, and even heat in the surrounding air molecules from drag. Now you need to get off the freeway and slow down for the exit. So you hit the brakes. This is converting your kinetic energy from moving forward into high entropy heat in your brake pads and rotors. That energy can never be recovered to do useful work. The energy has not gone away, its entropy has become too high.
Take another example. Freezers work by using a refrigeration system that moves heat from inside the freezer to outside. Once the temperature inside drops below 32°F, water freezes and we get ice. Since moving heat from inside the freezer to outside takes electricity (low entropy) to run the compressor, we end up with more heat outside the freezer than we have removed from inside. That's because the electricity has also become heat. It's moved from low-entropy electrical energy to high-entropy thermal energy. So we've established that the process of making low-entropy ice creates more entropy overall.
What can we do with this knowledge?
Humans use energy in order to make things and do things to improve our lives. Is all energy use the same? Does it always produce the same amount of entropy? Is using 1000 joules (a unit of energy) to heat our house the same as using 1000 joules to drive to the store? What about using 1000 joules to make a cold drink? Are there different ways of using that 1000 joules for the same end goal that generate different amounts of entropy?
Let's say the goal of making ice is so we can have a cold drink. Let's say we use a freezer and put some water inside an ice tray to make ice cubes. The longer the freezer runs, the more electricity it will use and the more entropy it will generate. After the freezer runs for a while, it will move enough heat out of the water to freeze it. It has to move the heat from the water and the heat from the air inside the freezer to outside. But it doesn't just have to remove the heat that started inside the freezer. Since once heat starts to be removed from inside the freezer, a temperature difference is created between inside and outside. That means thermal energy will begin to flow into the freezer through its walls at a rate that is proportional to the temperature difference, the surface area, and the resistance to heat flow (R-value) of the freezer walls. So in order for the water in the ice tray to ever have a chance of freezing, the freezer must remove more heat than what is flowing into it through the walls. The more heat that flows into the freezer, the more electricity is required to remove it, and the more entropy is generated.
Now, in order to generate less entropy in the creation of this ice, we need to use less electricity. To do that, we need to limit the amount of heat that flows into the freezer. We could use a smaller freezer (reducing surface area) or we could increase the resistance to heat flow of the walls of the freezer by insulating it better. Doing either of these things will allow us to get the same end result we desire, ice for our drink, but generate less entropy in the process. Same utility value for humans, less entropy.
Remember, entropy describes the unavailability of energy to do work. So if we generate less entropy in the process of making ice for a cold drink, we are left with more electricity to do other useful work we also desire.
Modern humans have been gifted with pockets of very, very low-entropy energy on Earth. We call these fossil fuels and burn them to convert the energy into other forms to improve our lives. In the process, we generate lots of entropy. These low-entropy fossil fuels are created over millions of years, so we can consider them finite.
Much like the freezer example, not all fossil fuel use is the same in terms of the utility value to humans and the entropy it produces in its conversion to other forms. Let's take natural gas, for example. If we burn a cubic foot of natural gas outside in the open, we convert it to heat, dissipating into the surrounding air. We have not extracted anything useful, so its utility value is zero.
Now we take that cubic foot of natural gas and burn it inside the combustion chamber of a power plant to spin a generator to make electricity. 40% of the energy in the gas ends up as electricity we can use and 60% is dissipated to the surroundings. That electricity then goes to our homes for us to use. Let's say we use this energy to power our television to watch a documentary on CuriosityStream about Order and Disorder hosted by Dr Jim Al-Khalili. That electrical energy powers the electronics in the television to produce light and sound in a way that we value. Some of that energy is converted to heat in the circuitry in the television, and even the light energy and sound energy are converted into small amounts of heat when they hit things in the room. Of course, you can't feel this heat because... Because the thermal energy is so dispersed, its entropy is so high, that the temperature rise in any one location is imperceptible. That thermal energy can never be recovered to do useful work. So in this case of burning natural gas in a generator to produce electricity to power our television, we have extracted a human utility value from that gas of a little under 40% before all the energy is eventually converted to high-entropy dispersed heat.
Here's the interesting part. In both examples we generated the same amount of entropy. The energy in the natural gas eventually spread out enough into high- entropy thermal energy that it has become incapable of doing work. In one case we got no utility value, in the other we get about 40% utility value – for the same cost in entropy.
Let's take it a step further. Let's burn that cubic foot of natural gas in a power plant, make electricity, watch Dr Jim Al-Khalili again, but with a twist. Say it's winter. It's cold. You want to be warm. You can plug in an electric heater and convert some electrical energy to thermal energy, but that would take away some of the electricity you were planning on watching Order and Disorder with. What if instead of dispersing the heat to the surroundings in the heat engine in the power plant, we pumped that heat to your house through a heat network. If half of that heat makes it to your home, now 30% more of the total energy in that cubic foot of natural gas is contributing to your utility value. Now we've generated the same amount of entropy as in the first two examples, but now our utility value is around 70%.
Couldn't we just send that 30% extra energy we used as heat in the third example to the house in form of electricity? No. Because the thermal energy left over after operating the heat engine in the power plant has high-entropy and insufficient free energy to power the turbine to make electricity. So we either disperse it to the environment, or use it for some other purpose humans value that doesn't require such a low-entropy energy source such as electricity.
If every conversion of energy involves an increase in entropy overall, then eventually won't the entropy of the universe keep going up and reach a maximum? Yes. This is called the "heat death" of the universe – when entropy is at maximum and there is no more free energy.
I believe that one of humanity's key goals should be to maximize the function of Utility Value ÷ Entropy. In so doing, we can improve the lives of humans, other species, and do the least damage to the environment we live in. Do you agree? Why or why not?
P.S. I apologize to you physicists and engineers for oversimplifying some of these concepts. This article is intended for someone without a heavy science background.