‹

INSIGHT

The concept of entropy comes from thermodynamics and describes the tendency of all systems towards disorder. It derives from the Greek, en, inside, and tropé, change.

Briefly and simplified, entropy is nothing more than the measure of relative disorder in a system, like the Universe, which is the system that, as it seems now, contains all systems. It is measured in joules, the unit of measurement of heat, energy and work, divided by Kelvin, the temperature (J/K).

Entropy increases when we pass from an ordered system, like ice, composed of crystalline blocks with a precise geometric shape, to a disordered one, liquid water. A system tends to increase its entropy spontaneously: therefore, disordered configurations are the most probable, while to put order requests an effort. For example, if a drop of ink falls into a glass of water, it disperses and stops being a drop, an ordered system, until the water and ink molecules are in a uniform mixture (disordered system). If this process occurs spontaneously, the reverse one, separating the ink from the water, requires work and energy external to the system in question.

Considering the Universe as an isolated system that does not exchange matter or energy with anything outside itself, according to the Laws of Thermodynamics, which we will see later, the energy inside it is constant, but the total entropy keeps increasing until it reaches equilibrium. This means no amount of energy can be created or destroyed in the Universe, but only transformed by dissipating a part of it in the form of heat and increasing the entropy: this is the price of any type of transformation and work.

For example, if we burn a piece of coal, its energy is converted into another type of energy and heat. The combustion process cannot be reversed, and therefore the original piece of coal cannot be reconstituted. And even if it could be done, the system would have changed forever: entropy increases as a result of various transformations, unless the system enters another larger system where entropy is lower.

Gradually, therefore, the entropy in a system reaches the equilibrium, the state in which it has the greatest value and there is no longer free energy because it has all been dissipated into heat: for the Universe, supposing it has a finite mass, this means a uniform temperature in each point and its Thermal Death.

A diagram which explains what an increase in entropy means

Out of pure curiosity, we point out that the term entropy is also used in other areas beyond those strictly belonging to physics:

Information: introduced through Claude Shannon's Theory of Information, in this context entropy is the amount of uncertainty of an information

Economy: Georgescu-Roegen applies the concept of entropy to the economy, according to which every production process either increases or does not change the entropy of the Earth-system, making increasingly energy unavailable and subtracting it from future generations, which will have to deal with the disorder poured into the environment by the increased entropy

GO BACK TO DESTINY OF THE UNIVERSE

PIC CREDITS

Entropy – Assignment Point