Rebellion — an introduction

I don't have imagination

I am nerd. A complete, absolute, full über-nerd. I love learning about everything. I read everything, listen to everything, think about everything. This doesn’t mean I’m smart (hint: I’m most definitely not) but only your typical nerd. It also means I waste a lot of time digging up hidden curiosities in the internet (iconic photographs, WWI poetry, news from Run-II in the LHC, Islamic theology, Westerosi history, obscure blogs on metaphysics…), looking at lists, feeding the insane infatuation I have with some my favorite actresses, learning how to use Mathematica like a pro, or writing useless posts like this one.

Look at her. LOOK. AT. HER.

Look at her. LOOK. AT. HER.

In other words, I’m the full package. Videogames, TV-series? Check. Tolkien, Lewis, Star Wars, Harry Potter, ASOIAF? Check. Indie, hipster movies? Ugh, sadly check. Cynicism, sarcasm, caustic comments? Check. Nerdy comics? Check. Classical music, or perhaps Southern Gothic country music? Check. Science, Math, Philology, Art, History, Philosophy, Religion? Check, check, check, check, and so much check.

I like to always have a book in my hands. I was re-reading for the third time a book on apologetics, “Orthodoxy”, from the massive Chesterton, before re-reading the vampire-history-epistolary novel-travelogue “The Historian” and then a history of the Fall of Constantinople, immediately followed by the popular science book “Facts and Mysteries in Elementary Particle Physics, by Nobel Prize physicist Martinus Veltman.

I’m saying all of this to put into context the fact that, two weeks ago, I just finished reading another book. On science. For fun. But not science divulgation. Hardcore science. With equations and plots and shit. As you can see, I’m not very well of my head*.

Have you looked at her yet? I SAID LOOK AT HER.

Have you looked at her yet? I SAID LOOK AT HER.

It was the classic “Thermodynamics”, by the great Enrico Fermi. Ahh, nothing sweeter than going back to the basics, to those years of old when you were just starting to learn fundamental physics and were fascinated by the simple yet astonishing truths that pretty little theories such as Classical Thermodynamics explain.

Although a basic book, I learned many interesting things. For example, some of you may know that glaciers (those massive ice conglomerations on top of mountains or at the poles) actually move. Well, it turns out that a combination of the fact that the density of ice is smaller than that of water (something unusual in most substances, but that’s why ice floats!) and the famous Clapeyron’s equation are behind this. Really cool stuff; if one day I write something other than philosophical musings or bad poetry in this blog of mine (like, I don’t know, PHYSICS, MY ACTUAL JOB?) I’d love to talk about this.

In his little book Fermi makes a wonderful job explaining the eternally famous Three Laws of Thermodynamics (the same ones some nutcases try to disprove from time to time). The First Law on energy conservation”. The Second Law, about entropy. The Third Law, about the absolute zero. Some pedantic guys say there’s a Zeroth Law but that’s just wanting to put a fancy name to a definition, in my humble opinion.

Anyway, perhaps the most famous of these is the First, which in layman’s terms states that:

The total energy in an isolated system is conserved.

There. Everyone has heard about this law, in one version or another, maybe in its form “Energy isn’t created or destroyed, it only gets transformed”. You see this every day: the Sun’s energy, which is light, gets transformed into heat; plants also transform light into their “food” (chemical energy) which animals eat and then transform into kinetic motion, heat and famous deeds like going to war or making babies, etc. etc.

But by far, the one that more captivates people’s minds is the Second Law which, in its most common formulation, states that:

The total entropy in an isolated system either increases or stays the same.

It is the subject of this post to deepen a little bit about this law and, in subsequent posts, to discuss what philosophical conclusions (if any) we can extract out of it. In particular, in a series of three posts, of which this is the first instance, I’ll discuss that shiny little word that crowns the present text and that modern culture, in its tireless effort to make everything boring, pitiful, and generally shitty, has misused so often it’s just not funny anymore.

So brace yourselves, another of Manuel’s famous rants is coming.


I’d like to start with the (not very intuitive) thermodynamic definition of entropy, in its PG-13 version:

Entropy is a property of a system of which only changes matter.
The change in the entropy of a system is the amount of heat per unit temperature that the system absorbs:

change in entropy = heat absorbed divided by temperature, or mathematically,

ΔS = Q/T,

where ‘Δ (read delta) means change, ‘S’ means entropy (because its inventor, Rudolf Clausius, decided to honor the engineer Sadi Carnot, whose books Clausius had been studying for over 15 years), ‘Q’ means heat (maybe because it has the same “sound” as ‘c’, as in “caloris”, which is latin for heat, and people alread used ‘c’ for constants), ‘T’ means temperature and ‘/’ means divided. The temperature is actually the absolute temperature and is measured in Kelvins; absolute temperatures are called like that because they’re always positive. Thus, 27 Celsius, which in the primitive usage of the americans are 80 Fahrenheit, is equivalent to 300 Kelvins. Here’s a nice interactive webpage to make temperature conversions. Reaumur and Rankine are other weirdo temperature units that some pedantic bastards wanted to use. No one really cares about them.

Clausius, a.k.a. "Lincoln is pissed".

Clausius, a.k.a. “Lincoln is pissed”.

In summary: never mind (for now) what entropy actually measures, what matters are the changes in it, and these changes are related to how much heat an object with a certain temperature absorbs. Those of you my two readers that already know some thermodynamics will of course understand my reluctance at writing the precise mathematical definitions (differentials, reversibility, integrals and such); I don’t want to complicate things, and it’s not the purpose of this post to give a lecture.

Carnot, in his good-boy-from-the-Ecole-Polythechnique uniform.

Carnot, in his good-boy-from-the-Ecole-Polythechnique uniform.

Now, I want to make the distinction between heat and temperature. Heat is the amount of energy that “flows” from one object to another, in a disordered way. That is, not by pushing, pulling, punching or biting (energy transfer by these mechanisms is called work), all of which involve energy being added or extracted in an orderly fashion, via applied forces. For example, when you rub yourself against another body, energy in the form of heat gets transferred via friction, by contact with it/him/her. Temperature, on the other hand, is a measure of the average energy that an object already has. Something that has a lot of energy in it is called hot (it has a large temperature), something that doesn’t posses as much energy is called cold (low temperature). Then, temperature is owned by an object, whereas heat is not: heat is only given or taken, surrendered or absorbed.

With this in mind, let us think of an experiment to confirm the Second Law of Thermodynamics:

Imagine I have two systems (for example, two gases), A and B, with temperatures TA and TB, TA being colder than TB: TA<TB (‘<‘ means “smaller than”, while ‘>’ means “greater than” §). A and B are isolated systems, that is, they don’t exchange energy with their environment or with each other, in any way. This can be approximately achieved by enclosing each system with very thick, solid, insulating walls. Wood is really good at this, and some types of plastic too. If you have ever touched a pan on a stove you know metals don’t work. The configuration is labeled “before” in the following picture:

Doodles_01

Two isolated systems, A and B, one colder than the other, enter into contact to give a new system C, with a temperature between the previous two.

Now, imagine my two systems, which in the picture are represented by boxes, sharing an insulating wall that is movable. If I now retrieve the shared wall I’m allowing the two systems to be in contact. Then, the hotter system, B, is going to spontaneously give up some energy to the colder system A in the form of heat Q, and A is going to absorb that same heat.

At the end of the process I’m gonna end up with a system C, composed of the systems A and B put together, at a temperature TC that is in between TA and TC: TA<TC<TB. This is labeled “after” in the picture.

If I use the formula for the change of entropy we discussed above, I can now write:

ΔSA ≈ Q/TA,
ΔSB ≈ – Q/TB;

where I’m calling SA and SB the entropy of the systems A and B respectively. A few words of caution. Notice that there is a negative sign ‘-‘ in the heat for the equation of the entropy SB. The reason is that I called Q the heat absorbed by A; but B does not absorb heat, it gives it up (to A), which is the opposite (negative) of absorbing. Therefore the change in the entropy of B is negative (the entropy decreased) but that of A is positive (its entropy increased).

Then, the total change of entropy is given by:

ΔSC = ΔSA + ΔSB ≈ Q/TA – Q/TB,

where I’m calling ΔSC the total change in the entropy of by two-system configuration. Notice that, because TA<TB then 1/TA > 1/TB (try it by putting some numbers!) and therefore Q/TA – Q/TB > 0 !!! The total entropy has increased! The Second Law holds!

Those of you who are have read until here without setting your computer on fire and that have been paying attention would have noticed that I used the “squiggly” equality sign ‘≈’ in the equations above. This is because I have been somewhat sloppy. Indeed, the temperatures of A and B do not remain constant when I lift the shared wall and allow the systems to be in contact: they change! The temperature of A, TA, increases until it reaches TC, while the temperature TB decreases to TC. In this way, both A and B now form a whole new system C with a new temperature TC.

But my imprecision doesn’t matter, because the conclusion stays the same. Even though TA and TB are changing, TA is always smaller than TB during the whole process, until the very end when they are both equal to TC; and thus Q/TA is always bigger than Q/TB during the whole process, thus keeping ΔSC > 0, that is, the total entropy still increases.

Of course, if I had started with my systems A and B at the same temperature then, after removing the wall, nothing would have happened and I would have TA = TB = TC. Therefore, from the above equation, ΔSC = 0. This is precisely what the Second Law of Thermodynamics says: in an isolated system (C, which is A and B together) the entropy either increases or stays the same.

Awesome! Physics! So sexy!

But… what does this have to do with anything? Where’s the philosophy, where’s the rebellion? Where, O Manuel, are those outrageously wonderful statements from your “brilliant” mind, that appear to be the result of too much free time in your hands and a more than liberal consumption of plants of dubious legal status?

For that, my dear people, we need the help of another historical character, one most famous (among scientists): Ludwig Boltzmann.

I can’t stop praising the genius of this guy. Really, he was brilliant. Just look at him. A bearded nerd, so handsome.

I wish I could grow a beard like that :(

I wish I could grow a beard like that.

Many things carry the name of this austrian physicist. The Maxwell-Boltzmann distribution for the velocities of particles in a gas, which comes from the Maxwell-Boltzmann version of Statistical Mechanics; the Boltzmann equation for the dynamics of thermodynamic fluids, the Boltzmann Energy Equipartition Theorem, the Stefan-Boltzmann law for black bodies… here’s a more complete list in case you’re curious. As you can see, I love the guy.

But right now I’m concerned with his Boltzmann equation for the entropy of a system:

S = k Log W,

where S is the entropy of the system under consideration, k is just a constant number called… the Boltzmann constant, Log is something called the Logarithmic function and W is called the number of microstates of a system. I’ll explain this in a second.

This equation is so important and is of such great relevance that careers are built around it, money is gained, fame is obtained, and hot mexican guys write blog posts about it. It is also written on Boltzmann’s tomb.

You can see the equation for the entropy engraved at the top of Boltzmann's grave.

You can see the equation for the entropy engraved at the top of Boltzmann’s grave.

The story of Boltzmann’s death is a sad one. He was a staunch supporter of the atomic theory: that everything we see is made of atoms. We now know he was right, but in his time there was little evidence (although in my opinion compelling) of the existence of atoms and thus many, many people mocked him; among them various famous positivist philosophers, like Ernst Mach.

He apparently suffered from undiagnosed bipolar disorder and used to fall in periods of depression, some say enhanced in frequency and strength by the continuous ridicule of which his colleagues made him subject.

One day it was too much. He hang himself in Trieste, Italy, on September 5, 1906.

As usual, philosophers ruining lives.

But let us honor Boltzmann’s memory by discussing his work. As I said above, he discovered a formula for the entropy of a system. In order to understand it, I want to explain what a logarithm (Log) is, and what a microstate is.

First, Log is simply the number of times you have to multiply a special number called ‘e’ and equal to e = 2.718… to get another one. Never mind what’s so special about it; it’s sort of like the less famous cousin of π = 3.14159…. Thus, for example, Log of 7.389… is 2, because e × e =7.389…. Log of 20.085… is 3 because e × e × e = 20.085…, and so on. You can also calculate the Log of numbers that are not the result of an integer number of e multiplications. For example, the Log of 10 is between 2 and 3, because Log 20.085 = 3, while Log 7.389 = 2. If you go to Wolfram Alpha (something you should do often) and type Log 10 you’ll get something close to 2.3 . What you should take home with you is that, the larger the number, the larger its Log. Easy.

Now, what is a microstate? It is the configuration of all the parts that make up a system, that is, all the information about the components that make it. For example, if my system is made of a single particle I need to say where it is, if it’s moving or not, how fast and in what direction, what’s its mass, if it has any electric charge, etc. If I have two particles I need to give the same information for the two of them and, in addition, if there are any interactions between them.

As we all know and Boltzmann believed, everything is made of small particles called atoms (which in turn are made up of more stuff, but we don’t care about that now); if we could count all the atoms of a system, say of my little finger, and give all the information about them, then we would have described one possible microstate of the system.

But actually, that is being too meticulous. Most systems do not care if their parts are in this or that microstate: they look exactly the same. In the case of my little finger, if I move it to the right, to the left, in circles or just leave it on the table, the atoms that compose it have different positions, velocities and orientations (and let’s not forget the electrons within those atoms are always moving!), and yet my little finger still looks the same. This is called a macrostate. A single macrostate can be reproduced by many microstates, whose number we call W. Allow me to illustrate with another experiment.

hhh

Two macrostate configurations of a system. The one with more possible microstates is the one with higher entropy. Click to enlarge.

In the picture to the left I start with a very simple system, composed of 8 cells divided into two regions, left and right, and with 4 identical red balls and 4 identical blue balls.

In the top figure, I have one particular microstate: the red and blue balls are arranged in one specific way. This microstate in particular is giving rise to the macrostate in which the left side of the system has only red balls while the right side has blue balls.

By changing the position of my balls (stop giggling, you filthy teenager) I can arrive at different microstates. But because all the red balls look the same and so do all the blue ones, there’s actually only one microstate that gives me the system with all the red balls to the left and all the blue ones to the right, and it is the one pictured. We say that the shown macrostate of my system has only one possible microstate. Notice how neat this macrostate looks, very well arranged. Ordered.

Now, in the lower figure I’ve rearranged the balls, 2 blue and 2 red in each side. I can rearrange the balls again, within each side, to give a different microstate, but I would still get the same macrostate: namely, the one with 2 red and 2 blue balls to the left, and 2 red and 2 blue ones to the right.

If I exchange identical red balls I still get the same microstate, because the configuration is exactly the same. If I exchange a red ball on the left with a blue ball on the right I do get a different microstate, but also a different macrostate: one with more blue balls on one side than in the other.

If you count them, there are 6 possible configurations that I can have in each side that still gives me the same macrostate (for your convenience listed in the picture). Therefore, because I have two sides, I end up with 6 × 6 = 36 different microstates that give the very same macrostate: 2 blue and 2 red balls on each side. Notice how messy this macrostate is, with the balls all mixed up. It is disordered.

In summary, we have W = 1 microstates for the well-ordered macrostate with red balls to the left and blue balls to the right, whereas we have W = 36 microstates for the disordered macrostate with equal numbers of red and blue balls in each side.

Usually, real life systems posses more than 100,000’000,000’000,000’000,000 particles (that’s a 1 with twenty-three 0s) and so the number of microstates that a given macrostate can have is obscenely huge, but still the same idea I just explained applies.

And, according to Boltzmann’s equation for the entropy and what we learned from logarithms, the larger the number of microstates W a system has, the larger its entropy is. But, as we have seen, the larger W is then the messier, more disordered a system is.

Therefore, the Second Law of Thermodynamics, which states that entropy always increases (or stays the same), can be translated into our new language as:

A system tends to go to the macrostate with a larger number of microstates.
or
A system tends towards disorder.

If you want me to make this clearer, I’ll ask you to go to the kitchen, take an egg, and let it roll towards the edge of the table. Something like this might happen:

Entropy is such a pain in the ass.

Entropy is such a pain in the ass when it comes to cleaning.

Now, please wait until the egg, spontaneously, rearranges itself to its previous, unbroken state. Just a warning: it might take a while.

What just happened can be explained in the language of macrostates and microstates. There are many microstates in which an egg is unbroken: you could put it here or there, you can shake it up, the yolk inside might be in this orientation or that, whatever. All these microstates give the same macrostate: an unbroken egg. But there are many, many more microstates for a broken egg: the egg white could splash all of your kitchen while the yolk end up close to the fridge, the shell can be broken in two, five or a thousand different pieces, and each could have almost any shape… and every single one of these microstates yield the same macrostate: a broken egg. Therefore a broken egg, having a larger number of microstates W, has a larger entropy S and thus is messier. And because the Second Law of Thermodynamics tells us that systems tend to increase their entropy and not to decrease it, we can conclude that unbroken eggs tend to break, and not the other way around.

In other words, we can rewrite the Second Law of Thermodynamics as follows:

EVERYTHING GOES TO SHIT.

This is the terrible truth. Nature tends to thwart all of our attempts at order and organization. She tends to destroy.

Of course, I know I’m being somewhat careless with my examples and thought experiments. Those of you with some wits about yourselves might argue that this law applies only to isolated systems, and you will be right. The Earth of course, is not an isolated system (there’s the Sun out there, giving up heat to us), and thus we can expect entropy to have its weird moments and actually decrease. This is true, and that’s the reason why we can do anything at all! That’s why we can take disordered, scattered materials on the Earth’s surface and make order out of them by building houses and computers, brewing and drinking beer, writing poetry, and making great movies. We can diminish entropy. We can fight.

But only locally. Only temporarily. Because, as we saw in the experiment of the boxes of gas, you’re allowed to have a part of your system to decrease its entropy as long as another part increases its own, and by a larger amount. Entropy ends up winning. And it takes everything with it ¶.

Allow me then to conclude. What I want you to get out of this freakishly long post is, among other things, that Paolo Coelho is an idiot, and that he would love nothing more than you buying his feel-good, pat-in-the-shoulder nonsensical books. He famously wrote “When you want something, the whole universe conspires to make it happen” ‡. Bullshit. The Universe wants to screw you over. Everything works against you, in an unending sequence of senseless waste. Disorder and Death rule the world. And nothing can escape from Them.

But the intention of this blog entry was to be more an excuse to talk about science and to present a tiny aspect of how messed up the Universe can be than anything else. As its title say, this post is simply an introduction, a motivation for what is to come, something bigger and monstrous that I want to discuss.

For there is something worse than an Universe with an ever-increasing entropy. Something that can chill you to the bone and make you go mad, madder than anything else. For Disorder has a Father, a cruel ghost whose empty sockets and sewn mouth see nothing and say nothing. There’s something worse than a corpse. And its name is the Absurd.

And this is what I want to talk about next time.


* While this is undoubtly true, I still find funny that the average person’s bookshelf contains novels, biographies or even monographies about art but hardly a book of science, even though its pursue is one of the most defining characteristics of humanity. My undergrad adviser, the mythical Fefo, used to tell me how his artists/writers friends invariably congratulated him on his possession of some obscure book about art or literature but, when he asked them if they had read any book on science, they would look away. People should read science. Period. Also, this is a very long annotation/asterisk. I should stop.

§ Am I being too anal? I am being too anal, am I not? I’m sure I am.

At this point, a very interesting question can be asked: what’s the entropy of the whole Universe? Is it increasing? People have been tempted to answer affirmatively, saying that a (finite) Universe ought to be isolated (well, because the Universe is everything there is!) and thus the Second Law should apply. The answer is much more complicated than that. Scientists have called into question the application of elementary thermodynamics to the whole Universe, doubting a meaningful definition of entropy for the Universe even exists. Also, for scales as big as those of the Universe gravity becomes crucial and no one actually knows what the “entropy of gravity” is, or if there’s anything like that at all. There’s currently a lot of exciting theories that discuss these issues, and black holes seem to play a very important role. For some babies-level review on this, look here. Incidentally, the increase of entropy seems to give a direction to time. This is called the Arrow of Time. You can find some new discussions and radical ideas about it and its relation to Quantum Mechanics in a non-technical article here. There’s much more information about this interesting topic everywhere in the internet, look for it! In any case, I’m not aware of any indication whatsoever of things looking any better for us, at least in our local patch of the Universe. Stuff does seem to become more and more disordered: objects give up heat and are rendered unusable for work, stars run out of gas, people, in spite of our best efforts, grow old and die; etc.

Here. Read and laugh.

Anuncios

Responder

Introduce tus datos o haz clic en un icono para iniciar sesión:

Logo de WordPress.com

Estás comentando usando tu cuenta de WordPress.com. Cerrar sesión / Cambiar )

Imagen de Twitter

Estás comentando usando tu cuenta de Twitter. Cerrar sesión / Cambiar )

Foto de Facebook

Estás comentando usando tu cuenta de Facebook. Cerrar sesión / Cambiar )

Google+ photo

Estás comentando usando tu cuenta de Google+. Cerrar sesión / Cambiar )

Conectando a %s