Seven varieties of metaphysics

I was having a discussion with someone recently on metaphysics, so I thought I would blog about it. Here are seven varieties of metaphysics, describing three “layers” of reality (and yes, I am oversimplifying for brevity).

The first is Platonism. Plato believed that there was a hierarchy of Forms (Ideals), of which the highest was The One (Plato’s version of God). These Forms or Ideals were the true reality, and the physical objects we touched, saw, and tasted were only shadows of that true reality (that is the point of the allegory of the cave). The physical orange which we see and eat reflects Ideals such as “Fruit,” “Sphere,” and “Orange.” Neoplatonism continues and extends this point of view.

Saint Augustine and many later Christians held to a Christianised Platonism, in which the Ideals were thoughts in the mind of God (the Christian God, naturally). The physical objects we touched, saw, and tasted had a greater importance in Christian Platonism than they did for Plato – after all, when God created those objects, “God saw that it was good.” Much as with Platonism, the regularities that people see in the physical universe are explained by the fact that God created the universe in accordance with regularities in the Divine thoughts. However, Christian Platonism does not have the metaphysical hierarchy that Platonism or Neoplatonism have – in Christian Platonism, God makes direct contact with the physical universe.

Aristotle also reacted to Plato by increasing the importance of the bottom layer, and Aristotle’s thought was Christianised by Thomas Aquinas as Thomism. However, in Thomism the all-important bottom layer does very little except to exist, to have identity, and to have properties assigned to it. It is also not observable in any way. This can be seen in the Catholic doctrine of transubstantiation. According to the Tridentine Catechism of 1566, the bread and the wine of the Eucharist lose their bottom (“substance”) layer (“All the accidents of bread and wine we can see, but they inhere in no substance, and exist independently of any; for the substance of the bread and wine is so changed into the body and blood of our Lord that they altogether cease to be the substance of bread and wine”), while the bottom (“substance”) layer of the body and blood of Christ becomes metaphysically present instead.

Idealism denies that the physical universe exists at all. The followers of Mary Baker Eddy take this view, for example, as did George Berkeley. Only thought exists. To quote a famous movie line, “there is no spoon.” These thoughts may be independent of whatever God people believe in or, as in monistic Hinduism, they may be actually be the thoughts of God (in which case, only God exists).

The last three kinds of metaphysics deny the existence of any kind of God. In Platonist Materialism, this denial is combined with a Platonist approach to mathematics, about which I have written before. Mathematics exists independently of the physical universe, and controls the physical universe, in the sense that the physical universe follows mathematical laws. Roger Penrose is one of many scientists holding this view.

In what I am calling Extreme Materialism, the existence of an independent mathematical world is also denied, i.e. there is an empiricist approach to mathematics (mathematics simply describes observed regularities in nature). This view seems to be increasing in popularity among non-religious people, although it causes philosophical problems for mathematics.

Finally, the concept of the Mathematical Universe holds that the so-called “physical universe” is itself composed only of mathematical objects – only mathematics exists (which makes this, in fact, a kind of Idealism).


World Population

Some feedback on my last post expressed surprise that Ptolemy’s specification of the Oikoumene now holds holds 80.6% of the world’s population. Above (click to zoom), I have redrawn the classic bar charts of world population which explain this fact. Africa, Asia, and Europe contain about 86% of the world’s population. Ptolemy excluded what we now know to be Southern Africa (which only drops the total to 85%) and didn’t extend his Oikoumene quite far enough to the east.

The chart below shows the same thing, but using NASA’s image of the Earth at night. It can be seen that the spikes on the bar chart correspond to major cities.


The Oikoumene of Ptolemy

I was reading recently about the Geographia of Ptolemy (written around 150 AD). This classic book applied Greek mathematical skills to mapping and map projection – and if there was one thing the Greeks were good at, it was mathematics. According to Neugebauer, Ptolemy believed the Oikoumene, the inhabited portion of the world, to range from Thule (63° North) to 16°25′ South, and 90 degrees East and West of Syene in Egypt.

The map above illustrates this Oikoumene, with a modern population overlay in red (data from SEDAC). Ptolemy was not too far wrong – today this region holds 80.6% of the world’s population, and the percentage would have been greater in antiquity.

Also shown on the map are some of the many cities listed in the Geographia. Open circles show Ptolemy’s coordinates (from here, adjusted to a Syene meridian), and filled circles show true positions. Ptolemy had reasonably good latitude values (an average error of 1.2° for the sample shown on the map), but much worse longitude values (an average error of 6.8°). The longitude error is mostly systemic – Ptolemy’s estimate of 18,000 miles or 29,000 km for the circumference of the earth was only 72% of the true value (several centuries earlier, Eratosthenes had come up with a much better estimate). If Ptolemy’s longitudes are adjusted for this, the average error is only 1.5°.

However, Ptolemy’s book deserves considerable respect – it is not surprising that it was used for more than a thousand years.

A Medieval Calendar

The beautiful image above (click to zoom) represents the month of September in the Très Riches Heures du Duc de Berry, a book of hours from the 1400s. In the background of the main picture is the Château de Saumur, with its height exaggerated (almost doubled). For comparison, below is a modern photograph of the château (by Kamel15) stretched vertically ×2:

The foreground of the main picture shows the grape harvest. At the top is a complex calendar. On the inner track, around the chariot of the sun, in red and black numerals, are the days of the month. On the outer track, in red and blue numerals, is a zodiacal calendar, showing the last days of Virgo and the beginning of Libra. Adjacent to the inner track are blue letters which relate to the 19-year Metonic cycle. Combining those letters with an appropriate table will show the phases of the moon for a given year.

The manuscript uses the Hindu-Arabic numerals first introduced to Europe by Fibonacci in his Liber Abaci of 1202. They are not quite the same as the ones we use today:

It is interesting to compare those digits with the ones in this German manuscript of 1459 by Hans Talhoffer (although Talhoffer actually mixes two different styles of 5). Then again, the letters of the alphabet have also changed since that time.


In praise of the humble flagellum

The bacterial flagellum (above) is a fascinating device. It contains a molecular motor which rapidly rotates the filament. Whipping around, the filament drives the bacterium forward. In some bacteria, running the motor in reverse causes a random tumble. Amazingly, the combination of forward motion, random tumbling, and a simple sensor allows a bacterium to “home in” on a target (see four simulated example runs below). The idea is to do a random tumble whenever the sensor shows the bacterium heading in the wrong direction. An actual steering mechanism is not necessary – the bacterium gets to the target in the end.

William Dembski and Michael Behe famously argued (via the somewhat informally articulated concepts of specified complexity and irreducible complexity) that the flagellum was too complex to have evolved. Their argument fell apart with the discovery that the flagellum shares components with other bacterial gadgets, such as the injectisome, and thus could potentially have evolved in stages (although in fact the injectisome seems to have evolved as a simplification of the flagellum, and the evolutionary history of the flagellum remains a mystery).

The fundamental point that Dembski and Behe were attempting to make can be illustrated by the simple experiment summarised in the chart above. This experiment presupposes three genes (A, B, and C) all created by single point mutations on copies of existing genes, such that the combination of all three genes creates a useful widget. In the “flat landscape” case, this combination must arise entirely by chance. This takes a very long time (on the experimental assumptions used, an average of almost 14,000 generations). Dembski and Behe were probably right to suggest that, if the bacterial flagellum had to arise that way, it could not have evolved in the time available since the earth was formed.

In the “parallel evolution” case, however, each of the genes A, B, and C are assumed to be independently beneficial. The A-B-C combination then evolves very quickly. Evolution of the bacterial flagellum may have included aspects of parallel evolution, if components of multiple older widgets were “co-opted” for the flagellum.

The evolution of the bacterial flagellum is generally assumed to have instead been a case of “sequential evolution” (gene A is beneficial on its own, gene B is beneficial in the presence of gene A, gene C is beneficial in the presence of genes A and B, etc.). However, it is not at all clear what the sequence of genes producing the bacterial flagellum might have been (suggestions on this topic by Liu and Ochman have been criticised), nor is it clear what the sequence of intermediate benefits might have been (given that the injectisome was not an intermediate stage). Further research on the humble, but amazing, bacterial flagellum is clearly still required.


Complexity and Randomness revisited

I have posted before (post 1 and post 2) about order, complexity, and randomness. The image above shows the spectrum from organised order to random disorder, with structured complexity somewhere in between. The three textual examples below illustrate the same idea.

Regular Complex Random
AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA AAAAAAAAAA … It was the best of times, it was the worst of times, it was the age of wisdom, it was the age of foolishness, it was the epoch of belief, it was the epoch of incredulity, it was the season of Light, it was the season of Darkness, it was the spring of hope, it was the winter of despair, … ShrfT e6IJ5 eRU5s nNcat qnI8N m-cm5 seZ6v 5GeYc w2jpg Vp5Lx V4fR7 hhoc- 81ZHi 5qntn ErQ2- uv3UE MnFpy rLD0Y DI3GW p23UF FQwl1 BgP36 RK6Gb 6lpzR nV03H W5X3z 2f1u8 OgpXy tY-6H HkwEU s0xLN 9W8H …

These three examples, and many intermediate cases, can be distinguished by the amount of information they contain. The leading way of measuring that information is with Kolmogorov complexity. The Kolmogorov complexity of a block of text is the length of the shortest program producing that text. Kolmogorov complexity is difficult to calculate in practice, but an approximation is the size of the compressed file produced by good compression software, such as 7-Zip. The chart below shows the number of bytes (a byte is 8 bits) for a compressed version of A Tale of Two Cities, a block of the letter ‘A’ repeated to the same length, and a block of random characters of the same length:

The random characters are chosen to have 64 possible options, which need 6 bits to describe, so a compression to about 75% of the original size is as expected. The novel by Dickens compresses to 31% of its original size.

But does this chart show information? Grassberger notes that Kolmogorov complexity is essentially just a measure of randomness. On this definition, random-number generators would be the best source of new information – but that’s not what most people mean by “information.”

An improvement is to introduce an equivalence relation “means the same.” We write X ≈ Y if X and Y have the same meaning. In particular, versions of A Tale of Two Cities with different capitalisation have the same meaning. Likewise, all meaningless random sequences have the same meaning. The complexity of a block of text is then the length of the shortest program producing something with the same meaning as that text (i.e. the complexity of X is the length of the shortest program producing some Y with X ≈ Y).

In particular, the complexity of a specific block of random text is the length of the shortest program producing random text (my R program for random text is 263 bytes), and we can approximate the complexity of A Tale of Two Cities by compressing an uppercase version of the novel. This definition of complexity starts to look a lot more like what we normally mean by “information.” The novel contains a large amount of information, while random sequences or “AAAAA…” contain almost none:

Those who hold that information satisfies the rule ex nihilo nihil fit can thus be reassured that random-number generators cannot create new information out of nothing. However, if we combine random-number generators with a selection procedure which filters out anything that “means the same” as meaningless sequences, we can indeed create new information, as genetic algorithms and genetic programming have demonstrated – although Stuart Kauffman and others believe that the evolution of biological complexity also requires additional principles, like self-organisation.