Happy New Year, everybody!
To usher in the new year, here’s a chart of some people of the past that I was trying to keep straight in my head (click to zoom).
I was having a discussion with someone recently on metaphysics, so I thought I would blog about it. Here are seven varieties of metaphysics, describing three “layers” of reality (and yes, I am oversimplifying for brevity).
The first is Platonism. Plato believed that there was a hierarchy of Forms (Ideals), of which the highest was The One (Plato’s version of God). These Forms or Ideals were the true reality, and the physical objects we touched, saw, and tasted were only shadows of that true reality (that is the point of the allegory of the cave). The physical orange which we see and eat reflects Ideals such as “Fruit,” “Sphere,” and “Orange.” Neoplatonism continues and extends this point of view.
Saint Augustine and many later Christians held to a Christianised Platonism, in which the Ideals were thoughts in the mind of God (the Christian God, naturally). The physical objects we touched, saw, and tasted had a greater importance in Christian Platonism than they did for Plato – after all, when God created those objects, “God saw that it was good.” Much as with Platonism, the regularities that people see in the physical universe are explained by the fact that God created the universe in accordance with regularities in the Divine thoughts. However, Christian Platonism does not have the metaphysical hierarchy that Platonism or Neoplatonism have – in Christian Platonism, God makes direct contact with the physical universe.
Aristotle also reacted to Plato by increasing the importance of the bottom layer, and Aristotle’s thought was Christianised by Thomas Aquinas as Thomism. However, in Thomism the all-important bottom layer does very little except to exist, to have identity, and to have properties assigned to it. It is also not observable in any way. This can be seen in the Catholic doctrine of transubstantiation. According to the Tridentine Catechism of 1566, the bread and the wine of the Eucharist lose their bottom (“substance”) layer (“All the accidents of bread and wine we can see, but they inhere in no substance, and exist independently of any; for the substance of the bread and wine is so changed into the body and blood of our Lord that they altogether cease to be the substance of bread and wine”), while the bottom (“substance”) layer of the body and blood of Christ becomes metaphysically present instead.
Idealism denies that the physical universe exists at all. The followers of Mary Baker Eddy take this view, for example, as did George Berkeley. Only thought exists. To quote a famous movie line, “there is no spoon.” These thoughts may be independent of whatever God people believe in or, as in monistic Hinduism, they may be actually be the thoughts of God (in which case, only God exists).
The last three kinds of metaphysics deny the existence of any kind of God. In Platonist Materialism, this denial is combined with a Platonist approach to mathematics, about which I have written before. Mathematics exists independently of the physical universe, and controls the physical universe, in the sense that the physical universe follows mathematical laws. Roger Penrose is one of many scientists holding this view.
In what I am calling Extreme Materialism, the existence of an independent mathematical world is also denied, i.e. there is an empiricist approach to mathematics (mathematics simply describes observed regularities in nature). This view seems to be increasing in popularity among non-religious people, although it causes philosophical problems for mathematics.
Finally, the concept of the Mathematical Universe holds that the so-called “physical universe” is itself composed only of mathematical objects – only mathematics exists (which makes this, in fact, a kind of Idealism).
“About once every hundred years some wiseacre gets up and tries to banish the fairy tale,” C.S. Lewis wrote in 1952. The wiseacre of our time seems to be Richard Dawkins who, two years ago, told the world that fairy tales could be harmful because they “inculcate a view of the world which includes supernaturalism” (he had said similar things in 2008). In a later clarification, he added that fairy tales could “be wonderful” and that they “are part of childhood, they are stretching the imagination of children” – provided some helpful adult emphasises that “Do frogs turn into princes? No they don’t.”
But many scientists grew up with, and were inspired by, fantasy literature. For example, Jane Goodall tells of growing up with the novel The Story of Doctor Dolittle (as I did!). In fact, many science students and professional scientists avidly read fantasy literature even as adults (as they should). The booksthatmakeyoudumb website lists, among the top 10 novels read at CalTech and MIT, Harry Potter, Dune, and The Lord of the Rings. And Alice in Wonderland was written by a mathematician.
This is a science blog, so I have a strong emphasis on scientific truth, which tells us many important ecological and physiological facts about, for example, frogs. Without science, we’d all still be struggling subsistence farmers. But there is actually more than scientific truth out there.
There is also mathematical truth. Are the links in this frog network all equivalent? Yes, they are – but that is decided by mathematical proof, not by scientific experiment. It is in fact a purely abstract mathematical question – the background picture of the frog is actually irrelevant.
And there is ethical truth. Is it OK to eat frog’s legs? Science does not give us the answer to this (although logic can help us decide if our answer is consistent with our other beliefs), but fantasy literature often helps us to explore such ethical questions. Tolkien’s The Lord of the Rings is one superb example. Would you “snare an orc with a falsehood”? Would you attempt to take the One Ring and “go forth to victory”?
There is metaphorical truth. A frog may, in spite of what Dawkins says, be a handsome prince – there’s more to the universe than can be seen at first glance. Or, as Antoine de Saint-Exupéry put it, “What is essential is invisible to the eye.” Children often learn this important fact from fairy tales.
And there is even religious and philosophical truth. Does the frog-goddess Heqet exist, for example? Does the universe exist? Is there a spoon? The methods of philosophy are different from the methods of science, and some amateur philosophers simply state their beliefs without actually justifying them, but philosophy is actually very important. Science itself is based on certain philosophical beliefs about reality.
I recently read James Hannam’s God’s Philosophers, which is the story of the Medieval ideas that led up to modern science, told largely through short biographies of major and minor figures (this relates to my previous two posts about when and why science began, as well as to my three posts about science and Dante).
The early Middle Ages was, to a large extent, a struggle to build a more productive agricultural system (since Europe had lost access to the rich grain-fields of North Africa that had fed the Roman Empire). The later Middle Ages, however, saw an explosion of new ideas. Some of these ideas came from the Muslim world, but many were entirely original.
Hannam briefly surveys Medieval mathematics, logic, medicine, astronomy, astrology, alchemy, and engineering. Roger Bacon (1214–1292) and Richard of Wallingford (1292–1336) are discussed in some detail. The former wrote on optics and the theory of science, while the latter did work in trigonometry and designed an elaborate astronomical clock. Clocks were to replace living things as metaphors for the operation of the Universe.
Hannam also has a chapter on the Merton Calculators – Thomas Bradwardine (c. 1290–1349), Richard Swineshead (fl. 1340–1355), and William Heytesbury (c. 1313–1373). As well as contributing to logic, these scholars anticipated Galileo’s application of mathematics to physics, proving the mean speed theorem. In France, Nicole Oresme (c. 1325–1382) developed an elegant graphical proof of this theorem, as well as doing work in astronomy and introducing the bar graph. Ironically, it was the later Humanists who, inspired by the glories of ancient Greece and Rome, discarded some of these advances (the same source of inspiration also led to a decline in women’s rights, as Régine Pernoud has pointed out).
Hannam finishes his book with the stories of Kepler and Galileo. These are better known than those of the Medievals, but the myths surrounding Galileo seem to be as persistent as those about the so-called “Dark Ages.” Hannam’s treatment is necessarily simplistic and brief, but he does point out Galileo’s debt to Oresme and the Merton Calculators. For readers specifically interested in Galileo, the best introductory book is probably Galileo’s Daughter by Dava Sobel, with Finocchiaro for follow-up.
Hannam concludes “It would be wrong to romanticise the period and we should be very grateful that we do not have to live in it. But the hard life that people had to bear only makes their progress in science and many other fields all the more impressive. We should not write them off as superstitious primitives. They deserve our gratitude.”
See also this review in Nature of Hannam’s book (“God’s Philosophers condenses six hundred years of history and brings to life the key players who pushed forward philosophy and reason”), this review by a Christian blogger (“In God’s Philosophers James Hannam traces medieval natural philosophy—and some of the other disciplines we’ve come to think of as scientific, such as medicine—through the reign of Plato and Aristotle to the discoveries of Kepler and Galileo”), and this excellent review by an atheist historian (“… the myth that the Catholic Church caused the Dark Ages and the Medieval Period was a scientific wasteland is regularly wheeled, creaking, into the sunlight for another trundle around the arena. … Hannam sketches how polemicists like Thomas Huxley, John William Draper, and Andrew Dickson White, all with their own anti-Christian axes to grind, managed to shape the still current idea that the Middle Ages was devoid of science and reason.”). Hannam has also responded comprehensively to this negative review by Charles Freeman. I disagree with Freeman, and am giving Hannam’s well-researched and readable book four stars. My only real quibble is Hannam’s somewhat biased view of the Protestant Reformation.
This first philosophical view is familiar through the slogan “there is no spoon.” The only true reality, it says, is spiritual. Nothing physical actually exists. This view has been taught by some (though not all) schools of Hinduism. In Europe, it is associated with George Berkeley. The difficulty with this perspective is that the laws of science must, in some sense, be emergent from the spiritual reality. But how?
This view implies a time t0 at which the Universe “began” – in the sense that nothing (not even time) existed before then. The immediate response is: why? Some kind of explanation for the existence of the Universe seems necessary (although Stephen Hawking argues not). Given that there could be no event before t0, a purely scientific explanation seems impossible, leaving religion or philosophy to supply one. The traditional explanation – from Plato, Christianity, and other religions – is some form of divine creation. Such an explanation is not everybody’s cup of tea, of course.
Alternatively, the Universe has always existed. If the number of possible states in the Universe is finite, this means that the present state of the Universe must have occurred infinitely often in the past (down to the position of every atom), and must occur infinitely often in the future. Previous analogues of me have blogged this comment infinitely often in the past, and infinitely many future analogues will do so again. This is true whether the Universe is deterministic or random. The Stoics were one group who believed in such a (rather depressing) cyclic Universe, but it seems difficult to swallow.
The prospect of “eternal recurrence” can be eliminated if the Universe has an infinite number of states, but this seems to require some kind of eternal expansion. The Steady State model was once proposed as a way of achieving this. A modern alternate suggestion is that new sub-Universes are constantly “popping into existence” as a result of quantum fluctuations in older sub-Universes, thus forming an infinite branching tree.
It is not quite clear how this branching would work, however, and Paul Davies points out that there are philosophical problems too: “For a start, how is the existence of the other universes to be tested? To be sure, all cosmologists accept that there are some regions of the universe that lie beyond the reach of our telescopes, but somewhere on the slippery slope between that and the idea that there are an infinite number of universes, credibility reaches a limit. As one slips down that slope, more and more must be accepted on faith, and less and less is open to scientific verification. Extreme multiverse explanations are therefore reminiscent of theological discussions. Indeed, invoking an infinity of unseen universes to explain the unusual features of the one we do see is just as ad hoc as invoking an unseen Creator. The multiverse theory may be dressed up in scientific language, but in essence it requires the same leap of faith.”
So there you have it. Four views, some of which have been around for millennia, and all of which have adherents and opponents. View 2a is the most commonly accepted. Which one do you think is correct?
I recently read the classic Evolution as a Religion: Strange Hopes and Stranger Fears by English philosopher Mary Midgley. In the introduction to the revised (2002) edition, Midgley explains the motivation for the book as follows: “I had been struck for some time by certain remarkable prophetic and metaphysical passages that appeared suddenly in scientific books about evolution, often in their last chapters. Though these passages were detached from the official reasoning of the books, they seemed still to be presented as science. But they made startling suggestions about vast themes such as immortality, human destiny and the meaning of life.” (p. viii). As an example, she quotes the molecular biologist William Day: “He [man] will splinter into types of humans with differing mental faculties that will lead to diversification and separate species. From among these types, a new species, Omega man, will emerge … as much beyond our imagination as our world was to the emerging eucaryotes.” (p. 36).
Such “prophetic and metaphysical passages” are also familiar from the fictional works of Olaf Stapledon and H. G. Wells. Midgley argues that they represent bad science, twisted to have characteristics of a religion, such as assigning meaning to life (pp. 15, 71). The myth of the “Evolutionary Escalator,” extrapolated to some glorious imaginary future, is one example. Nothing in evolutionary science justifies this view, comforting though it may seem (p. 38). Furthermore, past attempts to accelerate the process by breeding an “Übermensch” have not ended at all well (p. 9), and more recent proposals are also disturbing (pp. 48–49).
Midgley claims that prophecies based on the “Evolutionary Escalator” myth “are quite simply exaltations of particular ideals within human life at their own epoch, projected on to the screen of a vague and vast ‘future’ – a term which, since Nietzsche and Wells, is not a name for what is particularly likely to happen, but for a fantasy realm devoted to the staging of visionary dramas. In their content, these dramas plainly depend on the moral convictions of their author and of his age, not on scientific theories of any kind.” (pp. 81–82).
In contrast, Midgley quotes a more pessimistic perspective from the physicist Steven Weinberg: “The more the universe seems comprehensible, the more it also seems pointless. But if there is no solace in the fruits of our research, there is at least some consolation in the research itself. Men and women are not content to comfort themselves with tales of gods and giants, or to confine their thoughts to the daily affairs of life; they also build telescopes and satellites and accelerators, and sit at their desk for endless hours working out the meaning of the data they gather. The effort to understand the universe is one of the very few things that lifts human life a little above the level of farce, and gives it some of the grace of tragedy.” (pp. 86–87). This perspective is very different from that of the evolutionary optimists, but it does share a certain scientist-centric bias.
Are God and Nature then at strife,
That Nature lends such evil dreams?
So careful of the type she seems,
So careless of the single life;
‘So careful of the type?’ but no.
From scarped cliff and quarried stone
She cries, ‘A thousand types are gone:
I care for nothing, all shall go.’
Midgley is particularly negative about the “red in tooth and claw” view of evolution, which emphasises competition as against cooperation. She sees the “selfish gene” concept popularised by Richard Dawkins as an example of this. In a 2007 interview with The Independent, she claimed “The ideology Dawkins is selling is the worship of competition. It is projecting a Thatcherite take on economics on to evolution. It’s not an impartial scientific view; it’s a political drama.”
Indeed, when an organism succeeds by occupying a new ecological niche (as, for example, urban coyotes do), there need not be any competition at all (at least, not initially).
Extreme forms of sociobiology comes in for particular criticism from Midgley. They produce, she claims, bad science: “Environmental causes are neglected without any justification being given, and so are causes which flow from an individual itself during its lifetime … In human affairs, both these areas are of course of the first importance, since they cover the whole range of culture and individual action.” (p. 151). She is far from being the only scholar to make such criticisms.
Less common is the way in which she blames the growth of creationism on the rhetoric of sociobiologists themselves: “The project of treating the time scale of the Genesis story literally, as a piece of history, is an amazing one, which serious biblical scholars at least as far back as Origen (AD 200) have seen to be unworkable and unnecessary. The reason why people turn to it now seems to be that the only obvious alternative story – evolution – has become linked with a view of human psychology which they rightly think both false and immoral.” (p. 172).
See also this talk by Midgley, related to a more recent book on a similar topic:
In essence, Mary Midgley strongly supports scientists when they do science, but does not always accept the results of scientists doing philosophy (and especially moral philosophy). This little book sounds a helpful note of caution for those scientists who have become interested in philosophical speculation.
The debate last February between creationist Ken Ham and science educator Bill Nye has been widely discussed (see also the video). Both sides were rather an embarrassment, but one interesting aspect was Ham’s distinction between “observational science” and “historical science.” This has been called an “inane and baseless fallacy” – but is it?
In fact, all watchers of the CSI franchise know that there is a clear distinction between (on the one hand) applying known science to the past – in order to decide who did what – and (on the other hand) developing new knowledge of scientific principles. There is, of course, an interplay between the two. For example, forensic entomology draws on experimental work in a specific aspect of insect ecology. Experimental work in ballistics (popularised by the MythBusters) is used to decide what conclusions can be drawn from bullets and bullet wounds.
Observational science tends to be restricted to the here-and-now, where confounding factors can be dealt with. NASA and ESA justifiably spend a lot of money sending probes around the solar system (e.g. the probe above) so that the reach of observational science can be extended to objects which humans cannot visit. Events which are outside the solar system, or are distant in time, are outside the scope of direct observation altogether, which means that some degree of inference is inevitable.
Of course, this does not mean that scientists throw up their hands in despair, and say “we’ll never know.” Astronomers routinely investigate the same phenomenon at multiple wavelengths (e.g. radio waves and visible light), in order to get a clearer picture of what’s been going on. The supernova of last April (see image below) is one example, having been investigated at gamma-ray and optical wavelengths.
Carbon dating involves several assumptions about the past – but from the very beginning those assumptions were cross-checked using other dating techniques, such as tree rings and historical methods (the diagram below is redrawn from the Arnold & Libby paper of 1949). In practice, carbon dating is adjusted for multiple confounding factors, and provides a moderately accurate dating method for carbon-containing objects with ages up to tens of thousands of years.
In summary, then, a distinction can indeed be drawn between “observational science” and “historical science.” The latter draws on the scientific principles established by the former. Scientists tackle the problem of not being able to directly observe the past by using multiple independent methods to infer what happened, and this can allow very solid conclusions to be drawn. That’s precisely what makes books, films, and television shows about forensic science so compelling.
Update: see also this 2008 post from the National Center for Science Education on the topic.