Gender and glaciers?

There has been some controversy about the 2016 NSF-funded paper “Glaciers, gender, and science: A feminist glaciology framework for global environmental change research” (see here for a detailed analysis). The paper refers, inter alia, to the Forbes/Tyndall debate of the century before last (although I believe it is misinterpreting that saga). But, interesting as that episode was in the history of science, it has little to say about the epistemology of modern glaciology. In the 1800s, observing glaciers required extensive (perhaps even “heroic”) mountain climbing. Today, remote sensing methods and computer models are also important, and we understand glaciers much better than either Forbes or Tyndall did.

I don’t think that the gender studies lens adds anything to our understanding of glaciers. And I suspect that Elisabeth Isaksson, Moira Dunbar, Helen Fricker, Julie Palais, Kumiko Goto-Azuma, or Jemma Wadham would not think so either. Nor are race relations particularly important in studying ice. And as to “alternative ways of knowing,” I would prefer to stick with the scientific method – it’s worked very well so far (didn’t we just have a march against “alternative facts”?). Indeed, to subordinate science to the modern politicised humanities would be to abandon the concept of scientific truth, and to make it impossible to gain widespread agreement on the crises currently facing humanity.


Looking back: 1978

In 1978 I started senior high school (year 11 and 12). That was a year of terrorism – a bomb was exploded outside the Sydney Hilton Hotel by the Ananda Marga group (apparently in an attempt to kill Indian prime minister Morarji Desai), and former Italian prime minister Aldo Moro (below) was kidnapped and murdered by the Red Brigades. On a more positive note, John Paul II became the first Polish pope, and helped to chip away at the power of the Soviet Union.

That year also marked the debut of the soap opera Dallas and the comic strip Garfield. In science, James Christy at the United States Naval Observatory discovered Pluto’s moon Charon. We finally got a good look at it in 2015:

In computing, the Turing Award went to Robert Floyd, for his work in programming languages and algorithms. Intel introduced the 8086, the first of the x86 microprocessors which are still the most common CPUs in personal computers and laptops today. The game Space Invaders also had its debut:

The year 1978 also saw the release of the unsatisfactory animated version of The Lord of the Rings, and a number of interesting albums, including The Kick Inside by Kate Bush, Pyramid by The Alan Parsons Project, Dire Straits by the band of the same name, the electronic Équinoxe by Jean Michel Jarre, and Jeff Wayne’s Musical Version of The War of the Worlds:

Of the books published that year, The Road Less Traveled by M. Scott Peck, the exceedingly dark The House of God by Samuel Shem, and A Swiftly Tilting Planet by Madeleine L’Engle (below) stand out.


Social Media, Marketing, and the Fyre Festival

In traditional Christian theology, Satan is the ultimate marketing genius. Not being able to create, Satan has no actual product to sell – merely illusions. However, being a fallen angel, he does have supernatural intelligence. He also has a large crowd of “influencers” willing to endorse the nonexistent product. The book and film of Stephen King’s Needful Things illustrate the concept brilliantly, as the main character (played to perfection by Max von Sydow) uses his supernatural marketing genius to con people into trading their souls for useless bits of junk:

Of course, that kind of marketing is an ideal that mere human beings cannot achieve. Beneath the ridiculous Kendall Jenner advertisement, Pepsi has an actual product to sell. It may only be flavoured sugar-water, but that’s not a product to be sneered at – I remember a hot day in rural Thailand some decades ago when it was the only safe thing to drink.

Yet we may be closing in on what Max von Sydow could do. Browser history analysis and sophisticated predictive algorithms can stand in for the supernatural intelligence. YouTube helps to sell the illusion. And Instagram provides influencers galore. The recent Fyre Festival is perhaps the closest approach ever to the ideal. The musicians, accommodation, and food promised to the paying clientele do not appear ever to have been organised (although there apparently were a few waterlogged tents and cheese sandwiches). But the promo was great.


In praise of the codex


Charles Emmanuel Biset, Still life with Books, a Letter and a Tulip

The codex (book with pages) has been with us for about 2,000 years now. Because of advantages like rapid access to specific pages, it gradually replaced the older technology of the scroll:

Christians seem to have been early adopters of the codex technology. The oldest known fragment of the Christian New Testament, papyrus P52, dated to around the year 130, is a small fragment of a codex of the Gospel according to John (with parts of verses 18:31–33 on one side of the page, and parts of verses 18:37–38 on the other):

In 2010, Google estimated that the total number of published books had reached 130 million. At times it seems that e-books are taking over from the printed codex format, but there is a friendliness to the printed book that would make me sorry to see it go. I am not the only one.

Robert Darnton, in The Case for Books: Past, Present, and Future, writes: “Consider the book. It has extraordinary staying power. Ever since the invention of the codex sometime close to the birth of Christ, it has proven to be a marvelous machine – great for packaging information, convenient to thumb through, comfortable to curl up with, superb for storage, and remarkably resistant to damage. It does not need to be upgraded or downloaded, accessed or booted, plugged into circuits or extracted from webs. Its design makes it a delight to the eye. Its shape makes it a pleasure to hold in the hand.

How true that is!


I ♥ science books!


A fable about science and climate change

This post will tell a simple fable. The characters are fictional, although the scenario is based on reality. At the end of the fable are some questions that puzzle me.


The smelter at Davy before it closed (photo: Jmchugh)

Billy-Bob Smith lives in the small town of Davy in the US South. He worked in the aluminium smelter there, until pressure from environmentalists closed it down. He is now unemployed (and rather bitter).

Aluminium production is very energy-intensive (the metal has been called “crystallised electricity”), and the smelter at Davy was fed by coal-fired power. Its demise is part of the general decline in US aluminium smelting (see the chart below, produced from this data).

Of course, demand for aluminium doesn’t just go away – world aluminium production is actually increasing. The plant at Davy was replaced within the year by a new plant in China, which was also fed by coal-fired power. In fact, in 2015 about two-thirds of Chinese electricity production (900 GW) was coal-fired. Chinese coal-fired power generation is projected to increase by 20% to 1100 GW in 2020 (making up about 55% of overall Chinese electricity production in that year, given the non-coal power plants that will also be coming on line). For comparison, the new coal-fired capacity being added in China each year is roughly equal to the entire generation capacity of Australia.

Billy-Bob Smith is very cynical about the environmentalists who effectively outsourced his job to China, with (as he correctly points out) no net benefit to the planet, and no net reduction in carbon emissions. In fact, Billy-Bob believes that the environmental activists in his state were funded by the Chinese government to destroy American jobs. Needless to say, he voted for Donald Trump in the recent US election.


Coal-fired power plant in Shuozhou, China (photo: Kleineolive)

Alicia Jones is a professor of atmospheric physics at a university not far from Davy. She has made significant advances in climate modelling, improving the way that radiative forcing is handled in computer models. There is even talk of nominating her for a Nobel Prize one day. Outside of her university work, she regularly gives talks to schoolchildren on the threat of climate change and the need to address the problem before it’s too late. She also frequently appears on local television. She was part of the group which lobbied to close down the smelter at Davy, in the recent US election she voted for Jill Stein, and she has marched several times in Washington, DC.


US Green Party presidential candidate Jill Stein (photo: Tar Sands Blockade)

My questions are these:

  1. What makes an intelligent person like Alicia Jones believe that simply moving carbon emissions to China actually addresses climate change?
  2. Being fully aware of the usefulness of computer modelling, why did Alicia Jones not do any economic modelling on the expected follow-on effects of closing the Davy plant?
  3. Is virtue ethics, deontological ethics, or consequentialism the best ethical framework for handling questions of this kind?
  4. In general, does the expertise of scientists lend any credibility to their economic, political, or philosophical pronouncements? Should it do so?
  5. What does it say about Alicia Jones’ ability to communicate scientific issues that over 50% of people in her state (people like Billy-Bob) do not believe in anthropogenic climate change at all? What does it say about scientific communication in general?
  6. Do problems with peer review affect the public perception of science?
  7. What does it say about the education system in the USA that Billy-Bob does not even believe that the earth is warming? After all, many US cities have temperature records going back over a century. Mean temperatures for Newport, RI, for example, show a 1.7°C rise between 1893 and 2016 (see chart below – the blue line is a cubic interpolation, while the red line is the result of loess smoothing).
  8. What can be done to improve this particular debate?


Looking back: 1989

In 1989, I started my first lecturing job, at Griffith University, Brisbane, Queensland. My PhD was all but finished and – more importantly – my scholarship money had run out. That was the year that Stanley Pons and Martin Fleischmann announced that they had discovered cold fusion. They had not. I’m glad that I was being more careful in my own work.


Griffith University’s bushland setting (photo: Tate Johnson)

Konrad Lorenz, William Shockley, and Andrei Sakharov all died in 1989, while Isamu Akasaki developed the now-ubiquitous GaN-based blue LED. Tim Berners-Lee designed the World Wide Web, the Tiananmen Square protests took place, the Berlin Wall came down, George Bush became President of the USA, and the Soviet–Afghan War ended (Bush’s son was to start his own Afghan war in 2001).


William Shockley in 1975 (photo: Chuck Painter / Stanford News Service)

The spaceprobe Voyager 2 (launched in 1977) visited Neptune in 1989, and took some lovely photographs.


Neptune, as seen by Voyager 2 in 1989

In the world of cinema, Batman, Indiana Jones and the Last Crusade, and The Fabulous Baker Boys were released. Books of 1989 included The Remains of the Day by Kazuo Ishiguro, An Acceptable Time by Madeleine L’Engle, The Writing Life by Annie Dillard, and Wonderful Life by Stephen Jay Gould. An interesting year, on the whole.


Poster for Indiana Jones and the Last Crusade


Blue Jeans and Culture

An earlier post touched on the concept of “cultural appropriation.” This label is often applied inappropriately, because the world is more interconnected than most people realise. It has been that way for longer than most people realise (for example, some 4,000 years ago, tin from England was being traded across the Mediterranean sea for use in making bronze). And ideas go back further than most people realise.

As Michael Crichton says in his excellent novel Timeline, “Yet the truth was that the modern world was invented in the Middle Ages. Everything from the legal system, to nation-states, to reliance on technology, to the concept of romantic love had first been established in medieval times. These stockbrokers owed the very notion of the market economy to the Middle Ages. And if they didn’t know that, then they didn’t know the basic facts of who they were. Why they did what they did. Where they had come from.

Consider blue jeans, for example.

Blue jeans are dyed with indigotin, a chemical derived from the indigo plant, which has long been grown in India. But before someone says “cultural appropriation from India,” indigotin was traditionally derived in Europe from the woad plant (northern Britons painted their skins blue with woad). In China, a different plant was used. Essentially, the use of indigotin was a cultural universal. In Germany, where a culture of excellence in organic chemistry grew up during the 19th century, a practical method for making synthetic indigotin was developed at the BASF company in 1897, and the choice of plant became moot.


A cake of indigo dye (photo: David Stroe)

Blue jeans are made from denim, a fabric named after Nîmes in France. During the California gold rush, Levi Strauss, a Jewish-American businessman of German origin, teamed up with Jacob Davis, a Jewish-American tailor of Latvian origin, to make denim work clothing for miners. These blue jeans were strengthened by metal rivets – an idea due to Davis, patented in 1873.

So which culture produced blue jeans – Indian? French? German? Latvian? Jewish? American? One can only say that blue jeans were produced by human culture.


Illustration from the patent application