The eclipse

Above, a NASA photo of the solar eclipse of August 21.

The second before the sun went out we saw a wall of dark shadow come speeding at us. We no sooner saw it than it was upon us, like thunder. It roared up the valley. It slammed our hill and knocked us out. It was the monstrous swift shadow cone of the moon. I have since read that this wave of shadow moves 1,800 miles an hour. Language can give no sense of this sort of speed – 1,800 miles an hour. It was 195 miles wide. No end was in sight – you saw only the edge. It rolled at you across the land at 1,800 miles an hour, hauling darkness like plague behind it. Seeing it, and knowing it was coming straight for you, was like feeling a slug of anesthetic shoot up your arm. If you think very fast, you may have time to think, ‘Soon it will hit my brain.’ You can feel the deadness race up your arm; you can feel the appalling, inhuman speed of your own blood. We saw the wall of shadow coming, and screamed before it hit.” — Annie Dillard, 1982

I wish I had been there to see it.


Advertisements

Killer robots: it’s not the AI that’s the problem

In a recent open letter, Tesla’s Elon Musk and others called for a ban on autonomous weapons, saying “Lethal autonomous weapons threaten to become the third revolution in warfare. Once developed, they will permit armed conflict to be fought at a scale greater than ever, and at timescales faster than humans can comprehend. These can be weapons of terror, weapons that despots and terrorists use against innocent populations, and weapons hacked to behave in undesirable ways. We do not have long to act. Once this Pandora’s box is opened, it will be hard to close.

Yet autonomous weapons are already with us, after a fashion. And artificial intelligence isn’t actually the biggest problem.

A bullet, during the second or so that it is in flight, autonomously follows the laws of physics. But the world is not likely to have changed much during that time. If shooting the bullet was appropriate, that will still be true when it hits. A cruise missile can fly for several hours, and home in on a precise spot, specified by GPS coordinates – although things may have changed during those hours of flight.

Smarter again is a heat-seeking or radar-guided missile, which can home in on an aircraft, even one doing it’s best to evade the threat – yet it cannot distinguish passenger aircraft from military aircraft. The next step up are systems guided by IFF, which can distinguish friend from foe. After that comes the kind of AI that Elon Musk is talking about.

The ultimate extreme is the “Menschenjäger” of Cordwainer Smith’s 1957 short story “Mark Elf.” The Menschenjägers were built by the “Sixth German Reich” to seek out and kill their non-German enemies (whom they could infallibly detect by their non-German thoughts). Being virtually indestructible, the last Menschenjäger had travelled around the planet on this mission 2328 times by the time the story is set. Since no Germans were alive at that point, there was nobody left to shut it down.

The real problem with the Menschenjägers was not their AI, but their persistence in time. A similar problem arises with that most stupid of autonomous weapons, the landmine. Sown in their tens of millions, landmines continue to kill and maim for decades after the war that buried them is over.

It isn’t really a matter of whether the weapon has AI or not – it’s whether the weapon has an off switch or a self-destruct mechanism. No weapon should keep on pointlessly killing people.