Two excerpts from from Annalee Newitz's column for Metro Active (10/27/07).

"In South Africa, a widely used anti-aircraft cannon called the Oerlikon GDF-005 suffered from what many observers believe was a computer malfunction that killed nine soldiers and maimed 15 in a training exercise. Its computer-controlled sighting mechanism went haywire, and the gun automatically turned its barrel to face the trainees next to it, spraying bullets from magazines that it automatically reloaded until it was out of ammunition. Many compared the incident to science fiction fare like RoboCop or Terminator, where military bots turn on their masters."

"The tragedy in South Africa could have been avoided if the engineers who designed that cannon had simply refused to computerize its sight. With a big gun, computer error can be far worse than human error. Any decent engineer would have known that failure in computer systems is inevitable and come to the conclusion that weapons should not be programmed to function autonomously."

This story brings a few thoughts to mind.

1. Some have advocated for a Hippocratic Oath for scientists. Determining where the lines should be drawn would be a challenge, but we manage equally difficult tasks in others fields; why not the sciences?

2. The creation of autonomous machines with the ability to kill will be justified by the idea that they will save ___ lives [fill in your nation of choice]. The decision to send humans or robots into harm's way is no real decision at all (who wouldn't chose robots?), but this question obscures the more fundamental question: is this violence justified in the first place? When a government (or any group) can kill without risking lives from their side, how much more likely are they to turn to violence?

3. Our current organizational/legal structures insulate individuals at the top and bottom from blame during war. Soldiers simply follow orders and leaders claim not to be responsible for the actions of "bad apples". (It can take years to peel back the layers of responsibility when bad apples are charged with wrong doing.) Adding another layer--computer error--confuses the picture further. As more of these machines are deployed, clarity on who is responsible for what they do becomes increasingly important. If they are put in the field by the U.S. government, then we (U.S. readers) share responsibility for what these machines do.

No comments: