Killer robots must be stopped, say campaigners


The Observer

A new global campaign to persuade nations to ban “killer robots” before they reach the production stage is to be launched in the United Kingdom by a group of academics, pressure groups and Nobel peace prize laureates.

Robot warfare and autonomous weapons, the next step from unmanned drones, are already being worked on by scientists and will be available within the decade, said Dr. Noel Sharkey, a leading robotics and artificial intelligence expert and professor at Sheffield University in northern England. He believes that development of the weapons is taking place in an effectively unregulated environment, with little attention being paid to moral implications and international law.

The Stop the Killer Robots campaign will be launched in April at the House of Commons and includes many of the groups that successfully campaigned to have international action taken against cluster bombs and landmines. They hope to get a similar global treaty against autonomous weapons.

“These things are not science fiction, they are well into development,” said Sharkey. “The research wing of the Pentagon in the United States is working on the X-47B [unmanned plane] which has supersonic twists and turns with a G-force that no human being could manage, a craft which would take autonomous armed combat anywhere in the planet.

“In America they are already training more drone pilots than real aircraft pilots, looking for young men who are very good at computer games. They are looking at swarms of robots, with perhaps one person watching what they do.”

Sharkey insists he is not antiwar but deeply concerned about how quickly science is moving ahead of the presumptions underlying the Geneva Convention and the international laws of war.

“There are a lot of people very excited about this technology, in the U.S., at BAE Systems, in China, Israel and Russia, very excited at what is set to become a multibillion-dollar industry. This is going to be big, big money. But actually there is no transparency, no legal process. The laws of war allow for rights of surrender, for prisoner of war rights, for a human face to take judgments on collateral damage. Humans are thinking, sentient beings. If a robot goes wrong, who is accountable? Certainly not the robot.”

He disputes the justification that deploying robot soldiers would potentially save lives of real soldiers. “Autonomous robotic weapons won’t get tired, they won’t seek revenge if their colleague is killed, but neither will my washing machine. No one on your side might get killed, but what effect will you be having on the other side, not just in lives but in attitudes and anger.

“The public is not being invited to have a view on the morals of all of this. We won’t hear about it until China has sold theirs to Iran. That’s why we are forming this campaign to look at a pre-emptive ban.

“The idea is that it’s a machine that will find a target, decide if it is the right target and then kill it. No human involvement. Article 36 in the Geneva Convention says that any new weapon has to take into account whether it can distinguish and discriminate between combatant and civilian, but the problem here is that an autonomous robot is not a weapon until you clip on the gun.”

At present, Sharkey says, there is no mechanism to distinguish in a robot’s mind between a child holding up a sweet and an adult pointing a gun. “We are struggling to get them to distinguish between a human being and a car. We have already seen the utter incompetence in the use of drones, operators making a lot of mistakes, not being properly supervised.”

Last November the international campaign group Human Rights Watch produced a 50-page report, Losing Humanity: the Case Against Killer Robots, outlining concerns about fully autonomous weapons.

“Giving machines the power to decide who lives and dies on the battlefield would take technology too far,” said Steve Goose, arms division director at Human Rights Watch. “Human control of robotic warfare is essential to minimising civilian deaths and injuries.”

U.S. political activist Jody Williams, who won a Nobel peace prize for her work at the International Campaign to Ban Landmines, is expected to join Sharkey at the launch at the House of Commons. Williams said she was confident that a pre-emptive ban on autonomous weapons could be achieved in the same way as the international embargo on anti-personnel landmines.

“I know we can do the same thing with killer robots, I know we can stop them before they hit the battlefield,” said Williams, who chairs the Nobel Women’s Initiative.

“Killer robots loom over our future if we do not take action to ban them now,” she said. “The six Nobel peace laureates involved in the Nobel Women’s Initiative fully support the call for an international treaty to ban fully autonomous weaponised robots.”


    We’ve already crossed into the dark place. We won’t turn back until we have destroyed the world.

    • 秋中 赳

      Destroyed the world with drones that deliver AGM-65 and AGM-114 missiles? Really?

  • The inability of people to see where drone warfare is leading us is incomprehensible.

    • 秋中 赳

      And where, if I may ask, is it leading us? All I see is the minimized risk for the operators. A fighter plane doing the same mission risks at least one pilot’s life, some planes even more.

  • Mouluuuu

    Terminator…Rise of the Machines

    • 秋中 赳

      No. Just no. There is no software capable of that. Sci-fi flicks like Terminator always oversimplify these things.

  • 秋中 赳

    They’re not autonomous. They all have ground control personnel. A Predator drone can’t even identify its target without the operators on the ground, let alone engage it. Same applies for the X-47B. The human brain can identify and react faster than any processor and software on the planet (there’s a reason all operators for drones like the Predator and the 47 are certified fighter pilots.)

    But then again the people against it are Nobel peace laureates, a completely worthless price by the way, so I’m not at the last surprised these people don’t understand that the drones are not actually autonomous and that we don’t have the software and processor power to make them so.

    Personally, if I was a general, I’d rather attack the enemy with a drone than with a real airplane: It still hits the target and minimizes the risks for my own soldiers. It’s a perfectly feasible option in warfare.

    • johnny cassidy

      Retired US Brig. Gen. Craig Nixon says otherwise (here:, noting a world of difference between drone warfare rhetoric and reality.

  • Bilge Pump

    What the people who design, build and operate these systems don’t seem to understand is that they are creating a vacuum where moral judgments about what is acceptable to stop THEM could wind up boomeranging. If you are designing a system that kills innocent civilians and the machine is not culpable, it is not a stretch to see that you, your family and those close to you could become targets for those who want to stop you at all costs. It’s not just the folks on the other side of the planet in danger. With no moral high ground, it could be right here in America.