'Machines can't make life & death decisions': Nobel laureate Jody Williams on new-age weapons
Jody Williams speaks of why some countries object to these weapons, her expectations from the meetings and why LAWS could change the way wars are fought.
Jody Williams received the Nobel Peace Prize in 1997 together with the International Campaign to Ban Landmines for their central role in establishing the 1997 Mine Ban Treaty. The US-based political activist is known across the world for her efforts to enhance understandings of security and related issues in the world today. She is also the chair of the Noble Women’s Initiative that she founded in 2006 together with five other women Nobel Peace laureates.
She, along with 20 of her fellow Nobel Peace laureates have called for a preemptive ban on Lethal Autonomous Weapons Systems (LAWS)—weapons that could operate without human supervision once activated even in matters of killing human beings. The UN’s Convention on Certain Conventional Weapons (CCW) held their third informal government’s meet in Geneva from 11-15 April. Williams speaks of why they object to these weapons, her expectations from the meetings and why LAWS could change the way wars are fought.
What are your objections to the deployment of LAWS in warfare?
There is a whole range of issues. There are international humanitarian law (IHL) issues—the firm belief that autonomous weapons that can target and kill on their own cannot possibly comply with the laws of war, the norms of which have been developed over many decades-- generations and generations and suddenly you would have weapons systems that would not be able to comply (with IHL) and would undercut generations of work trying to bring some sanity to war. And that’s an oxymoron anyway.
So in terms of international law, ethics and morality are to me huge with regard to this, just as they are with nuclear weapons. It blows my mind to think how any country thinks that it has the right to vaporise people in another country. Of course, my country (the US) is the only country that has done it and they did it twice because they tested it twice for two different types of nuclear bombs. So the morality of allowing machines to make fundamental decisions about life and death about a human being is beyond my comprehension—how can somebody be involved in this and feel good about it? Which is why in the artificial intelligence community, there are 3100 members of that society that have signed a letter calling for a ban on killer robots. That’s a big percentage of the number of scientists because it gets morally reprehensible.
And those who like the idea of killer robots—my country (US), Israel, Russia, China--I think, Brazil makes some small riot-control robots--if they were to proceed and have fully autonomous weapons, they would be programming their weapons. Of course, nobody would tell the other countries their programmes. When they talk of transparency and Article 36, it makes me want to scream! But if I am programming my swarm of autonomous jets, for example, and if Russia is programming its swarm, there is absolutely no way to predict the consequences of those swarms meeting in battle. None.
In addition, if they can be built, they can be hacked. Just imagine, somebody hacks the central programme of the swarms and send them somewhere else. We had one of the men in the artificial intelligence community who led the movement to get scientists to sign, he said when people say there is any level of predictability in this, they do not know what they are talking about. There is no way to predict how systems let loose on their own will behave. So for all of those reasons, to me, it's insanity and plus, it would be a complete revolution in warfare. First we had gunpowder and that transformed, then nuclear weapons beyond the pale and then if it moves to fully autonomous weapons systems that, obviously, is a complete change in how wars would be fought. We worry that it would make the threshold for going to war much lower because if then you don’t have to worry about dead bodies coming to upset your public.
Delegations here seem to be working on the presumption that LAWS do not exist. Do we know of any country who possess or are developing such weapons?
Part of the reason that countries say that they do not exist is because they do not want a treaty to encompass existing defensive systems that are autonomous. We have indicated that the sole purpose of defensive autonomous weapons systems is that if your country is suddenly faced with a barrage of incoming missiles there is no way a human being can respond. So these are specific to responding to ballistic missiles. One could argue about that too. But they are afraid if they say some already exist as that would complicate existing systems they already have. For example, South Korea has a sentry system on the border that if you could flick a switch could be autonomous. These are machines that just sit there and if something comes into the view across the DMZ (demilitarized zone) they would shoot it.
There are many systems that are already in testing phase. We call them precursors—they are on the cusp. And when we first started the campaign (to stop killer robots), we used to say they are couple of decades into the future. Roboticists say, “you are crazy”. They are essentially real. They are frightening. Israel has roboticized weapons, not necessarily fully autonomous, US, of course, there's South Korea, Brazil makes an anti-riot autonomous robot. We know that China and Russia are doing research and development. We don’t know what stage. I wouldn’t be surprised if India (is also doing research and development). Pakistan, of course, is calling for a ban, so that’s interesting.
Lethal drones are now in the hands of 40 nations. If these things (LAWS) come into being they will proliferate. Do we really want a bunch of weapons around the world that are flying around? I know I don’t.
What are the broad divisions that you perceive among groups of countries regarding LAWS?
Broadest are the countries that have them in development or have precursors, are not interested in any prohibitions on their continued research, development, production etc. And then there are all the other countries. Some haven’t spoken up yet but it’s a huge technological divide. There are countries which have the capability. And there is the rest of the world which doesn’t have the same. That is a fundamental divide, and one can see it in the discussions and possible recommendations to come out this week at the review conference for CCW. Some want the word ‘prohibition’ somewhere in there. Countries that have those weapons in process do not want the word ‘prohibition’ in the recommendations.
Would you consider the use of LAWS for defensive purposes legitimate?
It would depend. How could one predict it now? Too often things that are supposedly being developed for defense are used for offense.
You received the Nobel Peace Prize for your work on clearing and banning landmines. Was it an uphill task?
It was shockingly fast. We launched the campaign in October of 1992 and by September of 1997 we had a treaty banning the use, production, trade and stockpile of the weapon. And in diplomatic terms, it’s like lightening speed. I mean, consider here. This is the third year, a week each year, in informal discussions. They have no real status within the treaty. So the fact that we went from zero to treaty in five years (for a ban on landmines) is like breath-taking.
How optimistic are you about a preemptive ban on LAWS?
The reason we succeeded with landmines and banning cluster munitions in 2008 was because it wasn’t done in here (the UN). When we attempted this in CCW (to ban landmines), in the earliest days of the campaign, in 1992, to have them amend the existing protocol—protocol II on landmines--to ban them, it went nowhere. At the end of the whole review, expert groups and all of that business, they actually ended up making the Convention itself weaker. And there was also an attempt to create a new protocol on cluster bombs. Nowhere. So they (countries that were interested in the ban) went outside (the UN) and created a treaty banning cluster bombs. (It was) stand-alone negotiating. That’s what we did with the landmine treaty.
All a treaty is, is a contract. The countries that supported the ban on landmine decided to move out of here because, as we know, consensus really means dictatorship by one, and they had a series of meetings over a year to deal with different aspects. I mean, theoretically, if I had a big enough house I could invite the countries of the world to come and sit in my living room and negotiate a treaty. It doesn’t have to happen at the UN. But countries who didn’t want the ban on landmines were absolutely horrified because if you go out of the UN, they lose control. I think it is appalling that any one country could hold the rest of the world hostage. If it would not have been for the campaign to stop killer robots, this would never have happened. You know how drones suddenly snuck on the scene--first there was surveillance drones, doesn’t seem too horrifying, then they were weaponised and then used for murder. I think, countries like my own expected that we go from drone to drone to drone without people really knowing but we found out about it and ended up creating a campaign. They are aiming low and moving slow, purposely.
Do you see a lot of haggling over the definition of LAWS?
The final definition of a weapon is never done until the end of negotiations. These are just ploys (of procrastinating). For three years they have kept saying “Oh, its such a complex (thing)”, “we don’t even know what we are talking about”. I do not have to build a killer robot to understand the difference between a drone (and LAWS). Even if the drone flies autonomously, there’s a human being looking at the computer and deciding to push the button. That’s clearly controlled by a human being. When you take a human being out of the drone—that’s the difference between human-controlled weapons and a fully autonomous weapon. Of course, there are complexities within that, but that’s a working definition. As I said in my opening statement (to the CCW), its obfuscation and confusion is purposeful. I think they thought that this issue will go away and now they really understand that we are not going away even if they kill it in the UN. We are guardedly optimistic that there might be a group of experts. The recommendations from the UN are only recommendations to the CCW Review Conference in December. It could get totally trashed between now and then.
What are the observations on the outcomes of this week’s informal meetings on LAWS?
Go slow and aim low (was the motto of some governments). I am glad that they took it up after we launched the campaign. If they cannot agree on a group of government experts and instead go on for a year (more) of informal consultation, I don’t know what we will do. Even if it doesn’t lead to what we want, countries are being organized-they would not have come out talking about these weapons at all.
President Joe Biden's administration has declined for months to present its position on a possible boycott.
Salman Khurshid slams G-23, says reform not achieved by questioning something one has 'taken advantage of'
Days after Moily stressed the need for a 'major surgery' on the Congress to make it electorally more competitive, Khurshid said these 'wonderful phrases' are not the answer
US antitrust overhaul: What it means for the likes of Amazon, Apple, Facebook; how it would change big tech
Even if only some of the proposals are passed as law, they will likely have significant consequences for the way big tech does business globally.