Skip to main content
The Army Lawyer | 2019 Issue 1View PDF

No.1: Worldwide Wrap Up 2018 - Lethal Autonomous Weapon Systems

SGT Samantha Merryfield launches an RQ-11 Raven as part of an unmanned aerial system operator’s course at Joint Base McGuire-Dix-Lakehurst, N.J. (Credit: U.S. Army)

No. 1

Worldwide Wrap Up 2018

This Year’s WWCLE Focused on Technology and an Army in Transition.


Lethal Autonomous Weapon Systems

An Overview


At the 2018 Worldwide CLE, the National Security Law Department at TJAGLCS talked about some of the rules that apply to lethal autonomous weapon systems. We would like to share them here for everyone to think about and discuss.1 Though these weapons are too new for us to provide a comprehensive discussion of each and every rule that could apply, there are three more modest goals that we can accomplish: We can identify what autonomous weapon systems are; we can describe what we already know about the rules governing autonomous weapon systems; and we can identify—cautiously—some of the future legal problems that must be solved in order to get autonomous weapon systems right.

Defining Autonomous Weapon Systems

First, we want to define autonomous weapon systems. The Department of Defense (DoD) has done this in DoD Directive 3000.09, Autonomy in Weapon Systems. In general, DoD has defined two categories. First, there are semi-autonomous weapon systems, defined as a “weapon system that, once activated, is intended to only engage individual targets or specific target groups that have been selected by a human operator.”2 In other words, a human is the one that chooses the target. Second, we have autonomous weapon systems. These are defined as weapon systems “that, once activated, can select and engage targets without further intervention by a human operator.”3

The focus of our talk during the Worldwide CLE was on these two types of weapon systems as opposed to artificial intelligence more generally. While there are many possible uses of artificial intelligence in warfare, most of them will not pose exactly the same challenges that we face with autonomous weapon systems.

The Current Rules for Autonomous Weapon Systems

Because artificial intelligence is still in its early stages, the eventual capabilities and legal implications of autonomous weapon systems remain unknown. However, some very important issues have already been settled.

The Law of War Governs the Use of Autonomous Weapon Systems

The Law of War applies to the use of autonomous weapon systems. The U.S. reiterated this basic point in its submission to the Group of Governmental Experts, which meets under the framework of the Convention on Certain Conventional Weapons (CCW).4 Since the Law of War applies, basic Law of War principles and rules will govern the use of autonomous weapon systems during armed conflicts.

People—not Machines—Apply the Law of War

The U.S. has begun to clarify how the Law of War will apply to autonomous weapon systems. In its submissions to the Group of Governmental Experts, the U.S. has stated that “[i]t is not the case that the law of war requires that a weapon, even a semi-autonomous or autonomous weapon, make legal determinations.”5 Instead, “it is persons who must comply with the law of war by employing weapons in a discriminate and proportionate manner.”6 This is a critical point because it means that judge advocates’ law of war analysis must remain focused on commanders and human operators.

For example, when an autonomous weapon system selects and engages a target we do not pretend that the machine applied the principle of distinction or that the machine somehow made an assessment as to whether its actions were proportional. Instead, when a human commander decides to send machines to conduct an attack, that commander must ensure that only enemy combatants and military objectives are the object of the attack.7 Also, the commander must be satisfied that any harm to civilians will not be excessive in light of the concrete and direct military advantage expected to be gained, and the commander must have taken feasible precautions to protect civilians and other protected persons and objects.8 In this analysis, the machine’s code is best viewed as “an additional feature that improves the ability of human beings to implement legal requirements rather than as an effort to replace a human being’s responsibility and judgment under the law.”9 To put it more simply, the machine’s algorithms and code may be viewed as some of the feasible precautions the commander uses when conducting an attack.

This does not mean that the weapon system cannot be fully autonomous. The system need not have a human selecting each target. This does mean, however, that when a commander sends that weapon system to do a job and sets the parameters for how it will operate, that commander’s choices are governed by the Law of War.

The Department of Defense Has a Policy Governing Lethal Autonomous Weapon Systems

As mentioned earlier, DoD Directive 3000.09, Autonomy in Weapon Systems, establishes definitions and creates a policy framework for the acquisition of weapons that have autonomous features. Without diving into all the details of that policy, it is helpful to understand its general framework.

The policy begins by requiring that “[a]utonomous and semi-autonomous weapon systems shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force.”10 The term “human judgment over the use of force” is an important one, because the U.S. favors it over the “human control” standard advanced by some states11 and some non-governmental organizations.12 As the U.S. has pointed out, “an operator might be able to exercise meaningful control over every aspect of a weapon system, but if the operator is only reflexively pressing a button to approve strikes recommended by the weapon system, the operator would be exercising little, if any, judgment over the use of force.”13

After requiring that the weapon system allow appropriate levels of human judgment, the policy establishes technical requirements for the systems and then sets approval thresholds depending on the type of system. In general, DoD will follow its standard weapons approval process for three types of weapons: semi-autonomous weapon systems; human-supervised autonomous weapon systems (which are autonomous weapon systems where a human can terminate an engagement before unacceptable damage occurs) in limited circumstances; and autonomous weapon systems applying non-lethal, non-kinetic force (“such as some forms of electronic attack”).14 This means that most actual autonomous weapon systems must be approved by extremely high officials within DoD.15

Future Legal Issues

Once we understand the basic Law of War framework and the basic policy framework, we can begin to cautiously sketch out some of the legal issues that will need to be addressed in the future. In our Worldwide CLE discussion, we focused on five such issues. Before discussing the specifics of those issues, it is important to recognize that because of how they operate, autonomous weapon systems will raise legal issues during their development, not merely when they are used.16 This fundamental change from most of our current weapons creates most of the legal issues identified below.

Judge Advocates Involved in Development

Since legal issues are likely to arise in development, not just during the use of the weapon system, judge advocates will need to provide legal advice during the development process. This could require a shift in judge advocate assignments.

Also, judge advocates assigned to work on autonomous weapon system development must be well-versed in the Law of War. A judge advocate’s level of knowledge and training in this area is important because the stakes are incredibly high—if developers create unduly permissive algorithms, a Law of War violation could occur. If algorithms are unduly restrictive, the weapon system might not engage a lawful target important to mission success. This could be catastrophic because the employing commander probably cannot alter the algorithm while operating in the field.

Human-Machine Interface

Recall that the commander, not the weapon system, makes legal determinations. This means that the human-machine interface must enable the commander to comply with the Law of War. From a legal perspective, the human-machine interface must answer at least three questions. First, what precautions can the algorithm take? Second, how good is the algorithm at taking those precautions? Finally, is command input needed to take a precaution?

As an example, suppose a commander wishes to use an autonomous anti-ship missile to attack a group of enemy warships, but a hospital ship is nearby. The first question will be critical—once the missile is in the area, can the algorithm identify the hospital ship and avoid attacking it? The second question is also important: if the algorithm can identify the hospital ship with seventy-five percent accuracy, the commander will need to weigh the risk to the hospital ship in evaluating whether the attack on the warships will comply with the Law of War. This leads to the third question, whether command input is necessary to take a precaution. Here, if the algorithm cannot recognize and respond to the presence of the hospital ship on its own, the commander may need to intervene—for example, by narrowing the area the missile is allowed to search.

While simplistic, the above example illustrates the importance of the human-machine interface. This interface must communicate how the weapon will act, and it must do so in a way that allows the commander enough judgment to satisfy Law of War obligations.

Investigations

Things will go wrong during combat operations. When autonomous weapon systems are involved, investigations may be more complex. This can happen because many of the required resources may not be available at the unit level. For example, details about the algorithm’s development, data about what exactly occurred, and experts who can interpret the data may not be available to the unit using the weapon. While it is too early to know for sure, there may need to be a centralized investigation process for autonomous weapon systems. Also, designers must consider future investigations while creating the weapon system, ensuring that data is appropriately preserved.

Contract Law and Contact With Industry

Earlier we discussed the need for judge advocates to participate in the autonomous weapon system development process. While a simple concept in theory, proper procedures must be followed as many weapon systems are developed by contractors and not the government. Close involvement with contractors during autonomous weapon system development creates a risk that judge advocates (and other military personnel) could run afoul of the laws and policies governing contact with industry. However, this risk can be mitigated. Department of Defense policy actually favors “frequent, fair, even, and transparent dialogue with industry,”17 and both the Office of Federal Procurement Policy and the DoD Standards of Conduct Office have embarked on a “Myth-Busting” campaign to educate government personnel on how to successfully interact with industry.18 While a detailed discussion is outside the scope of this article, experts in these areas cannot be left out of the autonomous weapon system discussion. For a fuller discussion of the acquisition of disruptive technology, consult Maj. Andrew Bowne’s article also appearing in this issue of The Army Lawyer.

Possible External Constraints

Law, policy, and public opinion on artificial intelligence and lethal autonomous weapon systems is developing rapidly. There are a few situations that may emerge as external constraints on the U.S.’s ability to develop autonomous weapon systems.

First, there is the possibility that States who are parties to the CCW19 may agree on an additional protocol that would regulate lethal autonomous weapon systems. The CCW provides a framework for States to regulate certain types of weapons. The U.S. is a party to the CCW and to its five20 currently existing protocols. The U.S. is also participating as part of the Group of Governmental Experts working to determine the way forward for these weapons. There has been little progress, however, towards a new protocol.21 Even if a new protocol were created, it would only be binding on States that consent.

Second, States could create an entirely new treaty outside the CCW framework. This would be similar to the way in which States approached cluster munitions and antipersonnel landmines.22 While States seem content with the CCW process at this point, that may change in the future. Of course, such a treaty would only be binding upon States that become a party to it.

Finally, States could be limited by individuals, corporations, or other groups that voluntarily commit not to develop lethal autonomous weapon systems. Many such commitments exist and have received significant press attention. This is perhaps the most significant potential external limitation, as the competition for top talent in artificial intelligence research is fierce.23 For judge advocates, the opportunities for outreach in this area are significant.24 Areas of focus include the need to allow commanders to exercise appropriate human judgment, the need for accountability, and the importance of government cooperation with industry.25

Conclusion

Hopefully this brief overview will be helpful as our Corps prepares for the many ways autonomous weapon systems will affect military operations. As discussed during the Worldwide CLE, there is a lot of work to be done to solve the many legal issues that autonomous weapon systems will create.26 We are confident, however, that judge advocates, applying the fundamental principles of the Law of War, will be able to solve them. TAL

 


MAJ Sleesman is an associate professor and CAPT Huntley is a professor in the National Security Law Department at TJAGLCS.



Notes

1. Many thanks to Maj. Andrew Bowne, Contract and Fiscal Law Department, The Judge Advocate General’s Legal Center and School, without whose assistance this article would not have been possible. Despite his significant assistance, any errors are our own.

2. U.S. Dep’t of Def, Dir. 3000.09, Autonomy in Weapon Systems (21 Nov. 2018) [hereinafter DoD Directive 3000.09].

3. Id.

4. U.S. Working Paper, Autonomy in Weapon Systems, para. 10 (10 Nov. 2017) [hereinafter Autonomy in Weapon Systems]. For background on the Group of Governmental experts established by the Fifth Review Conference of the High Contracting Parties to the CCW, see Hayley Evans, Lethal Autonomous Weapons Systems at the First and Second U.N. GGE Meetings, Lawfare, (Apr. 9, 2018), https://www.lawfareblog.com/lethal-autonomous-weapons-systems-first-and-second-un-gge-meetings.

5. Id. para. 13.

6. Id.

7. U.S. Dep’t of Def., DoD Law of War Manual para. 5.5 (Dec. 2016) [hereinafter Law of War Manual]; see also Autonomy in Weapon Systems, supra note 4, para. 10.

8. Law of War Manual, para. 5.10; see also Autonomy in Weapon Systems, supra note 4, para. 13-14.

9. Autonomy in Weapon Systems, supra note 4, para. 15.

10. DoD Directive 3000.09, supra note 2, para. 4(a).

11. See Working Paper Submitted by the Bolivarian Republic of Venezuela on behalf of the Non-Aligned Movement (NAM) and Other States Parties to the Convention on Certain Conventional Weapons (CCW), General principles on Lethal Autonomous Weapons Systems, https://www.unog.ch/80256EDD006B8954/(httpAssets)/E9BBB3F7ACBE8790C125825F004AA329/$file/CCW_GGE_1_2018_WP.1.pdf.

12. Human Rights Watch, Heed the Call: A Moral and Legal Imperative to Ban Killer Robots, Human Rights Watch, https://www.hrw.org/report/2018/08/21/heed-call/moral-and-legal-imperative-ban-killer-robots.

13. U.S. Working Paper, Human-Machine Interaction in the Development, Deployment and Use of Emerging Technologies in the Area of Lethal Autonomous Weapons Systems, para. 11 (28 August 2018).

14. DoD Directive 3000.09, supra note 2, para. 4(c).

15. Id., para. 4(d). These officials are: The Under Secretary of Defense for Policy; the Under Secretary of Defense for Acquisition, Technology, and Logistics; and the Chairman of the Joint Chiefs of Staff. Id.

16. Autonomy in Weapon Systems, supra note 4, para. 29 (“[a]dvanced applications of autonomy in weapon systems can allow for issues that would normally only be presented in the context of the use of the weapon system to be presented in the context of the development of the weapon system.”).

17. Memorandum from Deputy Secretary of Defense to the Secretaries of the Military Departments, et al., subject: Engaging with Industry (Mar. 2, 2018).

18. Id. See also Defense Acquisition University, Contracting Subway Map, https://www.dau.mil/tools/Documents/Contracting%20Subway%20Map/DAU%20Contracting%20Subway%20Map.html.

19. Convention on Prohibitions or Restrictions on the Use of Certain Conventional Weapons Which May be Deemed to be Excessively Injurious or to Have Indiscriminate Effects, 10 November 1980, 1342 U.N.T.S. 137, https://treaties.un.org/pages/viewdetails.aspx?src=treaty&mtdsg_no=xxvi-2&chapter=26&lang=en [hereinafter CCW].

20. There are currently Protocols I-V, but Protocol II has been amended.

21. Details about the Group of Governmental Experts can be found at: https://www.unog.ch/80256EE600585943/(httpPages)/8FA3C2562A60FF81C1257CE600393DF6?OpenDocument.

22. CCW, supra note 19; Convention On The Prohibition Of The Use, Stockpiling, Production And Transfer Of Anti-Personnel Mines And On Their Destruction, Mar. 1, 1999, 2056 U.N.T.S. 211, https://treaties.un.org/Pages/ViewDetails.aspx?src=IND&mtdsg_no=XXVI-5&chapter=26&clang=_en.

23. Cade Metz & Adam Satariano, Silicon Valley’s Giants Take their Talent Hunt to Cambridge, N.Y. Times (Jul. 3, 2018), https://www.nytimes.com/2018/07/03/technology/cambridge-artificial-intelligence.html.

24. Memorandum from The Judge Advocate General to Judge Advocate Legal Services Personnel, subject: Guidance for Strategic Legal Engagements (8 Sep 2016).

25. Brigadier General R. Patrick Huston, Future War and Future Law, Lawfire (Dec. 3, 2018), https://sites.duke.edu/lawfire/2018/12/03/guest-post-bg-pat-huston-on-future-war-and-future-law/.

26. Good references already exist, including: Schmitt & Thurnher, “Out of the Loop”: Autonomous Weapon Systems and the Law of Armed Conflict, 4 Harv. Nat’l. Sec. J. 231; Paul Scharre, Army of None (2018).