Welcome to the first blog of the U.S. Army Judge Advocate General's Corps Future Concepts Directorate (FCD). We are excited to introduce our directorate and the interesting topics FCD will be discussing over the next year and beyond.
The FCD is the JAG Corps’ think tank and is one of four directorates of the Judge Advocate General’s Legal Center located on the campus of the University of Virginia in Charlottesville. Its mission is to serve as the JAG Corps’ focal point on the study of the law of future armed conflict by assessing the legal requirements of the future operational environment. It also reviews Army doctrine on behalf of the JAG Corps, and provides the intellectual foundation to design, develop, and field a globally responsive future JAG Corps.
FCD Mission
The FCD operates along three primary lines of effort: future conflict, doctrine, and strategic initiatives. First, it seeks to be the premier organization within the United States Government on the study of the law of future conflict. We think of this broadly as applying the law of armed conflict to the future operational environment, or LOAC-F. FCD partners or engages with any organization thinking about technology and its applications on the future battlefield. Second, FCD provides timely, ethical, responsive, and purposeful support and analysis to the Army’s doctrine development organizations. Third, FCD provides the same support to the JAG Corps’ own strategic initiatives in order to prepare its legal professionals to support future multi-domain operations.
Resources
Our goal is to make the FCD website a one stop shop for all matters pertaining to LOAC-F with news, analysis, and reports from our experts and partners. The site, found here, will contain links to relevant articles from the field and Academia, frequent blog posts, links to the FCD Podcast entitled “Battlefield NEXT, and news about technology, law, and future warfare.
We will also be highlighting interesting and useful information on The Judge Advocate General’s Legal Center and School’s Lifelong Learning website. Lifelong Learning can be found here and contains noteworthy news, articles, and resources that can be used for professional development.
The FCD will also be providing expanded reading lists that include examples of what our military leaders are reading about the military profession and strategic environment. However, we will also include other works that might seemingly be unrelated to our dual profession, but nevertheless offer different lenses through which we see issues. The objective in this respect is to spark creativity and inspiration in order to see the future more clearly. A few examples of works we are reading right now include Ghost Fleet by P.W. Singer and August Cole, The Light Brigade by Karmen Hurly, Army of None by Paul Scharre, and East West Street by Philippe Sands. We have also been listening to the Podcasts Revisionist History, Hardcore History, Bombshell, and the podcast of the Modern War Institute at West Point.
Regular Blog Posts
Substantive topics we will be attempting to tackle this year will be the use of artificial intelligence, offensive cyber operations, space operations including ground operations in space, autonomous weapons, ultrasonic effects, low yield tactical nuclear devices, emerging biological threats, deep fakes and their dangers to national security, private special operations capable organizations in light of Syria and Crimea, and effects of technology on future civilian populations.
Although there is much discussion about the use of emerging technology on the battlefield, many future conflicts will still bear similar characteristics as present-day conflicts in places like Syria, Libya, and Yemen. Accordingly, we will continue to explore chronic issues in warfare that will likely remain issues in the future including the use of explosive ordnance in urban areas as cities get larger and more densely populated, the continuing unlawful practice of targeting medical personnel and facilities, and accountability mechanisms.
Artificial Intelligence and Autonomous Weapons Systems
Artificial intelligence and autonomous weapons systems offer unique challenges for the future battlefield largely due to the uncertainty about how they will be employed. Accordingly, we will be giving extra emphasis on AI and autonomous weapons. Using the four pillars of the Law of Armed Conflict as his analytical foundation – military necessity, distinction, proportionality, and humanity – COL Jeffrey Thurnher argues that the legal risks associated with autonomous weapon systems can be operationally mitigated. He also states correctly that the “lack of a human to hold accountable does not undermine the lawfulness of the weapon system.”
Nevertheless, a lawful weapon might be used in an unlawful manner, and although the law of armed conflict does not require accountability (the decision to prosecute a suspect is left to prosecutorial discretion, but sometimes required for societal or political reasons), a state may still wish to pursue accountability. Further, the law does require states to be able to control the effects of their weapons. For this reason, methods of war crime accountability must remain an important part of the discussion for fear of eliminating the option to prosecute a war crime due to the lack of an attributable human.
Vulnerability
The use of technology necessarily creates vulnerabilities from technology. Consider this scenario. Sometime in the future, an army deploys a lethal autonomous robot. The opponent, suffering from local tactical overmatch, conducts a cyber-attack on the robot causing it to be unable to distinguish its targets. The robot kills civilians, and the enemy exploits the tragedy by publishing photographs of the aftermath across all manner of social media. Maybe the enemy even exacerbates the situation by publishing photographs that have been enhanced using deep fake technology leading the public to believe the victims were children. Public outrage demands accountability and military leaders conduct an investigation revealing that both the manufacturer and the military knew that this particular weapon system was vulnerable to cyber-attack. Who is responsible? Some of our academic partners, like Dr. Rebecca Crootof at the University of Richmond School of Law, are tackling this very issue.
In 1716, Christopher Bullock wrote in The Cobbler of Preston “Tis impossible to be sure of any thing but Death and Taxes.” Had he written this in the present day, he might also have included “technology that breaks.” It is not a matter of whether technology will malfunction, but when. And when it happens with objects designed to cause destruction, unintended consequences can be catastrophic. International law lacks a criminal negligence mens rea, and the law of armed conflict certainly contains no products liability provision. Will states in the future demand such a regime? Will there be formal dispute settlement mechanisms such as the one outlined in the United Nations Convention on the Law of Sea? Will dispute mechanisms be just for state parties like the International Court of Justice, or will private parties also be able to participate? As a matter of national policy, which states will voluntarily provide compensation for victims, and which states will not? What about inadvertent data spillage by government actors resulting in the public disclosure of private information? Will governments voluntarily allow affirmative claims?
Closer to home, American astrophysicist Neil deGrasse Tyson stated, “once you have an innovation culture, even those who are not scientists or engineers – poets, actors, journalists – they, as communities, embrace the meaning of what it is to be scientifically literate.” The U.S. Army is in the process or re-designing its acquisition system in order to be more responsive to emerging and future threats. Will American military lawyers need to be more scientifically literate and get involved earlier in the research, development, and acquisition process?
The law of future conflict is full of complexities and uncertainties such as these. The FCD will be confronting these issues straight-on in order to prepare the JAG Corps for future conflict. Whether we are discussing warp drives, robot soldiers, or tactical direct energy weapons, no topic is off limits to us and there are no bad ideas. If you have a topic you would like to discuss you can always reach us at usarmy.pentagon.hqda-tjaglcs.list.tjaglcs-doctrine@mail.mil. We look forward to partnering with you and holding interesting discussions.
LTC Matt Krause
Director, Future Concepts Directorate
Charlottesville, Virginia