From remotely piloted surveillance systems that collect and process vast amounts of battlefield imagery to AI-enabled decision support systems that help develop targeting lists, the use of AI in warfare is growing rapidly. International law expert Jessica Dorsey discusses the implications of increasingly high-tech warfare.
The datafication of warfare and the increasing autonomy of weapons systems are shifting the focus on the battlefield to speed and scale. We see it now in Gaza and Ukraine, two armed conflicts with tens of thousands of civilian casualties, and we will continue to see it wherever wars are fought. At the same time, the dual-use nature of AI offers opportunities within warfare to aid decision making, train personnel through simulation, map battlefield infrastructure and aid better protection of civilians.
How do we manage the use of AI and set boundaries that effectively manage the ethical, legal, and practical implications of AI-enabled warfare?
Jessica Dorsey, J.D., LL.M., is an Assistant Professor of International and European Law at Utrecht University. She leads the Realities of Algorithmic Warfare project and is director of the Public International Law Clinic in addition to being an Expert Member of the Global Commission on the Responsible Use of AI in the Military Domain. Her current research focuses on the legitimacy of military targeting operations in the face of increasing autonomy in warfare, with a particular focus on how transparency, accountability and the rule of law contribute to this. She also advises the Dutch Ministry of Defense and the US Department of Defense on Protection of Civilians (PoC) policy, including on efforts to integrate PoC principles within military AI.
Ticket reservation recommended
To be assured of a seat for this lecture, we recommend reserving a ticket (black "order" button).
SG & USE/ITEC registration
Please register for SG & USE/ITEC by scanning your student ID at the venue prior to the start of the program.
More information about SG & USE/ITEC can be found here.