top of page

Home > Publications > "Autonomy and Anticipation: Drone Swarms and The Limits of Jud ad Bellum"

June 13th 2025

Autonomy and Anticipation: Drone Swarms and The Limits of Jud ad Bellum

IMG_8179 - Isabel Funari.jpg

By Isabel Funari

Bachelor's Degree in International Relations and Organisations from Leiden University and a recently obtained Master's Degree in Public International Law from Leiden University. Research focused on the intersection between artificial intelligence and international law, drones and the law, criminal justice and politics. Find Isabel Funari on LinkedIn.

Image by Alessio Soggetti

Historically, technological advances have been one of the primary aspects of war. An illustrative example concerns the use of single drones, also known as Unmanned Aircraft Vehicles (UAVs). Single drones are electrically driven with finite operational time and are used (semi-) autonomously (Floreano et al., 2015). When states use single drones, it reduces military spending as well as any risks related to military personnel (Ayamga et al. 2021). However, as UAVs have become central in contemporary warfare, these UAVs raise concerns about civilian loss, collateral damage and less human control in lethal decisions (Kreps & Maxy, 2021).

​

The Threat of Drone Swarms

​

Recently, focus has shifted towards drone swarms. This shift is visualized in the ‘Slaughterbots’ video by the Future of Life Institute and prominent AI expert Stuart Russell. The video has gone viral after its release, and features thousands of drone swarms operating in unison (Sugg & Wood, 2025). Drone swarms can add or remove drones, break into smaller groups, adjust to new information inputs, can be equipped with different munitions and sensors and coordinate at different speeds (Zielinksi, 2023). Ukraine’s increasing production of AI- enabled swarms along with China’s aspiration for dominance evokes US reciprocation, triggering an AI arms race. As states compete to develop this military technology, tensions rise over fears of escalation (Bajak, 2024).

​

Drones not only play a pivotal role during armed conflict, but also at its onset or escalation. For instance, on October 4, 2023, the US downed a Turkish drone in Syria when it accessed a restricted zone close to US forces. These Turkish strikes were a response to a terrorist attack targeted at the Turkish Interior Ministry in Ankara. These types of attacks raise concerns about escalation, such as retaliatory strikes (Hayyar & Grimmig, 2023).

​

These concerns are especially warranted due to the existence of fully autonomous single drones, capable of carrying out strikes without human oversight (Rickli & Mantellassi, 2024). The creation of fully autonomous drone swarms (FADS) however, proves much more complex, especially in regard to ethical decision-making. Therefore, current drone swarms are mainly used with human oversight. Wide FADS usage, as drone swarm technology advances swiftly, is expected in the near future (Guitton, 2021). The consequent loss of human control in drone decision-making triggers various operational concerns, like cybersecurity vulnerabilities and target misidentification risks.

​

The Autonomy of Drone Swarms


First, it is important to outline precisely what the autonomy of fully autonomous drone swarms (FADS) means. Instead of human control, algorithms control the swarm and are essential for their functioning: meaning the algorithms are important in the perception and planning stage of the control of the swarm. These FADS’ control system is divided into two stages: the perception and planning stage. In the perception stage, FADS are only dependent on algorithms. As each single UAV is equipped with sensors, algorithms form a fundamental component because of their data processing function through these sensors. In the planning stage, the data processed is changed by algorithms into instructions for operational tasks. For instance, the Kalman filter is commonly used for navigational purposes (Zhang et al., 2021).

​

Drone Swarms’ Complications for Anticipatory Self-defense


After 1945, a framework was created within the UN Charter to prohibit all uses of force within international relations: “To save succeeding generations from the scourge of war (Preamble UN Charter).” Article 51 of the UN Charter allows states to use force to defend themselves in case an armed attack occurs against them. However, the international security and conflict landscape has changed drastically from the environment that existed when the Charter provisions regulating the use of force were created. The founders of the UN Charter had in mind, when phrasing this aspirational goal, the preceding two World Wars that have caused grave suffering of humankind during the course of the twentieth century (UN Charter Preamble). The founders of the UN Charter could, however, not have accounted for the post-1945 rapid advancements in the means and methods of warfare.

​

Having outlined how the autonomy of drone swarms works, its technology and law can be brought together. Anticipatory self-defense means if an armed attack is imminent, the defending state can intercept the attack instead of waiting until the attack is launched (Schuller, 2014). According to Webster’s (1841) formula, imminence can be understood as a threat being “instant, overwhelming, leaving no choice of means, and no moment for deliberation.” Regarding FADS, however, their use or activation might be detected before the attack has begun. This raises concern whether a defending state can pre-emptively neutralize the drone swarm or disable it.

​

Perception Stage
For instance, a vital issue concerns how the precedingly outlined perception and planning stages of the autonomous control of FADS impact the evaluation of imminence. In the perception stage, algorithms process large amounts of information to recognize threats, making it possible for the drone swarm to assess its environment. The difficulty for a defending state lies in differentiating between data processing and the early stage of an attack. If a swarm only collects data or improves its positioning, can this be considered a first step of an armed attack? The data available to the defending state might be inconclusive, especially due to the decentralized nature of the drone swarm’s decision-making.

​

Planning Stage
Likewise, in the planning stage, the drone swarm's algorithms change data into actionable strategies. The lack of direct human control means a drone swarm could autonomously change from reconnaissance to an attack because of algorithms. This gives rise to uncertainties, such as whether a defending state is able to invoke anticipatory self-defense before the drone swarm executes its plan to attack, especially when its decision- making is dynamic and ambiguous. Thus, the issue is whether a state can lawfully neutralize or disable a drone swarm prior to it launching an attack, especially when the exact threshold when an imminent attack for FADS is unclear.

​

A dangerous Road Ahead?


So far it has become evident that the loss of human control causes a significant level of uncertainty whether a state’s use of force is lawful under anticipatory self-defense circumstances. Compared to conventional weapons, such as missiles, the use of drone swarms lies in that risk of states resorting to force earlier. Currently, the threshold for an imminent armed attack is therefore becoming uncertain. Further promising steps that could be taken could include substantive dialogue regarding which sorts of drone swarms provide the most military value for states and which drone swarms are the most high-risk and should not be allowed under certain circumstances. This substantive dialogue would then form the basis for either the creation of a separate legal framework or classifying drone swarms as weapons of mass destruction.

​​

A Separate Legal Framework
Further research is needed to evaluate the effectiveness of existing arms control regimes in regulating drone swarms. It may be necessary to establish a separate legal framework, such as a drone swarm treaty. A separate legal regime could enable military powers to differentiate drone abilities they are fearful of giving up from drone swarm technologies weapon regulation advocates that are fearful of proliferation. This may facilitate obtaining approval from such states in a less complicated manner. The treaty could outline clear positive and negative obligations to create clear parameters in which states can legally use drone swarms in anticipatory self-defense scenarios.


Weapons of Mass Destruction
Another avenue worth exploring is considering the lessons from weapons of mass destruction arms control regulations. These frameworks could potentially accelerate the creation of arms control regulations due to its normative function. Classifying FADS as a type of WMD could lead to stigmatization similar to nuclear, chemical, and biological weapons.

​​​​

A stigma refers to something demeaning or dishonourable. Sociologists Bruce Link and Jo Phelan conceptualize stigmatization and argue that four intertwined mechanisms should occur: (1) labelling, (2) stereotyping, (3) separation and, (4) status loss. Applying this conceptual framework to autonomous drone swarms suggests a potential path toward their stigmatization.

 

First, drone swarms are labelled as ‘flying killer robots’ or ‘killer robot swarms,’ showcasing the absence of human oversight. Second, research on drone swarms strengthens stereotyping of drone swarms’ effects, similar to the research conducted on nuclear winter. These associations create a perception of drone swarms as uncontrollable and dangerous. Third, this perceived threat contributes to their exclusion because they appear uncontrollable. Lastly, the concept of autonomy in drone swarms causes status loss through its negative labelling. Hence, social hierarchies in international security discourse can rank drone swarms lower in the scale, similar to biases in race. This facilitates regulatory limitations based on perceived dangers instead of specific actions.

​​

Conclusion


Overall, current state practice is scarce and there are many open questions that need to be addressed in order for drone swarms to be used lawfully within the parameters of the jus ad bellum. Promising avenues could be either creating a separate legal framework or classifying drone swarms as weapons of mass destruction to to accelerate international norms and restrictions. What is evident, however, is that drone swarms pose a significant risk to the jus ad bellum and this risk will only keep increasing as drone technology and artificial intelligence swiftly advance. Finding a solution requires a fine balance between national interests and gaining military advantages out of self-interest.

​

​

References

​

Ayamga, M., Akaba, S., & Nyaaba, A. A. (2021). Multifaceted applicability of drones: A review. Technological Forecasting and Social Change, 167, 120658.


Bajak, F. (2024, April 12). US — Chinese military planners gear up for new kind of warfare. AP News. https://apnews.com/article/us-china-drone-swarm-development-arms-race-e5808a715415d709f466da00cdeab10f


Floreano, D., & Wood, R. J. (2015). Science, technology and the future of small autonomous drones. Nature, 521(7553), 460–466.


Guitton, M. J. (2021). Fighting the locusts: Implementing military countermeasures against drones and drone swarms. Scandinavian Journal of Military Studies, 4(1), 26–36.


Hayyar, M. E., & Grimmig, A. (2023, October 20). The legality of the downing of the Turkish drone by the US. EJIL: Talk!. https://www.ejiltalk.org/the-legality-of-the-downing-of-the-turkish-drone-by-the-us/


Kreps, S., & Maxey, S. (2021). Chapter 4: Context matters: The transformative nature of drones on the battlefield. In Technology and international relations.


Link, B. G., & Phelan, J. C. (2001). Conceptualizing stigma. Annual Review of Sociology, 27, 363–385.


Rickli, J. M., & Mantellassi, F. (2024). The war in Ukraine: Reality check for emerging technologies and the future of warfare (Geneva Paper No. 34/24). Geneva Centre for Security Policy.


Schuller, A. (2014). Inimical inceptions of imminence: A new approach to anticipatory self-defense under the law of armed conflict. UCLA Journal of International Law and Foreign Affairs, 18, 161–184.


Sugg, S., & Wood, M. (n.d.). Slaughterbots [Video]. YouTube. https://youtu.be/O-2tpwW0kmU?si=anqrznXhdQ-LF6oO


United States Secretary of State Daniel Webster. (1841). Note dated 24 Apr. 1841. British and Foreign State Papers, 29, 1137.


ZieliÅ„ski, T. (2023). Factors determining a drone swarm employment in military operations. Journal of Security and Strategic Studies, 10, 61–80.​​

IMG_8179 - Isabel Funari.jpg

By Isabel Funari

Bachelor's Degree in International Relations and Organisations from Leiden University and a recently obtained Master's Degree in Public International Law from Leiden University. Research focused on the intersection between artificial intelligence and international law, drones and the law, criminal justice and politics. Find Isabel Funari on LinkedIn.

Disclaimer: The International Platform for Crime, Law, and AI is committed to fostering academic freedom and open discourse. The views and opinions expressed in published articles are solely those of the authors and do not necessarily reflect the views of the journal, its editorial team, or its affiliates. We encourage diverse perspectives and critical discussions while upholding academic integrity and respect for all viewpoints.

bottom of page