AI IN ARMED CONFLICT: APPLICATION OF INTERNATIONAL HUMANITARIAN LAW TO AUTONOMOUS WEAPONS

Published on: 10th October 2025

Authored by: V. Dhamini
Dr. B.R. Ambedkar National Law University, Sonepat, Haryana

Abstract

This article studies the nexus between Artificial Intelligence and armed conflict, with a specific focus on Autonomous Weapons Systems (AWS) and their compliance with International Humanitarian Laws (IHL). Further, the core principles of IHL—distinction, proportionality, and accountability are examined and an evaluation of its relation with AWS mechanisms is made. Furthermore, this article evaluates the progress of the Indian defence sector with respect to AI in warfare. It proposes reforms, including the establishment of an international treaty regulating AWS, the creation of a global registry for transparency, and the imposition of legal liability on developers and operators. This article concludes that active human involvement in the deployment of autonomous weapon systems is necessary for respectful compliance with International Humanitarian Laws.

Keywords: AI, International Humanitarian Law, Autonomous Weapons System

Introduction

Artificial Intelligence is affecting every facet of life, increasing the technical and operational efficiency in multiple fields. Indeed, the defence is no exception, it uses AI to improve the effectiveness of warfare and armed conflicts; ranging from autonomous drones and predictive maintenance to AI-driven simulations and data analysis. One of the most important AI-driven systems is the Autonomous Weapons System that is used to engage with targets without human assistance. The central concern with these systems are with their ethical and technical capability to function independently and whether their use aligns with the principles of International Humanitarian Law.

This article questions the deployment of AWS in the presence of a comprehensive IHL. Further, it focuses on India’s evolving defence strategy and along with recent examples such as Operation Sindoor, it examines how AI has been applied in real-time conflicts. Lastly, it analyzes AWS within the IHL framework, and proposes necessary reforms to ensure compliance with humanitarian principles.

What is International Humanitarian Law?

Also referred to as the law of armed conflict or law of war, it is a set of rules that aim to limit the effects of armed conflict. It protects persons who are not, or are no longer, directly or actively participating in hostilities, and imposes limits on the means and methods of warfare. IHL is part of public international law, which is made up primarily of treaties, customary international law and general principles of law[1].

Where is it derived from?

The core treaties of IHL are comprised of the four Geneva Conventions of 1949 (GC I, II, III and IV), which have been universally acceded to or ratified[2] and are supplemented by Additional Protocols I and II of 1977, relating to the protection of victims of international and non-international armed conflicts respectively. Various international treaties prohibit specific means and methods of warfare and safeguard certain categories of people and objects from the impacts of conflicts. International Humanitarian Law (IHL) is also influenced by customary international law, which is established through consistent state practices and legal opinions, as well as general legal principles and judicial decisions, including those made by the International Court of Justice (ICJ).

Use of AI in the Defence sector in India

In the past few years, India has made significant progress in integrating artificial intelligence into its defence sector. The Ministry of Defence has proactively established institutional frameworks like the Defence AI Council (DAIC) and the Defence AI Project Agency (DARPA). As of 2025, DARPA has supervised 129 AI-related programs, of which 77 have already been completed, demonstrating the rapid adoption of AI technologies. To support ongoing development, the Government has allocated ₹100 crore to each branch of the armed forces for AI-driven initiatives.

India has also focused on the development of autonomous drone platforms, with notable contributions from domestic manufacturers such as IdeaForge’s SWITCH UAV and the NETRA V2. The Drone Federation of India (DFI), representing over 550 indigenous manufacturers, has played a vital role in establishing India as a leader in global drone technology. The Indian drone market is projected to grow to $11 billion by 2030, potentially capturing 12% of the global market share[3].

Usage of AI in Operation Sindoor

The forerunner of Operation Sindoor, AI-powered drones formed the backbone of India’s offensive capabilities. Drones like the Harop, Heron and SkyStriker demonstrated autonomous loitering, target identification, and engagement capabilities. loitering munitions effectively neutralized enemy radars and air defenses within 25 minutes, outmaneuvering Chinese-supplied systems and validating India’s edge in electronic warfare and precision weaponry.[4]

The Army’s AI-based Integrated Mission System (IMS) improved mission efficiency, while the Defence Research and Development Organisation’s (DRDO) Air Defence Control and Reporting System (ADC&RS) integrated sensor data and weapon control to intercept threats before they could enter Indian airspace.

Autonomous Weapons System

AI is often associated with job displacement, but what if there were also AI systems designed to take lives? Autonomous weapons system or lethal autonomous weapons system (LAWS) select and apply force to targets without human intervention. The level of human intervention can vary, from “human-in-the-loop” systems requiring authorisation for engagement , to “human-on-the-loop” where a human can override autonomous actions, and finally “human  out-of-the-loop” systems operating without any human involvement post-activation.[5]

After initial activation or launch by a person, an autonomous weapon system self-initiates or triggers a strike in response to information from the environment received through sensors and on the basis of a generalized “target profile.[6] Examples of automated weapons include defensive systems like the Israeli Iron Dome and the German MANTIS as well as the Swedish LEDS-150 and the South Korean Super aegis II[7].

Are these AWS mechanisms in compliance with IHL?

There are 2 core principles of IHL that need to be questioned when it comes to autonomous weapons systems; Proportionality and Differentiation.

The core legal obligations for an operator in the use of weapon systems include ensuring distinction between military objectives and civilian objects, combatants and civilians, and active combatants and those hors de combat as per the principle of distinction; to determine whether the attack may be expected to cause incidental civilian casualties and damage to civilian objects, or a combination thereof, which would be excessive in relation to the concrete and direct military advantage anticipated, as required by the rule of proportionality[8]. Can an autonomous weapon system differentiate between military and civilian objects? Is it capable of determining whether the attack is going to cause incidental civilian harm?

These IHL rules create obligations for human combatants in the use of weapons to carry out attacks, and it is combatants who are both responsible for respecting these rules, and who will be held accountable for any violations. As for all obligations under international law, these legal obligations, and accountability for them, cannot be transferred to a machine, computer program or weapon system. The abovementioned principles thus create the liability of human agency. When the question of accountability strikes, IHL’s mandate has always been to hold human actors accountable in cases of war crimes or grave breaches of the Geneva Conventions[9]. There is no provision to strike an AWS for accountability. Who would be in charge if an AWS makes a fault? There is no real accountability as IHL mandates the attack be carried out in its entirety by the accused. Thus, accountability as a background stands vague and imprecise.[10]

The question remains, however, what limits are needed on autonomy in weapon systems to ensure compliance with IHL?

 There is general agreement among Convention on Certain Conventional Weapons (CCW) States Parties that “meaningful” or “effective” human control, or “appropriate levels of human judgement” must be retained over weapon systems and the use of force.[11] The Chair’s summary of the April 2016 CCW informal meeting of experts states that views on appropriate human involvement with regard to lethal force and the issue of delegation of its use are of critical importance to the further consideration of LAWS.[12]

For its part, the ICRC has called for human control to be maintained over weapon systems and the use of force to satisfy legal and ethical requirements. It follows that some degree of human control over the functioning of an autonomous weapon system, translating the intention of the user into the operation of the weapon system, will always be necessary to ensure compliance with IHL, and this may indeed limit the lawful level of autonomy.[13]

Can the use of AI in war be beneficial to humans?

Indeed, there are negative aspects to AI, but if its usage can help spare human lives, why not make full use of it? AI robots can be deployed to perform dangerous tasks, while human soldiers can be positioned at the back end to control those robots. Further, AI can be trained to avoid particular civilian zones and assess threats more accurately. The idea is that as AI becomes core to military systems, nations may hesitate to strike each other, because attacking one AI system could cause unpredictable ripple effects across both sides.[14] Therefore, if used responsibly and with the right precautions, AI can indeed be beneficial.

Possible Reforms

Multiple reforms are required at national and international levels for the safe usage of AWS.

  1. An international treaty specifically governing AWS
  • A clause mandating a mechanism of human control if the AWS makes a wrong decision as in a reverse mechanism in the system.
  • A clause establishing a Global AWS Registry that publicly reports all AWS deployments, targeting errors, and system failures.
  • A clause ensuring programmers and manufacturers can be held liable under existing war crimes statutes.
  • A clause mandating the involvement of legal scholars, ethicists, and human rights experts in the system design phase.
  1. National forums in all countries to control the functioning and making of AWS at the national level.

Conclusion

The increasing role of Artificial Intelligence in armed conflict brings both promise and peril. While AI-driven autonomous weapons systems offer precision, efficiency, and strategic advantages, they also present serious ethical, legal, and accountability concerns under International Humanitarian Law. The principles of distinction, proportionality, and accountability remain cornerstones of IHL, and ensuring their compliance becomes challenging as AWS takes on greater decision-making authority. The solution lies in maintaining meaningful human control, developing international treaties, and creating transparent monitoring frameworks.

[1] Advisory Service on IHL, *International Humanitarian Law: A Primer* (ICRC 2022) <https://www.icrc.org/sites/default/files/document/file_list/what_is_ihl.pdf> accessed 8 June 2025.

[2] International Committee of the Red Cross, “Geneva Convention Relative to the Protection of Civilian Persons in Time of War of 12 August 1949” (1949) <https://www.un.org/en/genocideprevention/documents/atrocity-crimes/Doc.33_GC-IV-EN.pdf> accessed 8 June 2025.

[3] Ministry of Electronics and Information Technology, MeitY and Drone Federation India Launch National Innovation Challenge for Drone Research (NIDAR) under SwaYaan Initiative (Press Release, 2024) https://www.meity.gov.in/ accessed 9 June 2025

[4] Government of India, ‘Operation SINDOOR: The Rise of Aatmanirbhar Innovation in National Security’ (2025) <https://static.pib.gov.in/WriteReadData/specificdocs/documents/2025/may/doc2025514554901.pdf> accessed 9 June 2025.

[5] Jaspreet Bindra, ‘AI Warfare Is Here: How Intelligent Drones Harop and Heron Fronted India’s Operation Sindoor’ The Economic Times (17 May 2025) https://economictimes.indiatimes.com/news/defence/ai-warfare-is-here-how-intelligent-drones-harop-and-heron-fronted-indias-operation-sindoor/articleshow/121239839.cms?from=mdr accessed 9 June 2025

[6] “ICRC Position on Autonomous Weapon Systems” (International Committee of the Red Cross, January 21, 2022) https://www.icrc.org/en/document/icrc-position-autonomous-weapon-systems accessed on 9 June 2025.

[7] Yihan Deng, “AI & the Future of Conflict | GJIA” (Georgetown Journal of International Affairs, December 10, 2024) https://gjia.georgetown.edu/2024/07/12/war-artificial-intelligence-and-the-future-of-conflict/ accessed on 9 June 2025

[8] Neil Davison and International Committee of the Red Cross, “Autonomous Weapon Systems under International Humanitarian Law,” vol No. 30 (2018) https://www.icrc.org/sites/default/files/document/file_list/autonomous_weapon_systems_under_international_humanitarian_law.pdf accessed on 10 June 2025

[9] Prosecutor v Dominic Ongwen (Judgment) (International Criminal Court, Trial Chamber IX, 4 February 2021) ICC-02/04-01/15

[10] “Autonomy in Weapon Systems: The Unpredictable and Unaccountable Confluence of Mechanization of Weapons and International Humanitarian Law” (2024) 1 HNLU Student Law Journal 62 https://hnlu.ac.in/wp-content/uploads/2025/01/Chapter_04.pdf accessed 10 June 2025.

[11] International Committee of the Red Cross, “The Element of Human Control” (2018) <https://documents.un.org/doc/undoc/gen/g18/344/72/pdf/g1834472.pdf>. accessed 10 June 2025.

[12] Chairperson of the Informal Meeting of Experts, “Report of the 2016 Informal Meeting of Experts on Lethal Autonomous Weapons Systems (LAWS)” (2016) <https://unoda-documents library.s3.amazonaws.com/Convention_on_Certain_Conventional_Weapons_-_Informal_Meeting_of_Experts_(2016)/ReportLAWS_2016_AdvancedVersion.pdf>. accessed 10 June 2025.

[13] Davison and ICRC, Autonomous Weapon Systems under International Humanitarian Law (n 8). accessed 10 June 2025.

[14] Bindra, ‘AI Warfare Is Here’ (n 5). accessed 9 June 2025.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top