Autonomous weapons and laws of war: Navigating legality under international humanitarian law with India’s emerging role in AI driven conflict

Published On: October 11th 2025

Authored By: Safia Shahnawaz
Faculty of Law, Aligarh Muslim University

Abstract

Will Robots have human intelligence? now AI will be the future, this idea evince the most profound power of the world where human interference will be challenged in case of mishandling the future will be doomed. Robotics if often a fascinating concept but in the realm of autonomous weapons it is terrifying. In the autonomous lethal weapon system the GGE as well as ICJ had previously demonstrated on various occasions that lethal is lethal and proliferation could have end humanity. The world could be doomed but if handled with protocol as presented by CCW as well as international humanitarian law the idea of development with empowering GDP can be locked, The debate is ongoing where the experts often meet and have a chance to show what could be the possibility as in previous examples of bio technology which had shown mass destruction and terror infliction could possibly lead to the emergence of some new dystopian future. Anything that could possibly be in the wrong hands will show human nature. India on the other hand is Keen and observant regarding the emerging technologies but being able to use the technologies with such precision and also with current GDP shown a plausible future where either it will be a better opportunity for humans or a reason of terror.

Keywords – AI, Robots, Human Intelligence, Lethal autonomous weapons, GGE,ICJ, Humanitarian law, terror infliction.

Introduction

International humanitarian law is the law that governs laws related to armed conflict replacing old ‘Laws of War’ that is jus in bello which contains only the application of this body of law including war, armed conflicts while IHL focus on clear value judgement and considers humanity and humanitarian principles as fundamental. The inception begins in late 19th century with Swiss Henry Dunant who after taking part in battle of Solferino in 1859 closely observed the suffering and agony and thus founded Red Cross movement which acted as a custodian of humanitarian idea later called as International Committee on Red Cross. The motto Inter Arma Caritas that translates to In war, charity was then adopted by Geneva convention and became apparent that Geneva convention protects human values and humanity.

International humanitarian law devoted to weapons in hostility while Geneva convention distinguishes weapon, means, methods and Lethal Autonomous Weapon System(LAWS) where more than one weapon is combined together. The nation must Stick to ‘principle of military necessity,’  which is applying the amount of force which is necessary for the realisation for the purpose of war and second is of ‘humanity,’  where degree of violence is considered both goes simultaneously and not in opposition for better affiliation of laws of war.  The weapons acted singularly and then it is delivered via a delivery machine, but in 1999 Hellfire missile was launched which was an attempt of dehumanising and breaking of various International regulations and thus a lot of instruments were established such as Declaration of asphyxiation gases or dum dum bullets . In The Hague convention it was also established that parties in a conflict is not having rights to choose methods and means of warfare without any limit. parties to hostilities have agreed upon self-restraining reciprocally at given conditions when waging war but this idea of limitation was only in European territory and took a long year and around end of 19th century it extended to whole world.[1]

What are autonomous weapons?

Autonomous weapons is autonomy in rigorous sense where without human pertinency the weapon itself locates, tracks, and attack causing Daman and destruction which works on pre existing algorithms. The targeting functions that would normally be controlled by humans are taken over by the weapon system itself, using its sensors, computer programming software and weaponry, following first launch or activation by a human operator. Any weapon system that has the ability to choose and strike targets on its own, including some current weapons and possible future systems, is included in this working definition.[2] The definition provides a useful basis for a legal analysis by delineating the broad scope of the discussion about autonomous weapon systems without the need to immediately identify the systems that raise legal concerns. Accordingly, the concept is not meant to make assumptions about the degree of autonomy in weapon systems that might or might not be deemed legal. Instead, the ICRC has suggested that states decide where these boundaries should be set by evaluating the kind and level of human supervision needed to utilise weapon systems to carry out assaults in order to comply with IHL and, additionally, to meet ethical requirements.[3]

International humanitarian law though not specifically defines Autonomous weapons but it has to work in accordance with principles established by the IHL. It has lack of accountability with threats to security and also has low moral standards with no respect for human dignity. It uses technology advancement with algorithms based techniques to operate whereas in case of military drones the decision to take a life is remotely made by humans. Autonomous weapon is to kill specific target profile using sensors that could be facial recognition or any movement. This is accepted as lethal since it possesses no distinction between what a moving vehicles can be such as a military tank and ambulance, it can fire and kill. Godfather of AI has also warned about how autonomy can be disastrous since many investments are being made to target the innocent lives by funding for creation of AI military defence systems where defence will take new shape that could possibly be disastrous.[4]

Core principles of International Humanitarian Law

When using weapon systems, a commander or operator’s primary legal responsibilities include: ensuring that military objectives and civilian objects, combatants and civilians, and active combatants and those hors de combat are distinguished; determining whether an attack could be expected to cause incidental civilian casualties and damage to civilian objects, or both, that would be excessive in relation to the anticipated concrete and direct military advantage, as required by the rule of proportionality; and canceling or suspending an attack if it becomes clear that the target is not a military objective, is protected, or that the attack could be expected to violate the rule of proportionality as required by the rules on precautions in attack.[5]

Parties engaged in a conflict may choose methods or means of warfare including weapons systems but the usage of lethal autonomous weapons systems is not unlimited since it has predictability issues that cannot anticipate attacks or can attack without human intervention. Distinction, proportionality and precautions in attack create obligations for human combatants in the use of weapons to carry out attacks and it is combatants who are both responsible for observing these regulations, and who will be held accountable for any transgressions. As for all duties under international law, these legal commitments, and accountability for them, cannot be transferred to a machine, computer program or weapon system.[6]

Human dignity: Relationship between International Humanitarian law and International Human Rights Law

Human dignity is the most celebrated concept in recent times in International as well as National case law, Human dignity’ appears as a polymorphous concept, tasked with different and quite discrete functions depending on the contexts where it is employed and is understood as source of State duties towards individuals, but also as source of duties that individuals have towards themselves that can be objective. Human dignity in International Human Rights law(IHRL) is mentioned under Universal Declaration of Human Rights(UDRH) which is mentioned under Preamble as all human beings are born free and equal in dignity and rights. It follows that dignity is (i) a value that pertains to human beings as such, and (ii) an universal one as it attaches to being a member of a community.[7]

In the development of LAWS those who oppose it use human dignity as a weapon and the question shifts from Can to Should the weapons must take shape. It has been multiple times rejected based on CCW Germany and countries like Chile, Ecuador, Brazil, Sri Lanka gave human dignity as response to creation of lethal weapons. On the contrary, other States have employed ‘human dignity’ only in relation to ethics or more generally in a sense that discards any possible legal significance. Heyns, former UN Special Rapporteur, has written extensively on the topic, gave axiological dimension of human dignity, which he applies to LAWS with a view to showing that autonomous killing reduce an individual ‘to numbers: the zeros and the ones of bits’.More importantly, in the case of LAWS human dignity would be at stake because autonomous killing does not leave open the possibility of hope.

It is considered both theories as complimentary since based on a case-by-case comparative approach aimed at reinforcing, enhancing or better interpreting the applicable rules ICJ in its

landmark Advisory Opinion in the Nuclear Weapons case, where it stated that the ICCPR (and namely the right to life enshrined therein) continued to operate in time of armed conflict, and that IHL served as lex specialis to interpret the IHRL provision. The laws briefly describes in situations of war the arbitrary nature of life should be curtailed in times of war and thus shows a convergent point of IHL and IHRL.

Age of AI: Increased capability and swarming techniques

Former U.S. Air Force Chief Scientist Werner Dahm states that ‘by 2030 machine capabilities will have increased to the point that humans will have become the weakest component in a wide array of systems and processes.’ Autonomous technology in weapons has been in use for years and continues to advance. Some current systems, like sensor-fused and loitering munitions, perform limited autonomous functions mainly using sensors to engage pre-programmed targets within a set area. Known as “fire-and-forget,” these weapons operate independently after launch but still require human target selection, placing them at the lower end of Sheridan’s autonomy scale, fine example is Israel’s Harop, which select and engage its targets at longer distance, either by remote control or autonomously. Anti-personnel sentry weapons, both stationary and mobile, are used to patrol specific sites or borders. A notable example is South Korea’s aEgis I and II, and Super aEgis systems deployed at the Demilitarised Zone. These turret systems use optical, thermal, and infrared sensors to detect human targets and typically require remote operator authorisation to engage. However, they reportedly have capabilities for autonomous lethal action without human input. Future autonomy will be built on existing technology where AI will take charge and have a similar or greater capacity to think like a human. Adaptation is made possible by sensorimotor coordination and advanced evolutionary procedures such as genetic algorithms, evolutionary strategies or genetic programming. human cognition shows different features than A.I for instance, while machine can outperform humans in repetitive tasks, human intelligence excels in adaptability and creativity, which implies that the most capable military systems will be those that are optimised to take advantage of the best of both machine and human cognition.

Research in this field is particularly complex, as it involves theoretical cognitive science, neural networks, evolutionary computation neuroscience, engineering and obviously robotics, since human excel in adaptability and creativity that is lacking by AI which is being worked on which seems to replace human presence by optimal use of robotics engineering which is disruptive on the face of it.

A swarm is a group of many separate parts (like birds, drones, or even people) that work together by following simple shared rules. Even though each part acts on its own, together they create a system that moves and responds in a smart and coordinated way. The result is something greater and more powerful than just the individual parts.This idea of swarming isn’t new. A long time ago, a military thinker named Clausewitz wrote about it in his book On War. He was talking about guerrilla fighters (small groups of soldiers who don’t fight in the usual way). He said that each fighter in the group was like part of a storm cloud  you never knew when or where lightning (an attack) would strike. That unpredictability and coordination are what make swarms powerful.

When it comes to decision-making, the most important idea behind swarming techniques is dispersion meaning that many small parts (like robots or drones) are spread out over an area.There are usually far more robots in a swarm than there are human operators. Because it’s not practical for one human to control each robot (especially in military or strategic situations), most of the decisions are made by the machines themselves, not by humans. Humans only step in when something really important or unusual happens.This approach is similar to something we’ve seen before in the military: a human commander gives general orders, and then the machines make the smaller, on-the-ground decisions themselves, based on those orders.Swarming techniques are seen as one of the clearest examples of how new technology might create situations where autonomous  systems and lethal actions are closely linked. People often try to describe human involvement in such systems using terms like:“In the loop” the human is actively making decisions,“On the loop” the human is supervising and can step in,“Out of the loop”  the system is acting entirely on its own.These categories can be helpful to describe things, but they don’t really explain how much control humans actually have when a swarm is making deadly decisions on its own.Even though we have terms to describe levels of human involvement, they don’t truly capture how independent and potentially dangerous swarming systems can become when making life-or-death decisions without clear human control.[8]

Need for Human Oversight: Evaluating Existing Mechanisms and compliance measures

Group of governmental experts has focused on LAWS compliance with Certain Conventional weapons(CCW) IHL and referred to normative and operational framework and bring expertise on legal, , military and technological aspects. High contracting parties which are 128 in total suggested and agreed upon the nature of LAWS which can cause superfluous injury or unnecessary suffering if inherently indiscriminate What are the novel issues/questions that arise specifically in the context of LAWS that should be considered in relation to IHL?

  1. Are there types of LAWS the use of which would be inherently problematic or prohibited under IHL?
  2. What functions or effects would render a LAWS to be of a nature to cause superfluous injury or unnecessary suffering, or to be inherently indiscriminate?
  3. Are there any gaps in existing IHL when it comes to ensuring that LAWS are used in accordance with the principles of distinction, proportionality, and precautions in attack?
  4. What measures are needed to prevent an accountability gap under IHL in the context of LAWS?[9]
  5. Is human control needed to ensure that LAWS use is predictable, explainable, and reliable? The raised questions were discussed and several measures were given to weapon reviews, control, risk mitigation, responsibility and accountability, and limits related to the conduct of hostilities.

Weapon review- States are required to assess whether new weapons or methods of warfare comply with international law. Legal reviews help ensure that Lethal Autonomous Weapon Systems (LAWS) meet International Humanitarian Law (IHL) standards. States are encouraged to share best practices, and any new or significantly modified weapons must be reviewed for IHL compliance especially with Martens clause.

Human control-  Human control over LAWS is essential to ensure compliance with International Humanitarian Law (IHL), especially the principles of distinction, proportionality, and precautions in attack. These judgments must be made in good faith by human operators through a responsible command structure. Effective human-machine interaction should consider the operational context and the capabilities of the system to uphold IHL standards making it sure that the weapons are not to be used in civilian environments,

Risk mitigation- States must ensure IHL compliance throughout the entire lifecycle of LAWS, supported by appropriate training for human operators. At all stages design, development, testing, and deployment States must assess and reduce risks such as civilian harm, unintended engagements, loss of control, and terrorist acquisition. Self-destruct, self-deactivation or self-neutralisation mechanisms should be incorporated into munitions or the system.States should also incorporate interdisciplinary input, including ethics reviews, and develop verifiability and certification procedures, sharing best practices when appropriate.

Responsibility and accountability- States bear responsibility under international law for any wrongful acts involving weapons systems, including LAWS. Machines cannot be held accountable humans and States are. IHL obligations apply to States, individuals, and parties to conflict not to machines. Human accountability must be maintained across the entire lifecycle of LAWS from development to deployment and use.A responsible chain of command and control is essential to ensure lawful decision making. States must ensure individual responsibility for the use of LAWS in line with IHL. The human role in the use of force must remain central, including clear human-machine interfaces. Training and investment in human resources are necessary to uphold legal and ethical standards. Ultimately, States are accountable for the deployment and consequences of any weapons system, including lethal outcomes.[10]

India’s stance on autonomous weapons  

Operation Sindoor was more than just a skirmish; it was a turning point in history as two competitors with nuclear weapons entered the era of autonomous warfare, when domination is algorithmic and deterrent is digital. And as the smoke clears, one thing is certain: the next conflict will start with the sound of drones flying silently through the air rather than a soldier charging. An unprecedented era of drone-centric warfare in South Asia was heralded by the April 22 Pahalgam terror incident, Operation Sindoor, in which Unmanned Aerial Systems (UAS) were instrumental in direct military confrontation between two nuclear-armed neighbors. Israeli Heron MK-II and domestically built TAPAS-BH-201/Rustom-II-Medium-Altitude Long-Endurance (MALE) Intelligence, Surveillance and Reconnaissance (ISR) Unmanned Aerial Vehicles (UAVs) are suspected of flying deep into Pakistani airspace in the 48 hours prior to Operation Sindoor in order to collect thermal signatures and electronic and signals intelligence from suspected Islamist terror camps.[11]

India’s military spending is remains below that of nations like China, Russia, and the United States, despite defence and national security gaining centre stage. The Stockholm International Peace Research Institute claims that China’s military spending has increased at the fastest rate in the last ten years. In China, it increased by over 83%, whereas in India, it increased by only 29%. The US, on the other hand, is still by far the biggest spender in the world, spending more than twice China’s own military budget, even if it has cut its spending by roughly 17% over the last ten years. The Parliamentary Standing Committee on Defence (2017–18) pointed out that, at 3.3% of GDP, even Pakistan spent more on defence than India. It might be more beneficial for India to use its available cash to create and apply laws in ways that are more appropriate for it. India really cannot afford to invest its resources in a legal rat race that it is unlikely to win anyway. However, in ways specific to its circumstances, it can rationalise and modify LAWS development to fit it. India’s frontiers are among the most militarised in the world, thus border security can be better tailored. The entire length of India’s international land border is 15,106.7 km, of which 3,323 km are with Pakistan and 3,488 km are with China. 143 The 1,643 kilometre Indo-Myanmar border and the 4,096.7 kilometre Indo-Bangladesh border are two other noteworthy land borders. With more than 2.5 million soldiers, India has the largest border security force in the world to protect these borders.[12]

Increasing automation, if not autonomy, is already a priority for India. The Comprehensive Integrated Border Management System (CIBMS), which has been defined as a modernisation of the and should not be confused with autonomous weapons like the South Korean SGR-A1, involves the “deployment of a range of state-of-the-art surveillance technologies, including thermal imagers, infra-red and laser-based intruder alarms, aerostats for aerial surveillance, unattended ground sensors that can help detect intrusion bids, radars, sonar systems to secure riverine borders, fibre optic sensors, and a command and control system that shall receive data from all surveillance devices in real time.” “Building AI capabilities is essential for three main reasons: to preserve strategic autonomy in a time when military advantage is determined by technological superiority; to contribute significantly to the development of the international regulatory framework; and, lastly, to equip our forces for future conflicts involving such systems,” stated Samir Kumar Sinha, Director-General for Acquisition and Additional Secretary in the Defense Ministry.’’

Conclusion

 Autonomous weaponry, whether lethal or not, is inevitable. A positive development is how States are already seriously considering these issues and expressing their views. Although autonomous weaponry is already in use by States, it is employed only by a handful of them and even these States have not advanced irrevocably down the autonomy line. Hence, the law may not be too far behind technology in this field, and the legal principles that evolve from these ongoing discussions can help both inform and direct the growth of autonomous weaponry. That is the ultimate goal of the inter-governmental discussions on autonomous weaponry.The debate showed two sides where the question of morality was issued by one side whether the robots have the decision making ability to that of a human and other side as whether the robots can rape, the debate is still ongoing. In terms of India’s stance active participation in the Group of Governmental Experts is heartening. But any claims of India playing a major leading role would, for the foreseeable future, be exaggerated. India is already lagging behind the United States and China, in defence expenditure and more importantly, defence modernisation.

References

  1. Diego Mauri, “Lethal Autonomous weapon system in International Humanitarian Law and Human Rights law(2019).
  2. International committee on Red Cross, Neil Davidson, A Legal Perspective: Autonomous weapon systems on International Humanitarian Law, UNODO occasional paper 30.
  3. Autonomous weapons, What are autonomous weapon systems? https://autonomousweapons.org/
  4. Christian Eede, Spotify’s Daniel Ek leads €600 million investment in AI military defence company(20th June,2025) https://djmag.com/news/spotifys-daniel-ek-leads-eu600-million-investment-ai-military-defence-company
  5. Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I) (adopted 8 June 1977, entered into force 7 December 1978) 1125 UNTS 3, art 36.
  6. Convention on certain conventional weapons, Measures needed to ensure compliance with international Humanitarian Law and additional protocols, (Geneva August 2024)
  7. Rahul Bedi, “Autonomous warfare in operation Sindoor,” The Hindu (India 30 may 2025)
  8. Sarvjeet Singh and Sarnghan Aravindakshan, “Killer robots or soldier of the future: Legal issues and India role in Lethal Autonomous weapons debate,” (2020) vol 16 IJLT < https://repository.nls.ac.in/cgi/viewcontent.cgi?article=1031&&context=ijlt&&sei-redir=1&referer=https%253A%252F%252Fwww.google.com%252Furl%253Fq%253Dhttps%253A%252F%252Frepository.nls.ac.in%252Fcgi%252Fviewcontent.cgi%25253Farticle%25253D1031%252526context%25253Dijlt%2526sa%253DU%2526sqi%253D2%2526ved%253D2ahUKEwjsgZaA05OPAxXpp1YBHcvnFSEQFnoECHQQAQ%2526usg%253DAOvVaw0_aKzSzE2T0UL4hTwWoLVi#search=%22https%3A%2F%2Frepository.nls.ac.in%2Fcgi%2Fviewcontent.cgi%3Farticle%3D1031%26context%3Dijlt%22
  9. Dinakar Peri, ‘MOD study on AI powered weapons highlights gap between India’s aspirations and capabilities’ The Hindu ( India, 31 January 2025)

[1] Diego Mauri, “Lethal Autonomous weapon system in International Humanitarian Law and Human Rights law(2019).

[2] International committee on Red Cross, Neil Davidson, A  Legal Perspective: Autonomous weapon systems on International Humanitarian Law, UNODO occasional paper 30.

[3] Autonomous weapons, What are autonomous weapon systems? https://autonomousweapons.org/ accessed last on 18th August, 2025

[4] Christian Eede, Spotify’s Daniel Ek leads €600 million investment in AI military defence company(20th June,2025) https://djmag.com/news/spotifys-daniel-ek-leads-eu600-million-investment-ai-military-defence-company accessed last on 18th August,2025

[5] Protocol Additional to the Geneva Conventions of 12 August 1949, and relating to the Protection of Victims of International Armed Conflicts (Protocol I) (adopted 8 June 1977, entered into force 7 December 1978) 1125 UNTS 3, art 36.

[6] ICRC (n2) legal perspective, UNODO.

[7] Diego Mauri, (n1) lethal autonomous weapon system.

[8] Ibid

[9] Convention on certain conventional weapons, Measures needed to ensure compliance with international Humanitarian Law and additional protocols, (Geneva August 2024)

[10] Ibid

[11] Rahul Bedi, “Autonomous warfare in operation Sindoor,” The Hindu (India 30 may 2025)

[12] Sarvjeet Singh and Sarnghan  Aravindakshan, “Killer robots or soldier of the future: Legal issues and India role in Lethal Autonomous weapons debate,” (2020) vol 16 IJLT < https://repository.nls.ac.in/cgi/viewcontent.cgi?article=1031&&context=ijlt&&sei-redir=1&referer=https%253A%252F%252Fwww.google.com%252Furl%253Fq%253Dhttps%253A%252F%252Frepository.nls.ac.in%252Fcgi%252Fviewcontent.cgi%25253Farticle%25253D1031%252526context%25253Dijlt%2526sa%253DU%2526sqi%253D2%2526ved%253D2ahUKEwjsgZaA05OPAxXpp1YBHcvnFSEQFnoECHQQAQ%2526usg%253DAOvVaw0_aKzSzE2T0UL4hTwWoLVi#search=%22https%3A%2F%2Frepository.nls.ac.in%2Fcgi%2Fviewcontent.cgi%3Farticle%3D1031%26context%3Dijlt%22 > accessed last 18th August, 2025.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top