advertisement
As the United Nations talks failed recently to reach a consensus on negotiating a treaty to govern the usage of Lethal Autonomous Weapons (LAWs), or ‘Killer Robots’, the politics of banning or regulating LAWs has started a global debate. LAWs are weapon systems that make their own decisions on who to kill. With the rise of Artificial Intelligence(AI), the nature and outcome of warfare have drastically changed, as displayed by the usage of drones in the conflict between Azerbaijan and Armenia, and by government-backed forces in Libya against militia fighters in 2021. The LAWs may help in efficient warfare with minimal human damage on the side of the victor, but could spell disaster for the vanquished.
The United Nations Convention on Certain Conventional Weapons (CCW) has emerged as the key forum for nations to discuss the issues associated with the process of establishing norms guiding the behaviour of LAWs of late, and within it, the Group of Governmental Experts (GGE) primarily deliberates on such matters. The aspect of human control over killer robots has acquired utmost attention from all quarters, but states differ in their views on complete ban vs. improving the existing regimes of exercising control.
From a report prepared by Human Rights Watch, a careful analysis of the positions taken by different countries in the CCW talks over the last few years (except the latest round) reveals an interesting pattern. First, the category of states includes the US, the UK, Russia, Israel, etc., who are developing advanced systems, and hence, don’t seek a legally binding treaty banning them. The second category includes those like Spain, Sweden, Kuwait, Portugal, etc., who talk about human control and their own unwillingness to possess these lethal systems but don’t support a ban.
Romania, Thailand, Tunisia, etc., have not elaborated their views on a ban, while France and Germany seek a political declaration that is legally not binding. China’s call has been only for banning the use, not development and production. India has taken a different position on it while continuing to develop such systems. New Delhi argues that steps to address this issue should not widen the technology gap among different nations and the issue of human control should not legitimise the proliferation of LAWs. Indian Defence Minister, Rajnath Singh, in 2019 had spoken in favour of adequate human control over LAWs in terms of final decision-making. However, in the latest round of negotiations this month, as per reports in the international media, while 60 countries argued in favour of a ban on killer robots, India, Russia, the UK, Israel and the US were among prominent countries that stood against any protocol to regulate the killer robots.
The progress of technological development related to killer robots is faster than the pace of negotiations, the spectrum of 'autonomy' is wide, the arms control regime globally has weakened, and, in most negotiations, private sector players are excluded. This makes it imperative for stakeholders to push for confidence-building measures with respect to LAWs among nations, backed by Track 2 or 1.5 dialogues, or some form of political declaration affirming human control over killers robots rather than a complete ban.
The CCW negotiations have never targeted their gaze on a particular weapon system given the usage of 'fully' autonomous weapons is still not in vogue and the definition of autonomy is unclear. Civil society organisations and NGOs need to make efforts for raising public awareness across countries as they did for the Ottawa Convention on Landmines during the 1990s. The Ottawa Convention also shows how stigmatisation and construction of normative barriers are useful tools of arms control, whereby humanitarian agenda reigns over a military framework. Moreover, a single country, such as New Zealand, or a set of countries that are committed to the regulation of LAWs, could take the negotiation process outside CCW, the way it was done by Canada for the ban on landmines and Norway for prohibiting cluster munitions. Killer robots are still not used prominently in warfare, which makes it difficult to normalise disapprobation against them.
If the great powers stay away from any commitment, other countries would not mind exploring all possibilities, which categorically means a rapid vertical and horizontal proliferation of LAWs. Future negotiations should focus upon export control regimes for LAWs, wherever possible.
India's strategic choice regarding LAWs is limited given the way Pakistan and China are collaborating in the usage of unmanned aerial vehicles against India, recently exemplified by the dropping of explosives via drones in Punjab. Counter-insurgency operations in India could also face significant challenges due to the leverage killer robots may give to non-state actors in asymmetric warfare. For the first time, India demonstrated the use of offensive drone technology during the Army Day Parade in 2021. The Defence Research and Development Organisation (DRDO) has already rolled out ‘Muntra’, India’s first unmanned remotely operated tank with three variants. At the same time, India chaired the GGE on regulating killer robots in 2017-18.
The pragmatic approach for India would be to continue voicing qualified support for an international regime to regulate LAWs while not giving up on developing and exporting the technology of killer robots.
Any negotiation on LAWs is not possible until states arrive at a reflective equilibrium regarding the ethical issues around the management of artificial intelligence at large, and its future.
(Subhrangshu is an M.Phil. research scholar at the Centre for International Politics, Organization and Disarmament, Jawaharlal Nehru University. He tweets @subhrangshusp. This is an opinion piece and the views expressed are the author’s own. The Quint neither endorses nor is responsible for them.)
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)
Published: undefined