For the best experience, open
on your mobile browser.

Need for new international guidelines

need for new international guidelines

In a plea for global action, eminent humanitarian figures have rallied nations to initiate negotiations for a new legally binding framework that prohibits autonomous weapon systems from targeting humans.


The deadline for these critical regulations is set for 2026, prompting Member States to take immediate and decisive measures to safeguard the future of humanity.


The Secretary-General of the United Nations, António Guterres, and the President of the International Committee of the Red Cross (ICRC), Mirjana Spoljaric, have jointly issued this appeal at the United Nations headquarters.

They underscore the imperative need for new international guidelines governing autonomous weapon systems, aiming to protect humanity collectively.


“We call on world leaders to launch negotiations of a new legally binding instrument to set clear prohibitions and restrictions on autonomous weapon systems and to conclude such negotiations by 2026. “


Autonomous weapon systems, characterised by their ability to independently select targets and employ force without human intervention, raise profound concerns across the humanitarian, legal, ethical, and security domains.


Over a decade of deliberations within the United Nations, spanning the Human Rights Council, the Convention on Certain Conventional Weapons, and the General Assembly, has laid the groundwork for prohibitions and restrictions.


With a rising number of reports concerning the deployment of autonomous weapons in conflict zones, devoid of human control, the international humanitarian community is growing increasingly concerned about the potential consequences.

The preservation of human control over the use of force is emphasised as an ethical red line that must not be crossed. The autonomous targeting of humans by machines is deemed unacceptable under international law.

The responsibility now falls on States to collaboratively build upon this foundation and formulate new regulations tailored to address the tangible threats posed by these advanced weapon technologies.

The proliferation of sophisticated emerging technologies, such as robotics and artificial intelligence, exacerbates these concerns. Even the scientists and industry leaders responsible for these technological advancements express alarm.

Effectively harnessing these technologies for the betterment of humanity necessitates addressing the most urgent risks and averting irreversible consequences.

This involves the unequivocal prohibition of autonomous weapon systems that operate unpredictably, particularly those controlled by machine learning algorithms.

Furthermore, comprehensive restrictions are deemed essential for all categories of autonomous weapons to ensure compliance with international law and ethical standards.

These restrictions encompass deployment locations, timing, target selection, the scale of force applied, and the facilitation of robust human oversight, prompt intervention, and deactivation.

The new international regulations concerning autonomous weapons are an imperative step forward to clarify and reinforce existing laws. These regulations serve as a preventive measure, offering protection to potential victims and preventing catastrophic repercussions for humanity as a whole.

In the past, the use of several weapons has been banned. For example, Weapons with non-detectable fragments, landmines, booby traps, incendiary weapons, and blinding laser weapons, which are restricted by the Convention on Certain Conventional Weapons, adopted in 1980 and amended by five protocols.

Explosive remnants of war are regulated by Protocol V of the Convention on Certain Conventional Weapons, adopted in 2003.

Since 2018, the United Nations Secretary-General has maintained that lethal autonomous weapons systems are politically unacceptable and morally repugnant and has called for their prohibition under international law.

In his 2023 New Agenda for Peace, the Secretary-General reiterated this call, recommending that States conclude, by 2026, a legally binding instrument to prohibit lethal autonomous weapon systems that function without human control or oversight, and which cannot be used in compliance with international humanitarian law, and to regulate all other types of autonomous weapons systems.