Technology

At the UN, they urge to restrict autonomous killer drones: Russian Federation and China vs.-Ny Times

Rapid progress in the field of artificial intelligence and the intensive use of drones in the war in Ukraine and in the Middle East have made this problem even more urgent. The United States, China, Russia and several other countries seek to develop and introduce autonomous drones with artificial intelligence as soon as possible, which themselves will decide on the murder of goals, despite the denial of some states and experts.

About it writes the American edition of The New York Times in his article. The prospect of the emergence of combat robots operating without human participation is so worried about the governments of many countries that they propose to introduce legally mandatory rules for the use of such equipment. "It is really one of the most important turning points for humanity," said Alexander Kuht, Austria's main representative in negotiations in an interview.

Although the UN provides a platform to discussing the problem, it is unlikely that this process will lead to new significant legally mandatory restrictions. The United States, Russia, Australia, Israel and others argue that there is no need for new international law at this time, while China wants to identify any legal restrictions so narrowly that it will not have much practical effect, saying supporters of arms control , writes the publication.

"We do not see that now a good time," said Konstantin Vorontsov, Deputy Head of the Russian Delegation to the UN, diplomats who have recently gathered in the UN Headquarters in New York.

Members of the American delegation, which includes a representative of the Pentagon, argue that instead of the new International Law of the UN should explain that the available international human rights laws already forbid countries to use weapons against civilians, or such that causes them disproportionate harm to them .

But the position occupied by the great powers has only increased tensions among small countries that say that they are concerned that deadly autonomous weapons can be a common occurrence on the battlefield before any agreement on the rules of its use is reached. Rapid progress in artificial intelligence and the intensive use of drones in the war in Ukraine and in the Middle East have made this problem even more urgent.

So far, drones usually rely on people-operators to fulfill deadly missions, but develop software that will soon enable them to find and choose goals. According to critics, such weapons sometimes work unpredictably and is likely to make mistakes in determining the goals, such as unmanned vehicles that have been accidentally crashed.

New weapons may also increase the likelihood of deadly force during the war, as military uses will not immediately risk their own soldiers or can lead to faster escalation, opponents say. Last week, the Committee, based in Geneva, agreed for the insistence of Russia and other major powers to give itself the opportunity to continue studying this topic by the end of 2025, one diplomat who participated in the debate said. "If we wait too long, we really regret it," said Mr.