The first international conference on the responsible use of artificial intelligence (AI) in the military, specifically aimed at involving people in life-and-death decisions, will take place next week in the Netherlands, AFP reports.

Artificial IntelligencePhoto: Oscar Burriel / Sciencephoto / Profimedia Images

It will gather in The Hague about 50 countries, including the United States and China. Russia was not invited because of its invasion of Ukraine.

“We really see this as a defining moment for the future of artificial intelligence in the military,” Dutch Foreign Minister Wopke Hoekstra said Thursday.

“In a life-and-death field, you want to make sure that humans, regardless of the flaws in our DNA, are part of the decision-making process,” Hoekstra explained.

From a military perspective, artificial intelligence is already being used for intelligence, surveillance and situational analysis.

Although one of the conference sessions is titled “Regulating Killer Robots,” the prospect of fully autonomous killing machines is still a long way off.

But it will soon be possible to allow artificial intelligence to autonomously select targets, including swarms of drones and to use artificial intelligence in nuclear command and control systems.

The conference aims to take the first step toward international rules on “what is acceptable and what is not acceptable” regarding the military use of artificial intelligence, Hoekstra said.

“We are already seeing how artificial intelligence is being used in Russia’s war against Ukraine,” he said.

The minister compared the debate to the debate surrounding the use of chatbots such as ChatGPT, which have benefits but have also been used by students to cheat.

“This is not something that should scare us,” he said.

According to Dutch officials, China is invited to the conference as a key player in the technology and artificial intelligence sector.

High-ranking ministers and diplomats will attend the summit, called REAIM (Responsible Military AI), as well as companies and experts.