How AI is heading towards its ‘Oppenheimer moment’ – and why humans must act

Regulators have warned that AI is facing its ‘Oppenheimer moment’ and are urging people to take action before it’s too late.

The statements were made Monday at a conference in Vienna, citing J. Robert Oppenheimer, who helped invent the atomic bomb in 1945 before advocating for controls on the spread of nuclear weapons.

The event featured civilian, military and technology officials from more than 100 countries discussing their countries’ control over militarized AI.

The US Pentagon has invested millions of dollars in AI startups, the European Union is working on a database to evaluate battlefield targets and the Israeli army has an algorithm that draws up a ‘kill list’.

“This is the Oppenheimer moment of our generation,” said Austrian Foreign Minister Alexander Schallenberg. ‘Now is the time to agree on international rules and standards.’

“This is the Oppenheimer moment of our generation,” warned Austrian Foreign Minister Alexander Schallenberg (above), whose government hosted a two-day conference on limiting AI in war zones. “Now is the time to agree on international rules and standards,” he said

Above, the Thermonator, an Ohio-based company's flame-throwing robot dog worth $9,420

During his opening remarks Monday, Schallenberg described artificial intelligence (AI) as the most significant advance in warfare since the invention of gunpowder more than millennia ago. Top left is the Thermonator, a flame-throwing robot dog from an Ohio-based company worth $9,420

At this week’s conference, a former AI investor from Google’s parent company worried: “Silicon Valley’s incentives may not be in line with the rest of humanity.”

AI was designed to help improve people’s lives, allowing them to abandon mundane tasks and focus on the greater good, but it has since taken a turn that could destroy humanity if left unregulated.

During his opening statement, Schallenberg described AI as the most significant advance in warfare since the invention of gunpowder more than millennia ago.

The only difference was that AI is even more dangerous, he continued.

“In any case, let’s ensure that the most profound and far-reaching decision – who lives and who dies – remains in the hands of people and not machines,” Schallenberg said.

The Austrian minister argued that the world needs this to ‘ensure human control’, with the disturbing trend of military AI software replacing humans in the world decision-making process.

The statements come just weeks after it was discovered that the Israeli arm is using an AI system to fill his ‘kill list’ of suspected Hamas terrorists, leading to the deaths of women and children.

A report from +972 magazine quoted six Israeli intelligence officers, who admitted to using an AI called “Lavender” to classify as many as 37,000 Palestinians as suspected militants – marking these people and their homes as acceptable targets for airstrikes.

During his opening remarks Monday, Schallenberg described artificial intelligence (AI) as the most significant advance in warfare since the invention of gunpowder more than millennia ago.

During his opening remarks Monday, Schallenberg described artificial intelligence (AI) as the most significant advance in warfare since the invention of gunpowder more than millennia ago. Top left is the Thermonator, a flame-throwing robot dog from an Ohio-based company worth $9,420

Civilian, military and technological leaders from more than 100 countries gathered in Vienna (above) on Monday in an effort to prevent, as physicist Anthony Aguirre put it,

Civilian, military and technological leaders from more than 100 countries gathered in Vienna (above) on Monday in an effort to prevent, as physicist Anthony Aguirre put it, “the future of slaughterbots.”

Costa Rica's Foreign Minister Arnoldo André Tinoco expressed concern at the conference that AI-powered weapons of war will soon be deployed by terrorists and other non-state actors, which will require a new legal framework.  Above, an American Reaper drone

Costa Rica’s Foreign Minister Arnoldo André Tinoco expressed concern at the conference that AI-powered weapons of war will soon be deployed by terrorists and other non-state actors, which will require a new legal framework. Above, an American Reaper drone

Lavender was trained using data from Israeli intelligence’s decades-long surveillance of the Palestinian population, using the digital footprints of known militants as a model for what signal to look for in the noise, the report said.

But the technology has also been added to drones used in the war in Ukraine, which help the country search for targets that offload munitions without human guidance.

Austria’s top disarmament official, Alexander Kmentt, who led the organization of Monday’s conference, advised that traditional “arms control treaties” would not work for software like AI.

“We are not talking about a single weapon system, but about a combination of dual-use technologies,” Kmentt said. “A classic approach to gun control doesn’t work.”

Kmentt argued that currently existing legal instruments, such as export controls and humanitarian laws, would be a better and faster solution to the crisis, which is already underway, rather than waiting for the drafting of a new ‘magnum opus’ treaty .

Costa Rica’s Foreign Minister Arnoldo André Tinoco also expressed concern that AI-powered weapons of war will soon be deployed by terrorists and other non-state actors, which requires a new legal framework.

The easy availability of autonomous weapons removes the constraints that have prevented only a few from participating in the arms race,” he said.

‘Now students with a 3D printer and basic knowledge of programming can make drones that can cause large numbers of victims. Autonomous weapon systems have forever changed the concept of international stability.”