Recent advances in artificial intelligence contribute to nuclear risk

Nuclear-armed states’ competition in military AI and premature adoption of AI in nuclear weapons and related capabilities could have a negative impact on strategic stability and increase the likelihood of nuclear weapon use.

Recent advances in artificial intelligence (AI) contribute to nuclear risk according to a new report from the Stockholm International Peace Research Institute (SIPRI). The authors warn that nuclear-armed states’ competition in military AI and premature adoption of AI in nuclear weapons and related capabilities could have a negative impact on strategic stability and increase the likelihood of nuclear weapon use.

The report indicates that recent advances in AI, specifically machine learning and autonomy, could unlock new and varied possibilities in a wide array of nuclear weapons-related capabilities, ranging from early warning to command and control and weapon delivery.

Machine learning and autonomy are not new, but recent developments in these fields have enabled the development of automated systems that can solve complex problems or tasks that had previously only yielded to human cognition or required human intervention.

‘The key question is not if, but when, how and by whom recent advances in AI will be adopted for nuclear-related purposes,’ says Dr Vincent Boulanin, Senior Researcher, SIPRI and lead author of the report. ‘However, at this stage the answers to these questions can only be speculative. Nuclear-armed states have not been transparent about the current and future role of AI in their nuclear forces’

Research shows nonetheless that all nuclear-armed states have made the military pursuit of AI a priority, with many determined to be world leaders in the field. The report warns that this could negatively impact strategic relations, even before nuclear weapon–related applications are developed or deployed.

Premature adoption of military artificial intelligence could increase nuclear risk  

The authors argue that it would be imprudent for nuclear-armed states to rush their adoption of AI technology for military purposes in general and nuclear-related purposes in particular. Premature adoption of AI could increase the risk that nuclear weapons and related capabilities could fail or be misused in ways that could trigger an accidental or inadvertent escalation of a crisis or conflict into a nuclear conflict.

‘However, it is unlikely that AI technologies—which are enablers—will be the trigger for nuclear weapon use.’ says Dr Lora Saalman, Associate Senior Fellow on Armament and Disarmament, SIPRI. ‘Regional trends, geopolitical tensions and misinterpreted signalling must also be factored into understanding how AI technologies may contribute to escalation of a crisis to the nuclear level’.

The report recommends that transparency and confidence-building measures on national AI developments would help to mitigate such risks.

Challenges of artificial intelligence must be addressed in future nuclear risk reduction efforts

According to the report’s authors, the challenges of AI in the nuclear arena must be made a priority in future nuclear risk reduction discussions.

‘It is important that we do not overestimate the danger that AI poses to strategic stability and nuclear risk. However, we also must not underestimate the risk of doing nothing,’ says Dr Petr Topychkanov, Senior Researcher, Nuclear Disarmament, Arms Control and Non-proliferation Programme, SIPRI.

‘While the conversation on AI-related risks is still new and speculative, it is not too early for nuclear-armed states and the international security community to explore solutions to mitigate the risks that applying AI to nuclear weapon systems would pose to peace and stability,’ says Topychkanov.

The report proposes a number of measures for nuclear-armed states, such as collaborating on resolving fundamental AI safety and security problems, jointly exploring the use of AI for arms control and agreeing on concrete limits to the use of AI in nuclear forces.

 

Leave a Reply

Your email address will not be published. Required fields are marked *