
Nuclear and New Technologies
In 2020, the European Leadership Network (ELN) in cooperation with partners set out on a journey to unpack technological complexity as it impacts nuclear decision-making and propose practical policy approaches to deal with related risks.
The challenge we want to address
Nuclear decision-making is complex. Disruptive technologies pose both risks and opportunities to nuclear decision-making which need to be better explained, understood, gamed, and mitigated. The project’s focus is on the – so far under-examined – implications of the technological complexity that emerges when nuclear decision-making is affected by a plethora of new technologies which are all evolving rapidly and simultaneously. Building on existing work that looks at the impact of individual technologies on nuclear policy, this project assesses the impact of these technologies in the aggregate, seeks to overcome related risks and explores opportunities offered by technologies to mitigate these risks.
Leveraging on the ELN’s deep expertise, convening power, and network of seasoned, high-level practitioners from multiple countries and utilizing ELN’s partner organizations strengths, we have embarked on a path to study, analyze, describe, train, and recommend decision-makers on nuclear policy challenges of technological complexities.
The project will develop, test-drive, propose and promote practical policy approaches that governments might pursue to begin to responsibly regulate and steer the weaponization of potentially disruptive technologies and their use in nuclear decision making.
The objectives of this multi-year project are to reduce risk in the nuclear decision-making, identify mitigation strategies, de-escalation solutions and manage potential and unintended escalation. We also strive to engage and raise the voice of younger generation experts in the discussion.
To commence work, the ELN in partnership with the German Federal Foreign Office has organized and hosted a “Rethinking Arms Control” workshop in March 2021. This closed-door meeting brought a diverse group of experts of scholars, practitioners, former nuclear weapons decision-makers, and emerging leaders to ideate and analyse the challenges, opportunities, and pitfalls of technological complexity. The summary of the proceedings and major takeaways from the workshop ARE highlighted in the following report: New Technologies, Complexity, Nuclear Decision Making and Arms Control: Workshop Report, June 2021
How we want to achieve the goal
The project is built upon four strands which – like four legs of a stool – support the main goal. These are:
- Baselining Exercise
- Big Data Analysis of Emerging and Disruptive Technologies
- Methodologies to Deal with Multi-tech Complexities
- Mitigation Strategies & Arms Control
We begin by asking what the science (strand 1), practioners (strand 2) and current policies and tools (strand 3) tell us about the impact of and ways of dealing with technological complexity in nuclear decision making. We then craft policy approaches that governments might pursue (strand 3 and 4).
This comprehensive approach allows us to unpack technological complexity by harnessing the brightest minds around the world, test policy approaches with people who “have been there and done it” and use our networks to develop and promote solutions with current decision-makers.
Funding from the German Federal Foreign Office, the MacArthur Foundation, the Carnegie Corporation of New York, the Nuclear Threat Initiative, the Heinrich Böll Foundation and in-kind contributions from project partners make this work possible.
Nuclear and New Technology Publications

Existential threats beyond the bomb: emerging disruptive technologies in the age of AI
To better understand emerging technologies, NEVER members Konrad, Anemone, Emil, Arthur, and Joel outline the evolution of the risk landscape around emerging disruptive technologies and draw parallels between the dangers posed by nuclear weapons and those posed by novel biotechnologies. They explore the broader challenge of governing emerging technologies and suggest potential ways forward.

French thinking on AI integration and interaction with nuclear command and control, force structure, and decision-making
Héloïse Fayet analyses the French literature on France’s perception of military AI, especially its consequences on strategic systems and competition, and nuclear deterrence. Fayet offers practical recommendations for France both domestically and internationally.

UK thinking on AI integration and interaction with nuclear command and control, force structure, and decision-making
Alice Saltini analyses the British literature on the UK’s perception of military and nuclear applications of AI and their impact on strategic stability and NC3. The paper offers recommendations for unilateral measures that the UK can take, as well as multilateral initiatives within the P5 framework, to address the risks associated with AI in nuclear decision-making

Russian thinking on AI integration and interaction with nuclear command and control, force structure, and decision-making
Oleg Shakirov analyses Russian-language literature on the Russian debate on AI and the nuclear field and offers recommendations for P5 states to advance dialogue on AI integration into nuclear C2, force structure and decision-making.

Chinese thinking on AI integration and interaction with nuclear command and control, force structure, and decision-making
Fei Su and Jingdong Yuan analyse Chinese-language literature to present Chinese perspectives on AI and its military applications. The paper offers recommendations to mitigate the risks associated with the military use of AI in nuclear C2 systems, particularly focusing on the steps that China could consider to enhance its practices.

AI and nuclear command, control and communications: P5 perspectives
The nuclear-weapons states China, France, Russia, the United Kingdom, and the United States are increasingly recognising the implications of integrating AI into nuclear weapons command, control, and communication systems. Exploring the risks inherent to today’s advanced AI systems, this report sheds light on characteristics and risks across different branches of this technology and establishes the basis for a general purpose risk assessment framework.
-
European Leadership Network (ELN)
Go to website -
Federal Foreign Office
Go to website -
The Arms Control Association (ACA)
Go to website -
The Council on Strategic Risks (CSR)
Go to website -
The Center for Global Security Research (CGSR) at the Lawrence Livermore National Laboratory (LLNL)
Go to website -
The Oracle Partnership
Go to website -
Professor Andrew Futter, University of Leicester
Go to website -
The British American Security Information Council (BASIC)
Go to website -
The Heinrich-Böll-Stiftung (HBS)
Go to website -
The Younger Generation Leaders Network on Euro-Atlantic Security (YGLN)
Go to website -
Dr Vladimir Kozin (Analytical Agency “Strategic Stability")
Go to website