Skip to main content

23 September 2025

War over AI, not with AI: experts warn of emerging security risks

The Digital Futures of Security brought together experts to explore how artificial intelligence and cybersecurity are reshaping the concept of security in the digital age.

Kate Devlin, Tony Reeves, Luca Vigano, Claudia Aradau
Upper row, left to right: Professor Kate Devlin, Tony Reeves. Lower row: Professor Luca Viganò, Professor Claudia Aradau. Screenshot of the event recording

Chaired by Professor Kate Devlin, Chair-Director of the Digital Futures Institute, the event featured insights from Tony Reeves, Partner and Lead Partner in Digital Defence and AI Strategy at Deloitte, Claudia Aradau, Professor of International Politics at King’s, and Luca Viganò, Professor of Computer Science and Head of the Cybersecurity Group at King’s.

Has security changed?

‘Security is pretty much the same in intent. Security is still that ability for an individual or society to make their own decisions and actions without undue influence or pressure from others,’ says Tony Reeves. The digital landscape, however, has made threats harder to identify.

‘Security has not changed, but the world has,’ says Professor Luca Viganò. ‘30 years ago, cybersecurity was already a very big issue, but it pertained only to a small subset of people: governments, the military and those who had assets they needed to protect online. Nowadays, every single one of us has assets online because our lives are online. The opportunity to attack humans rather than machines or institutions is completely new with respect to 30 years ago.’

Professor Claudia Aradau warned that in the digital age, the vast amounts of data we generate can be repurposed, even if it was never intended for security use: ‘A lot of our language of understanding and approaching social and political problems becomes one of security. This is also reinforced in the digital age through huge amounts of data we produce all the time. I argue that all this data is potentially security data. This way, any private company working with data and using these digital technologies becomes also a security company.’

Is the UK prepared for a large-scale cyberattack?

According to Professor Luca Viganò, the UK is better prepared than other countries to fend off a potential cyberattack, but not fully ready.

One hundred per cent cybersecurity does not exist. There are mathematical reasons for that. We could still get a very high degree of cybersecurity, but it is expensive and very difficult to achieve because of the need for constant monitoring. The UK is better prepared than other countries thanks to the huge investments by the Government Communications Headquarters, National Cyber Security Centre, research councils, the government and different organisations.

Professor Luca Viganò, Professor of Computer Science and Head of the Cybersecurity Group at King’s

According to Professor Viganò, the UK’s main vulnerability – and one that is difficult and expensive to solve – lies in the cohabitation of legacy systems and new technologies.

War over AI, not with AI

According to Tony Reeves, Western military has spent the last decade preparing for AI age to make sure that it can be adopted safely, responsibly and sensibly. AI is being increasingly used in defence. At a tactical level, it is built into munitions, drones, weapon systems and rifle sites.

‘The one thing I do worry about is not necessarily a war with AI, but a war over AI. We're already seeing AI being deployed effectively as a strategic weapon. Both President Biden and President Trump have put restrictions on GPU sales – the hardware that drives our AI capabilities. There are restrictions and sanctions on rare earths. There will probably be more disputes about power and water to drive AI.’

One of the big causes of conflict is often one country feeling they're falling so far behind another country that a war seems the only option to try and catch up or to peg back their opponent. So much as I worry about AI on a tactical field being misused, I do worry at a bigger level that we probably will have a war over using strategic AI, rather than a war concern with AI.

Tony Reeves, Partner and Lead Partner in Digital Defence and AI Strategy at Deloitte

International society, says Tony Reeves, must reach a consensus and determine when use of AI in warfare constitutes a war crime, and a manual for AI deployments in conflict must be introduced: ‘If you look at drone devices being deployed in Ukraine, what happens when they're being misappropriated or used against civilian targets? That is a deep concern, and this is one area, AI in particular, where lack of consensus from international society is unacceptable.’

What’s next?

Looking ahead, the speakers emphasised the need for anticipatory governance, inclusive regulation and public engagement.

When we don’t have regulation, we have disadvantage, unfairness, and behaviours that we don't want and we don't encourage. It is a difficult place between encouraging innovation and being regulated, but my biggest concern is that if we rely just on governments to regulate, they take so long to do this,’ says Tony Reeves. ‘It takes too long to agree what we should regulate, and by the time we've agreed on that, the technology has moved on so far, it’s irrelevant. Unless we are, oddly, going to use AI to accelerate the regulation process, we'll always play catch up and be too late, and, therefore, not be relevant to actually the problem that’s facing society.

Tony Reeves, Partner and Lead Partner in Digital Defence and AI Strategy at Deloitte

Professor Claudia Aradau emphasised the role of workers in democratic oversight, framing strikes, whistleblowing and civil society campaigns as forms of democratic contestation.

These are democratic processes and forms of trying to institutionalise and bring change to shape the processes of prioritisation and de-prioritisation, what is valued and what becomes devalued. Often AI is introduced and promoted as leading to automated speedier decisions. But all these democratic processes, if you think of campaigns, mobilisations or whistleblowing, are quite slow. The forms of deliberation and contestation take time.

Professor Claudia Aradau, Professor of International Politics

‘We need to change the narrative,’ says Professor Luca Viganò. ‘We need to change the language. We need to make sure that people understand the risks, but also that people can or need to be part of the solution. Companies, the government, the military and all the different institutions all need to be part of the solution, otherwise we'll never reach an acceptable degree of cybersecurity.’

In this story

Kate Devlin

Professor of Artificial Intelligence & Society

Claudia Aradau

Professor of International Politics

Luca Viganò

Vice-Dean (Enterprise and Engagement) and Head of the Cybersecurity Group