2026

State of AI in Security & Development

Our new report captures the voices of 450 security leaders (CISOs or equivalent), developers, and AppSec engineers across Europe and the US. Together, they reveal how AI in cybersecurity and software development is already breaking things, how tool sprawl is making security worse, and how developer experience is directly tied to incident rates. This is where speed and safety collide in 2026.

Key Findings

01

AI Adoption

1 in 5 suffered a serious incident linked to AI code

“Whilst organisations battle to leverage the benefits of AI, there is often a hidden associated battle going on, and that’s to keep their organisation safe from an increase in cyber risk that AI presents. The role of the CISO is to ensure the security posture scales as quickly as the technology does.”

Christelle Heikkilä
Former CIO/CISO, Arsenal FC

53% blame the security team for incidents linked to AI code

Q. If a vulnerability introduced by GenAI code later caused a security incident‚ who would ultimately be held accountable in your organization? Select all that apply

“There's clearly a lack of clarity among respondents over where accountability should sit for good risk management”

Andy Boura
CISO, Rothesay
02

Developer Experience

15%
of engineering time is lost to triaging alerts

That's $20m per year for a 1000-dev organization

$ wasted on false positives
Remainder $ wasted on triage

Estimated annual cost of developer time spent on triaging

$720k
$3.6m
$14.4m

2/3 of respondents bypass security, dismiss findings or delay fixes

44%
37%
35%
34%
32%
22%
Q: How has dealing with false positives from security tools affected your development practices? Select all that apply.

“When security tools overwhelm developers with noise, they drift into risky workarounds. We aim to restore balance by removing false positives, strengthening guardrails, and improving Developer Experience, so teams can focus on what truly matters”

Darshit Pandya
Senior Principal Engineer - Platform, Serko

Tools built for both dev and security teams saw far fewer incidents

Teams using tools designed for both developers and security teams were more than twice as likely to report zero incidents compared to those using tools made for only one group.

“Giving developers the right security tool that works with existing tools and workflows, allows teams to implement security best practices and improve their posture”

Walid Mahmoud
DevSecOps Lead, UK Cabinet Office

Security reality

03

Teams using separate AppSec and CloudSec tools are 50% more likely to face incidents

"It's clear that we need to combine our AppSec and Cloud Security programs into a single product security team. For companies where infrastructure is defined as code, cloud security fundamentally is code security, and it drives better results."

James Berthoty
Founder, Latio Tech

The Future of AI

96% believe AI will write secure, reliable, code

Within 1 - 2 years
20%
3-5 years
44%
24%
6-10 years
+10 years
8%
4%
never
But only 21% think it will be without human oversight

9 in 10 organizations expect AI to take over penetration testing, with an average timeline of 5.5 years.

9/10 organizations

If you work in software security, you need to read this.