Information Safety

Improving technology through lessons from safety.

Interested in applying lessons from safety to security? Learn more at security-differently.com!

SIRAcon 2026

SIRAcon 2026 wrapped up 2 weeks ago, and as always, it was worth the trip! This year included a bonus - the conference started the day after Patriot’s Day, which gave me the opportunity to fly out a day early and watch the Boston Marathon, which was great fun and quite the event!

The talks were all recorded, and available for attendees on cvent for a few weeks, and in the SIRA Members Area after that.

Highlights

The talks this year that stood out to me included:

  • Tony Martin-Vegue’s keynote, which (finally) named two different types of risk I’ve been thinking about for some time: Aleatory uncertainty (randomness) and Epistemic uncertainty (lack of knowledge). This distinction helps understand how we can gain leverage over each type: statistics and math for aleatory uncertainty, and research and evidence for epistemic uncertainty.
  • Stephen Shaffer previewed the Exploit Vector Incident Loss (EVIL) Model, which is used to inform vulnerability investment decisions (do I patch a higher EPSS CVE on 50 servers, update signatures across 24,500 workstations, or patch a low EPSS CVE across those 24,500 workstations?)
  • Josh Marker gave a fun talk on dimensional analysis - something I haven’t done since high school, yet he convinced me it’s still useful!
  • Jim Lipkis spoke about measuring the risk of rare events using the Cost of Capital - I found this quite interesting, and am planning to revisiting my own analysis on the value of cybersecurity risk reduction using Jim’s approach.
  • Finally, one of the students, Chelsea Conard, presented her work on the Cyber Incident Severity Score (CISS), designed to help governments prioritize critical infrastructure security incidents, which takes into account not just financial impact but also individual and operational impact. It’s good to see other researchers looking into the larger social impact of cyber incidents!

My own talk, What can we learn from cybersecurity warnings? was well received and I had fun presenting both safety and cybersecurity warnings, and was happy to get some additional examples from other attendees, that I used when I presented at Minnebar 20!

Slides

You can download handouts from my talk with full speaker’s notes and references, and links to all of my work can be found at https://jbenninghoff.com.

Abstract

Security warnings are a risk communication intended to help users make better decisions and improve security performance. This talk covers examples of good and bad warnings, the factors that lead to better outcomes, and how those lessons can be used in a broader risk practice.

comment

Tapping Other Fields To Approach Security Differently

Earlier this month I joined Dustin Lehr on the Security Champions Podcast! I enjoyed our conversation, which covered adapting ideas from safety to security, empowering developers, influence, organizational change management, and more. It even included an old phrase I coined, that particularly resonated with Dustin:

“A security amateur knows how to secure things; a security professional knows when you don’t have to.”

Dustin has a full writeup on the Security Journey blog, and you can watch a video version of the podcast on YouTube or listen to it on your favorite podcast app!

Description

John joins the podcast to explore what it means to treat security like other mature safety disciplines. Drawing on safety science, economics, and hands-on AppSec experience, he shares a practical perspective on security as decision support and how empowering developers with the right time and tools leads to stronger security outcomes.

comment

SIRAcon 2025

Last week I made my annual trip to SIRAcon 2025, which was once again held at the Boston Federal Reserve! I had a great time both attending the talks and making time to speak with the other attendees; both old friends and new members. If you registered for the conference, either in-person or virtual, you can watch all of the talks on cvent for a few weeks, and after that in the SIRA members area (the agenda is publicly available).

The highlights for me included Graeme Keith’s keynote session on a simple approach to quantitative enterprise risk management at scale, a drop-in replacement for heat maps, Tony Martin-Vegue’s talk on LLMs, the Marsh McLennan talks on quantifying cyber risk and security control effectiveness, and the student competition winners, Isaac Teuscher and Philip Akekudaga.

I was quite happy with how my own talk, Insecure at any speed: why Secure by Design is not enough, generated good questions from the audience as well as thoughtful and insightful follow-up conversations.

I left with some key insights from both writing and giving the talk as well as from the attendees:

  • The Payment Card Industry Data Security Standard (PCI-DSS) has been effective at reducing credit-card related security incidents
  • The insurance industry is doing good work that is starting to identify what works in cybersecurity
  • The IEEE includes security in the Software Engineering Body of Knowledge (SWEBOK v4)

But my biggest insight was that we can improve third party risk management (TPRM) by replacing long questionnaires that don’t work with asking about insurance coverage - does the partner have cyber insurance commensurate with their risk? This would shift TPRM to a trusted intermediary that is in a much better position to evaluate and assess security posture in a standardized way. If you’re already trying this, please reach out - I’d love to hear from you on how it’s working!

Slides

You can download handouts with full speaker’s notes and references, and find additional links at the QR Code I shared at the end of my talk.

Description

As a society, should we mandate secure software? CISA’s Secure by Design program calls for voluntary implementation of critical security controls, but safety research, analysis of manager incentives, and the history of auto safety tells us this will not be enough.

In May 2024, the Cybersecurity and Infrastructure Security Agency (CISA) launched the Secure by Design pledge, inspired in part by the 1965 book “Unsafe at any speed”. There are remarkable parallels between automotive safety in the early 1960s and cybersecurity today, including lack of systematic data collection, and customers who are forced to take responsibility for security going right and blame when things go wrong.

While Secure by Design is a good start, the book and market incentives show that the pledge does not go far enough. Software companies are unlikely to make sufficient investments in security, much like the auto manufacturers and safety in the 1960s. Like pollution, security failures impose costs on society that are not paid by the producer. I present a call to action to address the investment gap, as well as a list of additional practices needed to improve the security of software.

Research shows that safety is not good for business, and my own analysis explains why executives under-invest in cybersecurity. The auto safety movement of the 1960s shows what’s needed to secure our software-based systems.

comment