I spoke for the second time at Minnebar 20 last Saturday (May 2)! Although it was only my third time attending, this year was special as it was the 20th Minnebar! It’s a wonderful community, and after starting my sessions by giving my own talk, I attended twosessions on Operation Metro Surge, both of which contained lessons that apply beyond how Minneapolis responded to the ICE operation, a fun Matrix-themed talk on passkeys and the current challenges with using them, a retrospective on tech companies in the Twin Cities, and a couple of high-quality DevOps related talks.
Minnebar is a great community and I also had great conversations at lunch and in the hallway track between sessions. I’m looking forward to it next year! Details of the talk I gave (along with slides) are included below!
Description
Let’s talk about cybersecurity warnings! Security warnings - and many other computer warnings - are terrible, and there are so many examples…but why is that and does it really have to be that way? I’m lucky to have worked on a project to explain what makes a good cybersecurity warning for product designers, security professionals, and lawyers, and I can tell you that it doesn’t!
Come join us in mocking a gallery of bad security and not-security warnings, in screenshots and emoji. Along the way, we’ll talk about the history of security and traditional product warnings, what we know about making good warnings, celebrate a (small) gallery of good warnings, and talk about how we can all get better!
Slides
You can download handouts of my slides, which include my speaker’s notes and links to all references.
SIRAcon 2026 wrapped up 2 weeks ago, and as always, it was worth the trip! This year included a bonus - the conference started the day after Patriot’s Day, which gave me the opportunity to fly out a day early and watch the Boston Marathon, which was great fun and quite the event!
The talks were all recorded, and available for attendees on cvent for a few weeks, and in the SIRA Members Area after that.
Highlights
The talks this year that stood out to me included:
Tony Martin-Vegue’s keynote, which (finally) named two different types of risk I’ve been thinking about for some time: Aleatory uncertainty (randomness) and Epistemic uncertainty (lack of knowledge). This distinction helps understand how we can gain leverage over each type: statistics and math for aleatory uncertainty, and research and evidence for epistemic uncertainty.
Stephen Shaffer previewed the Exploit Vector Incident Loss (EVIL) Model, which is used to inform vulnerability investment decisions (do I patch a higher EPSS CVE on 50 servers, update signatures across 24,500 workstations, or patch a low EPSS CVE across those 24,500 workstations?)
Josh Marker gave a fun talk on dimensional analysis - something I haven’t done since high school, yet he convinced me it’s still useful!
Jim Lipkis spoke about measuring the risk of rare events using the Cost of Capital - I found this quite interesting, and am planning to revisiting my own analysis on the value of cybersecurity risk reduction using Jim’s approach.
Finally, one of the students, Chelsea Conard, presented her work on the Cyber Incident Severity Score (CISS), designed to help governments prioritize critical infrastructure security incidents, which takes into account not just financial impact but also individual and operational impact. It’s good to see other researchers looking into the larger social impact of cyber incidents!
You can download handouts from my talk with full speaker’s notes and references, and links to all of my work can be found at https://jbenninghoff.com.
Abstract
Security warnings are a risk communication intended to help users make better decisions and improve security performance. This talk covers examples of good and bad warnings, the factors that lead to better outcomes, and how those lessons can be used in a broader risk practice.
Earlier this month I joined Dustin Lehr on the Security Champions Podcast! I enjoyed our conversation, which covered adapting ideas from safety to security, empowering developers, influence, organizational change management, and more. It even included an old phrase I coined, that particularly resonated with Dustin:
“A security amateur knows how to secure things; a security professional knows when you don’t have to.”
Dustin has a full writeup on the Security Journey blog, and you can watch a video version of the podcast on YouTube or listen to it on your favorite podcast app!
Description
John joins the podcast to explore what it means to treat security like other mature safety disciplines. Drawing on safety science, economics, and hands-on AppSec experience, he shares a practical perspective on security as decision support and how empowering developers with the right time and tools leads to stronger security outcomes.