You are driving fast, maybe too fast, on a highway at night. Maybe it’s snowing or raining, or your eyes glaze over as you feel the fatigue of a long day, or maybe your phone rings and you look down for a moment. Suddenly, the car in front of you stops and you hit the brakes. You feel your tires spin and for a second you’re sure you’ve crashed.
But then: Nothing.
You stopped just in time. With your heart racing, you exhale. You’re shocked but also impressed by your quick reflexes. You think to yourself: there has been no harm.
But the damage is almost done. And that’s the problem.
Near misses like this often fade from our minds as quickly as they occur. But they are the most valuable security information have. People, organizations and societies often fail to prevent disasters, not because of a lack of warning, but because they do not take near misses seriously.
security scientist James Reason saw near misses as “immunizations.” for a security system, possibilities to detect and correct underlying vulnerabilities before real damage occurs. But too often we waste these opportunities. We are lucky and instead of investigating or analyzing what went wrong, we move on.
My interest in near misses comes from the practice of medicine and my research on the history of disasters and system failures, work that inspired my book. Written in blood. The study of accidents in all fields, from fires to transportation and healthcare, shows that warning signs are often visible long before a catastrophe occurs.
(Unsplash+/Getty Images)
Luck is not a strategy.
Take something as mundane as your phone. In late 2025, Apple released iOS 26.1, a routine software update. Except it wasn’t routine. Patched multiple critical vulnerabilities that could have allowed the attackers to take control of the iPhones. If the hackers had been successful, the data and privacy of millions of users could have been compromised. And although some phones had probably been hacked, for most people the crisis was averted.
In healthcare, near misses are common: a medication is almost given to the wrong patient but caught in time, or a surgical tool counted incorrectly but found before closing the patient’s incision. These are serious signs, but too often they go unreported. Most health workers not reporting near misses due to fear of blame, lack of feedback, or the false belief that no harm means no problem.
Often, healthcare personnel do not even realize that a near miss has occurred. If we don’t look for near misses, we are almost guaranteed don’t learn from them.
THE CANADIAN PRESS/John Woods
Transport shows the same pattern. Near-collisions on icy roads. Trains brake just before passing a signal. The aircraft diverts after onboard systems detect a mechanical failure mid-flight. In aviation and railways, these close situations are treated as data. In many other sectors, they are dismissed as background noise. But the data is there.
A recent Study by the Canadian Automobile Association (CAA) found that in just 20 monitored intersections, more than 610,000 incidents of “near misses” – close calls between vehicles and pedestrians or cyclists – were registered from September 2024 to February 2025.
Our systems are sending signals. Every time we get lucky is an opportunity to learn: an opportunity to build better layers of defense; a chance to prevent the next tragedy. Near misses are not false alarms. They are the most honest feedback a system provides: the future, whispering in the present.
Our brains are not programmed for prevention
So why don’t we learn from difficult times?
Psychologists have long understood The human brain is terrible at processing invisible risks.. We overreact to dramatic events, but underreact to near misses. We confuse luck with security. And we discard what “almost” happened.
Three psychological traps are especially pernicious:
- Availability bias: We remember the great disasters, but not the hundreds of catastrophes that were narrowly avoided. This distorts our risk radar.
- Confirmation bias: We assume that a system is safe because it did not fail. But many systems survive not because they are strong, but because nothing has been prepared to break them… yet.
- Optimism bias: We know that bad things happen to other people, but we assume that our skill or luck will protect us.
Reason’s “Swiss Cheese” Model describes how disasters occur when weaknesses in multiple layers of defense align. A near miss is when you almost line up and something, often by chance, blocks the path. But unless we plug those holes, next time we might not be so lucky.
THE CANADIAN PRESS/Darryl Dyck
There are exceptions. Aviation, nuclear energy and air traffic control, the so-called “high reliability organizations“Get this. Ideally, they treat every close situation as a data point. They institutionalize reporting. They never forget to be afraid.
These organizations cultivate a chronic malaise, a kind of productive paranoia. It is not pessimism; It’s realism. They know that systems often fail unless they are constantly corrected. That mentality is why they are among the safest sectors in the world.
Imagine if we brought that mentality to more industries: if every phishing text that almost fooled someone became a reason to improve security, if every minor medical error was reviewed as if it were an accident. The price of ignoring near misses it always pays eventually — in insurance claims, infrastructure failures, lawsuits and avoidable duels.
What can you do now?
If near misses are warning signs, the easiest step is to stop ignoring them. When something almost goes wrong, the instinct is usually to ignore it as luck. But luck is data. It is proof that a system was about to fail.
The real lesson of near misses is that they allow us to learn without paying the full price of the disaster. Aviation, nuclear energy, and other high-risk industries have built entire safety systems around studying these moments.
We should treat them the same way in everyday life: on the road, at home and at work. Notice them. Talk about them. Fix the conditions that made them possible.
Because the goal is not simply to avoid disaster. The goal is to learn from the moments when things almost went wrong.