“We’ve done this a million times without incident,” they say. Then, when the unthinkable happens, the narrative shifts to a different statistical myth: “What are the odds of that happening again?” or “Lightning never strikes twice.”
But in real life, past success does not prove future invincibility, and lightning often strikes the same location multiple times. New York's Empire State Building, which averages 20-25 strikes per year, provides empirical evidence of this. (Excuse the pun.)
Yet, these pseudo-scientific myths can persist and spread unconsciously even in modern science labs. Using insights from psychology, sociology, and neuroscience, this blog explores how to keep these myths from eating away at safety culture.
1. Gambler’s Fallacy
"Lightning never strikes twice in the same place" is a textbook example of the Gambler’s Fallacy, the mistaken belief that if an event happens more frequently than normal during a given period, it will happen less frequently in the future.
Often, the opposite is true. The Lab Safety Institute's historical incident database records many lab accidents that occurred immediately after a scientist was released from the hospital following a previous lab accident. Probability has no quota that, once filled by tragedy, protects us from further harm.
Just this month in India, lightning struck twice again. On February 13, a massive explosion occurred following a radiator blast at Brundavan Laboratories. While the fire caused significant panic and structural damage, no casualties or injuries were reported. Everyone could breathe a sigh of relief. After all, lightning never strikes twice, right?
Seven days later, it did. A 35-year-old chemist, Dhara Pawan, was burned to death in a similar incident at VJ Sai Chemical Labs — only a 7-minute drive from Brundavan. Contrary to the Gambler's Fallacy, because these two labs likely shared similar regulatory environments and supply chains, the first strike actually signaled a higher probability of a second. Pawan is the latest fatality added to the Laboratory Safety Institute's Memorial Wall and Science Incident Dashboard.
2. Optimism Bias

Neuroscientist Tali Sharot defines the Optimism Bias as our tendency to overestimate the likelihood of positive events and underestimate the negative.
In a lab, this manifests as a "personal exemption" from risk. We know, intellectually, that chemicals are volatile. However, the brain whispers, "Accidents happen to 'other' people—people who are careless or poorly trained." Because we view ourselves as competent, we believe our competence acts as a shield.
But no one is immune to the laws of chemistry. In fact, as the classic Swiss Cheese Model illustrates, an adverse incident results from the alignment of multiple systemic holes, not any one person's “incompetence.” Therefore, one person’s competence cannot possibly make everything better.
3. Normalization of Deviance
Sociologist Diane Vaughan famously coined this term while analyzing the safety gaps that led to the 1986 Challenger space shuttle disaster. Normalization of Deviance occurs when people become so accustomed to a dangerous behavior — skipping a safety check, ignoring a small leak, or "forgetting" PPE — that they no longer see it as deviant.
Each time a rule is broken without a negative consequence, the "perceived risk" of that action drops. Eventually, the dangerous behavior becomes the de facto standard operating procedure.
How Leaders Interrupt Myths
Statistics don’t care about the stories we tell ourselves to stay calm. And safety doesn’t live only in the EHS office. Safety lives wherever decisions are made.
Every leader — formal or informal — shapes whether risk is minimized, normalized, or elevated.
If you approve a process, design a lesson, manage a lab, set a budget, or model a shortcut — you are leading safety. To build a resilient safety culture, leaders must actively dismantle the psychological traps that allow smart, capable people to underestimate danger. Because if leaders don’t interrupt the myths, the myths will lead.
- Cultivate Constructive Tension: Comfort is not the goal of a safe lab — awareness is. When safety feels automatic or unquestioned, complacency quietly takes hold. Effective leaders cultivate constructive tension: the kind that keeps teams alert without making them anxious. At LSI, we use drill cards, realistic scenarios, participatory safety meetings, and even unannounced safety drills to rehearse decision-making before something goes wrong. The goal isn’t fear. It’s disciplined awareness. Yesterday’s success does not guarantee today’s safety — and intentional leaders make sure their teams remember that.
- Practice Pre-Incident Learning: If the first time you examine failure is after something breaks, you’re too late. Strong leaders don’t wait for accidents to teach them. They ask hard questions early: If this failed tomorrow, where would it start? What are we quietly depending on? What are we hoping never happens? These conversations cut through optimism bias without assigning blame and surface weak points while there’s still time to fix them. Intentional safety cultures don’t rely on luck — they pressure-test their systems before reality does.
- Audit the Small Stuff: Normalization of deviance doesn’t announce itself. It creeps in when “nobody does it that way” becomes normal. Skipped steps. Quiet shortcuts. Workarounds that slowly become standard practice. Strong leaders don’t hunt for violations — they look for drift. They review logs, observe real workflows, and ask better questions: What makes this step hard to follow? Where have we adapted the process? What feels fragile here? Because people rarely admit workarounds if they expect blame. Psychologically safe entry points surface reality early — before small deviations align into big failures.
Bottom Line
Lightning doesn’t care about the stories we tell ourselves. It follows the path that’s available. Bias creates those paths. Complacency widens them. Silence protects them. Leadership closes them. If we want stronger safety cultures, we cannot wait for incidents to expose what was already fragile. We have to challenge assumptions early, surface small deviations, rehearse tough scenarios, and create space for honest conversations before something breaks. Safety does not correct itself. It requires leadership.
These principles — and the practical tools to apply them — are central to LSI’s upcoming webinar:
Leadership in Safety
Date: March 13, 2026
Register here
Because culture doesn’t change through policy alone. It changes through leaders who are willing to act before luck runs out.