April 8, 2026
Why Workers Take Safety Shortcuts: The Psychology Behind Rule Violations
By Safety Team
Research shows most safety violations aren't reckless — they're rational responses to production pressure, poor rules, and normalized practice. Learn the science behind shortcuts and what actually works to reduce them.
safety cultureWhy Workers Take Safety Shortcuts: The Psychology Behind Rule Violations
Picture a pipefitter on a refinery turnaround. He has replaced this gasket more than a thousand times. The procedure says to lock out, tag out, verify zero energy, then pull the flange bolts. He has done exactly that — maybe the first two hundred times. But the line has been de-inventoried for three days. He can see it is empty. The permits are stacking up. The supervisor just told the crew they are behind schedule. So he skips the verify step, pulls the bolts, swaps the gasket, and moves on. Nothing happens. Nothing ever happens.
Until it does.
If you have spent any time in safety leadership, you have watched this scene play out in a hundred variations: the harness left unclipped on a low-height scaffold, the guard removed to clear a jam faster, the pre-use inspection signed but not performed. The instinct is to call it complacency, carelessness, or willful defiance — and to respond with discipline, retraining, or a sterner toolbox talk.
But four decades of research in human factors and organizational psychology tell a different story. Most safety violations are not acts of recklessness. They are rational, often predictable responses to production pressure, poorly designed rules, and workplace cultures that quietly reward the shortcut. Understanding why people take shortcuts is the first step toward building systems where they do not need to.
This article walks through the major research frameworks that explain safety violations, examines what the evidence says about common management responses, and outlines the interventions that actually work.
Not All Violations Are the Same: Reason's Typology
The starting point for any serious discussion of safety shortcuts is the violation typology introduced by James Reason in his landmark book Human Error. Reason drew a clear line between errors (unintended actions) and violations (deliberate deviations from rules or procedures). He then subdivided violations into three types that remain the foundation of the field today (Reason 108–112).
Routine violations are habitual shortcuts that become woven into normal work practice. The worker who never clips the chin strap on a hard hat, the forklift operator who always backs up without a spotter, the electrician who skips the voltage test on a circuit she "knows" is dead — these are routine violations. They persist because they save time and effort, they rarely produce harm, and over time they stop feeling like violations at all. Reason compared them to habitual speeding on a highway: technically illegal, universally practiced, and reinforced every time nothing goes wrong (Reason 109).
Situational violations occur when it is difficult or impossible to follow the rule and still get the job done. Reason called these "necessary" violations to emphasize the absence of choice (Reason 110). A confined-space entry procedure written for a standard vessel may be physically impossible to follow when the vessel has an unusual geometry. A two-person lift rule is violated when only one person is available and the production line is down. These violations are often a signal that the rule itself is flawed, outdated, or written by someone who has never done the task.
Exceptional violations happen in emergencies, when the existing rules clearly are not working and the worker improvises a solution. A crane operator who swings a load over a walkway to avoid a collapsing structure is committing an exceptional violation. These are rare, high-stakes, and usually well-intentioned.
Reason made two observations that deserve emphasis. First, violations do not always produce bad outcomes — in fact, most of the time they produce the desired outcome faster and more easily than compliance would. Second, strict compliance with a bad rule can itself create danger (Reason 112). These points matter because they explain why workers do not experience shortcuts the way safety managers do. From the worker's perspective, the shortcut works.
How Deviance Becomes Normal: Vaughan's Warning
If Reason explains why individual workers cut corners, sociologist Diane Vaughan explains how entire organizations come to accept it. Her concept of the "normalization of deviance," developed through a painstaking analysis of the 1986 Challenger space shuttle disaster, describes a gradual drift in which unacceptable practices become acceptable — not through any single decision, but through the slow accumulation of precedent (Vaughan 62–65).
The Challenger's O-ring seals were never designed to erode during flight. When early flights showed erosion, engineers flagged it as an anomaly. But each time the shuttle flew and returned safely despite the erosion, the anomaly was reinterpreted. What had been a deviation from design specifications was reclassified as an "acceptable risk." Launch after launch, the boundary of what counted as safe shifted outward, until a condition that would have grounded the fleet in 1981 was considered routine by 1986 (Vaughan 119–124).
Vaughan identified three forces that drive normalization of deviance: production pressure (the relentless organizational demand to meet schedules and budgets), structural secrecy (the way large organizations compartmentalize information so that no single person sees the full picture), and cultural beliefs about acceptable risk that are shaped by past success rather than engineering analysis (Vaughan 238–244).
Every safety manager should recognize this pattern on their own site. A fall protection policy is violated once during an urgent repair and no one falls. The next time the same situation arises, the precedent exists: we did it before and it was fine. Within a year, the shortcut has a nickname. Within two years, new hires learn it from their mentors. Within five years, anyone who insists on full compliance is seen as slowing the crew down.
The critical lesson from Vaughan's work is that normalization of deviance is not caused by bad people. It is caused by ordinary people working inside systems that reward production, punish delay, and lack the feedback loops to catch a slow drift toward danger. As Vaughan wrote, the Challenger decision was not an anomaly — it was a product of "the way things were routinely done at NASA" (Vaughan 394).
Safety Climate: The Organizational Thermostat
While Reason and Vaughan focused on the mechanics of rule-breaking and cultural drift, a parallel research stream has examined why some organizations have far fewer violations than others — even in the same industry, doing the same work.
The answer, consistently, is safety climate: the shared perceptions that workers hold about the priority management actually gives to safety. Not what the banners say. Not what the policy manual reads. What people see, hear, and experience every day.
Andrew Neal and Mark Griffin established the causal pathway in a study that tracked workers and accident rates over five years. They found that the average level of safety climate within a work group at one point in time predicted changes in individual workers' safety motivation. That motivation, in turn, predicted changes in safety behavior. And improvements in group-level safety behavior were associated with subsequent reductions in accidents (Neal and Griffin 949–951).
In other words, safety climate does not just correlate with fewer violations — it causes fewer violations, through a specific chain: when workers believe management genuinely prioritizes safety, they become more motivated to work safely, and that motivation translates into behavior change that reduces incidents.
Michael Christian and colleagues confirmed and extended this model in a large-scale meta-analysis that synthesized decades of research. They found that group safety climate had the strongest association with actual accidents and injuries of any factor studied. Safety knowledge and safety motivation were the most powerful proximal predictors of safety performance — meaning they were the direct psychological drivers of whether a worker followed or skipped a procedure. Critically, safety climate operated as the "distal" force that shaped those proximal factors: it determined how much safety knowledge workers acquired and how motivated they were to apply it (Christian et al. 1110–1115).
The practical implication is straightforward. If you want workers to follow procedures, the single most powerful lever you can pull is not a new rule, a stiffer penalty, or another training class. It is the daily, visible, consistent demonstration by front-line supervisors and senior leaders that safety is genuinely valued — not just when it is convenient, but when it conflicts with production targets.
The Rules Gap: Written vs. Practiced
One of the most uncomfortable findings in safety science is that many violations are not failures to follow good rules — they are rational adaptations to bad ones.
Andrew Hale and David Borys conducted an exhaustive two-part review of the research literature on safety rules and procedures, published in Safety Science in 2013. They identified two fundamentally different ways that organizations think about rules (Hale and Borys, "Part 1" 208–210).
Model 1 is the top-down, classical approach. Rules are written by experts, imposed on frontline workers, and treated as static, comprehensive limits on behavior. Violations are viewed as negative behavior to be suppressed through training, discipline, or surveillance. Most safety management systems are built on Model 1 assumptions.
Model 2 is the bottom-up, constructivist view. Rules are understood as dynamic, local, and situated. Frontline workers are treated as experts in their own domain, and competence is defined not as blind compliance but as the ability to adapt rules to the diversity of real-world conditions (Hale and Borys, "Part 1" 211–213).
The gap between these two models — between how work is imagined and how work is actually done — is where most violations live. Hale and Borys found that in many organizations, the rules-as-written bear only a passing resemblance to the rules-as-practiced, and that this gap grows wider over time as conditions change but rules do not (Hale and Borys, "Part 2" 223–225).
This gap is not a sign of worker deficiency. It is a sign of organizational failure: a failure to keep rules current, a failure to involve the people who do the work in writing the rules, and a failure to maintain what Hale and Borys called the "regular and explicit dialogue" between frontline workers and supervision that keeps rules alive and relevant (Hale and Borys, "Part 2" 228).
Consider a common example. A lockout/tagout procedure written for a standard motor-driven pump may specify isolation points, lock placement, and energy verification steps that are perfectly clear and perfectly executable — for that pump. But on a turnaround, the crew encounters a pump with a non-standard valve configuration. The procedure does not fit. The worker has three choices: stop work and wait for a new procedure (possibly for hours), improvise an adaptation, or skip the step entirely. In the real world, under production pressure, with a supervisor asking why the job is behind schedule, options two and three are the most common.
Alper and Karsh reinforced this picture in their systematic review of safety violations in industry. They catalogued 57 distinct variables associated with intentional, non-malevolent violations — grouped into individual factors (fatigue, experience, risk perception), organizational factors (production pressure, inadequate staffing, poor communication), rule-related factors (rules that are unclear, impractical, or outdated), and hardware factors (equipment design that makes compliance difficult) (Alper and Karsh 743–748).
Their most important finding was conceptual: the majority of safety violations they reviewed were not malicious or reckless. They were performed by workers who were trying to do their jobs, often under difficult conditions, using the best judgment available to them. Alper and Karsh argued that the field needed to move beyond treating all violations as moral failures and toward understanding them as planned behavior — the product of intentions shaped by attitudes, social norms, and perceived behavioral control (Alper and Karsh 749–750).
Why Punishment Does Not Work
Given the research above, the question of whether punishment reduces violations almost answers itself. But it is worth examining directly, because punitive responses remain the default in many organizations.
The logic of punishment is simple: if a worker is disciplined for skipping a procedure, the pain of the consequence will deter future violations. This logic holds under very specific conditions — the behavior must be fully within the worker's control, the punishment must be immediate, consistent, and certain, and the worker must perceive the punishment as proportionate and fair.
In practice, almost none of these conditions are met.
Punishment is inconsistent. Supervisors vary enormously in how strictly they enforce rules. A worker who is disciplined for a violation he has committed dozens of times without consequence does not learn that the rule matters — he learns that enforcement is arbitrary. As one safety practitioner put it, inconsistent application "sends the message that safety management is sloppy and inefficient" (ProAct Safety).
Punishment targets the wrong behaviors. Routine violations, by definition, are widespread. Disciplining one worker for a practice shared by an entire crew is perceived as scapegoating, not justice. It breeds resentment, not compliance.
Punishment suppresses reporting, not violations. Decades of research in healthcare, aviation, and industrial safety have demonstrated that punitive responses to violations and errors drive reporting underground. Workers learn to hide shortcuts rather than stop taking them. The organization loses its most valuable source of safety intelligence — the frontline workers who know where the real risks are (Dekker 12–15).
Punishment assumes individual control. When a violation is driven by production pressure, understaffing, poor equipment design, or an impractical rule, punishing the worker addresses none of the contributing factors. The system remains unchanged, and the next worker in the same situation will make the same choice (Alper and Karsh 749).
Punishment cannot create desired behavior. Discipline is designed to stop behavior, not start it. Stopping an unsafe shortcut does not automatically produce the desired safe behavior — it creates a void. Without positive reinforcement for the correct practice, without removing the barriers that made the shortcut attractive, the void will be filled by another shortcut (ProAct Safety).
This does not mean accountability has no role. The Just Culture framework, widely adopted in healthcare and aviation, provides a structured approach. Human errors (slips, lapses, and honest mistakes) call for consolation and system redesign. At-risk behaviors (shortcuts taken without full awareness of the risk) call for coaching and removal of the incentives for the shortcut. Only reckless behavior — the conscious, unjustifiable disregard of a known, substantial risk — warrants disciplinary action (Dekker 22–27). The key distinction is between a system that punishes outcomes and a system that addresses behavioral choices in context.
What Actually Reduces Violations
If punishment is the wrong tool for most violations, what is the right one? The research points to a set of mutually reinforcing interventions.
1. Fix the Rules
The single most direct way to reduce violations is to close the gap between rules-as-written and rules-as-practiced. This means involving frontline workers in writing and revising procedures — not as a token consultation, but as a genuine partnership. Hale and Borys found that organizations with the best safety performance treated rule management as a living process, with regular review cycles, explicit dialogue between workers and supervisors, and mechanisms for workers to flag rules that do not fit the work (Hale and Borys, "Part 2" 228–229).
Practically, this means:
- Conducting periodic "procedure walkdowns" where workers perform the task while someone checks each step of the written procedure against reality
- Creating a simple, low-barrier process for workers to submit rule change requests
- Reviewing and updating procedures whenever equipment, personnel, or conditions change
- Writing rules at the right level of detail — specific enough to guide behavior, flexible enough to accommodate real-world variation
2. Strengthen Safety Climate
Neal and Griffin's research demonstrated that safety climate is the upstream driver of safety motivation and behavior. Strengthening safety climate is not a single initiative — it is the cumulative effect of hundreds of leadership decisions.
Key practices supported by the research include:
- Supervisor behavior. Front-line supervisors are the most powerful influence on safety climate. When supervisors visibly prioritize safety, follow rules themselves, and respond to safety concerns promptly, workers internalize the message that safety matters (Christian et al. 1116).
- Management walk-arounds. Senior leaders who are present on the floor, ask genuine questions about hazards, and follow up on what they hear send a signal that cannot be faked.
- Resource allocation. Nothing undermines safety rhetoric faster than under-resourcing. When workers see that the company will spend money on safety equipment, adequate staffing, and proper maintenance, they believe the words.
- Conflict resolution. The moment of truth for safety climate is when safety and production genuinely conflict. Organizations that consistently resolve these conflicts in favor of safety — and communicate that they have done so — build the credibility that drives compliance.
3. Address Production Pressure Directly
Vaughan's normalization of deviance is fueled by production pressure. The most effective way to prevent the drift is to name it and manage it explicitly.
This means:
- Building realistic safety time into project schedules and shift plans
- Training supervisors to recognize when production pressure is creating incentive to cut corners
- Creating formal stop-work authority that is genuinely used — not just a policy that exists on paper
- Tracking leading indicators (near misses, stop-work events, rule change requests) as seriously as lagging indicators (incident rates)
4. Build Crew-Level Norms
Normalization of deviance operates through social norms — the unwritten rules about "how we really do things here." Reversing it requires building new norms at the crew level.
Research on safety participation — the voluntary, discretionary safety behaviors that go beyond basic compliance — shows that these behaviors are heavily influenced by peer expectations (Christian et al. 1113). Workers are far more likely to speak up about a hazard, remind a coworker to clip in, or stop a job when they see their peers doing the same.
Interventions that build positive crew norms include:
- Peer-led safety observations (not punitive audits, but supportive check-ins)
- Pre-job briefings that explicitly discuss the temptation to take shortcuts and the reasons not to
- Crew-level recognition for safety behaviors, not just individual recognition
- Open discussion of past violations in a learning (not blaming) context
5. Move Up the Safety Culture Ladder
Patrick Hudson's safety culture maturity model provides a useful diagnostic for understanding where your organization sits and what the next step looks like (Hudson 700–705). The model defines five stages:
- Pathological: The organization cares more about not getting caught than about safety. Safety is seen as a regulatory burden. Violations are the norm.
- Reactive: The organization takes safety seriously only after an incident. Investigations focus on blame. Rules exist but are not consistently enforced.
- Calculative: The organization has systems, metrics, and procedures in place. Compliance is tracked. But safety is treated as a technical problem, not a cultural one. Workers follow rules because they are told to, not because they believe in them.
- Proactive: The organization anticipates hazards and acts before incidents occur. Workers are involved in identifying risks. Safety is a shared value, not just a management priority.
- Generative: Safety is fully integrated into how the organization operates. Information flows freely. People at every level feel ownership for safety and act on it. The organization is resilient — it adapts to unexpected conditions rather than relying solely on pre-written rules.
Most organizations that struggle with chronic violations are operating at the Calculative stage or below. They have the systems but not the culture. Moving to the Proactive and Generative stages requires the interventions described above — fixing rules, strengthening climate, addressing pressure, and building norms — sustained over years, not months.
Putting It Together: A Framework for Safety Managers
The research reviewed in this article converges on a consistent picture. Here is a framework for applying it.
When you see a violation, ask five questions before you act:
- Is this routine, situational, or exceptional? (Reason's typology) A routine violation requires a different response than a situational one.
- Has this become normalized? (Vaughan's framework) If multiple workers are doing it, the problem is cultural, not individual.
- Does the rule actually work? (Hale and Borys) Walk the procedure. Do the task. Can the rule be followed as written? If not, the rule is the problem.
- What does the worker's environment reward? (Neal and Griffin; Christian et al.) Is production pressure pushing workers toward shortcuts? Does the safety climate support compliance or undermine it?
- Is this intentional and reckless, or intentional and rational? (Alper and Karsh) If the worker was trying to get the job done under difficult conditions, discipline will not prevent recurrence. Fixing the conditions will.
Then choose your response:
- Fix the rule if it does not match reality
- Coach the worker if they did not fully understand the risk
- Change the system if production pressure, staffing, equipment, or design created the incentive
- Strengthen the climate if the violation reflects a broader cultural pattern
- Discipline only if the behavior was a conscious, unjustifiable disregard of a known and substantial risk
This is not a soft approach. It is a more demanding one. It requires managers to do the difficult work of understanding why a violation happened, rather than the easy work of writing someone up. But it is the approach the evidence supports.
Limitations
This article draws on well-established research in human factors and organizational psychology, but several limitations should be noted.
First, the frameworks described here — Reason's typology, Vaughan's normalization of deviance, the safety climate path model — are simplifications of complex phenomena. Real-world violations often involve elements of more than one category, and the boundaries between routine, situational, and exceptional violations are not always clear.
Second, the research base is weighted toward high-hazard industries (oil and gas, aviation, healthcare, nuclear power). While the core principles are broadly applicable, the specific dynamics of violations may differ in lower-hazard settings such as offices, retail, or light manufacturing.
Third, much of the safety climate research relies on self-reported data (surveys and questionnaires), which is subject to social desirability bias. Workers may overreport compliance and underreport violations. The Neal and Griffin (2006) study is notable for linking self-reports to objective accident data over time, but this level of methodological rigor is not universal in the field.
Fourth, cultural context matters. The research reviewed here was conducted primarily in Western, industrialized settings. Safety norms, attitudes toward authority, and the social dynamics of rule-following vary across national and regional cultures. Managers working in multinational or cross-cultural environments should be cautious about applying these frameworks without adaptation.
Fifth, this article focuses on the psychology of individual and group behavior. It does not address the regulatory, legal, and economic dimensions of safety violations, which also shape organizational responses.
Finally, the Just Culture framework, while widely endorsed, is not without criticism. Some scholars argue that the line between "at-risk behavior" and "reckless behavior" is drawn subjectively and can reproduce the very biases it is designed to eliminate. Implementing Just Culture well requires training, consistency, and ongoing calibration — it is not a policy you can adopt and forget.
Works Cited
Alper, Samuel J., and Ben-Tzion Karsh. "A Systematic Review of Safety Violations in Industry." Accident Analysis & Prevention, vol. 41, no. 4, 2009, pp. 739–754. https://doi.org/10.1016/j.aap.2009.02.027.
Christian, Michael S., et al. "Workplace Safety: A Meta-Analysis of the Roles of Person and Situation Factors." Journal of Applied Psychology, vol. 94, no. 5, 2009, pp. 1103–1127. https://doi.org/10.1037/a0016172.
Dekker, Sidney. Just Culture: Balancing Safety and Accountability. Ashgate, 2007.
Hale, Andrew, and David Borys. "Working to Rule, or Working Safely? Part 1: A State of the Art Review." Safety Science, vol. 55, 2013, pp. 207–221. https://doi.org/10.1016/j.ssci.2012.10.007.
Hale, Andrew, and David Borys. "Working to Rule or Working Safely? Part 2: The Management of Safety Rules and Procedures." Safety Science, vol. 55, 2013, pp. 222–231. https://doi.org/10.1016/j.ssci.2012.10.008.
Hudson, Patrick. "Implementing a Safety Culture in a Major Multi-National." Safety Science, vol. 45, no. 6, 2007, pp. 697–722. https://doi.org/10.1016/j.ssci.2006.07.001.
Neal, Andrew, and Mark A. Griffin. "A Study of the Lagged Relationships among Safety Climate, Safety Motivation, Safety Behavior, and Accidents at the Individual and Group Levels." Journal of Applied Psychology, vol. 91, no. 4, 2006, pp. 946–953. https://doi.org/10.1037/1076-8998.11.4.315.
ProAct Safety. "Can Punishment Improve Safety?" ProAct Safety, n.d., https://proactsafety.com/articles/can-punishment-improve-safety. Accessed 9 Apr. 2026.
Reason, James. Human Error. Cambridge UP, 1990.
Vaughan, Diane. The Challenger Launch Decision: Risky Technology, Culture, and Deviance at NASA. U of Chicago P, 1996.