Why People Violate Safety Requirements, and What’s Really Going On Underneath

Safety rules rarely fail because they are “wrong.” More often, the problem is that the rule competes with how work actually gets done. In many workplaces, the decision to bypass a requirement is not experienced as a moral lapse. It is framed internally as a practical adjustment, a way to keep production moving, help a colleague, or avoid being seen as slow or incapable.

That tension matters. If we treat non-compliance as simply “bad behaviour,” we miss the system conditions that make violations feel reasonable in the moment. A more useful question is this: What makes a safe choice feel costly, unlikely, or unnecessary to the person doing the work?

 

Violations are not all the same

In human factors and safety science, “violation” typically refers to an intentional departure from a procedure, standard, or rule. Intentional does not mean reckless. It often means someone is using judgement under pressure, with incomplete information, and within constraints they did not design.

A helpful distinction is between errors (unintended actions) and violations (deliberate deviations). That distinction, discussed in classic safety literature such as Reason’s work on human error, shifts the conversation from blaming individuals to examining conditions that shape decisions.

 

The main reasons people bypass safety requirements

1) Production pressure and competing goals

Many organisations say “safety first,” yet workers are rewarded, praised, or promoted for speed, output, and responsiveness. When priorities conflict, behaviour tends to follow what is measured and reinforced.

Sometimes this pressure is explicit: deadlines, quotas, staffing gaps. In other cases it is cultural: the unspoken expectation to “make it happen.” Under those conditions, skipping a step can look like competence rather than risk.

 

2) Work-as-imagined versus work-as-done

Procedures are often written for stable conditions. Real work is variable. Equipment availability changes. The job site shifts. Customers call. Weather interferes. When a rule does not fit the situation, workers adapt.

This is a well-known tension in the Safety-I and Safety-II debate: do we focus mainly on preventing what goes wrong, or do we also study how people create safety through everyday adaptation? Different schools emphasise different answers, but most agree on the practical implication: if procedures are not usable on the ground, informal workarounds will fill the gap.

 

3) Normalisation of deviance

When small departures do not immediately lead to harm, the deviation starts to feel acceptable. Over time, “the shortcut” becomes the unofficial standard.

It is rarely announced. It spreads quietly through peer learning: “This is how it’s really done.” The longer a risky workaround persists without visible consequences, the more legitimate it appears.

 

4) Risk perception, familiarity, and overconfidence

Humans are not neutral risk calculators. Familiar hazards often feel less threatening than novel ones, even when the objective risk is high. People also discount low-probability, high-consequence outcomes because they are hard to imagine vividly.

When someone says, “I’ve done it this way for years,” they may not be dismissing safety. They may be relying on a personal track record that, in their mind, counts as evidence.

 

5) Usability problems: rules that are hard to follow

If a requirement adds time, requires awkward positioning, creates discomfort, or depends on equipment that is not readily available, compliance becomes harder. Friction matters.

This is where ergonomic design and operational practicality intersect. A rule that cannot be executed smoothly becomes a rule that is negotiated, especially during peak workload.

 

6) Social dynamics and identity

People take cues from supervisors and peers. If respected workers cut corners, the practice gains status. If new employees want acceptance, they may copy what they see rather than what they read in an induction manual.

There is also pride involved. In some trades and high-pressure environments, pushing through pain, working fast, or improvising is tied to professional identity. Safety requirements can be misread as a signal of weakness unless leaders actively reshape that narrative.

 

7) Fear of reporting and a blame-oriented climate

If raising concerns leads to ridicule, conflict, or punishment, workers learn to stay silent and “just manage it.” In that setting, people may violate rules to avoid attracting attention, especially if they believe the system will not support them.

This is why “just culture” approaches have become prominent in many industries. They try to hold people accountable without turning every deviation into a personal failure story. Not everyone agrees on where the line should be drawn, but the central point is consistent: psychological safety affects physical safety.

 

8) Training that is abstract, not job-specific

Training can become a checkbox exercise. When it lacks realism, it does not help workers navigate the messy edge cases where violations occur.

Competence is not only knowing the rule. It is knowing how to apply it under constraints, and how to pause or escalate when the system makes safe work difficult.

 

What this means for leaders and safety professionals

If violations are shaped by context, the response should focus less on “telling” and more on diagnosing.

A practical way to approach this is to ask three questions during walkthroughs, debriefs, and incident learning:

  1. What made the unsafe option attractive right then?
    Time pressure, tool availability, task design, supervision, fatigue?
  2. What made the safe option difficult or costly?
    Extra steps, missing resources, awkward access, unclear criteria, conflicting instructions?
  3. What signals did the system send about priorities?
    Incentives, informal praise, who gets listened to, how delays are handled?

These questions shift the conversation from policing to problem-solving, while still taking accountability seriously.

 

Targeted actions that reduce violations without increasing resentment

Improve the fit of procedures

  • Rewrite steps with frontline input, using the language of the job.
  • Specify decision points: “Stop and escalate when X occurs,” rather than vague cautions.
  • Test procedures in real conditions, not just on paper.

 

Reduce operational friction

  • Ensure PPE, permits, lockout equipment, and tools are accessible where work happens.
  • Remove unnecessary steps that add time but do not add control.
  • Design for the body: awkward posture and excessive force drive shortcuts.

 

Address incentives honestly

  • If speed is rewarded, acknowledge it and adjust metrics.
  • Recognise safe decisions that cost time, especially when workers stop and ask for support.

 

Strengthen learning rather than punishment

  • Use near misses and minor deviations as learning data.
  • Treat repeated workarounds as a signal of system strain, not just “poor attitude.”

 

Build credible supervision

  • Leaders who ask curious questions get better information than leaders who interrogate.
  • Visible follow-through matters: when workers report barriers and nothing changes, trust erodes quickly.

 

A final thought

Rule breaking often looks like an individual choice. In practice, it is frequently a system story showing up in individual behaviour. That does not remove responsibility, but it does change where the leverage is.

When we treat violations as data about how work is really functioning, we gain options: redesign tasks, improve access to controls, align incentives, and create an environment where speaking up is normal. Those shifts tend to reduce unsafe shortcuts more reliably than reminders alone.

 

Author: Wilson Ronnie Odoom, ASHEPA President

 

Contact Us:
Email: support@ashepa.org
Website: ashepa.org