Banner

Banner

Banner


"The Resource Centre For Hazardous Area Professionals"

Social Media
Select Language

Rules, Rules, Rule!

Print PDF
User Rating: / 0
PoorBest 

Rules, Rules, Rule!

Hazardous area engineering, like any technical application, is founded on fundamental principles and rules of practice.

The three main applications are founded upon the three-letter acronym, I.E.C.

  • Identification of hazards
  • Evaluation of hazards
  • Control of hazards


Identification

The first step in safety engineering is “hazard identification.” A hazard is anything that has the potential to cause harm. (Read More)

 

Many system safety techniques have been pioneered to aid in the identification of potential system hazards. None is more basic than “energy analysis.” Here, potential hazards associated with various physical systems and their associated operation, including common industrial and consumer related activities, can be identified (for later evaluation and control) by first recognizing that system and product “hazards” are directly related to various common forms of “energy.” That is, system component or operator “damage” or “injury” cannot occur without the presence of some form of hazardous “energy.”

“Hazard identification” in reality can be viewed as “energy identification,” recognizing that an unanticipated undesirable release or exchange of energy in a system is absolutely necessary to cause an “accident” and subsequent system damage or operator injury. Therefore, an “accident” can now be seen as “an undesired and unexpected, or at least untimely release, exchange, or action of energy, resulting, or having the potential to result in damage or injury.” This approach simplifies the task of hazard identification as it allows the identification of hazards by means of a finite set of search paths, recognizing that the common forms of energy that produce the vast majority of accidents can be placed into only ten descriptive categories.
The goal of this first step in the hazard control process is to prepare a list of potential hazards (energies) in the system under study. No attempt is made at this stage to prioritize potential hazards or to determine the degree of danger associated with them – that will come later. At this first stage, one is merely taking “inventory” of potential hazards (potential hazardous energies).

Evaluation

The evaluation stage of the safety engineering process has as its goal the prioritizing or ordering of the list of potential system condition or physical state hazards, or potential system personnel of human factors.

The mere presence of a potential hazard tells us nothing about its potential danger. To know the danger related to a particular hazard, one must first examine associated risk factors.

Risk can be measured as the product of three components: (a) the probability that an injury or damage producing mishap will occur during any one exposure to the hazard; (b) the likely severity or degree of injury or damage that will likely result should a mishap occur; and (c) the estimated number of times a person or persons will likely be exposed to the hazard over a specific period of time.

That is…

H x R = D, and since
R = P x S x E, then
H (P x S x E) = D

where:

H= Hazard
R = Risk
D = Danger
P = Probability
S = Severity
E = Exposure

In the evaluation of mishap probability, consideration should be given to historical incident data and reasonable methods of prediction.

Use of this equation must take into account that an accident event having a remote probability of occurrence during any single exposure, or during any finite period of exposure to a particular hazard, IS CERTAIN TO OCCUR if exposure to that hazard is allowed to be repeated over a longer period of time. Therefore, a long term or large sample view should be taken for proper evaluation.

Determination of potential severity should centre on the most likely resulting injury or damage as well as the most severe potential outcome. Severity becomes the controlling factor when severe injury or death is a likely possibility among the several plausible outcomes. That is, even when other risk factors indicate a low probability of mishap over time, if severe injury or death may occur as a result of mishap, the risk associated with such hazards must be considered as being “unacceptable,” and strict attention given to the control of such hazards and related mishaps.

Exposure evaluation should consider the typical life expectancy of the system containing a particular hazard, the number of systems in use, and the number of individuals who will be exposed to these systems over time.

Acceptable vs. Unacceptable Risk

This step in the hazard evaluation process will ultimately serve to divide the list of potential hazards into a group of “acceptable” hazards and a group of “unacceptable” hazards. Acceptable hazards are those associated with acceptable risk factors; unacceptable hazards are those associated with unacceptable risk factors.

An “acceptable risk” can be thought of as a risk that a group of rational, well-informed, ethical individuals would deem acceptable to expose themselves to in order to acquire the clear benefits of such exposure. An “unacceptable risk” can be thought of as a risk that a group of rational, well-informed, ethical individuals would deem unacceptable to expose themselves to in order to acquire the exposure benefits.

Hazards associated with an acceptable risk are traditionally called “safe,” while hazards associated with an unacceptable risk are traditionally called “unsafe.” Therefore, what is called “safe” does contain elements of risk that are judged to be “acceptable.” Once again, the mere presence of a hazard does not automatically mean that the hazard is associated with any real danger. It must first be measured as being unacceptable.

The result of this evaluation process will be the compiling of a list of hazards (or risks and dangers) that are considered unacceptable. These unacceptable hazards (rendering the system within which they exist “unreasonably dangerous”) are then carried to the third stage of the safety engineering process, called “hazard control.”


Hazard Control

The primary purpose of engineering and the design of products and facilities is the physical “control” of various materials and processes to produce a specific benefit. The central purpose of safety engineering is the control of system “hazards” which may cause system damage, system user injury, or otherwise decrease system benefits. Current and historic safety engineering references have advocated a specific order or priority in which hazards are best controlled.

For decades, it has been well established by the authoritative safety literature (as well as by logic and sound engineering practice) that, in the order of preference and effectiveness, regardless of the system being examined, hazards are first to be controlled through (a) “hazard removal,” followed by (b) the use of “physical safeguards,” and then, after all reasonable opportunities have been exhausted related to hazard removal and safeguarding, (c) remaining hazards are to be controlled through the development and use of adequate warnings and instructions (to include prescribed work methods and procedures).

Rule 1

The first cardinal rule of hazard control (safe design) is “hazard elimination” or “inherent safety.” That is, if practical, control (eliminate or minimize) potential hazards by designing them out of products and facilities “on the drawing board.” This is accomplished through the use of such interrelated techniques as hazard removal, hazard substitution, hazard attenuation, and/or hazard isolation through the use of the principles and techniques of system and product safety engineering, system and product safety management, and human factors engineering, beginning with the concept and initial planning stages of the system design process.

Rule 2

The second cardinal rule of hazard control (safe design) is the minimization of system hazards through the use of add-on safety devices or safety features engineered or designed into products or facilities, also “on the drawing board,” to prevent the exposure of product or facility users to inherent potential hazards or dangerous combinations of hazards; called “extrinsic safety.” A sample of such devices would include shields or barriers that guard or enclose hazards, component interlocks, pressure relief valves, stairway handrails, adequate lighting, and passive vehicle occupant restraint and crashworthiness systems.

Passive vs. Active Hazard Controls

A principle that applies equally to the first two cardinal rules of safe design is that of “passive vs. active” hazard control. Simply, a passive control is a control that works without requiring the continuous or periodic involvement or action of system users. An active control, in contrast, requires the system operator or user to “do something” before system use, continuously or periodically during system operation in order for the control to work and avoid injury. Passive controls are “automatic” controls, whereas active controls can be thought of as “manual” controls. Passive controls are unquestionably more effective than active controls.

Rule 3

The third cardinal rule of hazard control (safe design) is the control of hazards through the development of warnings and instructions; that is, through the development and effective communication of safe system use (and maintenance) methods and procedures that first warn persons of the associated system dangers that may potentially be encountered under reasonably foreseeable conditions of system use, misuse, or service, and then instruct them regarding the precise steps that must be followed to cope with or avoid such dangers. This third approach must only be used after all reasonably feasible design and safeguarding opportunities (first and second rule applications) have been exhausted.

We hope that this edition of Hazardous Engineering Solutions magazine can help provide you with some interesting products, services that can aid you with the implementation of a successful and safe rule practice.