Saturday, February 4, 2023

The Inverted Iceberg

 The Inverted Iceberg

By IceRoadPilots

The iceberg effect is a variant of the Heinrich Pyramid developed in the 1930’s. Herbert W Heinrich put forward the following concept that, in a workplace, for every accident that causes a major injury, there are 29 accidents that cause minor injuries and 300 accidents that cause no injuries. The Heinrich Law was widely accepted by the global aviation industry as a risk analysis tool and incorporated into the safety management system.

The Heinrich Pyramid is commonly known as the safety pyramid, or the safety triangle, and indicates a relationship between major injuries, minor injuries, and near-misses. The Heinrich Pyramid concludes that injuries and incidents are caused by a human decision to perform an unsafe action, and that by lowering the number of minor injuries, businesses could reduce the total number of major injuries and incidents. While the most often cited figure would suggest an emphasis on human errors, Heinrich also suggested that workplaces focus on hazards, not just worker behavior.

Commonly known as the iceberg ignorance, a variant of the Heinrich Pyramid principle, was later designed as the Iceberg Model. In the iceberg model, only four out of every hundred operational problems are known to the accountable executive (AE), and 96% of issues, or hazards, are hidden from the AE. The foundation of the iceberg theory is that 100%, or all problems, are known to the frontline workers, being flight crew, mechanics, or airport personnel, while only 4% are known to the AE.

The iceberg concept principle is that all serious problems are the result of several smaller problems that went unnoticed or unmanaged. For every serious incident in the iceberg model there were 59 smaller incidents, and 600 minor conditions. Conventional wisdom is that an accountable executive, as the decision authority, needs to be more aware of minor issues and conditions and initiate actions to stop these issues or conditions before they lead to a serious incident. Preventing minor event to escalate into a full-scale disaster is one reason why the iceberg of ignorance matters to the AE, directors, supervisors, and frontline workers. An accountable executive need to make a concerted efforts to be aware of minor issues and conditions, and overcoming this issue happens primarily through changing the hazard reporting system requirements. There are several valid theories and conditions to the iceberg principles, but there is a major flaw, or finding, in the system when the system is relaying on that knowledge of every minor event or preventing minor events will stop future disasters from occurring.

When applying the near- miss principles from the Heinrich Pyramid, the Safety Pyramid, or the Iceberg Model, an SMS enterprise builds their SMS platform on a misconception that major accidents only occurs after several minor incidents or near-misses are identified. When operating within a human factors system, organizational system, supervision system and environmental system, each individual person within each system performs independent of the other persons. Tasks are performed individually in a 3D environment with tasks measured in time (speed), space (location), and compass (direction). A robot would complete the task within the same timeframe each time, it would initiate the task at the exact same location, and it would follow the exact same process every time. Within a mechanical production system, the outcome of a process produces the same outcome without learning from past errors. When relying on these two principles to establish SMS reliability, an SMS enterprise is placing themselves in a box that is very difficult to crawl out of. Several years ago, a flaw was discovered in compressor turbine disk by applying non-destructive testing of the CT disk. This test was the first test after the final production stage and after it had left the production line. The test discovered a flaw in the material and was reported. There was no safety management system reporting avenue at that time, just an inspection report. The time was also before the iceberg model was widely known and understood. This material flaw was the very first flaw in a CT disk that management new of. Applying the safety pyramid as the accepted standard, 300,000 unsafe acts would be required, followed by 600 reports, or near misses before 30 incidents would occur, then 10 serious incidents, and finally one major accident. There have been several CT disk failures in turbine engines, with Sioux City IA as a high-profile accident. This does not imply that the same or similar flaw was the cause, but that applying the safety pyramid principles keeps an SMS enterprise inside the box. Another question to answer is how does the production of a CT disk crate an unsafe act. Since the safety triangle is based on 300,000 unsafe acts, there must be an astronomical number of unsafe acts daily. 300,000 hours equals 12,500 days, or just over 34 years of systematic undetected unsafe acts. For unsafe acts to go undetected, they must be few and hidden within a system that only the front-line workers knows about. An example could be the speed at which large trucks travel. The opinion of an unsafe speed varies from person to person, and it vary between jurisdictions. Each jurisdiction has their own speed limit set for large trucks, with a justification for safety. Applying the logic in the safety pyramid, the lower the speed limit is, the faster number 300,000 is reached and time between major trucing accidents shortens.

Another example is the publishing of airport NOTAM (notice to airmen). An airport operator has a tool in their toolbox to publish NOTAM when there are issues, construction, or events at an airport. When a NOTAM is published, the airport must implement a counteraction to remain within the airport standards. I have seen this time after time, that airports publish NOTAM, but does not action their operations to remain in compliance with the regulations, airport standards, operational processes, or their SMS safety policy. Just recently, this winter, an airport operator was operating with a NOTAM for four days that the runway was 100% ice-covered, with 1/8 inch of dry snow on top of the ice. Their next NOTAM was 3 inches of snow on top of compacted snow. Since the temperature was well below freezing for several days, and the ice was not removed, ice was still the base surface condition, but went unreported. In addition to the ice and snow, the runway was not cleared to full width, but left 10 feet wide windrows, and 4 feet tall on each side of the runway. The runway remained open day and night with these conditions present. Again, applying the safety pyramid principle, an unsafe act four days out of the 180 days of winterseason, gives an airport operator unlimited opportunities to operate with unsafe conditions. An airport general operating limitation is one-half statute mile visibility. An airport operations manual is a legal reference document between the airport operator and the regulator with respect to level of service. When the visibility is below one-half statute mile, the airport must close to remain compliant with their legal document. Another example of unsafe airport operations condition is that this same airport remained open, day and night, with visibility below legal limit. When operating within a safety management system, an airport operator is required to ensure that their airport is suitable for the operations of an aircraft, and sometimes this may include closing of the airport. Eventually the evidence goes away, and the airport operator continues as nothing ever happened. Both examples are current and true stories. True stories are good examples to learn from but applying safety triangle principles keeps an airport operator together with the regulator inside a box that suits their comfort level and they have no reasons to crawl out of.

When applied by an SMS enterprise operating with processes to action unsafe conditions, non-regulatory compliance processes, and hazards identified as an immediate threat to aviation safety, the Heinrich Pyramid, the Frank Bird Safety Pyramid, or the Edward T. Hall Iceberg Principle are tools to maintain conformance with safety management system principles. When actions are applied to unsafe conditions, or non-conforming processes, these three triangles are turned 

inverted, since each unsafe, or non-conforming conditions become learning experiences to be addressed. When the triangles are inverted, other unsafe conditions and non-conformances, or conditions below the waterline also become visible. The reason why the global aviation industry needs their SMS, is to start chipping away of safety concerns that became visible when the triangles turned.

A compliance guidance document states it beautifully that an SMS enterprise needs to complete a review of the finding, or observation, and clearly identify what happened, how widespread it is within its own organization, where it occurred in the system and if it was a policy, process, procedure, or culture issue. It is not the intent for an SMS

enterprise to reiterate the finding or observation, but rather that they do a factual review of the observations as it applies to their own organization. Their review includes a description of relevant factual information related to the non- compliance, identification of the enterprise system that led to the non-compliance, and they identify policy, processes, procedures, practices or organizational culture involved.

Other operators may learn from observations or findings that are shared with them, but corrective action of events is an internal performance task applicable to a single, and specific organization only. In other words, sharing information is sharing information only, and it is not sharing of corrective action plans.

OffRoadPilots



2 comments:

Passion For Safety

Passion For Safety By OffRoadPilots S afety is in everyone’s interest, but not everyone has a passion for safety. Generally, safety is defin...