American Journal of Industrial and Business Management
Vol.09 No.05(2019), Article ID:92478,11 pages
10.4236/ajibm.2019.95078

Risk Management Instruments, Strategies and Impacts in the Complex Organizations

Federico De Andreis, Marco Florio

Università degli Studi Giustino Fortunato, Benevento, Italy

Copyright © 2019 by author(s) and Scientific Research Publishing Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY 4.0).

http://creativecommons.org/licenses/by/4.0/

Received: April 20, 2019; Accepted: May 17, 2019; Published: May 20, 2019

ABSTRACT

Even they don’t realize it; all types of organizations are probably employing some kind of risk management. Over time, procedures are developed to make sure that things don’t go wrong and plans are putting in place to reduce organizational impact if they do. This article examines the relevance, in the complex organizations, of risk management. After having defined the risk, through the use of case studies, this research will seek to define the instruments of risk analysis in complex organizations, showing the reason why the risk management should be considered as a priority, regardless of the magnitude of the negative outcomes.

Keywords:

Risk, Risk Analysis, Risk Management, Risk Assessment, Organization

1. Introduction

On the night of 1 July 2002, a Bashkirian Airlines Tupolev Tu 154 and a DHL Boeing 757 cargo jet collided in the skies over Überlingen, a southern German town on Lake Constance, along the border between Switzerland and Germany. Both aircraft crashed to the ground, causing a fatal accident for all occupants (69 passengers and crew aboard the Tupolev and 2 crew members of the Boeing). That catastrophe will be remembered in later years as the Überlingen air disaster.

About a year after, another disaster occurred on 8 October 2001 at Linate Airport in Milan, Italy, when a Scandinavian Airlines McDonnell Douglas MD-87, carrying 110 people bound for Copenhagen, Denmark, collided on take-off with a Cessna Citation CJ2 business jet, carrying four people bound for Paris, France, which had entered the runway without permission from an intermediate taxiway. All 114 people on both aircraft were killed, as well as four people on the ground.

It remains the deadliest accident in Italian aviation history. Investigation revealed the collision was caused by a number of nonfunctioning and nonconforming safety systems, standards, and procedures at the airport.

Both accidents, were considered as two of the most serious organizational accidents in aviation and as two of the major aeronautics failures of safety. They were caused, in fact, by a concatenation of operational and latent errors, showing the precarious state of aviation safety in Europe in those years, concerning, particularly, some unsafe procedures and practices tolerated for years that have made these events, apparently usual, two of the most catastrophic accidents in aviation, with several tens of victims.

This paper aims to understand, with the contribution of these two case studies, the multitude of factors causing errors and negative outcomes, in order to clarify how the risk could be reduced in a complex organization, like the aviation.

The goal of the article is to demonstrate the importance of risk management in the organizations. Risks, in fact, could be viewed in two ways: the person approach and the system approach. Each has its model of error causation and each model gives rise to quite different philosophies of error management.

Understanding these differences has important practical implications for coping with the ever present risk of mishaps in organizational practice.

The article is organized as follows. Section 2 analyzes the risk, its definition and its meanings because understanding the components of a risk will allow an organization to manage risk effectively. Therefore, in this section, also a distinction between objective and subjective risks is made. Section 3 introduces the role of the human factor and decision making as the most critical aspects in the complex organizations. Section 4 then combines aviation safety, as well as the risk prevention in complex organizations, with significant number of studies which refer to the risk analysis and the human factors. Sections 5 and 6 describe in depth the two case studies as described above, the incidents of Überlingen and Linate, understanding the dynamics, causes and systemic flaws that have led to such disasters. Finally, Section 7 presents our conclusion, explaining the link between the dynamics that can lead to critical events and the human reality and, in each system or organization. Therefore, in this section, we will investigate also the role of risk managements, with its instruments and strategies, as a priority in all organizations to reduce the probability of an adverse event and to mitigate the consequences of any potential risk.

2. What Is Risk? Definition and Meanings

The term risk defines the potential chosen action or activity, including the choice of not acting, that can lead to a loss or to an unwanted event, namely the possibility of undergoing damage, associated to a condition more or less predictable [1] .

We can state that the organization’s purpose is to create value by interacting with its environment (customers, suppliers, technology, competition, markets, government, etc.); value is created by providing goods and services that fulfill the needs of the organization’s customers or constituents. Furthermore, we can also affirm that risk is the property that causes value to vary in uncertain ways. The source of risk is changing in the environment, since the environment represents a complex set of relationships and interactions among organizations and other elements.

The complexity and fluidity of the interactions creates uncertainty: no one organization has the ability to either completely control, influence or foresee all possible changes in the environment but the risk management, therefore represents a fundamental issue for all the complex organizations. Thus, companies increasingly focus more on identifying risks and managing them before they even affect the business.

The ability to manage risk will help companies act more confidently on future business decisions, in fact, their knowledge of the risks they are facing will give them various options on how to deal with potential problems; therefore an important role, in the risk prevention, is given to the risk analysis.

Thus, we can state that risk is the consequence of a specific incident, specified by their severity level and by the probability of occurrence. Determining these two factors is not easy and it is often affected by subjectivity of the analyst.

In aviation, the document ICAO-Doc 9859 (Table 1) defines the extent of damage that could reasonably occur as a consequence or a result of detected risk; Table 1 presents in 5 levels of severity, from highest to lowest, (catastrophic, major, moderate, minor and insignificant) customized according to the system or to the events taken into account.

Therefore, in the ICAO Documents also definitions for each category of severity are given. The probability of occurrence is the most difficult to determine, because the events are not only of a technical nature, such as the failure of a switch, for which there are analytical methods for the estimation of reliability, but often the events are related to the behavior of the operators.

Table 1. Levels of severity, ICAO-Doc 9859.

*Customize according to the nature of product or service provider’s operations.

In order to determinate this possibility, the opinion of experts in the industry is used, but also various methodologies, to assess the human behavior in an incident, have been developed, and therefore the probability of the occurrence of his actions.

Thus, in order to assess the risk probability, this could sometimes be replaced by the frequency of occurrence (Table 2); the first, in fact, being a probability is represented by a number between 0 and 1, while the second is expressed in terms of number of occurrences in a certain time interval (for example once a month, three times a year).

Table 2 represents a matrix of risk probability and risk severity. Also the risk probability is listed into 5 levels (frequent, occasional, remote, improbable, and extremely improbable).

The combination of frequency (probability) and severity leads to the definition of the risk matrix: depending on where the risk of the event is placed, this may be acceptable (1A, 1B, 1C, 1D, 1E, 2D, 2E, 3E), or may require intervention (2A, 2B, 2C, 3B, 3C, 3D, 4C, 4D, 4E, 5D, 5E) or may be unacceptable (3A, 4A, 4B, 5A, 5B, 5C). In order to defined the risk matrix, the tolerance can be used; this curve in fact defines the area within the risk is tolerable and the area within the risk is no longer acceptable. The importance of a tool like the risk matrix, is highlighted by the fact that it represents a fundamental means for the risk assessment in perspective or retrospectives analysis.

The risk management includes, calculating the risk, also the capability of an event prediction and of a containment of the consequences, through training, information and organization.

Every individual is exposed to different information, coming for example from the senses, from the memory, from the interpersonal relationships, from the age and from the social contexts.

Through the cognitive process, each individual processes this information, constructing a representation of reality. The perception of reality is mediated by a social structure and also the process to perceive the risks works in the same way.

Table 2. Risk probability and severity, ICAO-Doc 9859.

So now, it is necessary to make a distinction between objective and subjective risk. An objective risk in fact could be is considered using mathematical calculations (probability), while the subjective shall be based on the perceptive capabilities, since it tends, indeed, to focus primarily on specific risks [2] .

Thus, the risk can be considered a quantifiable concept, in the same way the objective probability of an event is calculated and it could change, following the change of objective criteria related to the event itself or to the environment.

On the contrary, the risk becomes subjective, i.e. “uncertainty”, when there are events that are not only classified by the objective probability but are instead related to subjective probability, in the forms of individual “degree of belief”.

In this case, then the risk expresses the perception that individuals have, changing aspects of the environment, since they do not have access to complete information and should develop hypotheses and associate each of them to a chance of occurrence by using probability theory.

To better understand what has been said, we can define the objective risk as a scientific calculate, given by the severity of the damage and to the possibility of occurrence; while we can classify the subjective risk as a perception of a possible danger that does not correspond to what is mathematically calculable, but relies on other parameters, related to the characteristics, to the experiences, to the environment of the individuals [3] .

3. The Role of Human Factor

We can state that the subjective risk is different for each individual; in fact someone may perceive the same risk as negligible, others as acceptable, others as tolerable or completely inacceptable.

A risk may be accepted or rejected depending on several factors; the voluntary exposure to a risk, an inability to keep it under control, the unfamiliarity with the situation or the environment or an analogous recent experience, will amplify the perception of a given risk; while the confidence with the environment, the idea that exposure to the risk is voluntary, an advantage that may occur, or the idea that risk could be controlled, decreases this perception [4] .

The studies about the risks, therefore are centered on social psychology and perception of risk and on the voluntary exposure to a risk. So, the risk assessment can be seen as a process whose purpose is to provide a formal representation of the possibility of damage related to a given risk.

The evaluation, therefore, aims to allow taking a decision on the basis of a clear representation of the damage that a system can cause.

The management is also very important to avoid the risk and it is connected with the public acceptability of the risk itself, based on the trust that people have in anyone who runs the risk and on the revealing that a risk becomes more acceptable when it is connected to benefits.

In complex organizations, as the aeronautical organizations, human resources are the most critical factor.

The analysis of cognitive processes that underlie decision-making process, constitutes one of the principal areas of evaluation in risk management and, similarly, the operating methods should be considered very carefully. Among the cognitive processes that can result in errors, the “decision” plays a fundamental role, namely the decision to take an action, among several options considered, by an individual or group (decision maker).

The process leading to the decision shall be identified is the “decision making” and it becomes so vital to have a “situational awareness” achievable through the sharing of information in the possession of the individuals that will take them to the best choice or option [5] .

The decision making is always connected to the objective to achieve and it is represented by a suitable choice of alternatives among a range of options. But, when the individual (the human resources) has to decide, often he can face a problematic situations, i.e. a large range of alternatives.

The differences between the alternatives will not be expressed in terms of right or wrong but in terms of probability of success or of failure. When choosing an alternative, the risk perception comes into play, since an option that in normal conditions would be discarded, can become the chosen option (accepting high levels of risk) if the decision-makers are subjected to high stress situations. The solution to prevent errors, is located in the knowledge of these mechanisms, in the ability to confront and to observe all the signals, in order to take the most appropriate decision, reducing the chance of error [6] .

With these objectives, in the professions that involve a high percentage of risk and where human error can have serious consequences, as aeronautical organizations, procedures have been set up to develop the right attitude to the risk management and to reduce the risk itself.

4. Errors in Complex Organizations

Aviation safety to be effective, must be able to prevent the occurrence of any incident or, more realistically, as much as possible. We have to state that the aviation safety does not represent a “freedom from danger and risk” but, more correctly a situation where the risk is limited and acceptable.

In fact, the total absence of risk in aviation is virtually impossible, since it would mean paradoxically that “the safest flight is that one will not depart”. This statement requires an analysis on what can be defined as a risk and on who is responsible for its assessment.

If, in fact, we can easily give a definition of risk determined on the basis of a mathematical function, it is not so simple to determinate threshold below which the risk can be tolerated.

The identification of this threshold is up to the political and economic power and the guarantee the people immunity and health is the end up to aspire. However this provision of suitable instruments often come into conflict with economic interests, given the high cost of a highly secure and efficient system.

It is therefore fundamental to have a cultural campaign aimed at raising awareness on all issues relating to security, for example highlighting how the economic damage caused by a plane crash, is potentially far greater than the costs required for its prevention.

A significant number of studies refer to the risk analysis and the human factors. For example, Reason (1990), analyzing the human factor as a risk factor in the complex systems or organizations, noticed that the operational errors of people involved in the front line in the event of a disaster, were not directly determined by the people themselves but by events that were generated on silent leaks, also presents for long years in the system itself [7] .

Starting from this observation, he proposed a model of analysis of errors that was inspired, metaphorically, to the appearance of Swiss cheese. This model (known as the “Swiss cheese model” or “Latent error theory”) is based on the research of latent conditions within the random sequence of events.

As we have seen, each incident or accident is generated by the strong interconnection of active and latent errors.

The active errors can be detected as mistakes and as deliberate violations of various kinds, with immediate effect. They can be identified in the first line activities (Front Line), i.e. those functions that allow the organization to establish a direct connection between user and client.

The latent error can reflect the decisions and/or actions, that remain silent even for a long time and that become visible only, when combined with local factors, break or exceed the system defenses and cause the incident.

We can state, therefore, that risks did not result from the actions of an individual but from the accumulation of latent conditions within the managerial and organizational spheres.

Reason therefore has represented clearly and precisely the meaning of “organizational failure”, understood as a situation in which an error could occur, regardless of the person involved, using the metaphor of the Swiss cheese (Figure 1) [8] .

Figure 1 exhibits multiple slices of Swiss cheese, stacked side by side, in which the risk of a threat becoming a reality is mitigated by the differing layers and types of defenses which are “layered” behind each other. In this model, an organization’s defenses against failure are modeled as a series of barriers, represented

Figure 1. Reason’s Swiss cheese model [9] .

as slices of the cheese. The holes in the cheese slices represent individual weaknesses in individual parts of the system, and are continually varying in size and position in all slices. The system as a whole produces failures when holes in all of the slices momentarily align, permitting “a trajectory of accident opportunity”, so that a hazard passes through holes in all of the defenses, leading to an error or a failure of the organizational system.

5. The Überlingen Accident

Bashkirian Airlines, a former Russian airline, flight 2937 was a chartered flight from Moscow, Russia, to Barcelona, Spain, carrying sixty passengers and nine crew.

DHL, an American cargo airline, flight 611 was en route from Bergamo, Italy to Brussels, Belgium. On the night of 1 July 2002, both aircrafts were flying at flight level 360 (10,973 meters; 36,000 feet).

On the night of 1 July 2002, both aircrafts were on a collision course. The airspace was controlled from Zürich but only an air traffic controller handling, was working two workstations at the same time. Partly due to the added workload, and partly due to delayed radar data, he did not realize the problem in time and thus failed to keep the aircraft at a safe distance from each other. Less than a minute before the accident he realized the danger and contacted Bashkirian Airlines flight, instructing the pilot to descend to a lower flight level 350 to avoid collision with crossing traffic (DHL flight). Immediately, the Russian crew initiated the descent. Bashkirian TCAS-traffic collision avoidance system instructed them to climb, while at about the same time the TCAS on flight 611 instructed the pilots of that aircraft to descend.

We could state that if both aircraft had followed those automated instructions, the collision would not have occurred.

Flight 611’s pilots on the Boeing jet followed the TCAS instructions and initiated a descent, but could not immediately inform the Zürich air traffic controller, because the controller was dealing with Flight 2937. About eight seconds before the collision, Flight 611’s descent rate was not quite as rapid as the range advised by that jet’s TCAS; in the meantime as for the Bashkirian pilot disregarded his jet’s TCAS instruction to climb, having already commenced his descent as instructed by the controller. Thus, both planes were now descending. Unaware of the TCAS-issued alerts, the air traffic controller repeated his instruction to Flight 2937 to descend, giving the Tupolev crew incorrect information as to the position of the DHL plane. Tell in fact them that the Boeing was to the right of the Tupolev when it was in fact to the left. Eight seconds before the collision, Flight 2937’s crew finally realized the problem when they gained visual sight of Flight 611 incoming from the left. Flight 611, in response increased its descent rate. Two seconds before the collision, Flight 2937’s pilots finally obeyed the jet’s TCAS instruction to climb and attempted to put the aircraft into a climb, but the collision was now inevitable. The aircraft collided at 23:35:32 local time [10] .

6. Linate Airport Disaster

On 8 October 2001 at Linate airport in Milan, Italy an accident occurred in thick fog, with visibility reduced to less than 200 metres.

A Cessna Citation business jet was instructed to taxi from the western apron along the northern taxiway and then via the northern apron to the main taxiway which runs parallel to the main runway, a route that would have kept it clear of the main runway. Instead, the pilot, due to the fog, taxied along the southern taxi route crossing the main runway toward the main taxiway which lay beyond it.

At 08:09:28, the Scandinavian Airlines MD-87 was given clearance by a different controller to take off from the main airport runway fifty-three seconds later, the Scandinavian aircraft, traveling at about 150 knots collided with the Cessna. One of the four people in the Cessna was killed on impact; the remaining three died in the subsequent fire. The MD-87 lost its right engine and the pilot, attempted to take off, reaching an altitude of approximately 12 meters (40 ft).

The remaining engine lost some thrust due to debris ingestion, and the plane, having lost the starboard landing gear, came down. So, the pilot applied thrust reverser and brakes, and tried to guide the plane through its control surfaces but this was insufficient to halt the jet’s momentum, and it crashed into a luggage hangar located near the runway’s end, at a speed of approximately 136 knots (252 km/h; 157 mph). In the impact, all the MD-87’s crew and passengers were killed. The crash and subsequent fire killed four ground personnel in the hangar, and injured four more.

Linate Airport was operating without a functioning ground radar system at the time, despite having had a new system delivered some years beforehand. The previous system had been decommissioned, but the replacement had not been fully installed. The new system finally came online a few months later. Guidance signs along the taxiways were obscured, or badly worn, and were later found not to meet regulations. After the pilots mistakenly turned onto the taxiway that led to the runway, there were no signs by which they could recognize where they were. When they stopped at a taxiway stop-marking, and correctly reported its identifier, the ground controller disregarded this identification because it was not on his maps and was unknown to him. Motion sensing runway incursion alarms, although present, had been deactivated to prevent false alarms from vehicles, or animals. The ground controller’s verbal directions used terminology to designate aprons, taxiways, and runways, which did not match their on-the-ground signage and labels. Lastly, neither pilot of the Cessna was certified for landings with visibility less than 550 meters (1800 ft), but had landed at the airport anyway a before the fog was increasing [11] .

7. Conclusions

This analysis highlights the dynamics that can lead to critical events, as noted in the two case studies of incidents of Überlingen and Linate; it also points out the importance to reduce and maintain risk at an acceptable level.

Überlingen and Linate, represent two active errors, which originate in more and more latent and organizational errors.

In a complex system, as highlighted before, certain factors can create the conditions to create accidents of several hundred victims, but, as mentioned, the error is inevitable because is component of the human reality and, in each system or organization, some circumstances, favoring the occurrence, can be determined.

It is therefore essential to create conditions that reduce the possibility of mistake and the consequences of an error, when this occurred [12] .

Organizational risk is potential for losses due to uncertainty. It is a term for risk at the top level of an organization that includes materials, strategic, reputational, legal, security and operational risks.

Managing a risk is fundamental for all kinds of organizations. In aviation a risk could generate a fatal accident but is also true that, in other types of organizations, a risk could anyways produce losses, failures or economic crisis not undervaluable.

For an organization, the objective to be achieved is to reduce and/or control risk, becoming able to reduce the probability of an adverse event and to mitigate the consequences of any potential risk [13] .

The first step, therefore, in creating an effective risk-management system is to understand the qualitative distinctions among the types of risks that organizations could face: internal, strategic and external.

The internal risks would not cause severe damage to the enterprise, but in general, they should be eliminated since they get no strategic benefits from taking them (examples are the risks from employees’ and managers’ unauthorized, illegal, unethical, incorrect, or inappropriate actions and the risks from breakdowns in routine operational). Then, the strategic risks represent the risks that a company could voluntarily accept in order to generate superior returns from its strategy. Finally, the external risks arise from events outside the company and are beyond its influence or control. Since the organizations cannot prevent such events from occurring, their management must focus on identification and mitigation of their impact. Risk events, in fact, can be fatal to a company’s strategy and even to its survival in business.

In order to promote a safety culture, organizations should establish a systematic strategy of communication and training requiring a preliminary investigation to identify any leak in the system and to know which specific aspects should be improved. Thus, the risk management thus becomes a priority and all organizations must dedicate resources and incentives for it, also by creating an environment where both the responsibility to report errors (just culture) and the learning from mistakes are encouraged.

Training, in an efficient risk management culture, means analyzing the factors that combine to determine the errors, reducing the risk of them through an education leading to accept them and discuss them.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

Cite this paper

De Andreis, F. and Florio, M. (2019) Risk Management Instruments, Strategies and Impacts in the Complex Organizations. American Journal of Industrial and Business Management, 9, 1157-1167. https://doi.org/10.4236/ajibm.2019.95078

References

  1. 1. Adam, J. (1995) Risk. UCL Press, London, 19-24.

  2. 2. De Andreis, F. (2017) Teamwork e risk management nelle professioni sanitarie. Natan Editore, Rome, 1-10.

  3. 3. Boeing (2012) Statistical Summary of Commercial Jet Airplane Accidents. Worldwide Operations 1959-2012, Seattle.

  4. 4. Nickerson, R.S. (1998) Confirmation Bias: A Ubiquitous Phenomenon Guises. Review of General Psychology, 2, 175-220. https://doi.org/10.1037/1089-2680.2.2.175

  5. 5. Endsley M.R. (1995) Toward a Theory of Situation Awareness in Dynamic Systems. Human Factors, 37, 32-64. https://doi.org/10.1518/001872095779049543

  6. 6. Beck, U. (2003) Risikogesellscaft. Suhrkamp, Berlin, 39-62.

  7. 7. Reason, J. (1990) Human Error. Cambridge University Press, Cambridge. https://doi.org/10.1017/CBO9781139062367

  8. 8. Reason, J. (1997) Managing the Risk of Organizational Accidents. Ashagate, Farnham.

  9. 9. Reason, J. (2000) Human Error: Models and Management. British Medical Journal, 320, 767-768. https://doi.org/10.1136/bmj.320.7237.768

  10. 10. Bundersstelle für Flunfalluntersuchung (2004) Investigation Report of überlingen Accident. 9-17.

  11. 11. Agenzia Nazionale per la Sicurezza al Volo (2004) Final Report. Accident Involved Aircraft Boeing MD-87, Registration SE-DMA and CESNA 525-A, Registration D-IEVX, Milano Linate Airport, October 8, 2011, 1-77.

  12. 12. Helmreich, R. and Foushee, H. (1993) Why Crew Resource Management? Empirical and Theoretical Bases of Human Factors Training in Aviation. In: Wiener, E., Kanki, B. and Helmreich, R., Eds., Cockpit Resource Management, Academic Press, San Diego, 4-53.

  13. 13. Jenkins, D.P., Walker, G.H. Stanton, N.A. and Salmon, P.M. (2012) Distributed Situational Awareness: Theory, Measurement and Application to Teamwork. Ashgate, Farnham.