American Journal of Industrial and Business Management
Vol.09 No.06(2019), Article ID:93169,31 pages
10.4236/ajibm.2019.96089

A Proposal for a Predictive Performance Assessment Model in Complex Sociotechnical Systems Combining Fuzzy Logic and the Functional Resonance Analysis Method (FRAM)

Hussein Slim, Sylvie Nadeau

Department of Mechanical Engineering, École de technologie supérieure (ÉTS), Montreal, Canada

Copyright © 2019 by author(s) and Scientific Research Publishing Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY 4.0).

http://creativecommons.org/licenses/by/4.0/

Received: May 2, 2019; Accepted: June 21, 2019; Published: June 24, 2019

ABSTRACT

Modern sociotechnical systems exhibit dynamic and complex behavior, which can be difficult to anticipate, model and evaluate. The perpetually evolving nature and the emergent properties of such systems require a continuous re-evaluation of adopted safety and risk analysis methods to comply with arising challenges and ensure successful performance. One of the interesting methods proposed in recent years is the Functional Resonance Analysis Method (FRAM). FRAM adopts a systemic perspective to model sociotechnical systems characterizing non-linear relationships and quality of outcome arising from performance variability and functional resonance. This paper aims to further improve the framework and expand the spectrum of features provided by FRAM through the integration of fuzzy logic. Fuzzy logic offers adequate mathematical tools capable of quantifying qualitative concepts and uncertain information applying comprehensible inference systems based on human judgement. An example of a possible application scenario is included through a simulation of aircraft on-ground deicing operations. The preliminary results of this project present an approach to generate numerical indicators for the quality of outputs, which can allow for a more comprehensible representation of potential performance variability. The presented model, however, requires further validation and optimization work to provide more representative and reliable results.

Keywords:

FRAM, Fuzzy Logic, Aircraft Deicing, Safety Assessment, Performance Variability

1. Introduction

The dominant view in science before and at the beginning of the 20th century was that of mechanistic reductionism, which considered any system to be reducible to its parts and understandable in terms of mechanisms [1] . While this approach might be valid in case of inanimate objects, it becomes inadequate as soon as living objects are involved [2] . The term “sociotechnical system” refers to a complex operational system, which consists of interactive social and technical components [3] . The social aspect of a sociotechnical system refers to humans as individuals or organizations. The technical aspect on the other hand refers to any form of technicalities such as technological systems and devices, tools, resources or any equipment needed to execute the systemic functions. As a result of this definition, the concept of a sociotechnical system applies to the majority of complex systems in the world today and cuts through all domains and fields such as education, healthcare, economy, etc. [3] . Such systems are too complex for evaluation by simple methods and are hardly treatable by statistical methods since they possess organized behavior.

Modern sociotechnical systems are open systems embedded in their environments [4] . They consist of a large number of interactive components, whose behavior collectively characterizes the emergent properties of the whole system. The interactions among the system components and between the whole system and its environment are determinant for the success or failure of the system performance [5] . Those interactions can be linear causal relationships, which are mostly considered in the designing process; or they are non-linear and dynamic complex relationships, whose implications are difficult to predict in the designing process. Diagnostic problems, lack of information and the inability to determine theoretically how the different system parts would interact can result in overseeing possible risks and adverse implications [6] . The behavior of individuals or groups can change in response to emergent conditions. The emergence of successful performance depends on the level of understanding and the capability to manage variability and uncertainty [6] .

The work environment of aircraft ground deicing/anti-icing operations forms such a complex sociotechnical system, in which man and machine collaborate to perform a specific task. The influential factors that affect the quality of the system’s performance are variable. Operations are conducted in a dynamic and fast-paced environment under strict temporal constraints and in harsh meteorological conditions [7] [8] . The workload can be demanding and might affect workers psychologically and physiologically [9] [10] [11] [12] . Human factors as age, sex, physical strength, knowledge level, experience and others are significant factors that characterize the performance of workers [10] . Organizational factors as providing adequate guidelines and resources are essential for an appropriate execution of the procedures in force [13] . Inadequate equipment and instructions might prevent the workers from performing their tasks correctly who then tend to adjust their performance to cope with the shortage, e.g. the Scandinavian Airlines flight 751 crash in 1991 [14] . The provision of adequate management and monitoring processes ensures the execution of the procedures as designed. The provided training programs, salary systems, planning of working and rest hours, etc. are all among the factors that might have an impact on the deicing operations [9] . Precision and caution are continuously required. Coordination and communication between many parties across different organizational hierarchies and departments are necessary in a clear manner [8] . Imprecise communication can cause loss of time, affect performance and might even cause accidents on the ground; e.g. the accident of Royal Air Maroc in 1995 at the Montreal (Mirabel) International Airport, Quebec [15] . Those and many more factors shape the working environment of deicing operations to be a highly dynamic and complex one.

As is the case in aviation generally, deicing operations are high-reliability organizations. Operational procedures are formulated and executed in a strict manner to ensure safe operations. The number of accidents and incidents is low in aviation in comparison to other systems. The trend in aviation over the years shows a continuous improvement in performance and safety measures. In Canada, most deicing operations at large airports nowadays take place in centralized deicing pads [16] . The utilization of centralized deicing pads facilitates the simultaneous deicing of multiple airplanes, which requires precise coordination and clear communication between the flight crew, Air Traffic Control (ATC), the deicing team and the deicing tower [16] . Most incidents related to inadequate ground deicing activities occurred in the takeoff and climb to cruising altitude phases [17] . The most critical period, in which incidents happened, was between December 10 and January 10 [17] . Despite having the largest air traffic volume in Ontario, only 5.3% of the incidents occurred there [17] . The highest rate of incidents occurred in Quebec and British Columbia (each 26.3%) [17] . Smaller airports and smaller aircraft types are more frequently involved with ground deicing accidents or incidents than larger airports and larger aircraft [17] .

Improving a reliable system that really works well can be difficult, since the possibilities for things to go wrong are limited and not immediately obvious. High reliability translates into a limited amount of data for analysis due to the rarity of adverse events that can provide conclusions and data for analysis and evaluation. Despite the high safety standards and high reliability of such a system, the need to evaluate and improve does not diminish. The continuous developments of applied technologies and the evolving nature of complex systems necessitate a continuous evaluation of the state of the system to maintain desired reliability and safety levels. New perspectives become necessary to cope with changes and relying on traditional analysis methods solely could be insufficient. To the best of our knowledge, research in the area of aircraft deicing from a systemic perspective considering human factors (individual and organizational) is rare [11] [18] [19] . Studies mostly aim at determining optimal operational conditions and technical requirements for maintaining and advancing deicing procedures. Classical safety and risk analysis methods reduce the scope of the analyses to simple basic tasks to identify problems in specific parts of the system and evaluate the reliability of its components. However, understanding that sociotechnical systems are emergent and complex by nature, a more holistic approach becomes necessary to understand the system’s performance in its entirety. Knowing that most accidents are caused by the human factor, this aspect cannot be neglected or assigned less significance than technical and operational aspects.

Adopting a systemic approach would require the consideration of the above-mentioned factors, which is easier said than done. First, the scope of the analysis must be wide enough to allow for a systemic evaluation, which increases the number of considerable variables and thus the complexity of the analyzed context. Secondly, such factors can hardly be measured quantitatively and are best represented in terms of qualitative linguistic values. The high reliability and the low number of accidents and incidents in aviation generally, and in deicing specifically, make the composition of quantitative analyses more difficult. Some evaluation parameters and factors in the deicing context can be difficult to quantify. Linguistic scales present only an approximate evaluation of the observed variables, which results in imprecise and uncertain analysis results. Humans can have different concepts of the same linguistic terms and might therefore evaluate the significance of the measured variables accordingly.

This study is part of a years-long research program [6] [8] [9] [10] [11] [13] [16] [17] [20] [21] . The Functional Resonance Analysis Method (FRAM) is recommended in this paper as an adequate systemic analysis method, which can provide a fresh perspective in complement to classical analysis tools. In Section 2, we discuss two approaches to safety (Safety-I & Safety-II) and argue why adopting a Safety-II approach is necessary for the assessment of complex systems. The principles and steps of FRAM are shortly presented in Section 3 to illustrate its features and advantages. In Section 4, a review of several recent studies proposing improvements to the framework of FRAM is presented. The focus in this paper is mostly directed to the theoretical aspect and the main objective is to propose a possible approach for the integration of fuzzy logic into FRAM as a means of quantification. A brief overview of fuzzy logic and its features is provided in Section 5 and the proposed methodology is presented in Section 6. In Section 7, an application example is presented by modelling and simulating the aircraft deicing context in the FRAM Model Visualizer (FMV) and MATLAB. The influential factors were evaluated on a scale between 0 and 10 to anticipate possible variability in performance. The results of the simulation presented numerical quantifiers for the quality of functional performance, which can point to possible variability sources in the analyzed system. Finally, the obtained results are discussed to evaluate what conclusions one might draw and reflect on possible future research to improve and validate the proposed model. In the following section, we make the case why adopting a systemic approach is necessary to ensure safety in aircraft deicing or any form of sociotechnical system.

2. From Safety-I to Safety-II: The Case for FRAM

Safety can be defined as “the system property or quality that is necessary and sufficient to ensure that the number of events that could be harmful to workers, the public, or the environment is acceptably low” [22] . Historically, the focus in traditional risk and safety management aimed at identifying what can go wrong and lead to adverse outcomes. Accident analyses and annual statistical reports concentrate on what went wrong (losses in lives and material, root causes, adverse conditions, etc.) and as a result measures to prevent the occurrence of such events in the future are adopted. The dominant view in safety management to focus on adversity comes from the human need for certainty, to feel free of harm (psychological) and to be free of harm (practical) [22] . This approach proved to be successful so far considering that, the number of accidents and fatalities is continuously declining on a yearly basis [22] . However, as the number of accidents and incidents continues to decline, the way ahead into the future to maintain desired safety levels and further reduce the number becomes more difficult (since what goes wrong would also decline). Further insights might be needed to further improve the designed systems and maintain acceptable safety levels.

As a consequence of the human need to be free of unacceptable risk, safety was defined as a “dynamic non-event” [22] and was therefore evaluated as a result of its absence rather than as a quality itself [23] . The occurrence of an unwanted event was explained in terms of linear causal relationships, which defined the outcome as the direct result of errors, failures or inadequate circumstances. This philosophy defined what is known as the Safety-I approach.

Safety-I takes a simple-system approach to analyzing systems. Simple systems are characterized by linear causal relationships and predictable behavior [1] . Systems in Safety-I are decomposable to their parts and the relationships among those parts are well defined and understood [23] [24] . The design process accounts for any type of risks that might occur and work is usually executed as imagined. However, upon examining the characteristics of complex systems, one would inevitably conclude that the above-mentioned characteristics do not apply.

Complex systems are self-organized distributed systems, which are open to their environments [1] . The behavior of a complex system as a whole is non-deterministic and can be difficult to anticipate [25] . The relationships among the system components are mostly non-linear, which can cause the outcomes to be unproportional to the inputs [1] . Complex systems are irreducible to their parts without losing their functional properties [25] . The elements or parts of a complex system are interdependent and any change caused by one part of the system can have effects on other parts of the system. The elements of a complex system separately do not show the same properties as the complex system as a whole. Only when put together, the collective behavior and properties of the whole system emerge due to the interactions of those elements with or without external influence (self-organization). Due to this dynamic and fluctuating nature, the function of the whole system cannot be understood from simply and solely understanding the functions of its sub-parts [25] . The result is the inability to precisely predict and analyze the behavior of complex systems, which presents a barrier in the face of system management and development.

Successful systemic performance depends on the level of understanding of the relationships among the components and the capability to manage variability. Focusing on one aspect or analyzing each aspect separately without considering the emergent and complex properties of a sociotechnical system would not provide a complete picture of the system status. Systems have become so complex nowadays that only the domain experts are still capable of understanding their aspects and behavior [26] . The large scale and increased complexity of modern systems along with the introduction of new types of hazards and risks resulted as well in adding up to the severity and cost of failures [26] . Relying on traditional analysis methods, in which accidents and adverse outcomes are explained in terms of single errors, component failures or root causes, would not be sufficient to explain the behavior of complex sociotechnical systems entirely [26] [27] . The scope of such an analysis would be limited to evaluating causal and linear relationships, while the emergent properties of a complex system and the resonance of dynamic non-linear relationships would not be covered. New systemic and holistic approaches are required considering social factors in addition to mechanistic relationships. Those approaches shall consider non-linear and dynamic relationships in addition to linear and sequential ones. Only by considering the properties of a complex sociotechnical system and looking at it holistically, a complete and comprehensive evaluation can be provided. Adopting such an approach would allow for a better understanding of the characteristics and behavior of the system in question, which is necessary whether for design and development purposes, performance evaluation or safety and risk management.

An alternative approach would be to focus additionally on “what goes right” i.e. the conditions of the system in question that ensures risk-free and optimal outcomes [22] . Changing the definition of what composes an event, safety can be defined as a dynamic event as well [22] . Just as an adverse outcome is an event, a successful outcome is an event as well. Things that go right to ensure that a task is carried out as intended should be considered. Upon examining actual performance and evaluating why things work out in practical applications, one eventually notices that the actual execution deviates from the foreseen procedures. This is the difference between “work-as-done” and “work-as-imagined” [22] . The task specifications from the theoretical or procedural end prescribe what and how things should be done, while actual applications differ depending on the context of the application. Local adjustments are necessary each time to ensure that a function is executed successfully. This is due to the underspecified and partial understanding of reality at the procedural end [28] . Human thinking approximates and summarizes information in form of labels, words and sentences to extract relevant information for the intended purposes [29] . The approximation of reality misrepresents its true nature, however, for human purposes, such approximations are sufficient to perform most of the tasks and functions that do not require a high degree of precision [29] . The human brain exploits this tolerance for imprecision and acquires only relevant information, which can construct a model that resembles the true nature of the phenomenon in question and describes the required features that are necessary to perform required tasks [29] . The brain thus limits the amount of information received through the human senses to a level, at which it can process the acquired information. Traditional analysis methods fail to capture the fuzziness of human reasoning and behavior. They are therefore inadequate to analyze humanistic systems [29] . A shift in perspective is necessary and adopting a more holistic and systemic approach is required. The flexibility and local adjustments in performance are an essential factor for the success of applications. Performance variability is natural and even required to comply with real-world conditions that were not covered in the procedures. The shift in perspective requires, therefore, looking proactively at what goes right in addition to what goes wrong. This approach is known as Safety-II [22] .

In summary, to avoid falling behind and cope with the growing complexity of modern sociotechnical systems, a shift in perspective is needed. Complex systems have to be considered as a whole to better understand the functional relationships of the systems in question. In addition to looking at what goes wrong (Safety-I), one should look at what goes right as well (Safety-II), especially in the case of highly reliable systems and the absence of statistics and sufficient data.

3. The Functional Resonance Analysis Method (FRAM)

FRAM was introduced by Erik Hollnagel in 2004 as a systemic accident investigation method. “The Functional Resonance Analysis Method describes system failures (adverse events) as the outcome of functional resonance arising from the variability of normal performance” [30] . The performance of a sociotechnical system is never carried out in reality as imagined or designed. Operational deviations from procedures are normal and are sometimes required to perform successfully. The variability of performance depends on the contextual conditions present at the time of execution, which results in altering the application each time the same procedure is carried out. FRAM analyzes systems in terms of functions and examines how the functional variability can resonate within the system to produce successful or failed outputs. The advantage in contrast to classical analysis methods is the capability to analyze dynamic nonlinear relationships and provide a more holistic approach. FRAM relies on four principles:

• Equivalence of success and failure;

• Inevitability of approximate adjustments;

• Emergence of consequences;

• Functional resonance.

The reader is advised to consult the website of FRAM for a more detailed presentation of the features of FRAM (http://www.functionalresonance.com/).

The application of FRAM consists of five steps: Objective, Functions’ Identification, Variability Characterization, Functional Resonance and finally Variability Management. The five steps will be discussed briefly in the following subsections.

3.1. Step Zero: Objective

The objective of the FRAM application has to be determined, whether the objective is to perform an accident investigation (reactive) or a safety and performance assessment (proactive).

3.2. Step One: Identification of Functions

The functions that compose the system have to be defined and characterized. FRAM functions are objectives or tasks to be achieved by the system in question. They are characterized in terms of six aspects: input, preconditions, time, control, resources and output (Figure 1). The characterization of the functional aspects defines the functional couplings and potential variability among the functions through linking the outputs of upstream functions as inputs for the downstream functions.

3.3. Step Two: Variability Characterization

The performance variability of the functional outputs has to be identified. The basic FRAM model characterizes variability in terms of time and precision using a qualitative three-point scale for each attribute (Table 1).

3.4. Step Three: Identification of Functional Resonance

A specific analysis scenario or instantiation can be used to evaluate the influence of variable functions on other functions and the overlapping or resonance of those influences through functional couplings to result in adverse or successful outcomes (Figure 2).

3.5. Step Four: Management of Variability

The final step in FRAM is to identify countermeasures for variability management to design a more resilient system, ensure adequate performance and provide desired outcomes.

Figure 1. A graphical representation of a FRAM Function [27] .

Figure 2. A graphical representation of a FRAM model depicting different function types [27] .

Table 1. Characterization of variability using linguistic labels [27] .

4. FRAM’s Applications and Evolution

Since its introduction, FRAM’s usefulness was demonstrated through many applications in many fields as in construction [31] , manufacturing [32] , healthcare [33] , railway systems [34] , and mostly in aviation [35] [36] [37] [38] , etc. The early applications of FRAM mostly were conducted in a retroactive manner as an accident investigation method, which was indicated in the original naming of FRAM as the “Functional Resonance Accident Model” [39] . In retroactive analyses, real events are usually evaluated. The parameters and data for the event in question are known [40] . Existing work conditions can be monitored to evaluate the state of the system and lessons from past events can be learned to improve safety measures [40] . Proactive applications are different in so far that they require creative thinking and imagination to anticipate what might happen and estimate the likelihoods for the occurrence of desired or undesired outcomes [40] . Due to the capability of FRAM to provide an understanding for the evolution of accidents and therefore the possibility for proactive applications, the acronym “FRAM” was redefined and changed to the “Functional Resonance Analysis Method”.

FRAM is beneficial when dealing with contexts that are of qualitative nature, which can be difficult to quantify. The main advantage of FRAM remains the ability to account for complexity in the studied systems and to analyze nonlinear dynamic relationships among functions. Precise data for such contexts can be lacking due to their inherent complexity and the nature of the evaluated factors. The reliance on qualitative linguistic scales enables the analyst to evaluate contexts, in which data are missing or uncertain or the variables are hardly measurable numerically. However, one issue of this approach is that it does not provide a precise magnitude of the examined variables. The perceptions and definitions of the same linguistic scales as “imprecise” or “too late” can differ from one person to another. Adding quantification tools would allow for a more comprehensible representation of variability in terms of numerical values. As remarked by Hollnagel, in order to realize safety objectively and practically, it is important to validate the existence of safety through “intersubjective verification”, i.e. different parties should be able to confirm that their definitions and understanding of safety are matching [22] . This can be achieved through quantification. People have different interpretations for the meanings of such expressions as “harmful” or “low” and these differences become significant when it comes to qualitative safety and risk assessments. It is important to define what is meant with the used expressions and terminology to ensure conformity in the understanding of the provided results.

The basic FRAM method evolved over the years and many improvements were proposed to provide more precise analysis results. Many studies addressed several limitations of FRAM related to the absence of quantification means. One of the first studies to propose an improvement to the framework of FRAM was conducted by Macchi [41] . Macchi addressed three limitations of FRAM: the representation of variability as a result of local adjustments in performance to comply with requirements; the differentiation between performance variability of heterogeneous functions according to the MTO (huMan-Technology-Organization) classification method; and finally, the generation of a single numerical representation to aggregate the scores of the eleven Common Performance Conditions (CPC) into one value [41] . Macchi [41] combined the qualities of the “precision” and “timing” phenotypes to produce a single quality of the functional output. The impact of the nine possible qualities on performance was rated numerically applying an ordinal scale between −3 for highly variable and +3 for highly dampening. A median value is then calculated to generate a single numerical quality value for the output. The improved methodology was then applied to evaluate a landing approach in Stuttgart examining the impact of the introduction of the Minimum Safe Altitude Warning (MSAW) system to Air Traffic Control (ATC). The proposed methodology applying an ordinal scale and calculating a median value for the output simplifies reality as acknowledged by Macchi, which can be efficient in practical applications. The proposed model was an important first step for the improvement of FRAM. Another limitation of the proposed methodology is assuming that the impact of the functional aspects on the output is the same. We believe that those limitations can be addressed appropriately using fuzzy logic.

Rosa et al. [31] proposed a methodology merging FRAM with the Analytic Hierarchic Process (AHP) relying on experts’ knowledge. Questionnaires were directed at the experts to provide a numerical ranking (ratio scale) based on comparisons between pairs of criteria [31] . The AHP and the pairwise comparison approach have the disadvantage of not handling the vagueness in judgments for transforming linguistic scales into numerical scales very well [42] .

A recent and significant study for the evolution of FRAM was published by Patriarca et al. [23] proposing a different approach. Patriarca et al. proposed a semi-quantitative approach based on the Monte Carlo simulation [23] . Numerical scores were assigned to each performance state of the two criteria: precision and timing. A higher score indicated a higher variability. The variability of the output of a given function was defined as the product of the two scores. To determine the effect of the couplings between upstream and downstream functions, two amplifying factors in terms of timing and precision was defined for each coupling separately (a < 1 amplifying; a = 1 neutral; a > 1 dampening). The effect of the performance conditions (abbreviated SPC) for each scenario or instantiation of the analyzed system was considered as well defining a factor on a rating scale between 0 and 1 (b = 0 for no impact; b < 1 for moderate impact; b = 1 for high impact). A matrix consisting of the set of possible scenarios of the system in question and their respective effect was constructed and the resulting conditional variability e j of any output was formulated as

e j z = max { 1 ; k = 1 m S P C z k b j k m } .

The variability for each coupling ( V P N i j z ) therefore was calculated as the product of the output’s variability (Timing Variability V j T & Precision Variability V j P ), the amplifying factor for each coupling ( a i j T & a i j P ) and the conditional variability e j z and the formula looked as follows:

V P N i j z = V j T V j P a i j T a i j P e j z .

To avoid misrepresenting the status and behavior of the system by using static scores, discrete probability distributions were instead utilized to provide a better representation of functional variability. Accordingly, the resulting product in the final formula above becomes through the Monte Carlo simulation a probability distribution as well. The developed methodology was then showcased through the application on a case study evaluating the Air Traffic Management (ATM) system. The proposed framework by Patriarca et al. marks an important development in the evolution of FRAM towards validation as a complementary tool to classical analysis methods. Rather than simply providing a simple numerical output, probability distributions are provided to assess variability. The applied Monte Carlo method relies on statistical data analysis to generate those distributions, which usually requires large data samples to run a large number of iterations. This makes the generation process of those values unidirectional since the sampling process is random.

A different approach to add quantification to FRAM can be achieved through the integration of fuzzy logic and the creation of a rule-based fuzzy inference system. The relationships between inputs and outputs can be characterized through the If-Then rules. Different weights and impacts can be associated with each quality class for each variable. The concept of fuzzy granulation and use of linguistic variables is a unique feature of fuzzy logic [43] [44] . Relying on linguistic variables becomes necessary when the “available information is too imprecise to justify the use of numbers” or there is a tolerance for imprecision, which can be exploited for better outcomes [45] . The reliance on linguistic variables allows for the quantification of qualitative expert knowledge in the form of natural language and consequently the design of comprehensible analysis models.

In this article, we explore a possibility to address this issue through the integration of fuzzy logic into FRAM as proposed by Hollnagel [27] .

5. Fuzzy Logic

Fuzzy Logic is based on the Fuzzy Set Theory [46] , which is a generalization of classical set theory. In classical set theory, elements either belong to a set or do not belong; they are either true or false [47] . In fuzzy set theory, elements can belong to more than one fuzzy set with a certain degree of membership or truth [47] . The characteristics of fuzzy sets are defined through the generalization of the usual characteristics of classical sets:

Let A be a fuzzy set and μ A is the membership function characterizing the fuzzy set A.

A then can be defined as: A = { x , μ A ( x ) | x A , μ A ( x ) [ 0 , 1 ] } with μ A : X [ 0 , 1 ]

A fuzzy set A is therefore a collection of ordered pairs ( ( x , μ A ( x ) ) , where μ A ( x ) is the degree of membership of x in A.

Some basic operations of fuzzy sets are listed below as next:

Union of two fuzzy sets: C = A B = μ C ( x ) = max [ μ A ( x ) , μ B ( x ) ]

Intersection of two fuzzy sets: C = A B = μ C ( x ) = min [ μ A ( x ) , μ B ( x ) ]

Compliment of a fuzzy set A : A = 1 μ A (x)

The application of the fuzzy logic methodology consists of three steps: Fuzzification, Inference and Defuzzification [48] .

5.1. Fuzzification

A linguistic variable in fuzzy logic can belong with a certain degree of membership to a fuzzy set, which represents a label or a class of objects with specific characteristics [29] . The range of values that a linguistic variable can possess is defined as the universe of discourse, which is partitioned to multiple linguistic classes, i.e. fuzzy sets [48] . The first step is to fuzzify the input data through the assignment of membership degrees to the defined linguistic variables [48] . The transition between membership and non-membership is gradual and not abrupt as in classical logic. The degree of membership for an element in a fuzzy set can be any value between zero and one. The degree of membership is determined with the help of a curve called the membership function, which can have many shapes depending on the nature of the variable (triangle, trapezoid, S-shape, etc.) [49] (Figure 3).

5.2. Inference Process

The most used fuzzy inference processes are the Mamdani Inference model [50] and the Sugeno Inference model [51] . The Mamdani model is more interpretable and adequate for handling qualitative knowledge and generating fuzzy rule-based expert systems, while the Sugeno model is more adequate for mathematical analysis. In this study, the Mamdani model will be used, since it is more intuitive and suitable for human input. After defining the linguistic variables and the respective membership functions and ranges of values, a rules base has to be generated. The conditional rules (IF-THEN rules) shall characterize the relationships between the inputs and the outputs. The rules are comprehensible, since they are written in natural language. For example, IF input is precise, THEN output is on time. The input and output are two linguistic variables, which have the values “precise” and “on time” respectively. The two values are labels for two fuzzy sets with the same name. The conditional statement or rule describes a simple relationship between the two variables “input” and “output”. The number of rules depends on the number of variables and respective classes. Different weights can be assigned to the rules depending on their significance and influence on the output. The inputs or antecedents will be linked to each other applying fuzzy logical operators such as AND, OR, or NOT. After the formulation of the rules, the implications for each rule will be determined. In the implication process, the results for each fuzzy rule will be transformed into an area value in the membership function of the output. The calculation method of the implication area in the output function will depend on the selected operator (AND or OR). In case of the “OR” operator, the union or maximum operation will be used (e.g. max [ μ A ( x ) , μ B ( x ) ] ). In case of the “AND” operator, the intersection or minimum operation will be used (e.g. min [ μ A ( x ) , μ B ( x ) ] ). The obtained implications will be then aggregated to provide one implication for the output in the form of a fuzzy set. For the aggregation process, two calculation methods are mainly utilized: the maximum operation and the summation method. The maximum method collects the highest areas in the fuzzy sets of the results’ implications, while the summation method simply adds up all the received areas for the implications (Figure 3).

5.3. Defuzzification

The final step of the fuzzy methodology is the defuzzification, which means transforming the fuzzy output into a crisp value. Many methods exist for defuzzification, from which one can be selected depending on the characteristics of the needed output (Figure 3) [48] . The center of gravity (COG) method is most common and the output can be determined using the following formula:

Figure 3. The three steps of a fuzzy inference system [48] .

C O G μ A ( x ) = μ A ( x ) x d x μ A ( x ) d x (1)

Fuzzy logic can be an appropriate method to quantify uncertain and vague contexts, in which linguistic scales are the only possibility to measure the variables of interest [46] . Human reasoning does not rely primarily on numbers, rather on linguistic variables, whose possible values are words or sentences in natural language [29] . Zadeh argued that, due to the principle of incompatibility, the application of traditional methods to analyze humanistic and complex systems could not be successful as with pure technical or mechanistic systems. The principle of incompatibility states that whenever the complexity of a given system increases, the ability to understand its behavior in a precise manner decreases [29] . In contrast to theoretical idealistic concepts, realistic processes in real life situations are characterized by ambiguity and vagueness. Real life conditions and processes are never as imagined and even in the most precise applications of procedures and regulations; operations and performance always deviate from the norms and defined standards. This deviation is what Hollnagel defines as the difference between Work-As-Imagined (WAI) and Work-As-Done (WAD) [22] . The deviation from the theoretical procedures is not exceptional or abnormal; rather it is an inherent characteristic of real-life conditions. The quantification of the qualitative values in FRAM can be achieved through the fuzzification of the functional aspects and the application of a rule-based Fuzzy Inference System (FIS) to produce numerical outputs for the functions.

6. Methodology

The MTO classification of functions in FRAM distinguishes between three categories of functions: huMan, Technological and Organizational (MTO) functions [27] . This classification serves a practical purpose of simplifying things for the analyst and allows for defining the potential functional variability depending on the type of each function [41] . As noted by Patriarca et al., the implication of this approach is assuming that functions of the same type have the same variability, since the evaluation of variability occurs in a qualitative manner relying on linguistic scales [23] . In reality, the performance of the different types of functions differs depending on their individual characteristics [23] . For example, assigning an “imprecise” quality to the output of two “human” functions does not differentiate clearly how the variability of the two functions is different. There is no obvious distinction between the magnitudes of the two outputs. Additionally, while the expert performing this analysis might understand how the variability of the two functions is manifested in reality, other parties might have different perceptions for the magnitude of the label “imprecise”. The same label might mean different things to different people. Another additional issue is that most functions in reality are not purely technological, human or organizational. Most functions are a combination of the three aspects. A mostly technological function can still have a human aspect, just as a human function can partly be technological. Assigning a function to one category is a generalization, which might limit the consideration of the influential factors on performance.

Despite the drawbacks of this approach, it remains practical and useful. Capturing the precise nature of complex systems is difficult. Our perception of reality as humans is simplified and fuzzy. Simplifications are necessary for modeling reality and providing means of evaluation and control. Therefore, improvements to the current framework of FRAM could overcome the above-described issues without sacrificing the practical advantages of this approach. Fuzzy logic as a mathematical approach capable of computing with natural language and quantifying words can resolve the ambiguity of the outputs and present more comprehensible results. In the following section, we will present a detailed description of the integration of fuzzy logic into FRAM to present a possible approach for the addition of quantification.

The first two steps (step zero and step one) in FRAM remain unchanged: the identification of the analysis purpose and the identification and characterization of the functions. In step two, the performance variability has to be characterized. We can distinguish between two types of variability with respect to the identified functions: an exogenous variability, which is imposed on the function from external sources (other functions) through the functional couplings; and an indigenous or internal variability, which comes from within the function in question and depends on the characteristics and nature of that function [27] . The functional couplings describe the relationships among functions and depict the possible impact of an upstream function on a downstream function. However, there is no clear path to account for how the internal variability manifests in the quality of the function’s output. Therefore, the first step would be to introduce an internal variability factor (IVF), which shall account for the internal variability of each function. The IVF can account for the inherent characteristics and the potential of the function to produce variability affected by present performance conditions, while the external variability is imposed on the function through the couplings with the other functions. Such factors can be the different human characteristics as emotional states, personality traits, attitude, knowledge, physiological and psychological factors, technological features and functionality, organizational climate, etc. In our case here, to determine the internal variability for each function, the Common Performance Conditions (CPC) will be applied to calculate a numerical output. The CPC list can be used to evaluate the influence of the contextual influences on performance (Table 2).

The MTO classification method can be used here to determine which factors affect which functions. Originally, the quality of the CPC was evaluated on a three points scale: Adequate, Inadequate and Unpredictable [41] . The impact of adequate CPCs is small, of inadequate CPCs noticeable to high and of unpredictable CPCs high to very high [41] . In our case, the quality of the factors will be evaluated in two classes as “adequate” or “inadequate”, while the quality “unpredictable” will be represented in the fuzzy rule base. The quality “unpredictable” means that a statement about the status of the CPC in question cannot be presented due to the lack of information or the dynamic nature of the condition itself i.e. a numerical score cannot be assigned. Since a numerical value cannot be plotted for unpredictable factors, adding the class “unpredictable” in this case is not useful. Rather, in case of dynamic or ambiguous conditions, the rule base can be designed in a manner to account for the unpredictability e.g. the quality “none” in the Fuzzy Logic Designer in MATLAB can be selected to represent an unpredictable variable. Additionally, limiting the number to two classes would allow for the limitation of the number of rules. A numerical scale between zero and ten will be used to assign a quality value for each factor (Figure 4). The IVF will be calculated as an internal function for each function applying a fuzzy inference system to produce the numerical output as a result of the quality of

Table 2. Common Performance Conditions & their influence on different function types [41] .

Figure 4. Membership functions of the IVF function in MATLAB.

present Common Performance Conditions (CPC). The range of the generated IVF will be between 0 and 1.5. Values between 0 and 1 account for negative variability that impairs performance, while values between 1 and 1.5 account for variability dampening and performance-enhancing impact (Figure 5). The IVF is then linked to the function as an additional aspect next to the other incoming five aspects from upstream functions, which will be fuzzified to determine the output’s quality (accounts for internal and external variability) of the function.

Macchi [41] addressed the limitations of the CPC methodology stating that “the use of the CPCs seems to be inadequate” to evaluate performance variability due to local adjustments. The CPCs reflect the influence of the context on performance and relying solely on them cannot account for the resonance of variability among functions through their couplings. However, the aim here is to anticipate potential sources of internal variability that comes from within the functions. The impact of the context here is essential and added to the variability due to local adjustments, both internal and external variability can be represented. The list of the influential factors is not necessarily limited to the CPC list. The analyst can adopt any list of factors that he/she deems most relevant for the performance of the function. The list of performance shaping factors in Human Reliability Analysis is long and depending on the context of the analysis, a set of influential factors can be selected.

The second type of variability is the external variability, which can be characterized through the couplings among functions. The outputs of the background functions are invariable, which means a stable output at 100% or one. The output of the foreground functions, which are the direct downstream functions to the background functions will receive only stable incoming aspects from the

Figure 5. Membership functions of the function’s output in MATLAB.

background functions. The outputs will be classified into three classes relying on the classification method of Macchi [41] Macchi combined the accuracy and timing characteristics to determine nine possible quality classes for the outputs [41] . He then plotted the classes graphically to determine the degree of impact on the functions whether it was inducing or dampening variability (Table 3).

Five classes were found to dampen variability (A to E) and four to increase or induce variability (F to I). At this stage of scientific developments, we need to limit the number of classes further to avoid the problem of rules explosion and present a simplified and practical model. Since highly controlled environments as aviation require high accuracy and all functions are to be executed as perfectly as possible, then we hypothesize and consider any dampening output as “Non-variable”, which shall account for positive or neutral impact. The outputs with low and medium variability will be combined and classified as “Variable”. The outputs with high variability will be classified as “Highly Variable”. This would simplify the classification of the outputs and limit the number of rules for the downstream functions significantly. The simplification is not an issue for the interpretation of the output’s quality, since an accurate numerical value for the output is provided (Table 4).

Note that “Variable” in this context refers to the negative deviation of the output from the desired outcome, which is ideally one. A “Non-variable” label accounts for possibly positive impact on performance (Figure 5).

Then, a second higher-order fuzzy inference system relying on the rule base that characterizes the relationships between the incoming functional aspects in addition to the IVF of the function and the output is designed to produce the numerical output for the output’s quality of the function. The number of rules

Table 3. Characterization of the output’s quality [41] .

Table 4. Simplified characterization of the output’s quality.

depends on the number of variables and respective classes. To keep the number of rules reasonable, many solutions can be adopted such as hierarchical fuzzy systems, or the use of genetic algorithms to design the rule base, etc. This would further complicate the design process and would make the application of FRAM difficult and exhaustive at this stage. In our case here, we tried to simplify the model to a degree that allows for the construction of a helpful model with reasonable effort. The simplification however shall not lead to the trivialization of the model. The rule base is helpful in overcoming another issue of FRAM, which is the assignment of weights to the different functional aspects. Different weights can be assigned to the rules depending on their significance and influence on the output. Additionally, weight scores can be assigned to the different labels in the antecedent part of the rule base to determine the implication of each rule and determine the respective consequent label. In our case, the applied implication method was the “MIN” method, and for aggregation, both the “MAX” and the “SUM” methods were applied.

The final step is to defuzzify the output to produce a numerical output. The applied defuzzification method in our case was the centroid method. The calculated numerical value presented a quantifier for the quality of the functional output. The fuzzy FRAM model is now ready for the simulation of deicing operations.

7. Aircraft Deicing Simulation: A Case Study

Looking at “work-as-imagined”, all performance conditions are optimal and the outputs of the functions are non-variable. To provide an application example in our case, a hypothetical scenario was constructed inspired by two deicing-related accidents, namely the Scandinavian Airlines flight 751 crash in 1991 [14] and the Air Maroc accident in Mirabel in 1995 [15] . For our simulation, we will assume the following:

• An international flight is scheduled to take off at a North American airport for a Trans-Atlantic flight provided by an international airliner;

• The pilots of the aircraft to be deiced are not very familiar with deicing procedures;

• Airliner instructions and guidelines provided for the flight crew do not specify clearly communication protocols and inspection procedures;

• The aircraft is to be taxied from the gate to the deicing pad, where two deicing trucks are positioned to perform the deicing operations;

• The weather conditions: temperature around 0˚C and snow showers were present;

• The flight crew was under temporal constraints: the flight was delayed due to weather conditions;

• The organizational performance conditions are not optimal, especially the provision of adequate training and instructions by the Airliner to its flight crew;

• The human or individual performance conditions for the flight crew are impaired: availability of resources, airliner procedures and plans, competence and time pressure.

The five steps for our FRAM model are then as follows.

7.1. Step Zero: Objective Identification

The first step in FRAM is to identify the purpose of the analysis. Our objective is to present an example of a possible way to construct and run a FRAM model integrating fuzzy logic as a quantification method. The selected context for analysis is the context of aircraft deicing operations. The model will be of predictive nature and will not focus on simple basic activities such as move from point A to point B. Rather, the focus will be on more complex tasks to allow for a wide systemic perspective.

7.2. Step One: Definition of Functions

The functions of the model are to be identified. To keep the number of functions, variables and respective rules reasonable, the scope of the analysis will be limited to the deicing activities conducted by the deicing service provider at the deicing pad. The functions will be identified based on knowledge gained through a literature review of deicing reports and research work conducted by our team over the previous years. The background functions will form the boundaries of the model and will provide invariable outputs. The foreground functions will be the focus of the analysis and can produce therefore variable outputs. Totally, there are four background functions and 13 foreground functions. Table 5 presents a list of the functions and their characteristics.

7.3. Step Two: Variability Characterization

The variability of the functions is to be characterized. We start by characterizing the internal variability for each function using the CPC list as explained above.

Table 5. The list of defined functions that constitute the deicing model.

Each CPC is evaluated on a scale between zero and ten to plot its membership to the fuzzy sets “adequate” or “inadequate”. The detailed assignment of scores to each performance condition is listed in Table 6 & Table 7. In practice, the evaluation of each CPC should occur based on expert judgement and in-depth knowledge of the conditions for executing the functions in question. Each CPC can be viewed as a set of influential factors as well. The list of influential factors can be selected based on the analysis context to determine the criteria for

Table 6. The numerical characterization of internal variability for the organizational functions.

Table 7. The numerical characterization of internal variability for the human functions.

assigning the numerical score. For example, the CPC “conditions of work” can include a list of factors that define what constitutes adequate or inadequate conditions. The internal FIS is used to produce the IVF for each function.

7.4. Step Three: Identification of Functional Resonance

The functional resonance is to be determined. The numerical outputs of the upstream functions will serve as incoming aspects for the downstream functions. The incoming aspects will be fuzzified in addition to the internal IVF and their impact on the downstream functions will be determined through the output’s Fuzzy Inference System (FIS) of each function (Table 8).

7.5. Step Four: Variability Management

The final step would be to analyze the received results according to the selected scenario and examine what measurements can be taken to improve the quality and resilience of the examined system (Figure 6).

The modelling of the system’s functions in the FMV happens in the form of tables characterizing the purpose of the defined functions and their aspects

Table 8. The numerical scores for the output’s quality.

according to the FRAM structure. The FMV enables the generation of a graphical representation of the designed model depicting a sort of a map of the system. This graphical representation provides an illustration of the relationships among functions, which allows for understanding how the functions affect each other and how variability can combine throughout the system. The numerical values of the IVF (representing the potential variability of the functions) and the outputs (representing the combined impact of internal and external variability on the output’s quality) were plotted in the graphical representation for illustrative purposes.

The formulated assumptions in our case here present a scenario, in which the airliner did not provide adequate training and adequate instructions to its flight crew. The flight was delayed due to weather conditions and a stressed flight schedule. This impacted negatively the performance conditions for the functions: Training, Airliner Guidelines and Instructions, Planning, Flight Crew Supervision, Pre-deicing Inspection, Deicing, Post Deicing Inspection, Anti-icing and Taxi to Runway. The functions with an output’s quality of one or higher are not variable in an adverse manner and have the potential to dampen variability in the downstream functions. The maximum output that can be achieved is 1.25 due to the selected defuzzification method i.e. the center of gravity method. The minimum quality output is 0.25. The numerical outputs showed a negative deviation from the ideal value (one or more) for the above-mentioned functions. The

Figure 6. A graphical representation of the generated FRAM model with numerical outputs.

lowest result was received for the output of the function “Post-deicing Inspection” due to the principle of resonance of variability.

Based on the characterized functions and performance conditions, the analyst would be able to construct a map of the system in question. The relationships and dependencies between the performance conditions and the quality of the outcome can be described to identify which conditions promote success and which ones impair performance. This map describes how the functions are linked and how they can possibly affect each other’s performance. The numerical outputs can provide a more precise and intersubjective representation of the variability magnitude. Using this map, the analyst would be able to locate potential sources for variability within the system. It is then possible to propose and implement measures to strengthen weak points and enforce conditions that ensure successful outcomes.

8. Discussion

The application of FRAM can provide interesting and helpful results to keep up with the fast pace of technological developments and the dynamic nature of complex sociotechnical systems. This is not to say that FRAM can replace traditional analysis tools; rather, FRAM is complementary to the established methods and can present a different perspective on safety management and performance evaluation [20] . FRAM enables the analyst to examine dynamic and complex relationships to present a holistic perspective of the studied system. Through the evaluation of the contextual conditions, FRAM allows for characterizing operations in terms of coupled functions to determine possibilities for positive or negative performance variability.

In contrast to retrospective analyses, in which events and their consequences can be described in a more precise manner, proactive or predictive studies lack certainty. Through the integration of fuzzy logic into the framework of the classical FRAM, the advantages of both approaches can be utilized for the provision of systemic analyses. Applying probabilistic methods relying on statistical data analysis may not always be possible. Fuzzy logic can be more suitable in the absence of sufficient quantitative data or the presence of vagueness and information imprecision [45] . The reliance on linguistic scales allows for the incorporation of approximate human judgements of experts to handle contexts, which are of qualitative nature and are not easily quantifiable. The representation of variability as a result of local adjustments to comply with performance requirements and the differentiation between the heterogeneous natures of functions as recognized by Macchi [41] can be realized through the fuzzy rule base. The IF-THEN rules describe the relationships between inputs and outputs and facilitate the assignment of different weights and significance for each variable. The addition of fuzzy logic facilitates the production of numerical results, which present more comprehensible and precise results without sacrificing the advantages of using linguistic labels.

The construction of the simulated model (characterization of functions, relationships, selection of membership functions, etc.) and the analysis were performed based on knowledge gained from studying deicing operations. Additionally, the characterization of the simulated deicing functions was performed relying as well on literature findings, accident reports and technical reports published by governmental agencies around the world. Through the formulation of some assumptions over performance conditions, a proactive analysis model was constructed. The simulation was run in the FRAM Model Visualizer and in MATLAB using the Fuzzy Logic Designer to demonstrate a possible approach for the realization of a fuzzy-logic-based FRAM model. The evaluation scale was selected between zero and ten, which can be used either as a discrete or as a continuous scale. However, it is important to note that human judgement can be less accurate on a continuous scale. While different scales may be more suitable for different applications, the test-retest reliability for rating scales with 11 response categories or more tends to decline in comparison to a 7-point, 9-point or 10-point scale [52] . On the other hand, the reliability of scales with fewer response categories (2, 3 or 4) is much lower also than scales with 7, 9 or 10 response categories [52] . It is important to select a scale that allows for the elicitation of experts’ judgement maintaining valid and reliable results.

The aggregated numerical output does not translate into a definite membership into one class of quality. Rather, the numbers can be seen as indicators for the potential of positive or negative variability based on the designed functions and their respective membership functions, quality classes and performance conditions. The model is flexible i.e. the functions can be re-defined and re-characterized if needed, new functions can be added or existing ones subtracted and relationships can be redefined as deemed appropriate. The influence of the different CPCs and the different functional aspects on the output can be weighted in the rule base. Each function depending on its nature can be examined separately to determine the weights in the rule base and account for the different influences on the output. In our case study, same weights were attributed to the different aspects and to all rules in the rule base of each function, which simplified the construction process of the rule base and allowed for a more efficient and feasible execution of the simulation. After all, the objective is to demonstrate how such an application can be executed and the focus is mostly directed to the theoretical aspect. Applying this approach to a real case study must be done with caution, since the proposed model at this stage is still a prototype in need of further improvements.

Admittedly, the simulation in the proposed case is a simplification of reality. The representation of influential conditions and the characterization of functions were simplified to facilitate the simulation process, which requires high computing resources. To avoid the “rules explosion” problem, the number of inputs was limited to a maximum of six. A higher number is possible of course; however, the size of the rule base would increase exponentially with each added variable, which can amount to a very exhaustive process. The validity and reliability of the numerical outputs depend greatly on the defined model characteristics for this simulation and the formulated assumptions. This means that the results are not necessarily generalizable to other contexts, which is not the point of this simulation anyway.

The continuous improvement of safety in aviation and the declining number of accidents year after year make it difficult to collect sufficient data to generate meaningful statistics [53] . The establishment of adequate databases and performance indicators for aviation maintenance generally and deicing specifically can be very helpful to construct accurate and meaningful fuzzy inference systems. The proposed model in this paper is a first step, which requires further validation and optimization work to present more representative and reliable results. Further validation is still required to define a more realistic model of deicing operations. Nonetheless, the results presented here are promising and provide a possible approach for the integration of quantification means into FRAM, which can be beneficial to assess complex sociotechnical systems.

9. Conclusions

To keep up with the fast pace of evolving modern sociotechnical systems, a continuous re-evaluation of applied safety and risk management tools is advised. A paradigm shift in the way we look at adversity is needed, namely the shift from a SAFETY-I to a SAFETY-II perspective. In addition to looking at what goes wrong and aiming at simply identifying causes and errors, looking at what goes right becomes necessary, especially when there is a lack of sufficient or precise data. The Functional Resonance Analysis Method (FRAM) is proposed in this paper as an adequate method to address these challenges in addition to classical assessment methods. The principles of FRAM allow for a fresh and different perspective on system analysis characterizing nonlinearity, complexity and performance variability. The main objective of this paper was to propose a possible improvement to the framework of FRAM through the integration of fuzzy logic as a quantification tool. In an effort to produce more intersubjective results, a fuzzy-FRAM model of the aircraft ground deicing operations was constructed relying on literature and findings of our research team over recent years. The context of deicing operations was simulated in the FRAM Model Visualizer and in MATLAB to present a first application of the proposed model. The preliminary results are promising and allow for a more comprehensible representation of potential performance variability. The presented model is still at this stage a prototype and requires further validation and optimization work going forward to provide more representative and reliable results.

Acknowledgements

The authors thank the Arbour Foundation and the National Sciences and Engineering Council of Canada (NSERC) for funding this study. The authors also thank Rees Hill, the developer of the software “FRAM Model Visualizer (FMV)”, which was applied in this study to develop and visualize the FRAM model.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

Cite this paper

Slim, H. and Nadeau, S. (2019) A Proposal for a Predictive Performance Assessment Model in Complex Sociotechnical Systems Combining Fuzzy Logic and the Functional Resonance Analysis Method (FRAM). American Journal of Industrial and Business Management, 9, 1345-1375. https://doi.org/10.4236/ajibm.2019.96089

References

  1. 1. érdi, P. (2008) Complexity Explained. Springer, Berlin. https://doi.org/10.1007/978-3-540-35778-0

  2. 2. Weaver, W. (1948) Science and Complexity. In: Facets of Systems Science, Springer, Berlin, 449-456. https://doi.org/10.1007/978-1-4899-0718-9_30

  3. 3. Hettinger, L.J., Kirlik, A., Goh, Y.M. and Buckle, P. (2015) Modelling and Simulation of Complex Sociotechnical Systems: Envisioning and Analysing Work Environments. Ergonomics, 58, 600-614. https://doi.org/10.1080/00140139.2015.1008586

  4. 4. Mumford, E. (2006) The Story of Socio-Technical Design: Reflections on Its Successes, Failures and Potential. Information Systems Journal, 16, 317-342. https://doi.org/10.1111/j.1365-2575.2006.00221.x

  5. 5. Carayon, P., Hancock, P., Leveson, N., Noy, I., Sznelwar, L. and Van Hootegem, G. (2015) Advancing a Sociotechnical Systems Approach to Workplace Safety-Developing the Conceptual Framework. Ergonomics, 58, 548-564. https://doi.org/10.1080/00140139.2015.1015623

  6. 6. Nadeau, S. (2003) Co-Operation in Health and Safety: A Game Theory Analysis. Pierce Law Review, 1, 219.

  7. 7. Transport Canada (2004) TP 10643E When in Doubt... Small and Large Aircraft-Aircraft Critical Surface Training for Aircrew and Ground Crew. 7th Edition, 138 p. http://www.tc.gc.ca/Publications/en/tp10643/pdf/hr/tp10643e.pdf

  8. 8. Günebak, S., Nadeau, S., Morency, F. and Sträeter, O. (2016) Aircraft Ground Deicing as a Complex Sociotechnical System: Towards a Safer and More Efficient Communication Process for Aircraft Ground Deicing. 62nd CASI Aeronautics Conference and AGM 3rd GARDN Conference, Montreal, 19-21 May 2015, 189-199.

  9. 9. Torres, Y., Morency, F. and Nadeau, S. (2013) Factors Influencing Performance of Aircraft Ground Deicing Operations: Perspectives in Ergonomics and Occupational Safety. In: AIHCE Conference, American Industrial Hygiene Association, Montréal.

  10. 10. Torres, Y., Nadeau, S. and Morency, F. (2016) Study of Fatigue and Workload among Aircraft De-Icing Technicians. Occupational Ergonomics, 13, 79-90. https://doi.org/10.3233/OER-160240

  11. 11. Le Floch, T., Nadeau, S., Landau, K. and Morency, F. (2018) Aircraft Deicing in Open Baskets: Study of the Effects of Activities on Heart Rate Variability.

  12. 12. Landau, K., Nadeau, S., Le Floch, T. and Morency, F. (2018) Arbeitsprozesse und Wertschöpfungsbeiträge bei der Flugzeug-Enteisung. Gesellschaft für Arbeitswissenschaft, Frankfurt, 21-23 February 2018.

  13. 13. Nadeau, S. and Morency, F. (2017) De-Icing of Aircraft: Incorporating Business Risks and Occupational Health and Safety. International Journal of Safety and Security Engineering, 7, 247-266. https://doi.org/10.2495/SAFE-V7-N2-247-266

  14. 14. SHK Board of Accident Investigation (1993) Report C 1993:57 Air Traffic Accident on 27 December 1991 at Gottröra, AB County Case L-124/91. Stockholm. http://www.havkom.se/assets/reports/English/C1993_57e_Gottrora.pdf

  15. 15. Transport Safety Board of Canada (1995) Aviation Occurrence Report: Collision Royal Air Maroc Boeing 747-400, CN-RGA, Montreal (Mirabel) International Airport, Québec 21 January 1995. Report Number A95Q0015, TSB Canada. http://www.tsb.gc.ca/eng/rapports-reports/aviation/1995/a95q0015/a95q0015.pdf

  16. 16. Günebak, S., Nadeau, S., Morency, F. and Sträter, O. (2015) Towards a Safer and More Efficient Communication Process for Aircraft Ground Deicing: Review of Human Factors and Team Communication Literature. AERO 2015: 62nd CASI Aeronautics Conference and 3rd GARDN Conference, Montréal, 19-21 May 2015.

  17. 17. Aventin, A., Morency, F. and Nadeau, S. (2015) Statistical Study of Aircraft Accidents and Incidents Related to De/Anti-Icing Process in Canada between 2009 and 2014. 62nd CASI Aeronautics Conference and AGM 3rd GARDN Conference, Montreal, 19-21 May 2015, 201-209.

  18. 18. Eyre, F.W. (2002) Tactile Inspection for Detection of Ice on Aircraft Surfaces—Notes on Current Practice. Transports Canada, Report No. TP 13858E. http://bibvir2.uqac.ca/archivage/17100950.pdf

  19. 19. Landau, K., Nadeau, S., Le Floch, T. and Morency, F. (2017) Ergonomic Time and Motion Studies of Aircraft De-Icing Work. Journal of Ergonomics, 7, 204. https://doi.org/10.4172/2165-7556.1000204

  20. 20. Melanson, A. and Nadeau, S. (2016) Managing OHS in Complex and Unpredictable Manufacturing Systems: Can FRAM Bring Agility? In: Advances in Ergonomics of Manufacturing: Managing the Enterprise of the Future, Springer, Cham, 341-348. https://doi.org/10.1007/978-3-319-41697-7_30

  21. 21. Slim, H., Nadeau, S. and Morency, F. (2018) FRAM: A Complex System’s Approach for the Evaluation of Aircraft On-Ground Deicing Operations. In: Kongress der Gesellschaft für Arbeitswissenschaft, Vol. 64, GFA Press, Dortmund.

  22. 22. Hollnagel, E. (2014) Safety-I and Safety-II: The Past and Future of Safety Management. Ashgate Publishing, Ltd., Farnham.

  23. 23. Patriarca, R., Di Gravio, G. and Costantino, F. (2017) A Monte Carlo Evolution of the Functional Resonance Analysis Method (FRAM) to Assess Performance Variability in Complex Systems. Safety Science, 91, 49-60. https://doi.org/10.1016/j.ssci.2016.07.016

  24. 24. Eurocontrol (2009) A White Paper on Resilience Engineering for ATM. Report of the Project Resilience Engineering for ATM.

  25. 25. Pavard, B. and Dugdale, J. (2006) The Contribution of Complexity Theory to the Study of Socio-Technical Cooperative Systems. In: Minai, A.A. and Bar-Yam, Y., Eds., Proceedings of the 3rd International Conference on Unifying Themes in Complex Systems, Springer, Berlin, 39-48. https://doi.org/10.1007/978-3-540-35866-4_4

  26. 26. Leveson, N. (2011) Engineering a Safer World: Systems Thinking Applied to Safety. MIT Press, Cambridge. https://doi.org/10.7551/mitpress/8179.001.0001

  27. 27. Hollnagel, E. (2012a) FRAM, the Functional Resonance Analysis Method: Modeling Complex Socio-Technical Systems. Ashgate Publishing, Ltd., Farnham.

  28. 28. Patriarca, R., Bergström, J., Di Gravio, G. and Costantino, F. (2018) Resilience Engineering: Current Status of the Research and Future Challenges. Safety Science, 102, 79-100. https://doi.org/10.1016/j.ssci.2017.10.005

  29. 29. Zadeh, L.A. (1973) Outline of a New Approach to the Analysis of Complex Systems and Decision Processes. IEEE Transactions on Systems, Man and Cybernetics, 1100, 38-45. https://doi.org/10.1109/TSMC.1973.5408575

  30. 30. Hollnagel, E. (2012b) An Application of the Functional Resonance Analysis Method (FRAM) to Risk Assessment of Organisational Change (No. SSM-2013-09). Swedish Radiation Safety Authority. https://inis.iaea.org/collection/NCLCollectionStore/_Public/44/057/44057156.pdf

  31. 31. Rosa, L.V., Haddad, A.N. and Carvalho, P.V. (2015) Assessing Risk in Sustainable Construction Using the Functional Resonance Analysis Method (FRAM). Cognition, Technology & Work, 17, 559-573. https://doi.org/10.1007/s10111-015-0337-z

  32. 32. Albery, S., Borys, D. and Tepe, S. (2016) Advantages for Risk Assessment: Evaluating Learnings from Question Sets Inspired by the FRAM and the Risk Matrix in a Manufacturing Environment. Safety Science, 89, 180-189. https://doi.org/10.1016/j.ssci.2016.06.005

  33. 33. Pickup, L., Atkinson, S., Hollnagel, E., Bowie, P., Gray, S., Rawlinson, S. and Forrester, K. (2016) Blood Sampling—Two Sides to the Story. Applied Ergonomics, 59, 234-242. https://doi.org/10.1016/j.apergo.2016.08.027

  34. 34. Belmonte, F., Schön, W., Heurley, L. and Capel, R. (2011) Interdisciplinary Safety Analysis of Complex Socio-Technological Systems Based on the Functional Resonance Accident Model: An Application to Railway Traffic Supervision. Reliability Engineering and System Safety, 96, 237-249. https://doi.org/10.1016/j.ress.2010.09.006

  35. 35. Sawaragi, T., Horiguchi, Y. and Hina, A. (2006) Safety Analysis of Systemic Accidents Triggered by Performance Deviation. In: SICE-ICASE International Joint Conference, Institute of Electrical and Electronics Engineers, Piscataway, 1778-1781. https://doi.org/10.1109/SICE.2006.315635

  36. 36. Nouvel, D., Travadel, S. and Hollnagel, E. (2007) Introduction of the Concept of Functional Resonance in the Analysis of a Near-Accident in Aviation. 33rd ESReDA Seminar: Future Challenges of Accident Investigation, Ispra, 13-14 November 2007, 9 p.

  37. 37. Hollnagel, E., Pruchnicki, S., Woltjer, R. and Etcher, S. (2008) Analysis of Comair Flight 5191 with the Functional Resonance Accident Model. 8th International Symposium of the Australian Aviation Psychology Association, Sydney, 8-11 April 2008, 8 p.

  38. 38. De Carvalho, P.V.R. (2011) The Use of Functional Resonance Analysis Method (FRAM) in a Mid-Air Collision to Understand Some Characteristics of the Air Traffic Management System Resilience. Reliability Engineering & System Safety, 96, 1482-1498. https://doi.org/10.1016/j.ress.2011.05.009

  39. 39. Hollnagel, E. (2004) Barriers and Accident Prevention. Ashgate Publishing, Aldershot.

  40. 40. Cacciabue, P.C. (2000) Human Factors Impact on Risk Analysis of Complex Systems. Journal of Hazardous Materials, 71, 101-116. https://doi.org/10.1016/S0304-3894(99)00074-6

  41. 41. Macchi, L. (2010) A Resilience Engineering Approach for the Evaluation of Performance Variability: Development and Application of the Functional Resonance Analysis Method for Air Traffic Management Safety Assessment (école Nationale Supérieure des Mines de Paris). https://pastel.archives-ouvertes.fr/pastel-00589633

  42. 42. Ishizaka, A. (2014) Comparison of Fuzzy Logic, AHP, FAHP and Hybrid Fuzzy AHP for New Supplier Selection and Its Performance Analysis. International Journal of Integrated Supply Management, 9, 1-22. https://doi.org/10.1504/IJISM.2014.064353

  43. 43. Zadeh, L.A. (2015) Fuzzy Logic—A Personal Perspective. Fuzzy Sets and Systems, 281, 4-20. https://doi.org/10.1016/j.fss.2015.05.009

  44. 44. Zadeh, L.A. (1997) Toward a Theory of Fuzzy Information Granulation and Its Centrality in Human Reasoning and Fuzzy Logic. Fuzzy Sets and Systems, 90, 111-127. https://doi.org/10.1016/S0165-0114(97)00077-8

  45. 45. Zadeh, L.A. (1996) Fuzzy Logic = Computing with Words. IEEE Transactions on Fuzzy Systems, 4, 103-111. https://doi.org/10.1109/91.493904

  46. 46. Zadeh, L.A. (1965) Fuzzy Sets. Information and Control, 8, 338-353. https://doi.org/10.1016/S0019-9958(65)90241-X

  47. 47. Sivanandam, S.N., Sumathi, S. and Deepa, S.N. (2007) Introduction to Fuzzy Logic Using MATLAB (Vol. 1). Springer, Berlin. https://doi.org/10.1007/978-3-540-35781-0

  48. 48. González Dan, J.R., Arnaldos, J. and Darbra, R.M. (2017) Introduction of the Human Factor in the Estimation of Accident Frequencies through Fuzzy Logic. Safety Science, 97, 134-143. https://doi.org/10.1016/j.ssci.2015.08.012

  49. 49. Shepard, R.B. (2005) Quantifying Environmental Impact Assessments Using Fuzzy Logic. Springer Science & Business Media, Berlin. https://doi.org/10.1007/0-387-28098-7

  50. 50. Mamdani, E.H. and Assilian, S. (1975) An Experiment in Linguistic Synthesis with a Fuzzy Logic Controller. International Journal of Man-Machine Studies, 7, 1-13. https://doi.org/10.1016/S0020-7373(75)80002-2

  51. 51. Sugeno, M. (1985) Industrial Applications of Fuzzy Control. Elsevier Science Inc., Hoboken.

  52. 52. Preston, C.C. and Colman, A.M. (2000) Optimal Number of Response Categories in Rating Scales: Reliability, Validity, Discriminating Power, and Respondent Preferences. Acta Psychologica, 104, 1-15. https://doi.org/10.1016/S0001-6918(99)00050-5

  53. 53. Roelen, A.L.C. and Klompstra, M.B. (2012) The Challenges in Defining Aviation Safety Performance Indicators. PSAM, Helsinki, 11.