Open Journal of Modelling and Simulation
Vol.03 No.02(2015), Article ID:54926,7 pages
10.4236/ojmsi.2015.32004

Integrated Testing Environment of Instructor Station for SMART Simulator

Joon Ku Lee1, Geun Ok Park1, Keung Koo Kim1, Woo Seok Huh2, Kwang Young Sohn2*

1Korea Atomic Energy Research Institute, Daejeon City, Korea

2MIRAE Engineering Co., Ltd., Daejon City, Korea

Email: *kwangyoung.sohn@mirae-en.co.kr

Copyright © 2015 by authors and Scientific Research Publishing Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY).

http://creativecommons.org/licenses/by/4.0/

Received 15 February 2015; accepted 20 March 2015; published 23 March 2015

ABSTRACT

SMART is the reactor that has been researched for many years by KAERI in order to provide the small-mid scale of power for typically seawater desalination. Now Korea Atomic Energy Research Institute (KAERI) has issued Standard Safety Analysis Report (SSAR) and acquired Standard Design Approval (SDA) for SMART. In order to conduct the design verification and validation for license, the integrated simulation test environment that is composed of 1) the system specific simulation codes formerly developed in the name of Nuclear Plant Analyzer (NPA) including NSSS and BOP simulation, 2) Instructor Station (IS), 3) Supervisory Control and Data Acquisition (SCADA), 4) operator and instructor Human Machine Interface (HMI), and 5) soft-controller has been considered as an important area for operator training and system validation. These sub-components has been designed and implemented for verifying and validating the SMART design and training of operators and for generating the backup data for licensing. This paper introduces the structure of integrated simulation test environment for SMART, explains the efforts to assist system-specific simulation code interface, and also addresses the effort for implementing and optimizing the test environment by maintaining its own simulation functionality and performance in order to review the simulation results efficiently.

Keywords:

System-Integrated Modular Advanced Reactor (SMART), Instructor Station (IS), Nuclear Plant Analyzer (NPA), Software Development Life Cycle (SDLC)

1. Introduction

SMART technology is the sensible mixture of innovative concepts and conventional technologies to improve the level of safety, reliability and economy [1] . Although it is an unprecedented design without operating experiences and references, it acquires the Standard Design Approval (SDA) and issues Standard Safety Analysis Report (SSAR). It means that the test environment for the verification and validation of SMART design commensurate with system behavior validation, operator training and support for further optimization is required [2] . This Nuclear Plant Analyzer (NPA) has been developed for verifying the behavior of NSSS and BOP system. By extending this research, KAERI now is developing a full scope simulator, which is supported by the Integrated Test Environment (ITE) for SMART reactor, which is composed of system specific simulation codes, Instructor Station, and MMI (Man-Machine Interface) displays for operator’s interfaces.

2. Main Control Room (MCR) Test Facility

The digitalized MCR for ITE that is very similar to Figure 1 is composed of the following segments:

・ A Large Display Panel (LDP);

・ A workstation for a Reactor Operator (RO);

・ A workstation for a Turbine Operator (TO);

・ A workstation for a Supervisory RO (SRO);

・ A workstation for safety shutdown;

・ An auxiliary workstation.

Operator’s actions in the MCR are accomplished through digital interface means such as a LDP, Flat Panel Displays (FPD) and soft-controllers (SC). The LDP is to provide overall and key information to the operators in the MCR so that operator and/or operator could commonly understand the status of SMART operations through them.

The information on the LDP should be recognized by the operators as quickly and clearly as possible. Parameters on a center panel of the LDP are fixed to encompass all the key and overall information of the SMART. Four wing panels are variable to provide supporting information such as alarm lists, graphs, and detail displays. The operator can change display pages to be indicated on the wing panels. The FPD is to provide detail information down to component levels. The operators have to navigate display pages on the FPD in order to get detail hierarchically deployed information of SMART operations. The SC is to provide touch-sensitive interfaces on the FPD to control components such as pumps, valves, motors and fans. The Man Machine Interface System (MMIS) includes the following systems:

Figure 1. Overall configuration of SMART MCR.

・ Information Processing System (IPS);

・ Alarm and Indication System (AIS);

・ Post-Accident Monitoring System (PAMS);

・ Inadequate Core Cooling Monitoring System (ICCMS);

・ Severe Accident Monitoring System (SAMS);

・ Soft controller for safety and non-safety.

3. The Structure of Integrated Simulation Test Environment

3.1. The Implementation of Instructor Station (IS)

IS is established in the server platform based on the 3 Key Master environment on which a lot of system specific simulation code is running. Also it provides a mechanism to share the simulation value among the system specific simulation codes. Figure 2 and Table 1 show the functional configuration of IS and the information for each system specific simulation model. Instructor Station is communicating with SCADA server for exchanging the simulation data, and hand switches in Safety Shutdown Control Panel (SSCP) bi-directionally in order to get the operator’s control input as diagrammed in Figure 3.

Figure 2. Interface diagram of SMART simulator.

Table 1. Scan cycle for each module.

Note: MARS: Multi-dimensional Analysis of Reactor Safety; MATRA: Multichannel Analyzer for Static and Transient in Rod Array; MASTER: Multi-purpose Analyzer for Static and Transient of Reactors; BOP: Balance of Plant; SCOPS: SMART COre Protection System; SCOMS: SMART COre Monitoring System.

Figure 3. Functions in ınstructor station.

The scan cycle for each simulation engine is the same so that a couple of consideration for generating the consistent chronicle order of each interface variables is possible to get refined data. This is provided by software module for simulation test aid in ITE. Instructor Station provides the function for training and design verification as indicated in Figure 3 which includes the injection and deletion of malfunction, generation and deletion of snapshot and multiple functions for simulator operation. Based on the integrated simulation test environment, IS provides the various functional commands such as fast, backtrack, halt and rewind etc. with the support of 3 Key Master environment.

3.2. SCADA Implementation

SCADA server is a data archive that interface with IS directly through TCP/IP communication, and commanding the simulation inputs and process inputs to respond to the request of operator’s GUI interfaces to display those values in LPD, IPS display, AIS display, PAM/ICCM display, and SC display. Thus it contains the huge on line and offline database, and communicating with IS and GUI bi-directionally through the OPC and TCP/IP connection optionally. Especially interface between SCADA and MMI systems uses the “lazy binding” to enhance the performance of simulator environment.

3.3. Operator’s HMI Implementation

All of data coming from and going to SCADA is going to be displayed in the operator’s GUIs. The GUI display page is mainly designed based on the Process and Instrumentation Diagram (P & ID) and user requirements. It is composed of IPS display, AIS display, PAM/ICCM displays for operator operation, Severe Accident Monitor (SAM) and Computerized Procedure System (CPS) in RO, AO and SRO console appropriately. Major components described in Section 2 are connected through the OPC and/or TCP/IP connection with interface systems [3] .

4. Efforts for Simulation Integrity

4.1. Logical Fidelity

The total number of process tags excluding the internal calculation point is up to about more than 700 of channels, which is fairly big numbers of data for simulation. Also the simulator handles these data of the on-line and off-line format.

Inherently simulator is functioning based on the single source of process input neglecting the rest of 3 channels of safety and the rest of 1 channel of non-safety system in SMART. To eliminate this limitation, SMART simulator dedicates 4 (safety), 2 (non-safety) channels signal generator, which is generating the variables necessary to simulate SMART, and unnecessary variables that have nothing to do with simulation itself temporarily [4] .

4.2. Malfunction List

SMART simulator is potentially is considering the 100% of thermal power output and its operation. The list of malfunction is derived from the analysis of DBE (Design Basis Event) for SMART as below 1 through 7. The malfunction is categorized into two categories of system level and component level malfunction. The major accident type is like the followings which is a basis to draw the malfunction list. Table 2 shows some of those for examples.

Table 2. Malfunction list for SMART.

a) Heat removal by BOP (Balance of Plant);

b) Fail or insufficient heat removal by BOP;

c) Low reactor coolant flow;

d) Anomaly of reactivity and/or power;

e) High reactor coolant inventory;

f) Low reactor coolant inventory;

g) Radiation release form subsystem or components.

5. The Analysis for Simulation Results

5.1. Analysis of Simulation Behavior

In the core of this integrated simulation test environment, integrator needs to produce the simulation results for the system-specific simulation engines described in Figure 2 and Table 1 in order for each simulation engine developer to compare the result to other engines’. For this, ITE for SMART simulator provide the following capability:

1) Capability verifying simulation behavior by taking a look at all the interface variables, i.e. all modules in Figure 2 and Table 1;

2) Capability to find out the divergence pattern.

The software supporting the efforts listed above 1) and 2 has been developed as separate software running independently on IS, which is finally producing the excel sheet. Establishing this functionality in simulation environment, each system-specific engine designer is able to analyze the execution result of engines, which will be addressed below section.

5.2. The Pattern among Multi Engines

This is the sample of log checking the consistency of interface variables between MARS and MATRA chronically. Table 3 is the sample (limited variables) of log checking the consistency of interface variables up to 700 point among the simulation module listed in Figure 2 running at the same time frame. As each scan time of a specific simulation engine is an important factor to generate the simulation test file, the highest resolution of time is selected to be a basis time line. The following is minimally a sample of those.

In Table 3 that is a subset of ITE test results, time step and variable A, B and C means the selected time line and variables belong to specific simulation engine model respectively. It is transmitted to engineer or designer of specific simulator for further analysis as well as enhancement.

Table 3. Example of ITE result (excerpt).

Table 4. Variable diverging more than +/− 10%.

5.3. The Detection of Divergence

The aid software for simulation result analysis is monitoring the each target variable in order to inspect one of the variables diverges into extreme abruptly which could cause the simulation engine to stop eventually. The major concern of this function is to find out the pattern of divergence in order engine developer to catch the cause and effect of the system.

The software module of ITE for detecting of variable divergence is monitoring the simulation variables diverging from extreme to extreme that may cause the various event and simulation engine halt.

The amount of criteria of divergence variable is selected as +/− 10% of engineering range of each variable, and it is of course adjustable. Table 4 is the example of it.

The detection of events such as trip, halt, a various alarm is a useful aid for Instructor or operator is able to find out the causes and its effect during the simulation. For implementation of this purpose, detection agent analyzes the event scenarios sampled from each simulation engine such as MARS, MATRA, MASTER and BOP model in order to provide the cause, variables contributing to various events, in order for simulation engine developer to identify the causes and effects in system behavior for enhancement, which is being developed in upcoming phase.

6. Conclusions

The selective preparation of design document and its verification and validation based on IEEE code and standard has been conducted for integrated testing environment for SMART simulator [5] -[7] . Establishing the integrated simulation environment, all the facilities in control room for real operation have been constructed to enhance the physical fidelity of simulator, and develop the human interface with system simulation engines. By doing these, the design and operation of integrated testing environment for SMART simulator are the solid basis to verify the system design, train the operator, accumulate the quasi-operating data for analysis and licensing, and optimize SMART design based on all of these results. This is about the best effort to support the verification of simulation engine for SMART.

The simulation engines that are currently only simulating 100% of power operation will be expanded to cover up the various plant operation modes. The following technical issues should be resolved for complete simulation:

1) As the integrated simulation environment is evolved, the more delicate malfunction scenarios including initial condition and severe accident scenarios should be refined for design verification for SMART reactor;

2) Some of models in Balance of Plant (BOP) system are still under development. So, for completely integrated simulation environment for SMART, those simulation engines should be developed and integrated in existing environment;

3) Also the method for generating the control rod position shall be devised and established in automatic and manual mode operation in order to verify the system behavior of SMART reactor too.

Acknowledgements

This work has been supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIP) (No. 2012M2A8A4025979), and especially thank to Dr. J. K. Lee for supporting this work.

References

  1. Kim, K.K., Lee, W.J., Choi, S., Kim, H.R. and Ha, J.J. (2014) SMART: The First Licensed Advanced Integral Reactor. Journal of Energy and Power Engineering, 8, 94-102.
  2. Regulatory Guide 1.149 (2011) Nuclear Power Plant Simulation Facilities for Use in Operator Training, License Examinations, and Applicant Experience Requirements, Revision 4.
  3. IEEE Std. 1023-2004 (2004) IEEE Guide to the Application of Human Factors Engineering to Systems, Equipment, and Facilities of Nuclear Power Generating Stations.
  4. ANSI/ANS-3.5-2009 (2009) Nuclear Power Plant Simulators for Use in Operator Training and Examination.
  5. IEEE Std. 829-2008 (2008) Standard for Software and System Test Documentation.
  6. IEEE Std. 829-2008 (2008) IEEE Standard for Software and System Test Documentation.
  7. ANSI/IEEE Std. 1008-1987 (1987) IEEE Standard for Software Unit Testing.

NOTES

*Corresponding author.