Open Journal of Nursing
Vol.05 No.12(2015), Article ID:62363,10 pages

Perceptions on the Efficacy of Simulation

Holli Sowerby

Weber State University, Ogden, USA

Copyright © 2015 by author and Scientific Research Publishing Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY).

Received 22 October 2015; accepted 26 December 2015; published 29 December 2015


This qualitative case study addressed recent graduates’ perception of the efficacy of simulation in registered nursing education. Finding clinical placement is one of the greatest challenges faced by schools of nursing. One possible solution is to use high-fidelity simulation manikins to substitute for clinical experience. The research that exists regarding nursing simulation laboratory experiences focuses on improvement on exam scores and preparation for national board examinations. Very little research explores recently graduated nurse’s perceptions about the efficacy of simulation experiences. The conceptual framework for this study was the constructivist theory. This is a process of experience and reflection. It is a dynamic process that changes as the learner internalizes the experience. This study allowed the researcher to understand how recent graduates perceive the value of simulation experiences. Two research questions were identified: 1) How do recent graduates of registered nurse (RN) education programs view the simulation lab experiences from nursing school? 2) In what ways do the perceptions of simulation experiences differ between associate degree RN program graduates and bachelors program RN graduates? A case study research design was used to explore the perceptions of recently graduated RN about their experience with simulation. By interviewing recent graduates’ perceptions about their experiences with high-fidelity simulation in nursing school, and documenting their perceptions about the efficacy of those simulation experiences, information was obtained that might allow schools of nursing to increase the effectiveness of the simulation experience or validate its applicability in the real world setting.


Patient Simulation, Education, Nursing, Nursing Simulation, Nursing Education Research

1. Introduction

Simulation use in nursing education has a lengthy history. As early as 1911, student nurses were practicing on manikins, now known as human patient simulators (HPS), at Hartford Training School. Mrs. Chase, a training manikin, featured jointed hips, elbows and knees, and after few years of service, she was improved to include a wig, body orifices, and more realistic skin [1] . The skills laboratory at the Indiana University Training School for Nurses operated a skills laboratory that provided student nurses with the opportunity to practice injections on manikins and practice other treatments on each other in 1932 [2] . The ability to practice in a controlled environment is essential to the preparation of nurses and the use of simulation has continued to increase with many improvements since the days of Mrs. Chase in 1911.

The modern era of simulation began in the 1960s when the Laerdal Company developed a manikin to teach cardiopulmonary resuscitation (CPR). Resusci © Anne was designed to provide learners with a realistic simulation of CPR. With a spring that mimicked chest compression and inflatable lungs, Resusci © Anne was cutting edge [1] . These important developments were just the beginning of the advances that have happened leading to the sophisticated world of high-fidelity simulation.

As the technology has advanced, the cost of these training manikins has increased. Today’s HPS are some of the most complex ever seen. Manikins are rated by fidelity, specifically low, medium and high as defined below:

・ Low-fidelity: Mrs. Chase, noted above, would be a low-fidelity HPS, a task trainer suitable for teaching basic psychomotor skills such as the insertion of various medical devices.

・ Medium-fidelity: A more realistic patient on which to learn new and increasingly complex skills that require some feedback from the patient, for instance, lung, cardiac or bowel sounds can be listened to on a medium-fidelity HPS.

・ High-fidelity: These manikins allow for responses from the patient to the student’s actions. These highly complex computer controlled machines can cry, speak, and exhibit symptoms of disease that require intervention from the student. Some specifically designed high-fidelity HPSs are even given birth. Major advances in technology have made a dramatic difference in how simulation is used today.

There is an increase use of simulation laboratory experiences to replace clinical experiences. This was precipitated by the shortage of clinical placement. More research is needed to determine that this method of education is effective.

Individual one-on-one interviews were conducted for this qualitative design, case study method, and research. There were two questions that provided the basis for the design of the interview:

1) How do recent graduates of registered nurse (RN) education programs view the simulation lab experiences from nursing school?

2) In what ways do the perceptions of simulation experiences differ between associate-degree RN program graduates and bachelor’s program RN graduates?

2. Materials and Methods

A small group of nine recent graduates were interviewed individually. These participants were volunteers from a group of recent graduates who were part of a residency program for newly graduated RNs at a local hospital system. The interviews were conducted and transcribed and the data were analyzed and coded to identify themes. Before the interviews were conducted Walden University Institutional Review Board granted approval and the participants granted informed consent.

2.1. Subjects

The participants were part of a residency program for newly graduated RNs. This group of participants had recently completed their nursing education at various schools of nursing and thus provided a participant group with a wide variety of experiences. It was anticipated that there would 8 - 10 participants. Face-to-Face interviews were conducted with each of the participants. The education office of the hospital where the residency program was taught facilitated contact with the participants. These new RNs came from varied schools of nursing and have had differing experiences with simulation. In total, there were nine interviews conducted.

An invitation to participate in this research study was forwarded to current residency program participants and those who had completed the residency program in the past 18 months. The group of current residency participants to participants in the past 18 months was selected to ensure that participants in the research had been RNs for two years or less. As the hospital was unwilling to release contact information, the invitation to participate was forwarded by the education office at the hospital. The participants contacted me directly to schedule a mutually convenient time for a face-to-face interview.

The invitation was sent on three separate occasions and there were a total of nine respondents. All nine of the RNs that responded were interviewed. All interviews were recorded and transcribed verbatim. The longest interview held was 25 minutes and the shortest was only 7 minutes long. There were participants from three different nursing programs and seven of the participants were female and two were male. Two of the participants held bachelor’s degrees and the remaining seven were associate degree nurses. All the participants who volunteered were interviewed, and the results are included in this study. Demographics of the participants appear in Table 1.

2.2. Member Checking and Evidence of Quality

One method of ensuring credibility of research is member checking. This step can help to assure that researcher biases are not influencing data analysis. To accomplish member checking the participants were given the opportunity to view the written transcript of their individual recorded interview along with the preliminary coding and interpretation. The audio recordings of the interviews were transcribed by me and the initial coding was done. After this step, the transcript of the interview was returned to the participant via email and the participant was asked to reply with any clarifications they felt were necessary. Only one participant returned a clarification regarding the number of simulations she participated in stating that she had not included the regular weekly simulation in her total number and upon reading the transcript felt that she should clarify. Interviews continued until the responses were consistently repeated which indicated a saturation of themes that is also desirable in qualitative research.

Another method of ensuring quality in research is providing rich, thick descriptions which allow readers to compare the research to local situations. Transferability is a goal of quality research and by providing the opportunity for comparison transferability can be accomplished. The final method used to ensure quality was having a peer reviewer evaluate the research. A confidentiality agreement was signed and the reviewer looked at the research before, during, and after the data gathering process. This person has been a nurse for over 40 years and has spent 22 years in nursing education. She has been an interim dean, site reviewer for National League for Nursing Accreditation Commission (now the Accrediting Commission for Education in Nursing or ACEN) and university program director. These methods serve to ensure validity and accuracy of the research.

3. Analysis

Following each interview, the recording was transcribed by me. This allowed for immersion in the data. After multiple readings of the transcript the preliminary coding was completed and the transcripts with the preliminary coding were returned to the participant for member checking. The coded data were entered into an electronic matrix which was stored on a personal computer and backed up to an external hard drive, both of which were

Table 1. Demographics of research participants.

password protected. Data analysis was an ongoing process. During the multiple readings, the descriptions that would be used in the text were identified. As interviews and transcription progressed, words or phrases that became new codes were identified and themes emerged. When no new codes emerged and codes repeated themselves after several interviews it was determined that saturation had been reached.

The data were reviewed and examined to assure that nothing was missed or coded incorrectly during the multiple readings and re-readings. When saturation was achieved and no new codes were revealed, the interviewing process was completed and analysis continued. All codes were entered into the previously mentioned electronic matrix and the codes were sorted into emerging themes. It was projected that 8 - 10 interviews would be necessary to reach saturation of themes; saturation was reached after 9 interviews during the course of 6 weeks.

4. Prior Research Results

While simulation has been used for many years, research regarding the efficacy of simulation has not kept pace with the increasing implementation of simulation programs. One area that has received some attention is the effect simulation has on confidence and decision making. Stirling, Smith and Hogg, showed that simulation experiences early in a nurse’s career can have positive effects [3] . In a study by Kaddoura new graduates perceived that the care they gave to critically ill patients was improved by simulation practice which helped them make good clinical decisions [4] . An increase in self-confidence was mentioned in several studies of students and simulation experiences [5] -[8] . While several studies noted the improvement of confidence and competence for students, application of experiences by graduates has not been studied in-depth. The ability to take skills from the simulation practice laboratory to the patient care setting is an obstacle faced by schools of nursing.

Blum, Borglund, and Parcells concluded that more research was needed regarding the transfer of knowledge gained during education to professional practice [5] . Kaplan and Ura, found that “the student clinical experience is rich, yet challenges arise in providing experiences where leadership skills can be developed and used in nursing practice” [6] . Another area of difficulty with simulation laboratory education is evaluation of what is learned. Alinier, Hunt, Gordon, and Harwood conducted quantitative research about the effectiveness of simulation as measured by an examination. Their results showed that test scores improved [9] . Learning nursing skills in the clinical setting is vital to nursing education. Using simulation experiences to augment clinical education can improve student performance on testing but there is no clear evidence that simulation experiences help to improve the application of skills to practice for nursing students.

Goodstone, Goodstone, Cino, Glaser, Kupferman, and Dember-Neal determined that over time students who participated in simulation activities using high-fidelity patient simulators saw an improvement on critical thinking skills as measured by testing which evaluated those skills [10] . Critical thinking is a skill that is extremely important for nurses as there are often many different factors to consider when providing patient care. By practicing critical thinking in the controlled environment of the simulation laboratory, students can gain needed skills to cope with common and even less common patient problems.

It is not possible to provide students with an opportunity to experience every possible event they will encounter as a new nurse. Using simulation to create learning experiences that do not routinely occur in the clinical setting allowed researchers Smith, Klaussen, Witt, Zimmerman and Cheng, to implement a learning experience in which students participated in a high-fidelity human simulation (HFHS) scenario and applied the concepts they learned in class to legal and ethical dilemmas presented in the scenario [11] . The research compared the experiences of the group which received simulation education with two other groups of students, a face-to-face group and an online group. Both faculty and students identified HFHS as the best approach for implementing a learning experience regarding legal and ethical content. Another area that students do not have adequate exposure to is the pediatric environment. In research regarding a pediatric simulation, Richard determined that students were highly satisfied with a pediatric asthma simulation and felt that they were able to apply concepts learned in the classroom is a safe learning environment. Flexibility on the part of the student and adequate time and support for faculty were found to be important factors in the effective use of simulation in the nursing curriculum [12] .

It has always been a challenge to bridge what is referred to as the theory practice gap. Schools of nursing struggle with ensuring that lessons learned in the classroom are applicable to patient care in the clinical setting. Finan, Bismalla, Campbell, LeBlanc, Jefferies, and Whyte compared the success rate of two groups of residents related to infant intubation. One group participated in simulation exercises and the other did not. While the simulation group was more proficient in the post-test performance their proficiency diminished quickly. At the end of eight weeks there was no significant difference in the two groups’ success rate. Conclusions were made that while immediate skill may improve in the simulation environment, these skills may not transfer to the clinical setting [13] . In studies exploring the student perspective, students often say that they like simulation experiences but more research is needed to validate the transfer of this experience to the clinical setting. Students identified, “…transient feelings of confusion”, but perceived that, hypothetically, they could apply the knowledge from the simulation to actual events [14] . The actual transfer of knowledge gained in simulation to the clinical setting is an area that has not been fully researched.

In the only study located that evaluated student skill on actual patients, Kirkman followed 42 nursing students conducting respiratory assessments. Their skill was evaluated on actual patients before the lecture was given, after the lecture and again after the simulation experience. Results showed significant improved at each evaluation. This research concluded that students are able to effectively apply their learning from simulation experiences in the clinical setting [15] .

4.1. Study Results

The five major themes that were identified during the data analysis phase were, in order of frequency: 1) environmental and technical factors; 2) preparation for nursing tasks; 3) human factors; 4) communication; and 5) caliber of the equipment. The questions were divided between four categories: question one and two-conducting simulation, questions four, five and six-comparing to real life, question three, seven, eight and nine-exploring value and importance, and question 10-additional comments. Instead of using names, the participants were assigned numbers: P1, P2, P3, P4, P5, P6, P7, P8, and P9. The transcripts were coded and 181 total codes were identified that were then divided among 5 themes. Environmental and technical factors such as equipment working and videotaping generated the most responses, with 39% of the total responses in that theme. All participants stated that they perceived that simulation helped with preparation for nursing tasks, 33% of the total responses. Caliber of the equipment was mentioned by eight of the nine participants. The ability to become engaged in a scenario and have an experience similar to real life is enhanced by higher fidelity manikins. This theme appeared in 7% of the coded responses. Communication, at 8%, was an issue that was presented as a problem in the simulation environment when trying to communicate with the manikin. It was also mentioned as a valuable tool of simulation when students learned to communicate with peers, patient family members and doctors. It was mention as less of an issue than the human factors at 13%. The codes that were assigned to the human factors theme were those relating to importance of the instructor, having other students take the simulation experiences seriously, and working in groups. The data supporting the themes are presented in the following sections.

4.2. Conducting Simulation

The first two questions participants answered were to gather data about the frequency of simulation activities, and how the program conducted simulation activities: 1) How often during a semester did you participate in a simulation laboratory experience involving a human patient simulator (manikin)? 2) Describe how your nursing school carried out simulation laboratory experiences. Please provide examples. After analysis, it was determined that the majority of students participated in simulation laboratory experiences weekly, with six of the nine participants giving the initial response of weekly. One participant clarified, through member checking, that weekly participation in simulation was a correct response rather than the two times stated during the interview. The participant further clarified that the initial response of two referred to midterm and final scenario simulation activities. This brought the total to seven of nine participants who participated in weekly simulation activities. Participants also stated that some of the simulation experiences were just practice time, and others were a more involved scenario based experience.

The majority of participants stated that they were divided into groups, and the simulation laboratory experience began with a demonstration of the skill the participants would be practicing during the simulation. Eight of the nine participants said they were given a scenario, report, or case study to work with, and that simulation experiences lasted 2 or 3 hours. One theme that was mentioned throughout the interviews, in responses to various questions, was caliber of the equipment. “Better manikins made it more realistic” (P2) and “Using the more realistic manikins made it a good experience” (P6) were two such responses to question two. The next group of questions asked about how manikins compare to real life.

4.3. Comparing to Real Life

Questions four, five, and six were asked to explore the participants’ perceptions of how simulation is applied in the real world setting. Question (d): Since graduation, what real life situations have you experienced that simulation prepared you for? Please provide examples. Here again the theme that caliber of the equipment was mentioned. Three participants responded that assessment skills were learned and polished through simulation. One participant stated that because of simulation, “I’m more thorough in my examination…you need to be thorough in your assessments” (P2). The assessment skills are included in preparation for nursing tasks, a theme that developed from the research. Participants mentioned gaining experience with feeding tubes, Foley catheters, vital sign trending, and blood administration in addition to the previously mentioned assessment, as nursing tasks for which simulation prepared them. By repeating these tasks multiple times in simulation laboratory experiences, they gain confidence and the ability to perform these tasks in real life situations, “Overall, I did learn a lot of my skills in simulation labs” (P1). When responding to this question participant five responded, “Every time a patient goes bad I think, okay, what did we do…okay, I can totally do this” (P5)

Communication is another theme that was identified during data analysis. In simulation, participants practiced communicating with other team members “It helps you to communicate and learn what to say to each other” (P6). Some participants also learned how to call a doctor, “getting orders from the doctor” (P1), and “I felt like it gave me a lot of practice on how to call doctors” (P4), while some participants felt that calling doctors was an area they wanted more simulation practice with in response to question five or six.

After asking what experiences simulation prepared them for, the next question asked participants to consider the following: (e) Since graduation, what experiences have you had that previous simulation experience could have prepared you for? Please provide examples. Here, the theme communication was mentioned again. Two participants felt like they did not get enough practice calling doctors. Participant five and nine respectively stated, “I think the only thing is, I’m not as comfortable calling doctors as I’d like” (P5) and “they don’t really prepare you to call a doctor too much” (P9). Another communication issue that participants would have liked to experience in simulation was communication in psychiatric situations. “Unfortunately I work in psych right now and it’s really just learning as you go, and trying to learn your de-escalation” (P7) was one comment made in response to question five. Another such response was also recorded, “I think maybe a simulation with like a psych patient…there wasn’t a ton of psych stuff” (P3). Of the nine participants interviewed, three could not identify any areas that were not covered by their simulation experiences. Participant nine stated, “I think they were pretty well generalized, that it really helped with a lot of things” (P9). After responding to a question about what real life experiences simulation prepared them for, and what simulation might have prepared them for, the participants were then asked to compare simulation to real life in question number six which stated, (f) How do simulation experiences compare to the real life experience in a hospital setting? Six of the participants responded that simulation was either different, or very different from real life. Of the items listed that make simulation different, communication issues were mentioned by five of the participants. Their responses, listed here, help to illustrate some of the communication difficulties experienced in simulation laboratories “People can tell you things that manikins can’t tell you…in the hospital there is interaction” (P8). “Manikins can make some noises and can look those ways, it’s different…” (P5). “The communication barrier is the most difficult thing to kind of overcome, talking to a manikin and not having them make eye contact and stuff like that is the most awkward” (P4). “They [manikins] don’t’ say ouch, they don’t get stressed” (P3). Along with communication difficulties, students mentioned that you can control simulation, you can pause or slow down or even stop to ask questions the ability to do this is seen as an advantage; “I feel like the most valuable piece of simulation was being able to reflect on what we were doing. So, being able to say, hold on one second, go back, let’s do this again…I liked that” (P1). It was also mentioned as a disadvantage, “It’s a lot more intimidating in real life than it is in there” (P9). This is part of the environmental and technical factors theme that emerged during data analysis. There are several other topics that fall into the environmental and technical factors theme and they are covered in next group of questions.

4.4. Exploring Value and Importance

The value and importance of simulation were assessed using questions three, seven, eight, and nine. In this group of questions, the fact that simulation provided an opportunity for hands-on practice was mentioned by five of the participants. This hands-on practice is a part of the environmental and technical factors theme that was mentioned in the previous section. Also mentioned, three times in this group of questions, was the topic of caliber of the equipment. Question three states (c) How would you describe your overall experience with simulation? Please provide examples. All of the nine participants interviewed stated that they liked simulation, or that it was a good experience. One of the participants did state “Sometimes I thought they were kind of boring if you weren’t the one participating” (P1). This comment, along with a comment from participant seven, “a lot of people weren’t taking it completely serious…they’d do it once or twice…then they pretty much messed around for the rest of it” (P7), and the comment that “If it had been in smaller groups that you had to do more in then I think that would have been more valuable” (P9), highlight the difficulty of keeping students engaged, part of another important theme, human factors. The problem of a participant perceiving that students were not taking simulation seriously was mentioned in response to question number three specifically, three times, and five times in response to other questions, for a total of eight occurrences; this also falls into the human factors theme.

Question seven asks (g) What was the most valuable aspect of simulation in your program? Two participants mentioned code simulations as being most valuable, “As a new nurse it’s always scary to think about your patient all of the sudden…having a code and it’s hard to think about what you would do. I liked that simulation because going through the process, helped me think about things that you might not have thought about” (P3). These comments along with comments about, “feeling less awkward with patients” (P7), “Prepares you for patients in a less stressful setting” (P6), and “Practice helped to review the process” (P3), are all examples of the theme preparation for nursing task. Participant seven also noted that, “It’s a good starting place, it’s that good foundation, its building that foundation. But, over time it stops becoming real life” (P7). This comment illustrated another issue in the human factors theme. Another theme that was revisited in question seven is that caliber of the equipment. Participant nine stated, “The patient actually had a heartbeat, bowel sounds, lung sounds…plus they had the microphone and everything, and the person could talk like they were the patient. The manikin is very helpful, made it a little bit more real” (P9). Following question seven about most valuable aspects of simulation, question eight asks about the least valuable. Question (h) Which was the least valuable aspect of simulation in your program? Once again the theme caliber of the equipment surfaced. Two participants perceived that, “having the manikin that said things to you, or reacted to what you were doing, helped more than just having the [other] manikins in the lab” (P6) and “I remember the baby manikins we had they were, I mean you couldn’t even tell when they were having difficulty breathing” (P4). With higher fidelity manikins, the participant gets a closer to real life experience. Other comments included that simulations were too long with too many students, both of which are part of the environmental and technical factors theme. Having discussed the most and least valuable, the next question deals with the participant’s perception of importance.

Question nine asked (i) What aspect of simulation in nursing school do you perceive as the most important and why? Responses to this question generated the first mention of the importance of the instructor in simulation. The importance of the instructor was mentioned by two participants in response to this question, and by three different participants in response to the final question. Participant one mentioned the importance of, “a simulation teacher who can really show you what you’re doing is right or wrong” (P1). Another participant stated, “I think the instructor makes a huge difference, because when we had a substitute it just didn’t go over well…I think them being in charge, and knowing what they want from us, and having it all organized well helped a lot. I think that’s probably the most important” (P5). These comments, a total of five, regarding the importance of the instructor are included in the human factors theme. Comments that were categorized into the environmental and technical factors theme were, “the equipment needs to work” (P2), and “videotaping and reviewing in class” (P9). The theme preparation for nursing tasks was represented by the comments “I think just establishing basic nursing skills on the manikin is super important” (P4), “Actually practicing skills is the most important” (P3), and “Going over the things that can really end somebody’s life and the warning signs leading up to that” (P7).

4.5. Additional Comments

The concluding question asked (j) What additional information might you provide to assist in better understanding your experience with simulation as a nurse? This open-ended question allowed students to add any additional information that they felt was pertinent to the topic. All five of the identified themes were represented in the responses to this question. Participants three and six commented on the importance of higher fidelity manikins. Participant three stated, “I think it’s really beneficial and as they’re going to come out with newer and newer manikins, it’s going to become almost like a real life situation” (P3). When talking about simulation at another school, with higher fidelity manikins participant 6 said, “…it would have been helpful to practice there more often” (P6). When discussing preparation for nursing tasks, participant two responded that being able to practice cardio-pulmonary-resuscitation (CPR) and have the chest move was valuable, and participant 6 said it was more helpful to actually do skills than just talk about them. Communication was represented by participant 3 who responded that “the hardest part for me is when they don’t talk back. Some people are just great at pretending but I’m like waiting for a response” (P3). The environmental and technical factors that were mentioned by the participants were; that it needs to be more educational, and have written scenarios, simulation is less stressful after a participant has passed boards, and is not being graded on participation. Human factors were mentioned frequently. Three participants replied that the instructor was important. Participant four stated, “I think having instructors who make you feel comfortable, and confident, is really important instead of making you feel funny and awkward during the whole thing” (P4). Participant 7 also felt the instructor was an important piece of simulation, “…it seems like depending on the instructor, depending on how much experience they have and depending on experiences, depends on how much they can add to it” (P7). The other human factor mentioned was getting students involved and taking it seriously. Participant seven boldly stated, “…it’s in everybody’s own individual hands. Which is great if you are with people who are equally wanting to participate but if you’re not you kind of get screwed over” (P7). Only one of the participants had no additional comments to add on question ten.

5. Discussion

The purpose of this research was to determine recent graduates’ perceptions of the efficacy of simulation laboratory experiences. Specifically, how did the recent graduates view the simulation experiences of nursing school, and were there differences between associate’s and bachelor’s degree students’ perceptions? The findings from this study showed that graduates found simulation useful and applicable in certain circumstances. The importance of the instructor as well as the hands-on experience was stressed, and factors that detracted from the experience such as too many students and not enough structure were emphasized. There was not a difference noted between the perceptions of associate’s degree RNs and bachelor’s degree RNs.

A commonality in the literature was reviewed and this study was the fact that much of the success of a HPS experience was dependent on the skill of the instructor. The conclusion is that by careful planning of the experience and thorough debriefing afterward, the learner can experience increased confidence in their ability to perform in situations similar to the simulation [16] . Garrett, MacPhee, and Jackson are also proponents of the importance of a thorough reflective debriefing [17] . Debriefing is a common area that is mentioned often in the literature and one that participants in this study referred to.

Brewer found that while HPS can be a valuable tool in nursing education, the instructor’s skill and technique was an important factor in the success of the technique as a learning tool. An important component of evaluation was emphasized and more research was recommended to identify what made up successful simulation [18] . Adamson and Kardong-Edgrenalso discuss the importance of evaluation in their article. This quantitative study documents the reliability of three tools designed to evaluate student performance in simulation scenarios. Twenty-nine faculty participants from across the United States viewed video-archives of nursing students engaged in simulation activities [19] . Analysis of student performance is becoming increasingly important as the use of simulation grows and is a topic for further research.

It is important to determine if the skills learned in the simulation laboratory transfer to real world of patient care as a nurse. Handley and Dodge highlighted the reality that there is no clear method for incorporating or evaluating the effectiveness of simulation in nursing education. Likewise, there is no established method to evaluate “…its effect on student competency within clinical practice” [20] . Okuda et al. determined that while there was evidence to support the claim that simulation was an effective method for teaching skills, few studies linked simulation with an improvement in patient outcomes [21] . In fact, Finan, Bismalla, Campbell, LeBlanc, Jefferies, and Whyte found that simulation training did not translate to improved performance in the clinical setting when evaluating medical students learning the procedures for intubating a newborn. Finan, Bismalla, Campbell, LeBlanc, Jefferies, and Whytere searched the reasons for this and found that participants felt increased anxiety in the clinical setting versus the simulation laboratory [13] . This contrasts with the results of this study in which participants felt that simulations helped to easy anxiety in the clinical setting.

Limitations to this study include the small sample size that is characteristic of qualitative research, and limits the generalizability of the study. Other difficulties that were encountered and should be mentioned, were the restricted access to the participants and time frame that resulted in trying to complete interviews during the winter holiday season. The participants were all part of residency program for newly graduated RNs. The institution where these participants worked would not allow access to their contact information. This resulted in delays as the invitation to participate and reminders were forwarded through the education office instead of directly by the researcher. As was often the case, this phase took longer than expected which resulted in interviews being scheduled during the busy winter holiday season. These factors may have impacted the number of volunteers willing to participate. Saturation of themes was reached and both research questions were answered during the course of the study.

Lack of clinical placement has required nursing educators to look at the substitution of simulation laboratory experiences with HPS for clinical time caring for real patients [22] . The ability to transfer the knowledge and skills learned in the simulation laboratory to the clinical setting has not been well researched [20] [21] . The use of simulation continues to increase; however, adequate research to support the increase has not been done [23] . It is proposed that more research should be conducted to establish the effectiveness of simulation laboratory experiences and how well simulation experiences are applied in the clinical setting. This research would assist schools of nursing to design effective simulation programs.


H.S. thanks her Walden University doctoral committee for their support during the research process and Sigma Theta Tau International, Nu Nu chapter for the grant that helped to fund this research.

Cite this paper

HolliSowerby, (2015) Perceptions on the Efficacy of Simulation. Open Journal of Nursing,05,1123-1132. doi: 10.4236/ojn.2015.512119


  1. 1. Nickerson, M. and Pollard, M. (2010) Mrs. Chase and Her Descendants: A Historical View of Simulation. Creative Nursing, 16, 101-105.

  2. 2. Brewer, E.P. (2011) Successful Techniques for Using Human Patient Simulation in Nursing Education. Journal of Nursing Scholarship, 43, 311-317.

  3. 3. Adamson, K.A. and Kardong-Edgren, S. (2012) A Method and Resources for Assessing the Reliability of Simulation Evaluation Instruments. Nursing Education Perspectives, 33, 334-339.

  4. 4. Handley, R. and Dodge, N. (2013) Can Simulated Practice Learning Improve Clinical Competence? British Journal of Nursing, 22, 529-535.

  5. 5. Okuda, Y., Bryson, E.O., DeMaria, S., Jacobson, L., Quinones, J., Shen, B., Levine, A.I., et al. (2009) The Utility of Simulation in Medical Education: What Is the Evidence? Mount Sinai Journal of Medicine, 76, 330-343.

  6. 6. National Council of State Boards of Nursing (2011) 2011 Nurse Licensee Volume and NCLEX Examination Statistics. NCSBN Research Brief, Vol. 37.

  7. 7. Schiavenato, M. (2009) Reevaluating Simulation in Nursing Education: Beyond the Human Patient Simulator. Journal of Nursing Education, 48, 388-394.

  8. 8. Garrett, B., MacPhee, M. and Jackson, C. (2010) High-Fidelity Patient Simulation: Considerations for Effective Learning. Nursing Education Perspectives, 31, 309-313.

  9. 9. Inch, J. (2013) Perioperative Simulation Learning and Post-Registration Development. British Journal of Nursing, 22, 1166-1172.

  10. 10. Kirkman, T.R. (2013) High Fidelity Simulation Effectiveness in Nursing Student’s Transfer of Learning. International Journal of Nursing Education Scholarship, 10, 1-6.

  11. 11. Wotton, K., Davis, J., Button, D. and Kelton, M. (2010) Third-Year Undergraduate Nursing Students’ Perceptions of High-Fidelity Simulation. Journal of Nursing Education, 49, 632-639.

  12. 12. Finan, E., Bismilla, Z., Campbell, C., LeBlanc, V., Jefferies, A. and Whyte, H.E. (2012) Improved Procedural Performance Following a Simulation Training Session May Not Be Transferable to the Clinical Environment. Journal of Perinatology, 32, 539-544.

  13. 13. Richard, J.J. (2009) Beginning Experiences in Simulation: Asthma in a Pediatric Patient. Clinical Simulation in Nursing, 5, e5-e8.

  14. 14. Smith, K.V., Klaassen, J., Witt, J., Zimmerman, C. and Cheng, A. (2012) High-Fidelity Simulation and Legal/Ethical Concepts: A Transformational Learning Experience. Nursing Ethics, 19, 390-398.

  15. 15. Goodstone, L., Goodstone, M.S., Cino, K., Glaser, C. A., Kupferman, K. and Dember-Neal, T. (2013) Effect of Simulation on the Development of Critical Thinking in Associate Degree Nursing Students. Nursing Education Perspectives, 34, 159-162.

  16. 16. Alinier, G., Hunt, B., Gordon, R. and Harwood, C. (2006) Effectiveness of Intermediate-Fidelity Simulation Training Technology in Undergraduate Nursing Education. Journal of Advanced Nursing, 54, 359-369.

  17. 17. Reid-Searle, K., Eaton, A., Vieth, L. and Happell, B. (2011) The Educator inside the Patient: Student’s Insights into the Use of High Fidelity Silicone Patient Simulation. Journal of Clinical Nursing, 20, 2752-2760.

  18. 18. Partin, J.L., Payne, T.A. and Slemmons, M.F. (2011) Students’ Perceptions of Their Learning Experiences Using High-Fidelity Simulation to Teach Concepts Relative to Obstetrics. Nursing Education Perspectives, 32, 186-188.

  19. 19. Kaplan, B. and Ura, D. (2010) Use of Multiple Patient Simulators to Enhance Prioritizing and Delegating Skills for Senior Nursing Students. Journal of Nursing Education, 49, 371-377.

  20. 20. Blum, C.A., Borglund, S. and Parcells, D. (2010) High-Fidelity Nursing Simulation: Impact on Student Self-Confidence and Clinical Competence. International Journal of Nursing Education Scholarship, 7, 1-14.

  21. 21. Kaddoura, M. (2010) New Graduate Nurses’ Perceptions of the Effects of Clinical Simulation on Their Critical Thinking, Learning, and Confidence. Journal of Continuing Education in Nursing, 41, 506-516.

  22. 22. Stirling, K., Smith, G. and Hogg, G. (2012) The Benefits of a Ward Simulation Exercise as a Learning Experience. British Journal of Nursing, 21, 116-122.

  23. 23. Davis, H.E. (1932) A Workable Nursing Laboratory. American Journal of Nursing, 32, 387-391.