Creative Education
Vol.09 No.08(2018), Article ID:85707,16 pages
10.4236/ce.2018.98091

Influence of Expert Video Feedback, Peer Video Feedback, Standard Video Feedback and Oral Feedback on Undergraduate Medical Students’ Performance of Basic Surgical Skills

Marieke Lehmann1, Jasmina Sterz1, Maria-Christina Stefanescu2, Julian Zabel1, Kenan Dennis Sakmen1, Miriam Ruesseler1*

1Department of Trauma, Hand and Reconstructive Surgery, University Hospital Frankfurt, Goethe University Frankfurt, Theodor Stern Kai, Frankfurt, Germany

2Centre of Surgery, University Hospital Frankfurt, Goethe University, Theodor Stern Kai, Frankfurt, Germany

Copyright © 2018 by authors and Scientific Research Publishing Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY 4.0).

http://creativecommons.org/licenses/by/4.0/

Received: April 29, 2018; Accepted: June 26, 2018; Published: June 29, 2018

ABSTRACT

Purpose: In daily clinical practice, sterile working conditions, as well as patient safety and self-protection, are essential. Thus, these skills should be taught appropriately during undergraduate training. Receiving constructive feedback can significantly improve future performance. Furthermore, reviewing one’s performance using video tools is a useful approach. This study investigates the impact of different modes of video feedback on the acquisition of practical surgical skills, including wound management and a bedside test. Methods: Third-year medical students completed a structured training of practical skills as part of their mandatory surgery rotation. All students received the same practical skills training for performing wound management and a bedside test. However, for feedback regarding their performance, students were assigned to one of four study groups: expert video feedback (receiving feedback by an expert after reviewing the recorded performance), peer video feedback (receiving feedback by a fellow student after reviewing the recorded performance), standard video (giving feedback to a standardized video of the skill), or oral feedback (receiving feedback by an expert without a video record). Afterwards, students completed two 5-minute OSCE stations in which they were assessed with respect to their acquired competencies. Effects on long-term retention were measured at two further measurement points. Results: A total of 199 students were included in the study (48 for expert video feedback, 49 for peer video feedback, 52 for standard video feedback, and 50 for oral feedback). All teaching methods were feasible in the given timeframe of 210 minutes for each module. There were nearly no statistically significant differences among the groups with regard to the technical and non-technical ratings for the three measurement points. Conclusion: In the present study, video-assisted feedback in various forms offered no significant benefit over oral feedback alone during simulation-based patient encounters.

Keywords:

Oral Feedback, Video Feedback, Expert Video Feedback, Undergraduate Medical Training, Surgery, Prospective Comparative Effectiveness Analysis

1. Introduction

A medical expert should have the technical, procedural, and interpersonal competencies in order to maintain patient safety and provide quality medical care (Frank et al., 2015) . In particular, sterile working conditions are fundamental to the daily clinical routine, especially in surgery. Also in highly developed medical systems, hospital-acquired infections are a major problem (Brennan et al., 2004) . Moreover, other important issues are patient safety and self-protection, especially when blood products are concerned. Negligence can lead to severe consequences; for example, graft-versus-host-disease in patients and needle-stick injuries for physicians (Gerberding, 2003) . To avoid these failures and their consequences in clinical practice, sterile methodologies and safety should be part of undergraduate training prior to clinical rotations. Learning these skills can be accomplished in various ways.

Simulation-based learning provides a safe environment in which mistakes can be evaluated without putting patients at risk (Issenberg, McGaghie, Petrusa, Lee Gordon, & Scalese, 2005) . Another important aspect is feedback, which is known as an essential device in medical education (Cantillon & Sargeant, 2008; van de Ridder, Stokking, McGaghie, & ten Cate, 2008) . Feedback can be distinguished by the quality, quantity, and method of delivery (Fanning & Gaba, 2007) . It can be delivered by an expert or by a peer (meaning a person of the same rank or standing). Peer feedback has become a popular tool in medical education, which can lead to better comprehension of tasks and increased self-confidence, motivation, and camaraderie (Beard, O’Sullivan, Palmer, Qiu, & Kim, 2012; Yeh et al., 2015) . Despite this, the literature is mixed regarding the learners’ appreciation of the effectiveness of peer feedback (Burgess & Mellis, 2015; English, Brookes, Avery, Blazeby, & Ben-Shlomo, 2006) .

In addition, reviewing one’s performance by video may be a useful supplement. Some studies have demonstrated that video feedback is a potent and efficacious teaching instrument (Birnbach et al., 2002; Farquharson, Cresswell, Beard, & Chan, 2013; Oseni et al., 2017; Ruesseler, Sterz, Bender, Hoefer, & Walcher, 2017) , while others found no significant effect (Byrne et al., 2002; Sawyer et al., 2012) or, in some cases, a worse outcome in comparison to oral feedback alone (Savoldelli et al., 2006) . However, video feedback combined with peer or expert feedback improved students’ communication skills (Krause, Schmalz, Haak, & Rockenbauch, 2017) . Aside from that, Nesbitt et al. (Nesbitt, Phillips, Searle, & Stansby, 2015) found that “unsupervised video feedback” followed by reviewing video of an expert demonstrating a suturing task combined with a video including “hints and tips” was similar to individual “supervised video feedback.”

Another aspect might be a debriefing watching a video of the task being performed by another person in order to improve a student’s skills, as “in the field of motor learning, visual learning strategies, such as learning by observation or by imitation, as well as by video demonstration, are well-established” (Sigrist, Rauter, Riener, & Wolf, 2013) .

Although there are different ways to integrate feedback, especially video feedback as above, into the learning process, which have been described in the literature, we were unable to identify a study comparing the effect of these different methodologies in general or in specific cases pertaining to the acquisition of surgical skills.

The aim of this study was to assess the influence of application modes of video feedback (in particular, standardized video versus oral feedback) on the acquisition of competency in sterile working skills and bedside test that included informed consent and a blood transfusion. Furthermore, we explored whether a standardized video can substitute for individual feedback.

2. Methods

2.1. Study Design

This prospective, comparative study of effectiveness with four parallel study arms analyzed the influence of individual video feedback by experts or peers, standardized video feedback, and oral feedback on the acquisition of basic surgical skills.

2.2. Participants

Study participants were 3rd-year undergraduate medical students in a 6-year curriculum completing a mandatory 1-week surgical skills training prior to a 2-week traineeship in a surgical clinic. All students were asked to participate voluntarily in the study, and written informed consent was obtained. By using an online survey, we collected basic data concerning a student’s age, sex, and duration of academic studies.

The study was performed according to the ethical principles of the Declaration of Helsinki (Ethical Principles for Medical Research Involving Human Subjects). Consistent with the Ethics Board at the medical faculty from Goethe University, no ethics approval was necessary for implementing this study.

Prior to the surgical training week, students were allocated to one of the learning groups each training week by the deanery. This was done independently of the authors and study participation. For study purpose, these learning groups were randomly assigned to one of the four feedback methods, independently of study participation.

2.3. Interventions

The 1-week surgical training is designed to achieve general and basic surgical skills, which is referred to as “training of practical clinical skills in surgery.” Students participate in 12 training modules that involve skill labs, role-playing, and simulation (Russeler et al., 2010) .

The study was conducted on the modules “Wound Management” and “Puncture and Injection,” which consisted of small groups of students (a maximum of six per group). All students received the same standardized slide presentations for each module explaining the theoretical background of the tasks for wound management and the bedside test.

The module “Wound Management” simulated an entire systematic sequence in wound management on a mannequin with authentic hospital supplies, which a healthcare provider would typically encounter in the clinic. Here, a major learning objective was to employ a correct sterile approach, which involved cleaning the wound, preparing and injecting local anesthesia, preparing a sterile working surface for suturing the wound, putting on sterile gloves, covering the wound with a sterile surgical incise drape, and, finally, after a deep inspection and bypassed suturing, covering the wound with plaster.

The module “bedside test” included performing a bedside test, obtaining informed consent for and performing a blood-transfusion, particularly valuing patients’ safety and self-protection.

Initially, each task in the modules was demonstrated by a specially trained peer-tutor. Afterwards, students practiced together in groups of two. One student played the “doctor,” and the other played either the “assistant” or, in case of the bedside test, the “patient.” Afterwards, students received feedback using one of the feedback methods described below based on their randomly assigned study group.

2.4. Feedback Methods

Group 1―Expert video feedback

In the expert video feedback group, each student was videotaped while performing the task and as student pairs, were reviewed immediately after completion of the task. Feedback was given by an expert using a five-step feedback sheet. These five steps in the feedback protocol assessed what went well, what could be improved, what went badly, what was missing, and what was the take-home message for each student. The videotape was deleted immediately after review for reasons associated with data privacy.

Group 2―Peer video feedback

The peer video feedback group, each student was videotaped and reviewed immediately after completion of the task. However, feedback was given by the peers within the learning group performance. Therefore, students received the five-step feedback sheet, as described above. The videotape was deleted immediately after review.

Group 3―Standard video

For the standard video group, students received no individual feedback after completing the task. Instead, each pair of students was shown a standardized video of an expert performing the respective task. This standard video contained mistakes, which are common difficulties for trainees, e.g. releasing the needle incorrectly into the disposal box. After watching the video, each student was instructed to give feedback according to the five-step feedback protocol for the video shown. Finally, the tutor concluded with additional feedback and added points not mentioned using the five-step feedback sheet, which ensured that all students received the same feedback for the standard video.

Group 4―Oral feedback

Students in the oral feedback group received feedback from the tutor immediately after completing the task using the five-step feedback sheet. However, the students in this group were not videotaped.

After receiving feedback according to the respective method, all students had the opportunity to practice again while being supervised by the tutor in order to improve their performance.

2.5. Measurements

To assess the competencies of the students, the objective structured clinical examination (OSCE) format was used, which is a valid and reliable instrument for assessing clinical competence (Hodges, 2003; Regehr, MacRae, Reznick, & Szalay, 1998) . Students had to complete two stations within 5 minutes each, and they then received a brief feedback afterwards. One station evaluated the students’ wound management skills according to those learned in the module. The second station evaluated the bedside test, which included providing informed consent and a blood transfusion on a trained Simulated Patient as patient. The first measurement of the task “Bedside Test” took place immediately after the training, and the task “Wound Management” took place on the same day or in the morning of the day after the training was completed.

The second measurement was part of a voluntary OSCE training, and it took place 2 weeks before the third measurement, which was part of a mandatory summative 10-Station OSCE assessment at the end of the semester (approx. 6 - 12 weeks after initial training).

Students were rated using checklists at both OSCE stations for each of the three measurements. The checklists were divided into two parts. Part A assessed technical skills, and Part B assessed non-technical skills. Examiners were blinded to the study group and were trained prior to the OSCE in order to gain experience using the checklist.

2.6. Statistical Methods

The assessment of the data was performed using Microsoft Excel (Version 2016) and IBM SPSS 19 (IBM Corp., in Armonk, New York, USA). Data are presented as the means ± standard deviation of the percentage, and the Gaussian distribution was verified using the Kolmogorov-Smirnov test. If a Gaussian distribution was present in the data for the variable, parametric tests were applied. If not, non-parametric tests were used. To analyze differences among the groups, the mean scores from all groups were assessed using the parametric ANOVA or the non-parametric Kruskal-Wallis-test. For comparisons between single groups, the Tukey-test was used as a post hoc test when using the ANOVA. In the absence of variance homogeneity, the Games-Howell-test was used. When using the Kruskal-Wallis-test for comparisons between single groups, the Bonferroni-Dunn-test was used as a post hoc test.

For comparisons among the three measurements, parametric t-tests were used for dependent samples if data were normally distributed, and non-parametric Wilcoxon-tests were used if data were not normally distributed. Concerning the calculated p-scores, the tests comparing three measurements employed the Bonferroni correction to solve the multiple-test problem. Significance was recognized when p < 0.05, except with the Bonferroni correction. Here, the significance level for the tests between the three measurements was corrected to 0.017. Checklist score reliability was estimated using the Cronbach’s alpha coefficient. The educational effect sizes were analyzed using Cohen’s d.

3. Results

A total of 199 students agreed to participate in the study, and the characteristics of the study’s participants and number of participating students at each time point in each group are shown in Table 1.

All feedback methods were feasible within 210 minutes for each module. While oral feedback took about 15 minutes during the feedback round, individual video feedback by experts or peers took about 20 to 30 minutes (for the bedside test) or 30 to 40 minutes (for the wound management test), which included watching video of the student’s performance. The standard video feedback took about 15 to 20 minutes, which included watching the video.

There were nearly no statistically significant differences and small educational effect sizes among the groups with respect to the technical (Table 2(a) and Table 2(b), Table 3(a) and Table 3(b)) and non-technical ratings for the three measurement points. When comparing students’ performance among the three cohorts, there were significant improvements in all groups (Table 4(a) and Table 4(b)).

4. Discussion

Receiving individual, well-structured feedback is essential for improving students’ performance (Issenberg et al., 2005; Mahmood & Darzi, 2004) . Results in

Table 1. Characteristics of the groups. Group 1: Expert video feedback; Group 2: Peer video feedback; Group 3: Standard video; Group 4: Oral feedback. For the items “age,” “duration of study” and “number of previous OSCE,” the data are presented as the means.

(a) (b)

Table 2. (a) Wound management: Results of checklist rating for Measurement points 1 - 3. Results of the checklist rating in total and for each checklist part. Given are the mean scores in % of the total score ± standard deviation. Group 1: Expert video feedback; Group 2: Peer video feedback; Group 3: Standard video; Group 4: Oral feedback; (b) Wound management: Differences and educational effect sizes between study groups for Measurement points 1 - 3. Group 1: Expert video feedback; Group 2: Peer video feedback; Group 3: Standard video; Group 4: Oral feedback.

(a) (b)

Table 3. (a) Bedside test: Results of checklist rating for Measurement points 1 - 3. Results of the checklist rating in total and for each checklist part. Given are the mean scores in % of the total score ± standard deviation. Group 1: Expert video feedback; Group 2: Peer video feedback; Group 3: Standard video; Group 4: Oral feedback; (b) Bedside test: Differences and educational effect sizes between study groups for Measurement points 1 - 3. Group 1: Expert video feedback; Group 2: Peer video feedback; Group 3: Standard video; Group 4: Oral feedback.

(a) (b)

Table 4. (a) Wound management: Differences and educational effect sizes between Measurement points 1 - 3 for each group. Group 1: Expert video feedback; Group 2: Peer video feedback; Group 3: Standard video; Group 4: Oral feedback; (b) Bedside test: Differences and educational effect sizes between Measurement points 1 - 3 for each group. Group 1: Expert video feedback; Group 2: Peer video feedback; Group 3: Standard video; Group 4: Oral feedback.

the literature are mixed concerning the effectiveness and appreciation of peer oral feedback compared with expert oral feedback (Burgess & Mellis, 2015; English et al., 2006; Krause et al., 2017; Yeh et al., 2015) . In addition to the debriefing process, reviewing one’s performance using a video may be a useful instructional supplement, but the literature regarding the effectiveness of video-enhanced learning is inconclusive (Birnbach et al., 2002; Farquharson et al., 2013; Oseni et al., 2017; Ruesseler et al., 2017; Savoldelli et al., 2006; Sawyer et al., 2012) .

Ten Cate et al. stated that peer-assisted learning offers educational value to students on their own cognitive level, and it creates a comfortable and safe educational environment to enhance the efficiency of the learning process (Ten Cate & Durning, 2007) . However, peers should not simply replace an expert, but could instead complement a professional instructor’s know-how and experience.

Reviewing one’s own performance on video may complement the learning experience via self-scrutiny and self-reflection. On the other hand, visual learning methods (such as learning by observation or by imitation, as well as by video demonstration) are well-established with regard to motor learning (Sigrist et al., 2013) . Therefore, watching a video of the task being performed by another person may also benefit students learning. That, in turn, means a lack of individual feedback, but the activity of mirror neurons may contribute to a similar effect. In studies, these neurons show the same pattern of activity when viewing a process as they do when actively performing the task (Rizzolatti & Sinigaglia, 2007) .

In the present study, we did not find any differences between peer video feedback, expert video feedback, oral feedback and viewing a standard video. However, we were able to demonstrate an increase in the students’ competence after their feedback. Thus, for undergraduate learners, expert feedback, video feedback, or the use of a standard video might be useful when beginning to learn a complex skill.

Despite of the feedback method outcomes, there are clearly obstacles related to the available resources and time intensiveness of video feedback. Additional technical equipment is required, such as cameras, laptops, memory cards, and tripods. Furthermore, supplementary staff was needed for recording and reviewing the videos with participants. Admittedly, the standard video took less time. The standard video required non-recurring time for its production with actors and equipment, which requires extra preparation prior to the surgical training weeks.

A learning method is also characterized by its long-term effects. In the present study, this effect was evaluated in the second and third measurement time points. Previous studies suggest long-term learning improvements are associated with video feedback (Birnbach et al., 2002) . However, in this study, all four methods of feedback led to clear improvements. The third measurement was part of a mandatory summary assessment. As a consequence, students learned intensively prior to this assessment, so the results may not reflect actual long-term effects when comparing the feedback methods, and this needs to be considered when interpreting these results (Raupach, Brown, Anders, Hasenfuss, & Harendza, 2013) . To minimize this bias, we measured students at a second time point 2 weeks prior to the summary assessment. At this time point, there was no difference among the feedback groups. Keeping this in mind, the learning effect of all four feedback methods seems similar with respect to long-term retention.

In our study, the videos were deleted immediately after the review due to data privacy concerns, but previous studies found that taking the video home to watch repetitively enhanced the learning benefit of video feedback (Farquharson et al., 2013) .

This study was conducted as part of the curricular training in surgery, and each training module was limited to 210 minutes. All four feedback methods could be successfully integrated into the structured training in which all video feedback groups required the whole time, and the oral feedback groups finished earlier. Reviewing videotapes was more time-consuming, even when two couples of participants concurrently held their feedback rounds in two separate rooms. In addition, the length of the videos and the repetition of the task (initially demonstrated by a tutor, practiced once, watched two individual videos or one standard video and practiced twice) may cause information overload. This might cause less intrinsic motivation in the second practice round for students. Furthermore, because all study groups were required to practice twice while supervised by a tutor due to a comparable educational standard, the benefit of adding a video (individual or standard) might be understated in the results.

Since this study was conducted within a single cohort of medical students at a single medical school, its explanatory power and transferability to other medical schools might be limited. However, the sample sizes for both tasks for all three measurements were large, especially considering the available literature. Additionally, the effects observed did not negate these results.

A notable advantage of our work is that the study was performed within the students’ curriculum, and the mandatory training recapitulated what might occur in real training scenarios, such as changing tutors and different motivations for each participant. Moreover, the feasibility of the video-enhanced feedback methods could be analyzed in the context of a surgical training curriculum and in its defined setting.

The strengths of this study include the large sample size, three measurements for short- and long-term retentiveness, an established training environment, a defined feedback structure, trained tutors, and instructed and blinded OSCE examiners.

5. Conclusion

Within the context of the curricular surgical training in the present study, video-assisted feedback in forms of expert feedback, peer feedback and standard video offered no significant benefit over oral feedback alone during simulation-based patient encounters.

Declaration of Interest

This work was supported by the German Federal Ministry of Education and Research (Grant 01PL12038A) as part of the joint research project, “Practical clinical competence―A joint program to improve training in surgery.” All authors declare that they have no conflicts of interest.

Cite this paper

Lehmann, M., Sterz, J., Stefanescu, M.-C., Zabel, J., Sakmen, K. D., & Ruesseler, M. (2018). Influence of Expert Video Feedback, Peer Video Feedback, Standard Video Feedback and Oral Feedback on Undergraduate Medical Students’ Performance of Basic Surgical Skills. Creative Education, 9, 1221-1236. https://doi.org/10.4236/ce.2018.98091

References

  1. 1. Beard, J. H., et al. (2012). Peer Assisted Learning in Surgical Skills Laboratory Training: A Pilot Study. Medical Teacher, 34, 957-959. https://doi.org/10.3109/0142159X.2012.706340 [Paper reference 1]

  2. 2. Birnbach, D. J., et al. (2002). The Effectiveness of Video Technology as an Adjunct to Teach and Evaluate Epidural Anesthesia Performance Skills. Anesthesiology, 96, 5-9. https://doi.org/10.1097/00000542-200201000-00007 [Paper reference 3]

  3. 3. Brennan, T. A., et al. (2004). Incidence of Adverse Events and Negligence in Hospitalized Patients: Results of the Harvard Medical Practice Study I. Quality & Safety Health Care, 13, 145-151. https://doi.org/10.1136/qshc.2002.003822 [Paper reference 1]

  4. 4. Burgess, A., & Mellis, C. (2015). Receiving Feedback from Peers: Medical Students’ Perceptions. Clinical Teacher, 12, 203-207. https://doi.org/10.1111/tct.12260 [Paper reference 2]

  5. 5. Byrne, A. J., et al. (2002). Effect of Videotape Feedback on Anaesthetists’ Performance While Managing Simulated Anaesthetic Crises: A Multicentre Study. Anaesthesia, 57, 176-179. https://doi.org/10.1046/j.1365-2044.2002.02361.x [Paper reference 1]

  6. 6. Cantillon, P., & Sargeant, J. (2008). Giving Feedback in Clinical Settings. BMJ, 337, a1961. https://doi.org/10.1136/bmj.a1961 [Paper reference 1]

  7. 7. English, R., et al. (2006). The Effectiveness and Reliability of Peer-Marking in First-Year Medical Students. Medical Education, 40, 965-972. https://doi.org/10.1111/j.1365-2929.2006.02565.x [Paper reference 2]

  8. 8. Fanning, R. M., & Gaba, D. M. (2007). The Role of Debriefing in Simulation-Based Learning. Simulation in Healthcare, 2, 115-125. https://doi.org/10.1097/SIH.0b013e3180315539 [Paper reference 1]

  9. 9. Farquharson, A. L., et al. (2013). Randomized Trial of the Effect of Video Feedback on the Acquisition of Surgical Skills. British Journal of Surgery, 100, 1448-1453. https://doi.org/10.1002/bjs.9237 [Paper reference 3]

  10. 10. Frank J. R., et al., eds. (2015). CanMeds Physician Competency Framework. Ottawa: Royal College of Physicians and Surgeons of Canada. http://www.royalcollege.ca/rcsite/documents/canmeds/canmeds-full-framework-e.pdf [Paper reference 1]

  11. 11. Gerberding, J. L. (2003). Clinical Practice. Occupational Exposure to HIV in Health Care Settings. New England Journal of Medicine, 348, 826-833. https://doi.org/10.1056/NEJMcp020892 [Paper reference 1]

  12. 12. Hodges, B. (2003). Validity and the OSCE. Medical Teacher, 25, 250-254. https://doi.org/10.1080/01421590310001002836 [Paper reference 1]

  13. 13. Issenberg, S. B., et al. (2005). Features and Uses of High-Fidelity Medical Simulations That Lead to Effective Learning: A BEME Systematic Review. Medical Teacher, 27, 10-28. https://doi.org/10.1080/01421590500046924 [Paper reference 2]

  14. 14. Krause, F., et al. (2017). The Impact of Expert and Peer Feedback on Communication Skills of Undergraduate Dental Students: A Single-Blinded, Randomized, Controlled Clinical Trial. Patient Education & Counselling, 100, 2275-2282. https://doi.org/10.1016/j.pec.2017.06.025 [Paper reference 2]

  15. 15. Mahmood, T., & Darzi, A. (2004). The Learning Curve for a Colonoscopy Simulator in the Absence of Any Feedback: No Feedback, No Learning. Surgical Endoscopy and Other Interventional Techniques, 18, 1224-1230. https://doi.org/10.1007/s00464-003-9143-4 [Paper reference 1]

  16. 16. Nesbitt, C. I. et al. (2015). Randomized Trial to Assess the Effect of Supervised and Unsupervised Video Feedback on Teaching Practical Skills. Journal of Surgical Education, 72, 697-703. https://doi.org/10.1016/j.jsurg.2014.12.013 [Paper reference 1]

  17. 17. Oseni, Z. et al. (2017). Video-Based Feedback as a Method for Training Rural Healthcare Workers to Manage Medical Emergencies: A Pilot Study. BMC Medical Education, 17, 149. https://doi.org/10.1186/s12909-017-0975-3 [Paper reference 2]

  18. 18. Raupach, T. et al. (2013). Summative Assessments Are More Powerful Drivers of Student Learning than Resource Intensive Teaching Formats. BMC Medicine, 11, 61. https://doi.org/10.1186/1741-7015-11-61 [Paper reference 1]

  19. 19. Regehr, G. et al. (1998). Comparing the Psychometric Properties of Checklists and Global Rating Scales for Assessing Performance on an OSCE-Format Examination. Academic Medicine, 73, 993-997. https://doi.org/10.1097/00001888-199809000-00020 [Paper reference 1]

  20. 20. Rizzolatti, G., & Sinigaglia, C. (2007). Mirror Neurons and Motor Intentionality. Functional Neurology, 22, 205-210. [Paper reference 1]

  21. 21. Ruesseler, M. et al. (2017). The Effect of Video-Assisted Oral Feedback versus Oral Feedback on Surgical Communicative Competences in Undergraduate Training. European Journal of Trauma and Emergency Surgery, 43, 461-466. https://doi.org/10.1007/s00068-016-0734-x [Paper reference 2]

  22. 22. Russeler, M. et al. (2010). Training of Practical Clinical Skills in Surgery—A Training Concept for Medical Students. Zentralblatt Fur Chirurgie, 135, 249-256. https://doi.org/10.1055/s-0030-1247355 [Paper reference 1]

  23. 23. Savoldelli, G. L. et al. (2006). Value of Debriefing during Simulated Crisis Management: Oral versus Video-Assisted Oral Feedback. Anesthesiology, 105, 279-285. https://doi.org/10.1097/00000542-200608000-00010 [Paper reference 2]

  24. 24. Sawyer, T. et al. (2012). The Effectiveness of Video-Assisted Debriefing versus Oral Debriefing alone at Improving Neonatal Resuscitation Performance: A Randomized Trial. Simulation in Healthcarec, 7, 213-221. https://doi.org/10.1097/SIH.0b013e3182578eae [Paper reference 2]

  25. 25. Sigrist, R. et al. (2013). Augmented Visual, Auditory, Haptic, and Multimodal Feedback in Motor Learning: A Review. Psychonomic Bulletin & Review, 20, 21-53. https://doi.org/10.3758/s13423-012-0333-8 [Paper reference 2]

  26. 26. Ten Cate, O., & Durning, S. (2007). Peer Teaching in Medical Education: Twelve Reasons to Move from Theory to Practice. Medical Teacher, 29, 591-599. https://doi.org/10.1080/01421590701606799 [Paper reference 1]

  27. 27. Van de Ridder, J. M. et al. (2008). What Is Feedback in Clinical Education? Medical Education, 42, 189-197. https://doi.org/10.1111/j.1365-2923.2007.02973.x [Paper reference 1]

  28. 28. Yeh, D. D. et al. (2015). Peer-to-Peer Physician Feedback Improves Adherence to Blood Transfusion Guidelines in the Surgical Intensive Care Unit. Journal of Trauma and Acute Care Surgery, 79, 65-70. https://doi.org/10.1097/TA.0000000000000683 [Paper reference 2]