| Home | E-Submission | Sitemap | Contact us |  
top_img
Korean J Med Educ > Volume 37(1); 2025 > Article
Rhee and Park: Standardized learner simulation for debriefer training through video conference

Abstract

Purpose

Debriefing after simulation-based healthcare education (SBHE) is challenging. Educators’ debriefing skills are essential to the success of learning. For debriefing skill training, we designed a simulation course with the standardized learner (SL) through video conference. The purpose of this study was to describe the implementation process of the SL simulation course and evaluate its effectiveness on educators’ self-confidence and debriefing skills.

Methods

This simulation course involved six trainees and two trainers. After watching a 5-minute sample video of SBHE, each trainee conducted their role as a debriefer in this video and the trainer acted as a learner (SL) in this video. Following each simulation, individual feedback by the trainer was immediately implemented. To evaluate the course’s effectiveness, trainees’ self-confidence questionnaire was collected, and objective structured assessment of debriefing (OSAD) scores were evaluated.

Results

After completing five SL simulation sessions in 2 weeks, the trainees’ self-confidence level and OSAD scores improved significantly (estimate=0.114, standard error=0.020, p<0.001, and p=0.006).

Conclusion

This debriefer training course using SL simulation via video conference could improve self-confidence and debriefing skills of trainees. This SL simulation can be used as a new and flexible method for training debriefers.

Introduction

Simulation is a well-known effective method for training healthcare providers [1]. And debriefing techniques and skills are essential for simulation-based healthcare education (SBHE) [2]. Although many studies have been conducted on how to properly debrief, debriefing remains a burden for educators who are new to actual simulationbased education.
Although there have been concerns about the lack of training methods for skills, a conceptual framework for developing debriefing skills has been reported recently to be of great help. As faculty development strategies, several methods such as foundational training, scripts and tools, expert mentorship, peer feedback, self-reflection, providing feedback, additional training, and varying context have been proposed [3]. Also, many educators have learned related knowledge and received feedback from seniors. Nevertheless, many educators still have difficulty in conducting simulation-based education.
In this study, we used simulation technique for debriefing skill training, just as simulation-based education was introduced for safe clinical education as a way to achieve a safer debriefing experience. Also, evidence already exists for the effectiveness of using standardized patients in simulation for medical education [4]. In line with these strategies, we implemented a debriefer training course for novice trainees using standardized learner (SL) simulation through video conference.
The purpose of this study is to describe the implementation process of SL simulation aimed at improving the self-confidence and debriefing skills of novice debriefers, and to determine its effectiveness.

Methods

This study designed the one-group pretest-posttest design to compare the subjective confidence and objective competence of trainees before and after education corresponding to level 2 of the Kirkpatrick model [5].
First, we prepared one video of arbitrary 5-minute SBHE as the material for use in this simulation. After watching the video together, the trainer acted as the learner and each trainee conducted a debriefing as the debriefer in the video, respectively. Following each simulation, the trainer gave feedback, as the standardized patient in SBHE does [6,7]. Therefore, the learner role in this simulated debriefing was called SL. This SL had to prepare the focused feedback according to the promoting excellence and reflective learning in simulation (PEARLS) framework and this course’s structure [8]. At the same time, each trainee pre-checked the same sample video and prepared debriefing according to the objective of the sample SBHE video as a debriefer and the focus of SL simulation as a trainee. The details of this course are as follows.

1. Beta test

Prior to this training, four beta tests were conducted to reveal necessary improvements. Initially, we included a pre-briefing and reviewed all four elements (reaction, description, analysis, summary) of the PEARLS framework in each training [8]. However, it took too much time (over 2 hours) for participants to focus. To reduce the long training time, we decided to skip the pre-briefing and focus on debriefing only one specific element of the PEARLS framework per one training (two trainings were assigned to ‘analysis’ element with diverse characteristics). Second, based on the opinion that there was a lack of individual specific feedback, each trainee planned to conduct SL simulation and individual feedback in a separate Zoom breakout room in turn.

2. Participants

There were two groups of three debriefing trainees each, and all six of these physicians were novice to debriefing. But they were educators who had to teach medical students or residents. Prior to the course, didactic lectures, literature reviews, observation of simulation activities, and group discussions on various SBHE topics were conducted.
In this course, two educators who had the Certified Healthcare Simulation Educator (CHSE) of certification participated as trainers. Their cumulative experience in simulation education were 7 years and 15 years, respectively.
The study protocol was approved by the University of Hawaii Human Studies Program as exempt (protocol no., 2020-00414). Informed consent was obtained from all participants before they participated in the study.

3. Preparation

1) Trainee orientation

Prior to SL simulation course, trainees were provided with a Power Point file (Microsoft Corp., Redmond, USA) containing 72 slides of self-learning materials for SBHE. The contents were based on CHSE blueprint and SBHErelated articles. They included educational theories (i.e., adult learning, deliberate practice, experiential learning, mastery learning), definitions and concepts related to simulation (i.e., types, structure, standardization, fidelity, feedback, facilitation, debriefing), assessment tools of debriefing such as objective structured assessment of debriefing (OSAD), and debriefing assessment for simulation in healthcare [8-18]. Trainees were also oriented to the format and focuses of SL simulation. They had an opportunity to prepare a debriefing plan based on SBHE learning materials.

2) Training course preparation

A 5-minute sample video of SBHE was created by recording the scenario developed by a colleague (Fig. 1). In the video, there was one learner acting as an intensive care unit (ICU) nurse who had to practice how to approach a deteriorating patient with chest tube on a ventilator. Prior to SL simulation course, trainees watched this sample video, and were informed the objectives of the SBHE scenario in detail.
For the video conference, we chose Zoom (Zoom Video Communications, San Jose, USA), which was already supported by the institution and familiar to all participants. The video conference schedule was arranged and shared with all trainers and trainees.

3) Preparation for evaluation

To evaluate the effectiveness of SL simulation course, trainees’ self-confidence was surveyed, and their debriefing skills were rated with the OSAD (Fig. 2).
Confidence was defined as a belief in own ability to do things and be successful [19]. A total of seven surveys were conducted before and after e-learning and after each session was implemented. The survey utilized a questionnaire consisting of a single item that respondents answered on a 5-point scale. The item was phrased as follows: “I am more confident of my ability to manage simulation based medical education as a result of this session,” with response options ranging from 1 (none, scared to death) to 5 (confident, excited). This assessed self-confidence in the operation of simulation class [20-22].
OSAD was developed to provide the best practice, evidence-based guideline for conducting debriefings in a simulation. It demonstrated very good contents validity (global OSAD contents validity index: 0.94), excellent inter-rater reliability (intraclass correlation coefficient: 0.881) and excellent test-retest reliability (intraclass correlation coefficient: 0.898) [23,24]. There were eight categories in OSAD, for which the facilitator was scored on a scale of 1 (done very poorly) to 5 (done very well). To help the rater score, descriptions of observable behaviors for scores 1, 3, and 5 were provided. One month after the course was completed, all SL simulations were rated independently by the two trainers in a randomized order. To use OSAD, trainers reviewed the OSAD handbook before rating. The inter-rater reliability was calculated based on the trainees’ rating results.
Trainees’ self-confidence questionnaire and OSAD rating results were collected with online-survey software, Google Forms (Google LLC, Mountain View, USA).

4. Debriefer training course design

Over 2 weeks, there were five training sessions of 100 minutes each. Each session’s main objective focused on different elements for PEARLS framework (session 1: reactions, session 2: descriptions, session 3–4: analysis, session 5: summary) (Fig. 2).
At the beginning of each session, after all participants attended Zoom at a set time, the sample video was reminded briefly, and the objective of each session was then identified.
During SL simulation, the trainer played the role of the SL (ICU nurse), and the trainee played the role of the debriefer in the sample video of SBHE. The trainer responded to the trainee’s debriefing behaviors according to scripted responses or frames of SL as the ICU nurse such as feeling, reasons of behaviors, and non-verbal attitudes. Each SL simulation was video recorded for OSAD rating.
About 10 minutes of debriefing, SL simulation was conducted by each trainee and immediately followed by about 20 minutes of feedback from the trainer, focusing on each objective and general skills as a meta-debriefing (e.g., plus-delta, advocacy and inquiry, useful expressions in specific situations, open-question, intentional silence, honest speaking, making notes of learners’ response). While one trainee was performing SL simulation, others were waiting in another room on Zoom. They were able to prepare and reflect on their simulation and share their ideas for the challenges. After all individual SL simulations, the training closed with group debriefing about common insufficient points (e.g., excessive support, perfunctory reaction). Participants were able to watch all the videos after completing the course.

5. Statistical analysis

Descriptive analysis using graphs was performed for self-confidence survey results and OSAD scores. To evaluate the changes in self-confidence across multiple training sessions, different statistical approaches were considered. Initially, Friedman test and Quade test were explored due to the repeated measures nature of the data. However, these tests were deemed inappropriate as the dataset contained missing values, violating the assumption of a complete block design. Ordinal mixed-effects models were also evaluated as a potential alternative, considering the ordinal nature of the self-confidence scores. However, the residuals from the model showed a deviation from normality, suggesting that this approach might not be suitable. Given these limitations, we employed a generalized estimating equations (GEE) model to analyze the data. GEE is a method for analyzing repeated measures data, particularly when the data does not meet the assumptions of normality or contain missing values. The GEE model was constructed with self-confidence as the dependent variable, the number of training sessions as the independent variable, and trainees as the clustering variable. The working correlation structure was set to “independence” to account for the lack of explicit correlations between repeated measures within individuals. All analyses were conducted using R ver. 4.0.2 (The R Foundation for Statistical Computing, Vienna, Austria) with the “geepack” package.
To examine changes in OSAD scores across sessions, a mixed linear model was employed. The model included session as a fixed effect, while random effects were specified for trainees and raters to account for repeated measures and inter-rater variability. A two-tailed p-value <0.05 indicated statistical significance. Results are expressed with a 95% confidence interval (CI). All results were analyzed with R statistical software ver. 4.0.2 for Microsoft Windows (R Foundation for Statistical Computing, 2020).

Results

This study started the study plan in January 2020, and after applying for an institutional review board deliberation exemption in May 2020, preliminary preparations including e-learning materials, SBME video, questionnaire forms, and orientation for participants were conducted for approximately 2 months. Then, debriefing training for the first and second group of trainees was conducted in July 2020 and January 2021, respectively. OSAD evaluations for the first and second groups were conducted in August 2020 and February 2021, respectively. Four out of six trainees were women. Out of a total of six trainees, the average age of the four whose ages can be verified was 44 years.

1. Trainees’ self-confidence questionnaire results

Trainees’ self-confidence questionnaire results in SBE are shown in Fig. 3. The Shapiro-Wilk test showed that questionnaire results were not normally distributed. The self-confidence significantly increased with each additional session (estimate=0.114, standard error=0.020, p<0.001), indicating a steady positive trend over time. The detailed GEE results are presented in Table 1.

2. OSAD rating results of trainees’ debriefing

The OSAD score showed normal distribution as a result of the Shapiro-Wilk test (p=0.429). Inter-rater reliability, assessed using the intraclass correlation coefficient (ICC), was 0.70 (95% CI, 0.45–0.85), indicating moderate to good reliability between the two raters [25]. The ICC was calculated using a two-way random-effect model to assess absolute agreement. Descriptive data and statistical analysis for OSAD score are shown in Fig. 4 and Table 2. The mixed linear models revealed a statistically significant effect of SL simulation course on OSAD scores (p=0.006).

Discussion

Until now, the debriefer training course for simulationbased healthcare educators is available through workshops, courses, fellowship programs, and peer coaching [26,27]. While these methods provide important didactics and valuable experience, one-time professional development models and opportunistic experiences are not known to have a significant impact in improving professional performance which is more closely related to hands-on engagement with feedback during training [28,29]. Discussion debriefing curriculum using participant videos is also known as an effective method, but even these had limitations that they could not be designed as systematically structured learning goals [30].
This study shows the SL simulation for debriefer training improved the debriefer’s self-confidence and OSAD scores (Figs. 3, 4). Previous studies mainly assessed self-confidence or OSAD before and after the intervention of debriefer training and reported improvements of scores [27,31,32]. However, in this study, OSAD and selfconfidence scores were assessed multiple times during the course. Therefore, it is difficult to directly compare the results of this study with the results of previous studies. In addition, this study found that repetitive sessions lead to continuous improvement in OSAD and self-confidence scores.
This SL simulation containing various strategies could provide opportunities intentionally designed for experiential learning. Through this course, trainees were provided with a recording of the SL simulation as well as the opportunity to debrief on the SL simulation. This debriefer training method provides these opportunities and materials through video conferencing, allowing learners to have a systematic and deliberate hands-on experience regardless of distance or time.
Furthermore, SL simulation is a flexible training model that can be adapted for trainees at different levels of debriefer expertise, novice, advanced beginner, or other professions (nursing, pharmacy, medicine, and so forth). During SL simulation, the trainer could trigger trainees’ opportunistic debriefing behaviors as learners and provide structured feedback as a trainer. Through feedback, trainees could also observe these debriefing skills of experienced trainers as a meta-debriefing. Most valuable of all, they had five opportunities to practice their prepared debriefing skills in a safe and familiar environment and reflect their behavior based on the feedback provided by the trainer [3].
As the quality of debriefing and the eventual impact on learning outcomes are highly dependent on the performance of the trainer who facilitates the training, if trainers have more understanding of SL simulation, trainees will have more significant improvement in their own performances [26]. However, it is not easy to prepare for SL simulations as a trainer. New trainers in the role of SLs should have the opportunity to practice stepby-step scripts based on trainees’ expected response frames prior to SL simulation. This experience can be one of the resources to develop skills on how to train debriefers through SL simulation. Trainers should also encourage trainees to prepare for SL simulations by sharing learning materials such as the PEARLS framework.
This study has several limitations. First, because this SL simulation was structurally designed only for debriefing after simulation, it was not possible to provide trainees with training on pre-briefing or facilitating. Second, this SL simulation was conducted with one SL using a sample SBHE video featuring one learner, which might not be enough for trainees to prepare a variety of SBHE involving multiple trainees. With additional programs involving multiple learners, including prebriefing and facilitating, this SL simulation can be improved into a more complete training method for debriefers. Third, this study included a small number of participants. Both self-confidence and OSAD scores through this process improved, but the degree differed among trainees (Figs. 3, 4). Fourth, we did not collect the data for the long-term effect of this course. However, these limitations are expected to be gradually overcome through continuous further research. Finally, as an improvement for further research, providing participants with a clear learning objective checklist for sample SBHE, although not provided in this course, will increase the fidelity of SL simulation.
This study shows that SL simulation via video conference is applicable in the real world and is one of the useful and effective methods for debriefer training in SBHE.

Notes

Acknowledgements
We would like to thank Eri Sato, Ju Ok Park, Sang Hoon Oh, Yuka Eto, Jannet J. Lee-Jayaram, and Benjamin W. Berg for participating in the debriefer training course.
Funding
No funding was obtained for this study.
Conflicts of interest
No potential conflict of interest relevant to this article was reported.
Author contributions
HSP: design of the work, data collection and analysis, and drafting the article. JYR: data interpretation, critical revision of the article, and final approval of the version to be published.

Fig. 1.

An Image of the Simulation-Based Healthcare Education Sample Video for Debriefer Training Course

ICU: Intensive care unit, SBHE: Simulation-based healthcare education.
kjme-2025-321f1.jpg
Fig. 2.

Schedule of the Debriefer Training Course Using Standardized Learner Simulation

kjme-2025-321f2.jpg
Fig. 3.

Trend and Boxplot of Trainees’ Self-confidence Survey Results

Trend (A) and boxplot (B) of trainees’ self-confidence survey results.
kjme-2025-321f3.jpg
Fig. 4.

Trend and Boxplot of Objective Structured Assessment of Debriefing Scores through Standardized Learner Simulation Debriefer Training Course

Trend (A) and boxplot (B) of objective structured assessment of debriefing (OSAD) scores through standardized learner simulation debriefer training course.
kjme-2025-321f4.jpg
Table 1.
Results of the Generalized Estimating Equations Analysis of Self-confidence over Course
Variable Estimate±SE z-value p-value
Intercept 0.6480±0.0200
Session 0.1140±0.0200 5.7300 <0.001

SE: Standard error.

Table 2.
Objective Structured Assessment of Debriefing mean and Median Scores through Standardized Learner Simulation Debriefer Training Course
Session No. of observations OSAD score
p-value for trend
Mean±SE Median (range)
1 12 28.2±3.61 29 (26–30) 0.006
2 10 30.9±6.33 34 (26–36)
3 12 31.9±2.35 32 (31–34)
4 10 30.1±3.07 31 (28–32)
5 12 33.9±3.15 35 (33–36)

OSAD: Objective structured assessment of debriefing, SE: Standard error.

References

1. Cook DA, Hatala R, Brydges R, et al. Technologyenhanced simulation for health professions education: a systematic review and meta-analysis. JAMA. 2011;306(9):978-988.
pmid
2. Fanning RM, Gaba DM. The role of debriefing in simulation-based learning. Simul Healthc. 2007;2(2):115-125.
crossref pmid
3. Cheng A, Eppich W, Kolbe M, Meguerdichian M, Bajaj K, Grant V. A conceptual framework for the development of debriefing skills: a journey of discovery, growth, and maturity. Simul Healthc. 2020;15(1):55-60.
pmid
4. Flanagan OL, Cummings KM. Standardized patients in medical education: a review of the literature. Cureus. 2023;15(7):e42027.
crossref pmid pmc
5. Kirkpatrick JD, Kirkpatrick WK. Kirkpatrick’s four levels of training evaluation. Alexandria, USA: ATD Press; 2016.

6. Becker KL, Rose LE, Berg JB, Park H, Shatzer JH. The teaching effectiveness of standardized patients. J Nurs Educ. 2006;45(4):103-111.
crossref pmid
7. Ainsworth MA, Rogers LP, Markus JF, Dorsey NK, Blackwell TA, Petrusa ER. Standardized patient encounters: a method for teaching and evaluation. JAMA. 1991;266(10):1390-1396.
crossref pmid
8. Eppich W, Cheng A. Promoting Excellence and Reflective Learning in Simulation (PEARLS): development and rationale for a blended approach to health care simulation debriefing. Simul Healthc. 2015;10(2):106-115.
pmid
9. Yardley S, Teunissen PW, Dornan T. Experiential learning: AMEE guide no. 63. Med Teach. 2012;34(2):e102-115.
crossref pmid
10. Wilson LO. Anderson and Krathwohl Bloom’s taxonomy revised: understanding the new version of Bloom’s taxonomy. https://quincycollege.edu/wp-content/uploads/Anderson-and-Krathwohl_Revised-Blooms-Taxonomy.pdf. Published 2016. Accessed June 5, 2024.

11. Moon JA. A handbook of reflective and experiential learning: theory and practice. Oxfordshire, UK: Routledge; 2013.

12. Wilson L, Wittmann-Price RA. Review manual for the certified healthcare simulation educator exam. 2nd ed. New York, USA: Springer Publishing Company; 2018.

13. Kurtz S, Draper J, Silverman J. Teaching and learning communication skills in medicine. 2nd ed. Boca Raton, USA: CRC Press; 2017.

14. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10-28.
crossref pmid
15. Steinemann S, Berg B, DiTullio A, et al. Assessing teamwork in the trauma bay: introduction of a modified “NOTECHS” scale for trauma. Am J Surg. 2012;203(1):69-75.
crossref pmid
16. King HB, Battles J, Baker DP, et al. TeamSTEPPS: Team Strategies and Tools to Enhance Performance and Patient Safety. In: Henriksen K, Battles JB, Keyes MA, Grady ML, et al, eds. Advances in Patient Safety: New Directions and Alternative Approaches. Rockville, USA: Agency for Healthcare Research and Quality (US); 2008:5-20.

17. Brett-Fleegler M, Rudolph J, Eppich W, et al. Debriefing assessment for simulation in healthcare: development and psychometric properties. Simul Healthc. 2012;7(5):288-294.
pmid
18. Rudolph JW, Simon R, Raemer DB, Eppich WJ. Debriefing as formative assessment: closing performance gaps in medical education. Acad Emerg Med. 2008;15(11):1010-1016.
crossref pmid
19. Oxford Learner's Dictionaries. Confidence. https://www.oxfordlearnersdictionaries.com/definition/english/confidence?q=Confidence. Accessed November 27, 2024.

20. Sari H, Ekici S, Soyer F, Eskiler E. Does self-confidence link to motivation?: a study in field hockey athletes. J Hum Sport Exerc. 2015;10(1):24-35.
crossref
21. Hicks FD, Coke L, Li S. Report of findings from the effect of high-fidelity simulation on nursing students' knowledge and performance: a pilot study. Chicago, USA: National Council of State Boards of Nursing; 2009.

22. Nemoto T, Beglar D. Developing Likert-scale questionnaires. In: Sonda N, Krause A, eds. JALT 2013 Conference Proceedings. https://jalt-publications.org/files/pdf-article/jalt2013_001.pdf. Published 2014. Accessed June 5, 2024.

23. Arora S, Ahmed M, Paige J, et al. Objective structured assessment of debriefing: bringing science to the art of debriefing in surgery. Ann Surg. 2012;256(6):982-988.
pmid
24. Arora S, Runnacles J, Ahmed M, et al. The London handbook for debriefing: enhancing performance debriefing in clinical and simulated settings. London, UK: LondonDeanery; 2012.

25. Koo TK, Li MY. A Guideline of selecting and reporting intraclass correlation coefficients for reliability research. J Chiropr Med. 2016;15(2):155-163.
crossref pmid pmc
26. Cheng A, Grant V, Huffman J, et al. Coaching the debriefer: peer coaching to improve debriefing quality in simulation programs. Simul Healthc. 2017;12(5):319-325.
pmid
27. Abulebda K, Srinivasan S, Maa T, Stormorken A, Chumpitazi CE. Development, implementation, and evaluation of a faculty development workshop to enhance debriefing skills among novice facilitators. Cureus. 2020;12(2):e6942.
crossref pmid pmc
28. Lyon AR. Implementation science and practice in the education sector. https://education.uw.edu/sites/default/files/Implementation%20Science%20Issue%20Brief%20072617.pdf. Published 2016. Accessed June 5, 2024.

29. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 Suppl):S70-S81.
crossref pmid
30. O’Shea CI, Schnieke-Kind C, Pugh D, Picton E. The Meta-Debrief Club: an effective method for debriefing your debrief. BMJ Simul Technol Enhanc Learn. 2020;6(2):118-120.
crossref pmid pmc
31. Snelling PJ, Dodson L, Monteagle E, et al. PRE-scripted debriefing for Paediatric Simulation Associated with Resuscitation Education (PREPARED): a multicentre, cluster randomised controlled trial. Resusc Plus. 2022;11:100291.
crossref pmid pmc
32. Johnson J, Pointon L, Keyworth C, et al. Evaluation of a training programme for critical incident debrief facilitators. Occup Med (Lond). 2023;73(2):103-108.
crossref pmid pmc pdf
TOOLS
PDF Links  PDF Links
PubReader  PubReader
ePub Link  ePub Link
XML Download  XML Download
Full text via DOI  Full text via DOI
Download Citation  Download Citation
  Print
Share:      
METRICS
0
Crossref
0
Scopus
414
View
20
Download
Editorial Office
The Korean Society of Medical Education
(204 Yenji-Dreamvile) 10 Daehak-ro, 1-gil, Jongno-gu, Seoul 03129, Korea
Tel: +82-32-458-2636   Fax: +82-32-458-2529
E-mail : kjme@ksmed.or.kr
About |  Browse Articles |  Current Issue |  For Authors and Reviewers
Copyright © 2025 by Korean Society of Medical Education.                 Developed in M2PI