Standardized learner simulation for debriefer training through video conference
Article information
Abstract
Purpose
Debriefing after simulation-based healthcare education (SBHE) is challenging. Educators’ debriefing skills are essential to the success of learning. For debriefing skill training, we designed a simulation course with the standardized learner (SL) through video conference. The purpose of this study was to describe the implementation process of the SL simulation course and evaluate its effectiveness on educators’ self-confidence and debriefing skills.
Methods
This simulation course involved six trainees and two trainers. After watching a 5-minute sample video of SBHE, each trainee conducted their role as a debriefer in this video and the trainer acted as a learner (SL) in this video. Following each simulation, individual feedback by the trainer was immediately implemented. To evaluate the course’s effectiveness, trainees’ self-confidence questionnaire was collected, and objective structured assessment of debriefing (OSAD) scores were evaluated.
Results
After completing five SL simulation sessions in 2 weeks, the trainees’ self-confidence level and OSAD scores improved significantly (estimate=0.114, standard error=0.020, p<0.001, and p=0.006).
Conclusion
This debriefer training course using SL simulation via video conference could improve self-confidence and debriefing skills of trainees. This SL simulation can be used as a new and flexible method for training debriefers.
Introduction
Simulation is a well-known effective method for training healthcare providers [1]. And debriefing techniques and skills are essential for simulation-based healthcare education (SBHE) [2]. Although many studies have been conducted on how to properly debrief, debriefing remains a burden for educators who are new to actual simulationbased education.
Although there have been concerns about the lack of training methods for skills, a conceptual framework for developing debriefing skills has been reported recently to be of great help. As faculty development strategies, several methods such as foundational training, scripts and tools, expert mentorship, peer feedback, self-reflection, providing feedback, additional training, and varying context have been proposed [3]. Also, many educators have learned related knowledge and received feedback from seniors. Nevertheless, many educators still have difficulty in conducting simulation-based education.
In this study, we used simulation technique for debriefing skill training, just as simulation-based education was introduced for safe clinical education as a way to achieve a safer debriefing experience. Also, evidence already exists for the effectiveness of using standardized patients in simulation for medical education [4]. In line with these strategies, we implemented a debriefer training course for novice trainees using standardized learner (SL) simulation through video conference.
The purpose of this study is to describe the implementation process of SL simulation aimed at improving the self-confidence and debriefing skills of novice debriefers, and to determine its effectiveness.
Methods
This study designed the one-group pretest-posttest design to compare the subjective confidence and objective competence of trainees before and after education corresponding to level 2 of the Kirkpatrick model [5].
First, we prepared one video of arbitrary 5-minute SBHE as the material for use in this simulation. After watching the video together, the trainer acted as the learner and each trainee conducted a debriefing as the debriefer in the video, respectively. Following each simulation, the trainer gave feedback, as the standardized patient in SBHE does [6,7]. Therefore, the learner role in this simulated debriefing was called SL. This SL had to prepare the focused feedback according to the promoting excellence and reflective learning in simulation (PEARLS) framework and this course’s structure [8]. At the same time, each trainee pre-checked the same sample video and prepared debriefing according to the objective of the sample SBHE video as a debriefer and the focus of SL simulation as a trainee. The details of this course are as follows.
1. Beta test
Prior to this training, four beta tests were conducted to reveal necessary improvements. Initially, we included a pre-briefing and reviewed all four elements (reaction, description, analysis, summary) of the PEARLS framework in each training [8]. However, it took too much time (over 2 hours) for participants to focus. To reduce the long training time, we decided to skip the pre-briefing and focus on debriefing only one specific element of the PEARLS framework per one training (two trainings were assigned to ‘analysis’ element with diverse characteristics). Second, based on the opinion that there was a lack of individual specific feedback, each trainee planned to conduct SL simulation and individual feedback in a separate Zoom breakout room in turn.
2. Participants
There were two groups of three debriefing trainees each, and all six of these physicians were novice to debriefing. But they were educators who had to teach medical students or residents. Prior to the course, didactic lectures, literature reviews, observation of simulation activities, and group discussions on various SBHE topics were conducted.
In this course, two educators who had the Certified Healthcare Simulation Educator (CHSE) of certification participated as trainers. Their cumulative experience in simulation education were 7 years and 15 years, respectively.
The study protocol was approved by the University of Hawaii Human Studies Program as exempt (protocol no., 2020-00414). Informed consent was obtained from all participants before they participated in the study.
3. Preparation
1) Trainee orientation
Prior to SL simulation course, trainees were provided with a Power Point file (Microsoft Corp., Redmond, USA) containing 72 slides of self-learning materials for SBHE. The contents were based on CHSE blueprint and SBHErelated articles. They included educational theories (i.e., adult learning, deliberate practice, experiential learning, mastery learning), definitions and concepts related to simulation (i.e., types, structure, standardization, fidelity, feedback, facilitation, debriefing), assessment tools of debriefing such as objective structured assessment of debriefing (OSAD), and debriefing assessment for simulation in healthcare [8-18]. Trainees were also oriented to the format and focuses of SL simulation. They had an opportunity to prepare a debriefing plan based on SBHE learning materials.
2) Training course preparation
A 5-minute sample video of SBHE was created by recording the scenario developed by a colleague (Fig. 1). In the video, there was one learner acting as an intensive care unit (ICU) nurse who had to practice how to approach a deteriorating patient with chest tube on a ventilator. Prior to SL simulation course, trainees watched this sample video, and were informed the objectives of the SBHE scenario in detail.

An Image of the Simulation-Based Healthcare Education Sample Video for Debriefer Training Course
ICU: Intensive care unit, SBHE: Simulation-based healthcare education.
For the video conference, we chose Zoom (Zoom Video Communications, San Jose, USA), which was already supported by the institution and familiar to all participants. The video conference schedule was arranged and shared with all trainers and trainees.
3) Preparation for evaluation
To evaluate the effectiveness of SL simulation course, trainees’ self-confidence was surveyed, and their debriefing skills were rated with the OSAD (Fig. 2).
Confidence was defined as a belief in own ability to do things and be successful [19]. A total of seven surveys were conducted before and after e-learning and after each session was implemented. The survey utilized a questionnaire consisting of a single item that respondents answered on a 5-point scale. The item was phrased as follows: “I am more confident of my ability to manage simulation based medical education as a result of this session,” with response options ranging from 1 (none, scared to death) to 5 (confident, excited). This assessed self-confidence in the operation of simulation class [20-22].
OSAD was developed to provide the best practice, evidence-based guideline for conducting debriefings in a simulation. It demonstrated very good contents validity (global OSAD contents validity index: 0.94), excellent inter-rater reliability (intraclass correlation coefficient: 0.881) and excellent test-retest reliability (intraclass correlation coefficient: 0.898) [23,24]. There were eight categories in OSAD, for which the facilitator was scored on a scale of 1 (done very poorly) to 5 (done very well). To help the rater score, descriptions of observable behaviors for scores 1, 3, and 5 were provided. One month after the course was completed, all SL simulations were rated independently by the two trainers in a randomized order. To use OSAD, trainers reviewed the OSAD handbook before rating. The inter-rater reliability was calculated based on the trainees’ rating results.
Trainees’ self-confidence questionnaire and OSAD rating results were collected with online-survey software, Google Forms (Google LLC, Mountain View, USA).
4. Debriefer training course design
Over 2 weeks, there were five training sessions of 100 minutes each. Each session’s main objective focused on different elements for PEARLS framework (session 1: reactions, session 2: descriptions, session 3–4: analysis, session 5: summary) (Fig. 2).
At the beginning of each session, after all participants attended Zoom at a set time, the sample video was reminded briefly, and the objective of each session was then identified.
During SL simulation, the trainer played the role of the SL (ICU nurse), and the trainee played the role of the debriefer in the sample video of SBHE. The trainer responded to the trainee’s debriefing behaviors according to scripted responses or frames of SL as the ICU nurse such as feeling, reasons of behaviors, and non-verbal attitudes. Each SL simulation was video recorded for OSAD rating.
About 10 minutes of debriefing, SL simulation was conducted by each trainee and immediately followed by about 20 minutes of feedback from the trainer, focusing on each objective and general skills as a meta-debriefing (e.g., plus-delta, advocacy and inquiry, useful expressions in specific situations, open-question, intentional silence, honest speaking, making notes of learners’ response). While one trainee was performing SL simulation, others were waiting in another room on Zoom. They were able to prepare and reflect on their simulation and share their ideas for the challenges. After all individual SL simulations, the training closed with group debriefing about common insufficient points (e.g., excessive support, perfunctory reaction). Participants were able to watch all the videos after completing the course.
5. Statistical analysis
Descriptive analysis using graphs was performed for self-confidence survey results and OSAD scores. To evaluate the changes in self-confidence across multiple training sessions, different statistical approaches were considered. Initially, Friedman test and Quade test were explored due to the repeated measures nature of the data. However, these tests were deemed inappropriate as the dataset contained missing values, violating the assumption of a complete block design. Ordinal mixed-effects models were also evaluated as a potential alternative, considering the ordinal nature of the self-confidence scores. However, the residuals from the model showed a deviation from normality, suggesting that this approach might not be suitable. Given these limitations, we employed a generalized estimating equations (GEE) model to analyze the data. GEE is a method for analyzing repeated measures data, particularly when the data does not meet the assumptions of normality or contain missing values. The GEE model was constructed with self-confidence as the dependent variable, the number of training sessions as the independent variable, and trainees as the clustering variable. The working correlation structure was set to “independence” to account for the lack of explicit correlations between repeated measures within individuals. All analyses were conducted using R ver. 4.0.2 (The R Foundation for Statistical Computing, Vienna, Austria) with the “geepack” package.
To examine changes in OSAD scores across sessions, a mixed linear model was employed. The model included session as a fixed effect, while random effects were specified for trainees and raters to account for repeated measures and inter-rater variability. A two-tailed p-value <0.05 indicated statistical significance. Results are expressed with a 95% confidence interval (CI). All results were analyzed with R statistical software ver. 4.0.2 for Microsoft Windows (R Foundation for Statistical Computing, 2020).
Results
This study started the study plan in January 2020, and after applying for an institutional review board deliberation exemption in May 2020, preliminary preparations including e-learning materials, SBME video, questionnaire forms, and orientation for participants were conducted for approximately 2 months. Then, debriefing training for the first and second group of trainees was conducted in July 2020 and January 2021, respectively. OSAD evaluations for the first and second groups were conducted in August 2020 and February 2021, respectively. Four out of six trainees were women. Out of a total of six trainees, the average age of the four whose ages can be verified was 44 years.
1. Trainees’ self-confidence questionnaire results
Trainees’ self-confidence questionnaire results in SBE are shown in Fig. 3. The Shapiro-Wilk test showed that questionnaire results were not normally distributed. The self-confidence significantly increased with each additional session (estimate=0.114, standard error=0.020, p<0.001), indicating a steady positive trend over time. The detailed GEE results are presented in Table 1.

Trend and Boxplot of Trainees’ Self-confidence Survey Results
Trend (A) and boxplot (B) of trainees’ self-confidence survey results.
2. OSAD rating results of trainees’ debriefing
The OSAD score showed normal distribution as a result of the Shapiro-Wilk test (p=0.429). Inter-rater reliability, assessed using the intraclass correlation coefficient (ICC), was 0.70 (95% CI, 0.45–0.85), indicating moderate to good reliability between the two raters [25]. The ICC was calculated using a two-way random-effect model to assess absolute agreement. Descriptive data and statistical analysis for OSAD score are shown in Fig. 4 and Table 2. The mixed linear models revealed a statistically significant effect of SL simulation course on OSAD scores (p=0.006).

Trend and Boxplot of Objective Structured Assessment of Debriefing Scores through Standardized Learner Simulation Debriefer Training Course
Trend (A) and boxplot (B) of objective structured assessment of debriefing (OSAD) scores through standardized learner simulation debriefer training course.
Discussion
Until now, the debriefer training course for simulationbased healthcare educators is available through workshops, courses, fellowship programs, and peer coaching [26,27]. While these methods provide important didactics and valuable experience, one-time professional development models and opportunistic experiences are not known to have a significant impact in improving professional performance which is more closely related to hands-on engagement with feedback during training [28,29]. Discussion debriefing curriculum using participant videos is also known as an effective method, but even these had limitations that they could not be designed as systematically structured learning goals [30].
This study shows the SL simulation for debriefer training improved the debriefer’s self-confidence and OSAD scores (Figs. 3, 4). Previous studies mainly assessed self-confidence or OSAD before and after the intervention of debriefer training and reported improvements of scores [27,31,32]. However, in this study, OSAD and selfconfidence scores were assessed multiple times during the course. Therefore, it is difficult to directly compare the results of this study with the results of previous studies. In addition, this study found that repetitive sessions lead to continuous improvement in OSAD and self-confidence scores.
This SL simulation containing various strategies could provide opportunities intentionally designed for experiential learning. Through this course, trainees were provided with a recording of the SL simulation as well as the opportunity to debrief on the SL simulation. This debriefer training method provides these opportunities and materials through video conferencing, allowing learners to have a systematic and deliberate hands-on experience regardless of distance or time.
Furthermore, SL simulation is a flexible training model that can be adapted for trainees at different levels of debriefer expertise, novice, advanced beginner, or other professions (nursing, pharmacy, medicine, and so forth). During SL simulation, the trainer could trigger trainees’ opportunistic debriefing behaviors as learners and provide structured feedback as a trainer. Through feedback, trainees could also observe these debriefing skills of experienced trainers as a meta-debriefing. Most valuable of all, they had five opportunities to practice their prepared debriefing skills in a safe and familiar environment and reflect their behavior based on the feedback provided by the trainer [3].
As the quality of debriefing and the eventual impact on learning outcomes are highly dependent on the performance of the trainer who facilitates the training, if trainers have more understanding of SL simulation, trainees will have more significant improvement in their own performances [26]. However, it is not easy to prepare for SL simulations as a trainer. New trainers in the role of SLs should have the opportunity to practice stepby-step scripts based on trainees’ expected response frames prior to SL simulation. This experience can be one of the resources to develop skills on how to train debriefers through SL simulation. Trainers should also encourage trainees to prepare for SL simulations by sharing learning materials such as the PEARLS framework.
This study has several limitations. First, because this SL simulation was structurally designed only for debriefing after simulation, it was not possible to provide trainees with training on pre-briefing or facilitating. Second, this SL simulation was conducted with one SL using a sample SBHE video featuring one learner, which might not be enough for trainees to prepare a variety of SBHE involving multiple trainees. With additional programs involving multiple learners, including prebriefing and facilitating, this SL simulation can be improved into a more complete training method for debriefers. Third, this study included a small number of participants. Both self-confidence and OSAD scores through this process improved, but the degree differed among trainees (Figs. 3, 4). Fourth, we did not collect the data for the long-term effect of this course. However, these limitations are expected to be gradually overcome through continuous further research. Finally, as an improvement for further research, providing participants with a clear learning objective checklist for sample SBHE, although not provided in this course, will increase the fidelity of SL simulation.
This study shows that SL simulation via video conference is applicable in the real world and is one of the useful and effective methods for debriefer training in SBHE.
Notes
Acknowledgements
We would like to thank Eri Sato, Ju Ok Park, Sang Hoon Oh, Yuka Eto, Jannet J. Lee-Jayaram, and Benjamin W. Berg for participating in the debriefer training course.
Funding
No funding was obtained for this study.
Conflicts of interest
No potential conflict of interest relevant to this article was reported.
Author contributions
HSP: design of the work, data collection and analysis, and drafting the article. JYR: data interpretation, critical revision of the article, and final approval of the version to be published.