| Home | E-Submission | Sitemap | Contact us |  
top_img
Korean J Med Educ > Volume 21(3); 2009 > Article
Korean Journal of Medical Education 2009;21(3): 287-297. doi: https://doi.org/10.3946/kjme.2009.21.3.287
표준화 환자를 이용한 진료수행시험에서 교수와 표준화 환자의 채점 정확도
박재현, 고진경, 김선미, 유효빈
경희대학교 의학전문대학원 의학교육학교실
Faculty Observer and Standardized Patient Accuracy in Recording Examinees' Behaviors Using Checklists in the Clinical Performance Examination
Jaehyun Park, Jinkyung Ko, Sunmi Kim, Hyobin Yoo
Department of Medical Education, Kyung Hee University School of Medicine, Seoul, Korea.
Corresponding Author: Jinkyung Ko, Tel: 02-961-9102, Fax: 02-969-0792, Email: michkay@khu.ac.kr
Received: April 13, 2009;  Accepted: June 25, 2009.
ABSTRACT
PURPOSE: The purpose of the study was to examine the recording accuracy of faculty observers and standardized patients (SPs) on a clinical performance examination (CPX). METHODS: This was a cross-sectional study of a fourth-year medical students' CPX that was held at a medical school in Seoul, Korea. The CPX consisted of 4 cases and was administered to 118 examinees, with the participation of 52 SP and 45 faculty observers. For the study we chose 15 examinees per case, and analyzed 60 student-SP encounters in total. To determine the recording accuracy level, 2 SP trainers developed an answer key for each encounter. First, we computed agreement rates (P) and kappa coefficient (K) values between the answer key-SPs and the answer key-faculty observers. Secondly, we analyzed variance (ANOVA) with repeated measures to determine whether the mean percentage of the correct checklist score differed as a function of the rater, the case, or the interaction between both factors. RESULTS: Mean P rates ranged from 0.72 to 0.86, while mean K values varied from 0.39 to 0.59. The SP checklist accuracy was higher than that of faculty observersat the level of item comparison. Results from ANOVA showed that there was no significant difference between the percentage of correct scores by the answer key, faculty observers and SPs. There was no significant interaction between rater and case factors. CONCLUSION: Acceptable levels of recording accuracy were obtained in both rater groups. SP raters can replace faculty raters in a large-scale CPX with thorough preparation.
Keywords: Clinical competence;Undergraduate medical education;Observer variation;Educational measurement
Editorial Office
The Korean Society of Medical Education
(204 Yenji-Dreamvile) 10 Daehak-ro, 1-gil, Jongno-gu, Seoul 03129, Korea
Tel: +82-2-2286-1180   Fax: +82-2-928-1647
E-mail : kjme_office@daum.net
About |  Browse Articles |  Current Issue |  For Authors and Reviewers
Copyright © 2018 by Korean Journal of Medical Education. All rights reserved.                 powerd by m2community