Influences of faculty evaluating system on educational performance of medical school faculty

Article information

Korean J Med Educ. 2016;28(3):289-294
Publication date (electronic) : 2016 June 30
doi : https://doi.org/10.3946/kjme.2016.34
1Department of Internal Medicine, Seoul National University College of Medicine, Seoul, Korea
2Office of Medical Education, Seoul National University College of Medicine, Seoul, Korea
3Department of Ophthalmology, Seoul National University College of Medicine, Seoul, Korea
Corresponding Author: Sun Jung Myung (http://orcid.org/0000-0001-7332-0126) Office of Medical Education, Seoul National University College of Medicine, 103 Daehak-ro, Jongno-gu, Seoul 03080, Korea Tel: +82.2.740.8177 Fax: +82.2.741.1187 email: issac73@snu.ac.kr
Received 2016 March 18; Revised 2016 April 21; Accepted 2016 May 23.

Abstract

Purpose:

The promotion of educators is challenged by the lack of accepted standards to evaluate the quality and impact of educational activities. Traditionally, promotion is related to research productivity. This study developed an evaluation tool for educational performance of medical school faculty using educator portfolios (EPs).

Methods:

Design principles and quantitative items for EPs were developed in a consensus workshop. These principles were tested in a simulation and revised based on feedback. The changes of total educational activities following introduction of the system were analyzed.

Results:

A total of 71% faculty members answered the simulation of the system and the score distributed widely (mean±standard deviation, 65.43±68.64). The introduction of new system significantly increased the total educational activities, especially in assistant professors.

Conclusion:

The authors offer comprehensive and practical tool for enhancing educational participation of faculty members. Further research for development of qualitative evaluation systems is needed.

Introduction

Traditionally, the role of medical school faculty was compared to a three-legged stool, with the legs as education, research, and practice. This comparison implies medical school faculties achieve balance in the three parts of their duty. Today, this comparison has changed to a tricycle with oversized front. Intense competition between the roles of clinical practice and research had pressed medical school faculty members. Affiliated hospitals are seeking profits, clinical professors are focused on clinical practice, and universities are pressing professors to produce research product such as “SCI” journals. This is reflected by the fact that the faculty evaluation system is focused on research activities, leading many professors to set a low priority on their role as educators [1,2,3]. This situation is no different in Korea [4,5].

A reliable system to evaluate faculty based on their educational activities is needed to improve recruitment and retention of high-quality educators [1,6] and achieve the unique mission of educating physicians [7,8]. While the evaluation of research activities has well designed and acceptable standards, evaluating educational performance is still unsatisfactory in spite of recent efforts to develop effective assessment tools in this area [4,5,9,10,11,12]. To encourage medical school faculty to maintain a balance between research activity and educational activity, a reasonable evaluation system for educational performance is necessary.

In this study, we developed a new system for evaluating educational activity as part of the promotion and tenure process, with the goal of increasing participation in educational activities and raising individual investment in faculty’s role in education.

Subjects and methods

The new evaluating system was developed and applied to all professors at the Seoul National University College of Medicine (SNUCM).

1. The overview of the process

The 2011 SNUCM task force team on educator evaluation was convened to develop an objective process for evaluating educational performance. The task force team includes 15 faculty members from various fields including medical education, internal medicine, surgery, pediatrics, psychiatry, neurology, otolaryngology, ophthalmology, dermatology, diagnostic radiology, neurosurgery, anesthesiology, anatomy, and forensic medicine. The task force team agreed to use educator portfolio (EP). The typical curriculum vitae (CV) reflect quantifiable data (numbers of papers and grants, grant dollars). In contrast, many important aspects of educational activity are not quantifiable and require alternative evaluation measures.

2. The principles of developing new evaluation system

Through consensus workshop, the task force team worked to develop a reliable evaluation system using clear items with appropriate and fair weighted value for each item. They focused on five guiding principles: (1) the new system should reinforce a professor’s role as educator; (2) special regard should be paid to the individual characteristics of each professor and differences among affiliated departments; (3) professors should be encouraged to meet with students individually for mentoring and counseling; (4) evaluation items should be diverse and clear, eliminating any ambiguity; (5) reliable and definite score differences should reflect faculty members’ efforts and the outcomes of educational activity.

3. New evaluation system for educational performance

The evaluation system was designed based on the time and efforts devoted to each educational activity (Table 1). The items were proposed as measurable educational activities and were divided into three categories: teaching & learner assessment, educational leadership/administration & curriculum, and advising/mentoring. These were rearranged from five categories which were previously described for documenting quantity and quality of scholarly engagement in educational activities: teaching, curriculum, advising/mentoring, educational leadership/administration, and learner assessment [11]. The requirement of educational scores for promotion escalated with promotion of the position.

Educator Portfolios Template Examples

4. Development of quantitative items

The various educational activities were proposed as items for evaluation system. Over the course of developing and selecting items for the evaluation tool, items were revised based on group discussion. We then developed a scoring system to measure each activity, using the educational activity of 1-hour lecture as a basic unit for scoring. Each item was weighted based on the educator’s effort and time spent.

Before applying the new evaluation system, each professor documented their educational activities in 2011 and calculated their score. We got feedback from faculty members on feasibility and applicability of the new evaluation system.

5. Outcome measure: educational activities

All professors at SNUCM were asked to describe their educational activities using new evaluation system between 2011 and 2013, encompassing the time before (2011 and 2012) and after (2013) the introduction of the new evaluation system. We then calculated the changes in their total educational activities following introduction of the system.

6. Data analysis

Statistical analysis was performed using the SPSS version 20.0 statistical package (IBM Corp., Armonk, USA). The changes in educational activities of professors were analyzed using analysis of covariance. The p-values of <0.05 were taken to indicate significant differences.

7. Ethical consideration

The SNUCM Institutional Review Board provided study approval and waived the requirement for written consent.

Results

1. Results of simulation

A total of 190 faculty members (71%) participated in the simulation of the system. The scores of faculty members were widely distributed (mean±standard deviation [SD], 65.43±68.64). The scores of lecture, basic experiment practice and clinical training showed a particularly wide disparity (mean±SD, 17.46±36.63, 12.8±43.95, and 14.04±33.70, respectively) with a standard deviation higher than average score and these activities were main educational activities of faculties. The sum of scores from lecture, basic experiment and clinical training was 67.7% of total educational scores (mean±SD, 44.3±50.4).

2. Changes of total educational activities of professors following introduction of new system

One of the primary goals in developing new evaluation system for educational performance is to encourage professor’s participation in educational activities. As shown in Figs. 1 and 2, the total of educational activities was significantly increased following introduction of new system (p<0.001). Score increase in educational leadership/administration & curriculum was significantly higher than increases in other categories of educational activities (p<0.001). The increase of assistant professor’s educational activity was significantly higher than that of associate professor or professor (p=0.045). There was no significant difference in scores across different affiliated hospitals (p=0.571).

Fig. 1.

The Change of Total Educational Scores according to Categories of Items

Fig. 2.

The Change of Total Educational Scores according to the Status of Faculty Members

Discussion

In this study, we present the development of a novel tool for evaluating educational activity. Our goal was to provide an instrument that would encourage promotions committees to consider recognize and reward medical educators based on educational activity.

EPs are now used in addition to or in combination with CVs in many institutions to evaluate educators [12]. EPs are very informative and medical schools which are using EP documentation in their promotion dossier are enormously increased [12].

In this study, despite our efforts to develop clear and detailed guidelines for scoring, some rated items were ambiguous. The diversity of educational activities available made it difficult to establish clear-cut and consistent scoring criteria. Furthermore, educational activities are constantly evolving, making it challenging to evaluate them based on fixed items. Evaluation items must be frequently re-evaluated and updated based on new developments in the field.

The scores of lecture, basic experiment practice and clinical training showed a particularly wide disparity and these might result from the inequity of chance for lecture, basic experiment practice and clinical training among affiliated departments. Therefore, the task force team decided to establish an upper limit of scores to reduce any possible bias in evaluating lecture and clinical training.

As shown in the results, the educational activities of faculties are still primarily lecture, basic experiment or clinical training oriented (67.7%). These might result from faculty members’ unfamiliarity with alternative educational activities such as small group teaching.

Recently, many medical schools favored small group activities in place of larger lectures and the importance of tailored guidance for students was amplified. This has increased the need for effective and committed clinician educators [8,13]. We hope that our new evaluation system will function as an inducement for educational activities.

One obstacle to implementation of our new evaluation system is faculty’s unfamiliarity with keeping EPs. Despite our straightforward web-based EP system, faculties still reported discomfort with keeping EPs. For a valid and reliable evaluation system, faculties must document their work meticulously. The quality of evaluation tool is fully dependent on the quality of each faculty’s EP.

Our study has several limitations. First, as mentioned above, in this early stage of the evaluation we relied upon quantitative data. After the system is more fully established, we hope to create a more qualitative approach. Second, the evaluation system was developed based on consensus workshop. A continuous review and modification of the system is required to ensure that items and scores are adequate for evaluation educational activities and progress. Third, the evaluation system is primarily designed for promotion and tenure process. Senior professors who are exempt from this evaluation system are not affected by this system and have no incentive to participate. This may explain the fact that score increase was particularly significant in assistant professors.

We hope that our study will trigger productive debate that will lead to a consensus on educator evaluation and EP standards. Medical educators deserve a professional environment in which their efforts are valued and rewarded.

Our evaluation system for educational activities significantly increased faculty’s participation in educational activities. Further research should focus on upgrading and expanding the evaluation system.

Acknowledgements

The authors thank the TF members for their effort for development of new system evaluating educational activities.

Notes

Funding

This study was supported by a grant from the Seoul National University College of Medicine Research Fund.

Conflicts of interest

None.

References

1. Thomas PA, Diener-West M, Canto MI, Martin DR, Post WS, Streiff MB. Results of an academic promotion and career path survey of faculty at the Johns Hopkins University School of Medicine. Acad Med 2004;79:258–264.
2. Buckley LM, Sanders K, Shih M, Hampton CL. Attitudes of clinical faculty about career progress, career success and recognition, and commitment to academic medicine: results of a survey. Arch Intern Med 2000;160:2625–2629.
3. Levinson W, Rubenstein A. Mission critical-integrating clinician-educators into academic medical centers. N Engl J Med 1999;341:840–843.
4. Ohrr H, Yang EB, Chung MH, Lee MS. The study on the faculty evaluation system of teaching ability in Korea. Korean J Med Educ 1999;11:297–312.
5. Kim YI, Kim JY. Faculty evaluation in Korean medical schools. Part I: designing of basic guideline for assessment of faculty activities. Korean J Med Educ 2000;12:153–162.
6. Beasley BW, Wright SM, Cofrancesco J Jr, Babbott SF, Thomas PA, Bass EB. Promotion criteria for clinician-educators in the United States and Canada: a survey of promotion committee chairpersons. JAMA 1997;278:723–728.
7. Levinson W, Rubenstein A. Integrating clinician-educators into academic medical centers: challenges and potential solutions. Acad Med 2000;75:906–912.
8. Hafler JP, Lovejoy FH Jr. Scholarly activities recorded in the portfolios of teacher-clinician faculty. Acad Med 2000;75:649–652.
9. Chandran L, Gusic M, Baldwin C, Turner T, Zenni E, Lane JL, Balmer D, Bar-On M, Rauch DA, Indyk D, Gruppen LD. Evaluating the performance of medical educators: a novel analysis tool to demonstrate the quality and impact of educational activities. Acad Med 2009;84:58–66.
10. Gusic ME, Baldwin CD, Chandran L, Rose S, Simpson D, Strobel HW, Timm C, Fincher RM. Evaluating educators using a novel toolbox: applying rigorous criteria flexibly across institutions. Acad Med 2014;89:1006–1011.
11. Simpson D, Fincher RM, Hafler JP, Irby DM, Richards BF, Rosenfeld GC, Viggiano TR. Advancing educators and education by defining the components and evidence associated with educational scholarship. Med Educ 2007;41:1002–1009.
12. Simpson D, Hafler J, Brown D, Wilkerson L. Documentation systems for educators seeking academic promotion in U.S. medical schools. Acad Med 2004;79:783–790.
13. Bunton SA, Mallon WT. The continued evolution of faculty appointment and tenure policies at U.S. medical schools. Acad Med 2007;82:281–289.

Article information Continued

Fig. 1.

The Change of Total Educational Scores according to Categories of Items

Fig. 2.

The Change of Total Educational Scores according to the Status of Faculty Members

Table 1.

Educator Portfolios Template Examples

Category Education item example Unit score Total teaching hour/year Note
Teaching & learner assessment Lecture 1/hour
Clinical training 1/hour
Clinical practice tutoring 1/hour
Supervise dissertation 10/paper
Small group tutoring 1.5/hour
Facilitator in communication training program 1.5/hour
Tutor for dyscompetent student 2/hour
Development of learning material 10/each
Offer research course 10/course
Basic experiment 1/hour
Educational leadership/administration & curriculum Chairman in board of education 10/year
Peer review of lecture 2/hour
Interviewer of admission committee 1/hour
Chairman of curriculum director 10/year
Educational committee member activity 3/each
Chief faculty in clinical clerkship 10/year
Advising/mentoring Academic advisor 2/each
Advisor for students' club activity 2/each
Remedial program tutor 2/hour
Mentor for high-risk student 2/each
Grand total