| Home | E-Submission | Sitemap | Contact us |  
top_img
Korean J Med Educ > Volume 36(4); 2024 > Article
Loomba and Jindal: Improving process aspect of oral examination as assessment tool in undergraduate biochemistry by introducing structured oral examination: an observational study in India based on a survey among stakeholders

Abstract

Purpose

The traditional method of oral examination, though a good tool for assessing the depth and breadth of student’s knowledge, has its shortcomings. A variable number of questions with variable difficulty levels by different examiners with different expectations can introduce bias in scores. The process aspect of oral examinations of first-year undergraduate medical students was improved by structuring them and by creating uniformity in the number, time, and difficulty level of questions for assessment, and feedback was taken regarding its acceptance as an improved tool of assessment.

Methods

After finalizing the topics, questions from a viva viewpoint were structured and categorized into three difficulty levels covering all aspects of chosen topics validated by subject experts. The number of questions asked per difficulty level, time, and marks given to each question was pre-decided. After briefing the students, the structured viva was conducted, and feedback was taken from students and examiners.

Results

The majority (87%) of first-year undergraduate students undertook the structured viva and filled in the feedback form. Nearly all students felt that the structured oral examination was a fair and unbiased assessment tool with less subjectivity compared to traditional viva. Most students (83.9%) felt that the topics were comprehensively covered, and 96.4% of students felt less stressed. Among examiners, there was 100% agreement on the uniformity of questions asked, topics covered, less subjectivity, and no carryover effect.

Conclusion

The examiners have accepted it as one of the formative assessment tools for future batches of students and are ready to explore its utility as a summative assessment tool.

Introduction

“Assessment” is at the center stage of any form of education. As aptly put by George E. Miller (1919–1998), Assessment drives learning but for that it needs to be planned and implemented strategically. Additionally, the assessment tool should be reliable and valid, only then can it be used as a tool to enhance learning.
Oral viva voice has been used as an assessment tool since ages and is unabatingly being used in most undergraduate and postgraduate medical colleges across India. This form of assessment often feels appropriate because it gives a lot of flexibility to the examiner [1,2]. The student can be scored immediately as opposed to in a written exam, the depth and breadth of knowledge can be tested, the viva can be steered according to the student’s responses, and it appears to check clinical competence better than the written exams [2,3]. It has the added advantage of testing student’s reasoning and communication skills as it offers one on one interaction with the examiner [1-4].
Despite all the apparent benefits of oral viva voice, there are lot of pitfalls that come with it. To begin with, different questions with different difficulty levels create an unequal platform on which students are judged. Most of the time, the question test recall capability rather than critical thinking and higher order thinking skills [1]. The questions do not cover the syllabus as they are impromptu and not planned. The bias is further broadened by the different temperaments of the examiners. Some are more strict or lenient in awarding scores than others [5]. More often, the viva converts into a confrontation between the examiner and student turning the examination atmosphere unconducive [6]. The high level of stress experienced by a student during the traditional viva offers no additional educational benefit as it does not motivate the student to study better. Additionally, the time given to each student is not uniform and often decreases as the day progresses, with the students at the end being scored based on a few questions [7]. Traditional oral exam (TOE) has also been found more likely to be affected by performance of the previous student affecting the score of next one and personal biases. A student’s appearance and confidence also greatly affect the exam scores and agreement between examiners is found to be poor [8]. All in all, these attributes result in a score that is unrelated to the competency of the student.
These shortcomings of oral exams can be minimized by application of structured oral examination (SOE). In SOE, the questions and the correct answers with their scores are predetermined to ensure standardized examination process and consistency from one examiner to another and from one examinee to another. It involves pre-deciding the syllabus and competencies to be assessed and fixing the time spent on each student [9-11]. This is a relatively newer method of assessment tested in small groups with both faculty and students showing a positive perception [2,3,12].
Being a time-consuming and labor-intensive exercise, limited studies are available on the application of SOE in medical education in India, especially in the subject of biochemistry. The objective of this study was to evaluate SOE as an assessment tool in the department of biochemistry via feedback from faculty and students.

Methods

1. Planning

This study was carried out with 57 medical and 30 dental undergraduate students of first professional studying in a medical college and hospital in North India. Approval for the implementation of this project was taken from the Christian Medical College and Hospital, Ludhiana (vide no., CMC/2282).
A faculty meeting was called to sensitize the faculty members about the need to structure the oral examination and the process of conducting it. Suggestions were taken and incorporated into the format. Topics and dates for the conduct of SOE were decided. The total number of questions to be asked in each difficulty level, time, and marks to be allotted per question was decided and a checklist for the same was prepared.

2. Sensitizing the students

The students were sensitized about the method of conducting the SOE 2 weeks prior to the actual examination and they were encouraged to clear any doubts regarding the methodology. The students’ participation was voluntary. An informed consent was obtained from the students for their agreement to participate.

3. Preparation

Questions were framed and categorized in three difficulty levels with pre-decided color codes for each category, covering all the aspects of the selected topics. Each category had five questions with predefined answer key with specific keywords provided to the examiners. Each student was required to answer at least three questions in each category to jump to the next level: (1) Yellow color cards: Lowest difficulty level questions: Included recall type of questions corresponding to the first level of Bloom’s taxonomy [13]. Each question in this category carried two marks. (2) Blue color cards: Middle difficulty level questions: Included biochemical basis of disease conditions (asking for ‘why’) corresponding to the second level of Bloom’s taxonomy. Each question in this category carried four marks. (3) Pink color cards: Highest difficulty level questions: Included demanding clinical co-relation and application of biochemistry corresponding to the interpretive or the third level of Bloom’s taxonomy. Each question in this category carried six marks.
Printed questions were pasted on the colored cards for their respective difficulty level. The cards were numbered according to increasing difficulty level. As many sets of cards as the number of faculty members was prepared for each difficulty level. These were circulated amongst the faculty and the suggestions were incorporated. These questions were also validated by subject experts for language, difficulty level and coverage of topic. The highest possible score a student could achieve by answering all nine questions correctly was 36.
A checklist with an answer key and specific keywords, along with the roll numbers of students and the number of questions each answered correctly at each difficulty level, was prepared for each examiner to ensure fair and consistent grading for all students. Feedback forms for both faculty and students were prepared, with questions having options on a 5-point Likert scale.

4. Execution

On the day of the examination, students were briefed about the new system of oral examination. They were informed of the mark’s distribution in each difficulty category, the minimum number of questions to be answered in each level before qualifying for the next, and the color code for each category.
Students were randomly divided into six groups, with each group assigned to one examiner, and 7 minutes were allotted to each student for the viva. The students were successively asked questions from each category, and upon answering three correct questions, they advanced to the next category. The number of questions answered by students ranged from 0–9. During the conduct of the examination the attendant of the department was made to sit with a buzzer and a stopwatch and was instructed to press the buzzer every seven minutes repeatedly till the viva was over. The two postgraduate students of the department coordinated the shifting of students from the demonstration room to the office of each faculty member and their exit to the departmental library after 7 minutes when viva was over. Arrangements were made so that the students could not interact with each other until the examination was over.
After the viva was over, feedback was taken from students and faculty. The feedback contained a questionnaire based on a Likert scale and two open-ended questions to get their views about the overall process, syllabus covered, uniformity of questions, fairness of system, subjectivity, time allotment, stress levels, and any other bias they felt (Appendices 1, 2). The feedback obtained from faculty and students was analyzed on a 5-point Likert scale and descriptive statistics were done using Microsoft Excel (Microsoft Corp., Redmond, USA).

Results

There were 87 students (57 medical and 30 dental) in total appearing for the SOE. They were randomly divided into six groups with three groups having 14 students each and three groups having 15 students each.
The results of the feedback questionnaire from the students have been depicted in (Table 1, Fig. 1). The majority (96.4%) of the students experienced less stress with SOE as an assessment method compared to traditional viva. Out of these, 64% strongly agreed, and 83.9% of students felt that the SOE was comprehensive and covered all aspects of the chosen topics. Nearly all students (98.7%) felt that there was uniformity of questions being asked with 69% agreeing strongly. There was agreement amongst 99.9% students who felt that SOE is fair and unbiased and a less subjective assessment tool.
Seventy percent of students found the time allotted per question was not enough. Only 3.4% found the time allotted to be appropriate. Seventy-eight percent felt that the examiners’ mood did not affect their performance. While 40.2% of students felt that this method will lead to preparing the topic more thoroughly, 49.43% were neutral about this. In all, 90.8% of students felt that they would prefer SOE over traditional viva.
Most of the comments from students to open-ended questions were positive. They are summarized in Table 2. The students found SOE to be an exciting and less stressful assessment tool. Only one student commented that the buzzer gave them stress. Many students commented that more time per question should have been given.
The perception of faculty conducting the SOE is shown in (Table 3, Fig. 2). There was 100% agreement between examiners regarding SOE being comprehensive. They found the questions to be uniform and less subjective compared to the traditional viva. The examiners also felt that there was no carryover effect (performance of previous student affecting that of next) from one student to another. Most examiners (83.3%) also felt that SOE was fair and unbiased and agreed to recommend it as one of the formative assessment tools for future batches. Twothirds of the examiners (66.6%) preferred this method over the traditional viva.
Most examiners, specifically 83.3% were neutral about whether time allotted for each question was less, and 16.6% found the time to be inadequate and 33.3% felt that they were not able to assess the depth and breath of student's knowledge.
The specific comments from the examiners, as shown in Table 2, show that they found it to be a good method but insisted on blueprinting the questionnaire and increasing the time allotted. Half (50%) of the faculty agreed on using it as a summative assessment tool in conjunction with other methods but also found it to be a labor-intensive exercise.

Discussion

For an assessment to be fair, it must be both reliable and valid. Numerous studies affirm that structuring oral exams enhances their reliability and validity in both undergraduate and postgraduate settings [6,9,14-17]. While TOE have the potential to assess the depth and breadth of a student’s knowledge, they are often criticized for their lack of reliability and susceptibility to subjective bias [1,4-8]. Scoring inconsistencies in TOEs arise from psychological biases, the subjective nature of oral assessments, and variability in examiner expectations. Due to these limitations, TOEs have been largely discontinued as a summative assessment tool in many specialties in the United States [12]. These issues can be somewhat mitigated by employing structured rating scales [4].
In response to these challenges, various alternative assessment methods have been explored, such as multiplechoice questions (MCQs) and direct observation. However, MCQs often fail to measure higher-order thinking since the options provided can cue the student to recognize correct answers, and direct observation is limited by the availability and time constraints of trained faculty [12].
This study introduced a SOE, a relatively new assessment method, in undergraduate medical biochemistry and evaluated its reception among students and faculty. In our study, a significant majority of students (96.4%) reported lower stress levels with SOE compared to traditional viva, with 64% in strong agreement. Additionally, 78% of students felt unaffected by the examiner’s mood, a significant improvement over the stress typically associated with TOEs as seen in previously done studies [5,7,11]. However, Waseem and Iqbal [18] noted that although fear of the examination was still present in both SOE and TOE, SOE was more effective in distinguishing between average and high performers. Boulais et al. [19] found that while SOE was a source of stress, some students considered it less stressful than oral structured clinical examinations. Another study found that SOE did not significantly reduce stress, likely due to students’ unfamiliarity with the new system [20].
A large proportion of students in our study (83.9%) felt that SOE provided comprehensive coverage of the syllabus, consistent with findings by Wang et al. [8], who also noted improvements in communication skills and practical application of theoretical knowledge. Furthermore, nearly all our students (98.7%) observed uniformity in the questions, and 99.9% found SOE to be fair and unbiased, aligning with previous studies that highlight the fairness of SOE over TOE in undergraduate medical and dental education [5,10,15].
Faculty feedback was overwhelmingly positive, with 100% agreeing that SOE was comprehensive, uniform, and less subjective than TOE. These results are in line with similar perspective observed by other authors where faculty noted that SOE reduced bias, minimized the luck factor, and ensured that each student was assessed on the same criteria [5-7]. In our study, the examiners also noted that there was no carryover effect, meaning that the performance of one student did not influence the assessment of the next student. This has been reported as another major drawback of traditional viva [8].
In our study, 70.2% of students and 16.6% of faculty found the time allotted per question inadequate, while 83.3% of faculty were neutral. We acknowledge that while concerns about time constraints were raised, extending the time per question could make it challenging to accommodate all students in a single session, especially considering the logistical challenges posed by a higher number of students and a limited number of available faculty. Despite similar concerns about time restraints observed by some other authors, SOE was still regarded as a valuable addition to assessment methods [12]. Future research should explore the impact of extended time on overall performance.
Interestingly, nearly half of the students in our study were unsure if SOE helped them prepare better, although prior research suggests that SOE promotes greater engagement with the material [8,19]. Our faculty (33.3%) felt that this method would enhance students’ learning.
Overall, 90.8% of students in our study preferred SOE over TOE. This was consistent with positive student feedback observed by Dhasmana et al. [14] with 83% overall acceptability and 66% of students expressing a preference for SOE over TOE. In a study investigating use of SOE in pediatric endocrinology fellowship programs, most fellows (94%) and faculty (86%) recommended the integration of SOE into their curriculum, citing its effectiveness in preparing fellows for unsupervised clinical practice [12]. Similarly, 88% of students and 83.3% of examiners preferred SOE over traditional viva due to its perceived fairness, uniformity, and reduced bias in a study by Khan and Mirza [20].
Despite acknowledging the labor-intensive nature of SOE, the majority of faculty (66.6%) in our study preferred SOE to TOE and 83.2% recommended it for future use. Concerns about SOE being labor and resource-intensive have been shown by faculty in other studies where they opine that SOE leads to examiner fatigue [4,6,8,11]. Availability of human resources and time is an important factor in overall feasibility of using SOE routinely. Shenwai and Patil [5] similarly concluded that, despite requiring extensive groundwork, SOE is a more effective assessment tool and can be widely adopted in medical education with some modifications. Another study by Boulais et al. [19] identified significant logistic challenges with SOE but concluded that extensive planning can overcome these limitations. We are convinced that advanced preparation of the question bank and answer checklists can alleviate some of these concerns. Also, once the process is planned and well conducted for a batch, it may become easier to conduct for the subsequent batches.
Numerous studies have compared SOE with TOE, yielding varied results regarding the scores obtained by students through each method. Khilnani et al. [6] and Muzzin and Hart [7] found SOE to yield lower marks than the traditional viva. But at the same time, students felt that SOE required in-depth preparation of the topic. Puppalwar et al. [21] observed higher scores in TOE, indicating examiner leniency, while fewer students passed SOE, reflecting its stricter and more objective approach. Ganji [10] demonstrated strong test-retest reliability with SOE in dental education with improvement in scores after implementation of SOE. Another study observed significant disparity in scores obtained with TOE compared to SOE and found the latter to be a fairer evaluation of students’ knowledge [14]. Imran et al. [15] found SOE had the highest discriminatory power, categorizing more students as appropriately assessed compared to unstructured viva and structured theory exams.
This study is limited by its single-institution scope, and the lack of a direct comparison between SOE and TOE scores.
This study’s strength is its focus on examining the use of SOE as a formative assessment tool in undergraduate medical biochemistry, a relatively under explored area thereby contributing valuable insights into its applicability in this discipline. This research provides comprehensive feedback from both students and faculty, providing a well-rounded view of the SOE’s effectiveness in reducing stress, increasing fairness, and ensuring uniformity. Additionally, it addresses practical challenges like faculty workload and time constraints, offering realistic solutions like advance preparation of question banks for implementing SOE effectively in educational settings. The study highlights the significant reduction in subjective bias and variability in examiner expectations, which has been a major limitation in TOEs and supports the use of SOE for consistent and equitable assessment.
In conclusion, SOE is an effective formative and summative assessment tool and can be integrated into undergraduate biochemistry as a fair, objective, and comprehensive method of assessment, aligned with the current trend of competency-based education.

Acknowledgments

We acknowledge the guidance and help provided by the Faculty of Biochemistry as they agreed and participated enthusiastically in execution of structured oral examination and the first year medical and dental students who agreed to participate.

Notes

Funding
No financial support was received for this study.
Conflicts of interest
No potential conflict of interest relevant to this article was reported.
Author contributions
RL designed the study. RL and NMJ executed the plan, conducted the literature review and statistical analysis, and prepared both the initial and final drafts of the manuscript.

Fig. 1.
Percentage Distribution of Student’s Responses to Feedback Questionnaire Based on Likert Scale
kjme-2024-310f1.jpg
Fig. 2.
Percentage Distribution of Examiner’s Responses to Feedback Questionnaire on Likert Scale
kjme-2024-310f2.jpg
Table 1.
Student’s Responses to Feedback Questionnaires Based on Likert Scale
No. Statement Strongly disagree Disagree Neutral Agree Strongly agree
1. I experienced less stress compared to traditional viva 0 0 3 (3.4) 28 (32.1) 56 (64.3)
2. SOE is comprehensive and covers all the aspects of chosen topic. 0 0 14 (16.0) 38 (43.7) 35 (40.2)
3. There was uniformity of questions asked 0 0 1 (1.1) 26 (29.8) 60 (68.9)
4. This system of assessment is fair and unbiased. 0 0 0 29 (33.3) 58 (66.6)
5. There is much less subjectivity than traditional viva. 0 0 0 26 (29.8) 61 (70.1)
6. Time for each question was enough in each set of difficulty level 19 (21.8) 42 (48.4) 23 (26.4) 3 (3.4) 0
7. Examiner’s mood affected my performance 33 (37.9) 35 (40.2) 16 (18.4) 3 (3.4) 0
8. This system of examination will lead me to prepare the topic more thoroughly. 0 9 (10.3) 43 (49.4) 32 (36.8) 3 (3.4)
9. I would prefer SOE over traditional way of taking viva. 0 0 8 (9.2) 44 (50.6) 35 (40.2)

Data are presented as number (%).

SOE: Structured oral examination.

Table 2.
Students and Examiners Remarks/Comments about the Open-Ended Questions in Feedback Questionnaire
Question Remarks/comments
Students
 What do you appreciate about this way of taking oral viva? It was fair as compared to traditional viva
I found it to be unbiased
There was equal opportunity and time
Should be implemented routinely
Instils confidence in us
Gives us an idea of where we stand
Very innovative!
Very interesting!
Will look forward to this type of exam
It was very exciting!
I was much less nervous
 What would you want to be changed in this process of taking viva to improvise it further? There should have been more time per question
Hearing the bell gave me stress
Students should be given all the questions to assess themselves later
Feedback should be given on the spot to students
Examiners
 How can this process be improved? More number of questions to be added, combine with subjective test
By using for selective topics
Blue printing of questions and difficulty level should be done
Good way of taking viva
More questions and more time per question
Blue printing of questions should be done
 Can SOE be used as a summative assessment tool? Yes, as one of the methods, not all in all
Yes, in conjunction with subjective assessment
No
No
No
Yes, but lot of work in planning and Question Bank preparation would be needed.
Table 3.
Examiner’s Responses to Feedback Questionnaires Based on Likert Scale
No. Statement Strongly disagree Disagree Neutral Agree Strongly agree
1. It enabled me to assess the depth and breadth of student’s knowledge of the topic. 0 2 (33.3) 3 (50.0) 1 (16.6) 0
2. SOE is comprehensive and covers all the aspects of chosen topic. 0 0 0 5 (83.3) 1 (16.6)
3. There was uniformity of questions asked and time allotted to each question and student. 0 0 0 3 (50.0) 3 (50.0)
4. This system of assessment is fair and unbiased. 0 0 1 (16.6) 2 (33.3) 3 (50.0)
5. There is much less subjectivity than traditional viva. 0 0 0 4 (66.6) 2 (33.3)
6. Time for each question was enough in each set of difficulty level 0 1 (16.6) 5 (83.3) 0 0
7. There was no carry over effect. (Performance of previous student affecting that of next.) 0 0 0 4 (66.6) 2 (33.3)
8. This process of structured oral evaluation will enhance student learning. 0 0 4 (66.6) 2 (33.3) 0
9. SOE is preferable over traditional way of taking viva. 0 0 2 (33.3) 4 (66.6) 0
10. Will recommend it as one of the formative assessment tools for future batches. 0 0 1 (16.6) 4 (66.6) 1 (16.6)

Data are presented as number (%).

SOE: Structured oral examination.

References

1. Torke S, Abraham RR, Ramnarayan K, Asha K. The impact of viva-voce examination on students’ performance in theory component of the final summative examination in physiology. J Physiol Pathophysiol 2010;1(1):10-12.

2. Rahman G. Appropriateness of using oral examination as an assessment method in medical or dental education. J Educ Ethics Dent 2011;1(2):46-51.
crossref
3. Rangachari PK. The targeted oral. Adv Physiol Educ 2004;28(1-4):213-214.
crossref pmid
4. Raymond MR, Webb LC, Houston WM. Correcting performance-rating errors in oral examinations. Eval Health Prof 1991;14(1):100-122.
crossref pmid pdf
5. Shenwai MR, B Patil K. Introduction of structured oral examination as a novel assessment tool to first year medical students in physiology. J Clin Diagn Res 2013;7(11):2544-2547.
crossref pmid pmc
6. Khilnani AK, Charan J, Thaddanee R, Pathak RR, Makwana S, Khilnani G. Structured oral examination in pharmacology for undergraduate medical students: factors influencing its implementation. Indian J Pharmacol 2015;47(5):546-550.
crossref pmid pmc
7. Muzzin L, Hart L. Oral examinations. In: Neufeld V, Norman G, eds. Assessing Clinical Competence. New York, USA: Springer; 1985:71-93.

8. Wang L, Khalaf AT, Lei D, et al. Structured oral examination as an effective assessment tool in lab-based physiology learning sessions. Adv Physiol Educ 2020;44(3):453-458.
crossref pmid
9. Alharbi F, Alamer A. Feasibility of radiology online structured oral examination for undergraduate medical students. Insights Imaging 2022;13(1):120.
crossref pmid pmc pdf
10. Ganji KK. Evaluation of reliability in structured viva voce as a formative assessment of dental students. J Dent Educ 2017;81(5):590-596.
crossref pmid pdf
11. Patel BS, Kubavat A, Piparva K. Correlation of students performance in theory and practical of final summative pharmacology examination in MBBS curriculum: a critical insight. Natl J Physiol Pharm Pharmacol 1970;3(2):171-175.
crossref
12. Shenoy RV, Newbern D, Cooke DW, et al. The structured oral examination: a method to improve formative assessment of fellows in pediatric endocrinology. Acad Pediatr 2022;22(7):1091-1096.
crossref pmid
13. Adams NE. Bloom’s taxonomy of cognitive learning objectives. J Med Libr Assoc 2015;103(3):152-153.
crossref pmid pmc
14. Dhasmana DC, Bala S, Sharma R, et al. Introducing structured viva voce examination in medical undergraduate pharmacology: a pilot study. Indian J Pharmacol 2016;48(Suppl 1):S52-S56.
crossref pmid pmc
15. Imran M, Doshi C, Kharadi D. Structured and unstructured viva voce assessment: a double-blind, randomized, comparative evaluation of medical students. Int J Health Sci (Qassim) 2019;13(2):3-9.

16. Schubert A, Tetzlaff JE, Tan M, Ryckman JV, Mascha E. Consistency, inter-rater reliability, and validity of 441 consecutive mock oral examinations in anesthesiology: implications for use as a tool for assessment of residents. Anesthesiology 1999;91(1):288-298.
crossref pmid
17. Wass V, Van der Vleuten C, Shatzer J, Jones R. Assessment of clinical competence. Lancet 2001;357(9260):945-949.
crossref pmid
18. Waseem N, Iqbal K. Importance of structured viva as an assessment tool in anatomy. J Univ Med Dent Coll 2016;7(2):29-34.

19. Boulais I, Ouellet K, Lachiver EV, et al. Considering the structured oral examinations beyond its psychometrics properties. Med Sci Educ 2023;33(2):345-351.
crossref pmid pmc pdf
20. Khan HM, Mirza TM. Perceptions of oral structured examination: a move from subjectivity to objectivity: oral structured examination. Pak Armed Forces Med J 2017;67(1):41-46.

21. Puppalwar PV, Rawekar A, Chalak A, Dhok A, Khapre M. Introduction of objectively structured viva-voce in formative assessment of medical and dental undergraduates in biochemistry. J Res Med Educ Ethics 2014;4(3):321-325.
crossref

Appendices

Appendix 1. Student Feedback Questionnaire

Please tick in the box that best describes your response.

No. Statement Response
Strongly disagree Disagree Neutral Agree Strongly agree
1. I experienced less stress compared to traditional viva.
2. SOE is comprehensive and covers all the aspects of chosen topic.
3. There was uniformity of questions asked.
4. This system of assessment is fair and unbiased.
5. There is much less subjectivity than traditional viva.
6. Time for each question was enough in each set of difficulty level.
7. Examiner’s mood affected my performance.
8. This system of examination will lead me to prepare the topic more thoroughly.
9. I would prefer SOE over traditional way of taking viva.

Q1. What do you appreciate about this way of taking oral viva? Q2. What would you want to be changed in this process of taking viva to improvise it further?

SOE: Structured oral examination.

Appendix 2. Teacher Feedback Questionnaire

Please tick in the box that best describes your response.

No. Statement Response
Strongly disagree Disagree Neutral Agree Strongly agree
1. It enabled me to assess the depth and breadth of student’s knowledge of the topic.
2. SOE is comprehensive and covers all the aspects of chosen topic.
3. There was uniformity of questions asked and time allotted to each question and student.
4. This system of assessment is fair and unbiased.
5. There is much less subjectivity than traditional viva.
6. Time for each question was enough in each set of difficulty level.
7. There was no carry over effect. (Performance of previous student affecting that of next.)
8. This process of structured oral evaluation will enhance student learning.
9. SOE is preferable over traditional way of taking viva.
10. Will recommend it as one of the formative assessment tools for future batches.

Q1. How can this process be improvised? Q2. Can SOE be used as a summative assessment tool?

SOE: Structured oral examination.

TOOLS
PDF Links  PDF Links
PubReader  PubReader
ePub Link  ePub Link
XML Download  XML Download
Full text via DOI  Full text via DOI
Download Citation  Download Citation
  Print
Share:      
METRICS
0
Crossref
0
Scopus
174
View
13
Download
Editorial Office
The Korean Society of Medical Education
(204 Yenji-Dreamvile) 10 Daehak-ro, 1-gil, Jongno-gu, Seoul 03129, Korea
Tel: +82-32-458-2636   Fax: +82-32-458-2529
E-mail : kjme@ksmed.or.kr
About |  Browse Articles |  Current Issue |  For Authors and Reviewers
Copyright © 2024 by Korean Society of Medical Education.                 Developed in M2PI