Comparison of undergraduate dental students' academic performance using structured and unstructured oral examinations



   Table of Contents   ORIGINAL ARTICLE Year : 2022  |  Volume : 8  |  Issue : 3  |  Page : 131-136

Comparison of undergraduate dental students' academic performance using structured and unstructured oral examinations

Nagesh Lakshminarayan1, GV Usha2
1 Department of Public Health Dentistry, Dayananda Sagara Dental College, Bengaluru, Karnataka, India
2 Department of Public Health Dentistry, Bapuji Dental College and Hospital, Davanagere, Karnataka, India

Date of Submission13-Nov-2021Date of Acceptance22-Jun-2022Date of Web Publication28-Sep-2022

Correspondence Address:
Dr. G V Usha
Department of Public Health Dentistry, Bapuji Dental College and Hospital, Davanagere, Karnataka
India
Login to access the Email id

Source of Support: None, Conflict of Interest: None

Crossref citationsCheck

DOI: 10.4103/ijam.ijam_147_21

Rights and Permissions


Introduction: Assessment plays a central role in formal education. Unstructured oral examinations are the status quo in dental education in India but have some drawbacks, including a lack of inter-rater reliability. It is not known whether structured oral examinations provide different scores nor how they will be accepted by students or examiners.
Materials and Methods: A sample of 20 students were randomly selected from a group of third-year BDS students who had scored 65% and above in the 2nd year university examination in Bapuji Dental College and Hospital. They were exposed to a total of 4 h of lecture and problem-solving sessions on dental ethics. Four raters (teaching faculty), having similar academic experience and designation, were selected and they were trained to conduct structured oral examinations according to a format, especially designed for the same purpose. Half the students underwent structured examination followed by unstructured examination while half the students underwent unstructured examination followed by structured examination. Paired t-test was applied to find out the statistical difference between structured and unstructured oral examination formats.
Results: The students obtained a mean score of 13.35 ± 3.8 out of a total score of 20 in the unstructured oral examination when compared to a mean score of 14 ± 3.76 in the structured oral examination. The difference was not statistically significant (P = 0.69). The acceptance trended favorable for structure examination but was not statistically significant. The structured examination had higher inter-rater reliability than the unstructured examination.
Conclusion: The results showed good reliability and repeatability of the structured oral examination format with an inter-rater reliability of 0.82. Structured viva voce was not found to improve the performance of students when compared to unstructured viva voce.
The following core competencies are addressed in this article: Practice-based learning and improvement, Patient care and procedural skills, Systems-based practice, Medical knowledge, Interpersonal and communication skills, and Professionalism.

Keywords: Dental curriculum, dental education, dental ethics, dental students, structured viva voce


How to cite this article:
Lakshminarayan N, Usha G V. Comparison of undergraduate dental students' academic performance using structured and unstructured oral examinations. Int J Acad Med 2022;8:131-6
How to cite this URL:
Lakshminarayan N, Usha G V. Comparison of undergraduate dental students' academic performance using structured and unstructured oral examinations. Int J Acad Med [serial online] 2022 [cited 2022 Sep 30];8:131-6. Available from: https://www.ijam-web.org/text.asp?2022/8/3/131/357232   Introduction Top

Assessment has a central role in formal education. A scientifically developed curriculum, sound learning techniques such as case-based learning, problem-based learning, and simulation, in combination with accurate assessment methods, may solve several problems prevailing in the educational system.[1] Dentistry is a rapidly developing profession and competency-based education is a catchphrase of dental education in the 21st century. Multiple assessment methods[2] are used to assess students at various levels during their learning period in dental schools. Viva voce, or oral examinations, provide a unique opportunity for assessments by bringing together the examiner with the examinee so that many qualities can be assessed. These qualities include depth of knowledge, attitudes, alertness, ability to perform under stress, decision-making, problem-solving, critical reasoning, sensitivity to context, professionalism, communication, and synthesizing skills. Probing questions can help in bringing out the depth and breadth of understanding of the candidate.[3]

Traditional, or unstructured, oral examinations do suffer from several drawbacks; some of those drawbacks include reliability between examiners and examiners who do not use the examination to rigorously assess a candidate's abilities. The drawbacks of unstructured oral examination can be controlled by changing the format into the structured examination. A structured examination is organized, consistent, and can assess the students with better accuracy. Structured formatting of oral examinations was found to be more valid and reliable in some studies done on a diverse group of students.[2],[4],[5],[6],[7],[8],[9],[10]

Although substantial research has been done on structured oral examination in medical sciences, there is a relative lack of such research in dentistry in India. In India, the structured viva voce is neither popular nor in vogue. It is not known if scores between structured and unstructured oral examinations will differ for dental students. Further, students and the assessors are important stakeholders in the assessment and whether they differ in their acceptance toward the two oral examination formats is unknown. Their acceptance is crucial in the adoption of a new format. Hence, a study was planned with the following objectives:

1. To find out whether there is a difference in the marks scored by the students between structured and unstructured oral examinations using dental ethics as a topic for assessment

2. To assess the degree of acceptance of structured oral examination format among students and the examiners and

3. To find out whether there is a difference in the acceptance level of students and the examiners between the structured and unstructured format of oral examinations.

  Methods Top

Study population

A sample of 20 students was randomly selected from a group of third-year BDS (Bachelor of Dental Surgery) students who had scored 65% and above marks in the 2nd year university examination in Bapuji Dental College and Hospital. Bapuji Dental College and Hospital is a postgraduate institution that teaches dentistry to both BDS and MDS students which is located in Davangere city, Karnataka. All eligible students were given oral and written information about the purpose of the study. The study was conducted in the Department of Public Health Dentistry, Bapuji Dental College and Hospital. A detailed protocol of the planned research was submitted and got approval from the IRB (BDC/1208/12-13) of Bapuji Dental College and Hospital.

Sample size

The total sample size was 20 (ten students from each group). The sample size was calculated for type I error (α error) fixed at <5% and type II (β error) fixed at 20%, expected mean difference – 3 and standard deviation – 2. Based on the above calculation, the minimum sample required was 14 students, and it was decided to use a sample size of 20 which would yield sufficient power to detect a meaningful difference between groups.

Study design

The study is a crossover study [Figure 1]. Students were exposed to a total of 4 h of lecture and problem-solving session on dental ethics by a single teacher who was not otherwise involved in the study. Students were trained to solve the ethical dilemmas presented as case scenarios; some of these case scenarios were extracted from a standard book on ethics and tailored to the Indian situation.

Four teaching faculty raters from our department, having similar academic experience and faculty status, were selected, and they were trained to conduct structured oral examinations according to a format, specially designed for the same purpose. They were also provided the literature about how to conduct structured oral examinations. The topic of dental ethics was divided into two parts as “fundamentals of dental ethics” and “ethical dilemmas.” The first part was the content area for the unstructured oral examination and the second part was the content area for the structured oral examination. Blueprinting was done for structured oral examination and different domains were identified followed by weighted allocation of marks.

The structured oral examination was standardized with respect to time allocation per student (12–15 min), simple to complex question pattern, number of questions, category of questions (problem-solving, critical thinking, decision-making, analytical, and synthesizing), difficulty level, and rater behavior [Table 1]. A checklist with a rating scale for every item was prepared, pilot tested, and used to assign the scores. The raters underwent a prior calibration session to minimize inter-rater variability. The traditional unstructured oral examination was conducted without any change from previous administrations.

Students were randomly assigned to Group 1 (unstructured oral examination first) and Group 2 (structured oral examination first). Randomization was performed using computer-generated random numbers. A person not directly involved in the research project carried out the allocation procedure. The students in Group 1 underwent unstructured oral examination and Group 2 students underwent structured oral examination. The students after undergoing the assessment in the first phase were kept under isolation and were not allowed access to any sources of information on medical and dental ethics. They were not allowed to talk to each other. After completion of Phase 1, the groups were crossed over, and Group 1 received a structured oral examination and Group 2 received an unstructured oral examination. Rater 1 and 2 conducted unstructured and structured oral examinations to Group 1. For Group 2, Rater 3 and 4 carried out both structured and unstructured oral examinations.

Questionnaires were used to assess the acceptance level of the examiners and students. The students and the examiners answered the questionnaires soon after the examination. The questionnaire for the examiners and students consisted of three questions where answers could be scored on a modified Likert scale with the provision for assigning also a global score.

Statistical analysis

The data were analyzed using SPSS 20 (SPSS Inc., Chicago, IL, USA). Shapiro–Wilk test was done to assess the normality of data. Paired t-test was applied to find out the statistical difference between structured and unstructured oral examination format scores. The statistical difference between students' acceptance scores for the structured and unstructured format was assessed using Wilcoxon signed-rank test. Inter-rater reliability was assessed using Cronbach's alpha.

  Results Top

Students obtained a mean score of 14 out of a total score of 20 in the structured oral examination when compared to 13.35 in the unstructured oral examination. Although the performance is found to be better in the structured, the difference was found to be statistically not significant with P = 0.69 [Table 2].

Students' acceptance was better for the structured oral examination when compared to unstructured both on Likert scale and global scale, but only the difference in the global scores was found to be statistically significant at P = 0.03 [Table 3].

Raters' acceptance did not show a significant difference between the two examination formats [Table 4]. Inter-rater reliability was 0.82 and 0.26 for structured and unstructured viva formats, respectively.

  Discussion Top

Competency-based curriculum has gained significant momentum in the present decade. Students should be comprehensively assessed for various competencies before they step into their professional life and deliver services independently. Quality assessment is a key to producing quality products.[11],[12] Innovation in assessment methods is very much sought after by educationists.[11] The proof of competency is the performance of students under assessment. Hence, the oral examination should provide every candidate an equal opportunity for fair, valid, and standardized assessment while testing cognitive knowledge, attitudes, and communication skills.

Our study was aimed at investigating whether students' performance differs between the two formats of the oral examination. Students secured more marks in structured viva voce than in unstructured viva voce, but the difference was not statistically significant. This may be because our students were never exposed to structured viva voce at any other time in their dental education. Although they were taught problem-solving and decision-making skills through case studies on ethical dilemmas, the training was limited and an altogether a new experience for them. Dental literature has only a few similar studies comparing students' clinical reasoning skills in structured versus unstructured oral examinations.[5],[11],[12],[13],[14],[15] A descriptive study by Cobourne emphasizes the drawbacks of traditional viva voce in the assessment of orthodontic education, specifically its lack of inter-rater reliability.[16] In our study, the inter-rater reliability for traditional viva voce was 0.26 when compared to structured where it was 0.82. Multiple studies support our findings of higher inter-rater reliability with structured examinations in various disciplines of medical and dental sciences.[3],[4],[6],[7],[9],[12],[15],[16],[17],[18],[19],[20] Since the structured oral examination is more disciplined, organized, and provides a framework, it improves reliability.

The students' acceptance of the structured format was higher than the unstructured format, both on the Likert scale and global score. Several studies from the dental discipline are in line with our study findings.[5],[13],[14],[15] However, we failed to find a statistically significant difference between both formats even though the trend was favorable to structured examinations. Raters' acceptance was also found to be considerably higher for structured when compared to unstructured, but the differences were statistically not significant.

Structured viva voce may be a better option than unstructured. The most important outcome of the study was that students enjoyed solving the ethical dilemmas in the preparatory classes. They realized that ethical dilemmas are challenging by nature and have multiple answers. Students came to accept that in reality, few problems have simple solutions, often with no right or wrong answers. The learning was more interactive and student-centered. The change was well received based on verbal feedback from the students and raters.

Despite these findings, some limitations are noted in our study. The research focused on a single topic, due to the short study duration and the students had short exposure to problem-solving sessions. The crossover of students from Phase 1 to Phase 2 is usually associated with carryover effects. Hence, a washout period is prescribed. In educational research, such a carryover effect is a serious threat to validity; hence, crossover trials are usually avoided in the field of educational research. In our study, we controlled carryover effects by dividing the topic of dental ethics into two parts, the first part (fundamentals of dental ethics) was the content for unstructured viva and the second part (ethical dilemmas) was the content for structured viva. This measure eliminated the requirement for a washout period. Crossover trials are always considered more valid because the same subjects act as their own controls. Subject variations as sources of error are completely controlled. Hence, these studies also require less sample size. In our study, the training to examiners was not extensive and students were not exposed to mock drills. It is possible that these factors influenced our results. Providing multiple training sessions for examiners promotes consistency of scoring and administration of oral examination; this consistency can reduce candidate anxiety.[11] Student anxiety can also be reduced through exposure to mock examinations and this could be done in the future. Finally, there were only four raters in the study and this small sample size is a limitation.

Future research should continue to investigate differences in performance between students based on structured and unstructured oral examination, taking into account the inter-rater reliability between the two assessment methods. Although the current study found no statistical difference between the two assessment methods in terms of final cores, it is also worth considering whether the use of an assessment method with low reliability, such as an unstructured viva voce, is consistent with assessment standards in the high-stakes summative examination. More systematic and extensive research is needed using larger samples and better designs to unravel the truth.

  Conclusion Top

Structured viva voce was not found to improve the scores of students when compared to unstructured viva voce. Acceptance by the raters and students did not differ between structured and unstructured viva voce although it trended toward a favorable opinion of the structure examination. Reliability was found to be higher for the structured than the unstructured examination.

Acknowledgment

The authors wish to thank the Bapuji Dental College and Hospital, Davanagere, for providing the necessary facilities to conduct the research. The authors would also thank all the dental students who sincerely participated in the research.

Financial support and sponsorship

Nil.

Conflicts of interest

There are no conflicts of interest.

Research quality and ethic statement

The authors of this manuscript declare that this scientific work complies with reporting quality, formatting and reproducibility guidelines set forth by the EQUATOR Network. The authors also attest that this clinical investigation was reviewed by the Institutional Review Board and the approval number was BDC/1208/12-13.

 

  References Top
1.Challa KT, Sayed A, Acharya Y. Modern Techniques of teaching and learning in medical education: A descriptive literature review. Med Ed Publish 2021;10:18. [doi: 10.15694/mep.2021.000018.1].  Back to cited text no. 1
    2.Tabish SA. Assessment methods in medical education. Int J Health Sci (Qassim) 2008;2:3-7.  Back to cited text no. 2
    3.Schwartz PL, Kyaw TunSein. A different twist to the 'staff: Student ratio': Administering medical oral examinations to students in groups. Med Educ 1987;21:265-8.  Back to cited text no. 3
    4.Downing SM, Haldane TM, editors. Handbook of Test Development. New Jersey: Lawrence Erlbaum Associates; 2006.  Back to cited text no. 4
    5.Guilford JP. Psychometric Methods. New York: McGraw Hill; 1954.  Back to cited text no. 5
    6.Khan FN, Saeed M, Bari A, Butt KA. Dental students perceptions about assessment methods. Pak Dent Assoc 2018;27:202-6.  Back to cited text no. 6
    7.Tutton PJ, Glasgow EF. Reliability and predictive capacity of examinations in anatomy and improvement in the reliability of viva voce (oral) examinations by the use of a structured rating system. Clin Anat 1989;2:29-34.  Back to cited text no. 7
    8.Anastakis DJ, Cohen R, Reznick RK. The structured oral examination as a method for assessing surgical residents. Am J Surg 1991;162:67-70.  Back to cited text no. 8
    9.Howley LD. Performance assessment in medical education: Where we've been and where we're going. Eval Health Prof 2004;27:285-303.  Back to cited text no. 9
    10.Hammond DI. Examining the examination: Canadian versus US certification exam. Can Assoc Radiol J 2000;51:308.  Back to cited text no. 10
    11.Hendricson WD, Kleffner JH. Curricular and instructional implications of competency-based dental education. J Dent Educ 1998;62:183-96.  Back to cited text no. 11
    12.Leach L, Neutze G, Zepke N. Assessment and empowerment: Some critical questions. Assess Eval High Educ 2001;26:293-305.  Back to cited text no. 12
    13.Ryding HA, Murphy HJ. Employing oral examinations (viva voce) in assessing dental students' clinical reasoning skills. J Dent Educ 1999;63:682-7.  Back to cited text no. 13
    14.Ganji KK. Evaluation of reliability in structured viva voce as a formative assessment of dental students. J Dent Educ 2017;81:590-6.  Back to cited text no. 14
    15.Puppalwar P, Rawekar A, Chalak A, Dhok A, Khapre M. Introduction of objectively structured viva-voce in formative assessment of medical and dental undergraduates in biochemistry. JRMEE 2014;4:321-5.  Back to cited text no. 15
    16.Puttalingaiah DV, Agarwal P. Oral assessments; Knowledge and perception of faculty in undergraduate dentistry program. J Contemp Med Edu 2020;10:66-73.  Back to cited text no. 16
    17.Wass V, Wakeford R, Neighbour R, Van der Vleuten C; Royal College of General Practitioners. Achieving acceptable reliability in oral examinations: An analysis of the royal college of general practitioners membership examination's oral component. Med Educ 2003;37:126-31.  Back to cited text no. 17
    18.Swanson DB. Performance based assessment: Lessons learnt from the health professions. Educ Res 1995;24:5-11.  Back to cited text no. 18
    19.Haladyana TM. Developing and Validating Multiple Choice Test Items. 3rd ed. Mahwah (NJ): Lawrence Erlbaum Associates: 2004.  Back to cited text no. 19
    20.McLean M. Qualities attributed to an ideal educator by medical students: Should faculty take cognizance? Med Teach 2001;23:367-70.  Back to cited text no. 20
    
  [Figure 1]
 
 
  [Table 1], [Table 2], [Table 3], [Table 4]
  Top  

留言 (0)

沒有登入
gif