THE UNDERGRADUATE MEDICAL STUDENTS’ PITFALLS IN EYE EXAMINATION STATION OF OBJECTIVE STRUCTURED CLINICAL EXAMINATION

https://doi.org/10.22146/jpki.47167

Widyandana Doni(1*), Angela Nurini Agni(2), Agus Supartoto(3)

(1) Department of Medical Education,Faculty of Medicine, Universitas Gadjah Mada
(2) Department of Ophthalmology, Faculty of Medicine, Gadjah Mada University, Yogyakarta, Indonesia
(3) Department of Ophthalmology, Faculty of Medicine, Gadjah Mada University, Yogyakarta, Indonesia
(*) Corresponding Author

Abstract


Background: High prevalence of eye disorders in Indonesia requires medical doctors to be skillful and well trained in ophthalmologic examination. Undergraduate medical students usually start their clinical simulation practice and ophthalmology assessment in a safe learning environment. Skill laboratory as the learning facility should be evaluated and improved regularly. This study aimed to evaluate student’s pitfalls in the eyes OSCE station.

 

Methods: Descriptive analytic study involving Objective Structured Clinical Examination (OSCE) eye examination station score from 1st - 4th year undergraduate medical students batch 2010 in Faculty of Medicine, Universitas Gadjah Mada, Indonesia (n=516). All checklists’ scores were analyzed based on particular sub-scales in every examination topic to explore the most pitfalls made by students in eye examination station during OSCE.

 

Results: The order average value of each subscale in OSCE are: Doctor patient interaction (88.42), History taking skills (82.44), Professionalism (76.43), Physical Examination (74.62), Diagnosis (60.68), Management of Pharmacotherapy (54.70). The percentage of failed-students (scores <70) in 1st-4th year OSCE based on topics skills were Year 1: Visual field (5.08%), Visual acuity (14.21%), Anterior Segment (2.54%). 2nd year: IOP by palpation (24.38%), Visual acuity (9.38%), Anterior Segment (29.38%). Year

 

3:  visual field (4.94%), IOP by palpation (2.47%), Visual acuity (12.35%), Anterior Segment (7.41%), Posterior Segment (22.22%). Year 4: Comprehensive eye exam (17.95%).

 

Conclusions: Students mostly challenged in the skills of diagnosis establishment and pharmacological management. The highest number of failed students in each year OSCE vary from each year. 1st year failed most at visual acuity examination, 2nd year was anterior segment examination and 3rd year was posterior segment examination. Those three skills need to be enhanced systematically.

 

Keywords: OSCE, pitfall pattern, eye examination, undergraduate students, skill laboratory

 


Keywords


Keywords: OSCE, pitfall pattern, eye examination, undergraduate students, skill laboratory

Full Text:

PDF


References

  1. World Health Organization. Global Data on Visual Impairment. World Health Organization; 2010.
  2. Riskesdas. Laporan Hasil Riset Kesehatan Dasar (Riskesdas). Jakarta: Badan Penelitian Dan Pengembangan Kesehatan RI; 2013.
  3. Leddingam IM, Harden RM. Twelve Tips for Setting Up Clinical Skills Training Facility. Medical Teacher, 1998;20(6).
  4. Nielsen DG, Moercke AM, Hansen GW, Eika B. Skills Training in Laboratory and Clerkship: Connections, Similarities, and Differences. Med Educ Online, 2003;8(12).
  5. Harden, R.M., Stevenson, M., Downie, W., Wilson, G.M. Assessment of Clinical Competence Using Objective Structured Examination. British Medical Journal, 1975;I(1):447-51.
  6. Gupta P, Dewan P, Singh T. Objective Structured Clinical Examination (OSCE) Revisited. Medical Education Indian Pediatrics, 2010;47(17):911-20
  7. Zayyan M. Objective Structure Clinical Examination, The Assessment of Choice. Oman Medical Journal, 2011;26(4):219-22.
  8. Cohen R, Reznick RK, Taylor BR, Provan J, Rothman A. Reliability and Validity of the Objective Structured Clinical Examination in Assessing Surgical Residents. American Journal of Surgery, 1990;160(3):302-5.
  9. Sloan DA, Donnelly MB, Schwartz RW, Strodel WE. The Objective Structured Clinical Examination: The New Gold Standard for Evaluating Postgraduate Clinical Performance. Annals of Surgery, 1995;222(6):735-42. doi:10.1097/00000658-199512000-00007.
  10. Antepohl W, Herzig S. Problem-Based Learning Versus Lecture-Based Learning in A Course of Basic Pharmacology: A Controlled-Randomized Study. Medical Education, 1999;33(2):106-13.
  11. Yuwono A, Widyandana, Hadianto T. Hubungan antara tingkat kesiapan mahasiswa memasuki pendidikan profesi dengan nilai objective structured clinical examination dan indeks prestasi kumulatif. Jurnal Pendidikan Kedokteran Indonesia, 2009;3(4):152-65.
  12. Dreyfus HL, Dreyfus SE. Expertise in Real World Contexts. Organization Studies, 2005;26(5):779-92.
  13. Schmidt RA. Motor Control and Learning: A Behavioral Emphasis 2nd ed. Champaign, IL: Human Kinetics; 1988.
  14. Fakultas Kedokteran UGM. S1 Pendidikan Dokter. [online] Available at: http://fk.ugm.ac.id/2010/05/ program-s1-pendidikan-dokter/ [Accessed 24 Jun. 2015]. Yogyakarta; 2015.
  15. Sena JDW, Lowe PA, Lee SW. Significant Predictors of Test Anxiety Among Students with and Without Learning Disabilities. J Learn Disabil, 2007;40(4): 360-76. Zeidner M. Test anxiety: The state of the Art. New York: Plenum; 1998.
  16. Vollebregt JA, Van Oldenrijk J, Kox D, Van Galen SR, Sturm B, Metz JCM, Richir MC, De Haan M, Hugtenburg JG, De Vries TPGM. Evaluation of a pharmacotherapy context-learning programme for preclinical medical students. British Journal of Clinical Pharmacology, 2006;62:666-72. doi: 10.1111/j.1365-2125.2006.02742.x
  17. McLean M, Van Wyk J. Twelve tips for recruiting and retaining facilitators in a problem-based learning programme. Medical Teacher, 2006;28(8):675-9.
  18. Adams, J. A. (1971). A closed-loop theory of motor learning. Journal of Motor Behavior, 3(2), 111–150. https://doi.org/10.1080/00222895.1971.10734898.
  19. Xeroulis GJ. Teaching suturing and knot-tying skills to medical students: A randomized controlled study comparing computer-based video instruction and (concurrent and summary) expert feedback. Surgery, 2007;141(4):442-9.
  20. Fidment S. The Objective Structured Clinical Exam (OSCE) A Qualitative Study Exploring the Health Care Students’ Experience. Student Engagement and Experience Journal, 2012; 1(1):1-11.
  21. Wallenstein J, Heron S, Santen S, Shayne P, Ander D. A core competency-based objective structured clinical examination (OSCE) can predict future resident performance. Academic Emergency Medicine, 2010;17Suppl2. https://doi.org/10.1111/ j.1553-2712.2010.00894.x.



DOI: https://doi.org/10.22146/jpki.47167

Article Metrics

Abstract views : 1367 | views : 1928

Refbacks

  • There are currently no refbacks.


Copyright (c) 2019 Widyandana Doni, Angela Nurini Agni, Agus Supartoto

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Jurnal Pendidikan Kedokteran Indonesia (The Indonesian Journal of Medical Education) indexed by:


JPKI Stats