Abstract
Objectives: This study describes a fast and efficient method that uses a prevalidated videotape of an objective structured clinical examination (OSCE) in a fracture scenario to evaluate raters and to measure the consistency of raters from different subspecialties and with varying levels of seniority. Study Design: We videotaped clinical scenarios for the purpose of evaluating residents' communication and clinical assessment skills. All orthopedic staff used prevalidated checklists to assess residents' performance in the videotape at 3 different time points. Cronbach's α was calculated to evaluate the internal consistency of the OSCE checklist construct. Kendall's W and KR-20 were used to investigate rater agreement. Expert validity was calculated to compare OSCE experts with the present raters. Results: A high Cronbach's α for the 23-item scale regarding global assessment in all 3 tests confirmed construct validity. Kendall's W showed only moderate interrater reliability. KR-20 was 0.96 for the pretest, 0.968 for the posttest, and 0.892 for the long-term test, indicating high internal consistency. The p-value for expert validity was 0.626 (independent t-test, n.s.). Conclusions: This efficient and fast video-based assessment of raters was reliable and yielded satisfactory rater consistency and some evidence for validity.
Original language | English |
---|---|
Pages (from-to) | 189-192 |
Number of pages | 4 |
Journal | Journal of Surgical Education |
Volume | 70 |
Issue number | 2 |
DOIs | |
State | Published - 03 2013 |
Keywords
- Medical Knowledge
- Patient Care
- Practice-Based Learning and Improvement