A. S. KHAN ET AL.
aspects. Of course, it is a difficult task to incorporate SPs as
assessors in an OSCE and may not be feasible in terms of time
management; however it is likely to be more reliable in assess-
ing communication skills and could also be a cost saving exer-
cise.
It has also been emphasized that candidates may be utilized
in assessment process and self-assessment has already been
established as a very effective learning tool, especially as re-
gard to history taking, exploring presenting problems and tak-
ing drugs and family histories etc. (Regehr, G., 2006). Impor-
tantly, however there is always a problem of biased results. Yet
interestingly, when we analyzed overall performance in our
study, based on global scoring, the candidates rated themselves
performance wise in 100% satisfactory or higher category,
where as examiners assessed that a little higher than 50% can-
didates performed satisfactorily, and the SPs assessed that only
one quarter of the candidates performed at or above satisfactory
level.
The candidates on self-assessment rated their overall skills
markedly higher than the assessment of their overall skills by
the examiners and the SPs. This could be explained by the fact
that physician-patient communication is a complex process and
often has high subjectivity and may be influenced by task fa-
miliarity (Bianchi, Stobbe, & Eva, 2008; Taras, 2002). A few
studies have shown that students tended to assess their skills
much lower than expected by their teachers (Siaja, 2006); con-
trary to this, another study (Jahan, Sadaf, Bhanji, Naeem, &
Qureshi, 2011) has shown comparable results as regard to
communication skills. The results of our study do not match
with these findings. One obvious explanation for these mark-
edly different results could be due to the fact, that our
small-scale study was conducted on experienced general practi-
tioners and might not be comparable with other studies, which
were focused mainly on undergraduate students.
Further analysis of the results of this study showed that there
was moderate and significant correlation present between as-
sessment by examiners and candidates, whereas the correlation
between examiners and SPs and SPs and candidates was very
low and not significant, which again demonstrates that there is
a difference in opinion between examiners and SPs regarding
the level of performance of candidates. The results by sel-
fassessment and examiners assessment in our study are similar
to another study’s results (Jahan et al., 2011) on undergraduates.
The results of the two studies however cannot be truly com-
pared, as our study was conducted on experienced general prac-
titioners.
Conclusion
Despite its limitations due to a relatively small sample size
and small number of stations, with limited training of SPs as
assessors, this study has highlighted an important issue, that the
assessment of communication skills and empathy in an OSCE
by examiners may not be reliable and could be different from
SPs’ opinion. This highlights the need for developing a system
to involve simulated patients in the assessment process. Further
research is needed on a much larger sample size and greater
number of stations, to evaluate, whether SPs should be involved
actively in the whole process of assessment in terms of reliabil-
ity of communication skills assessment, time management and
cost-effectiveness.
REFERENCES
Allen, R., Heard, J., & Savidge, M. (1998). Global ratings versus
checklist scoring in an OSCE. Academic Medicine, 73, 597-598.
doi:10.1097/00001888-199805000-00067
Bianchi, F., Stobbe, K., & Eva, K. (2008). Comparing academic per-
formance of medical students in distributed learning sites: The
McMaster experience. Medical Teacher, 30, 67-71.
doi:10.1080/01421590701754144
Bokken, L., Linssen, T., Scherpbier, A., Van der Vleuten, C., & Re-
thans, J. J. (2009). Feedback by simulated patients in undergraduate
medical education: A systematic review of the literature. Medical
Education, 43, 202-210. doi:10.1111/j.1365-2923.2008.03268.x
Bokken, L., van Dalen, J., & Rethans, J. J. (2010). The case of “Miss
Jacobs”: Adolescent simulated patients and the quality of their role
playing, feedback, and personal impact. Simulation in Healthcare:
Journal of the Society for Simulation in Healthcare , 5, 315-319.
Brannick, M. T., Erol-Korkmaz, H. T., & Prewett, M. (2011). A sys-
tematic review of the reliability of objective structured clinical ex-
amination scores. Medical Education, 4 5, 1181-1189.
doi:10.1111/j.1365-2923.2011.04075.x
Chumley, H. S. (2008). What does an OSCE checklist measure? Family
Medicine, 40, 589-591.
Fischbeck, S., Mauch, M., Leschnik, E., Beutel, M. E., & Laubach, W.
(2011). Assessment of communication skills with an OSCE among
first year medical students. Psychotherapie, Psychosomatik, Mediz-
inische Psychologie, 61, 465-471. doi:10.1055/s-0031-1291277
Harden, R. M., & Gleeson, F. A. (1979). Assessment of clinical com-
petence using an objective structured clinical examination (OSCE).
Medical Education, 13, 41-54.
doi:10.1111/j.1365-2923.1979.tb00918.x
Hatala, R., Marr, S., Cuncic, C., & Bacchus, C. M. (2011). Modifica-
tion of an OSCE format to enhance patient continuity in a high-
stakes assessment of clinical performance. BMC Medical Education,
11, 23-28. doi:10.1186/1472-6920-11-23
Jahan, F., Sadaf, S., Bhanji, S., Naeem, N., & Qureshi, R. (2011).
Clinical skills assessment: Comparison of student and examiner as-
sessment in an objective structured clinical examination. Education
for Health (Abingdon, England), 24, 421.
Mazor, K. M., Ockene, J. K., Rogers, H. J., Carlin, M. M., & Quirk, M.
E. (2005). The relationship between checklist scores on a communi-
cation OSCE and analogue patients’ perceptions of communication.
Advances in Health Sciences Education, 10, 37-51.
doi:10.1007/s10459-004-1790-2
McNaughton, N., Tiberius, R., & Hodges, B. (1999). Effects of por-
traying psychologically and emotionally complex standardized pa-
tient roles. Teaching and Learning in Medicine , 11, 135-141.
doi:10.1207/S15328015TL110303
Moineau, G., Power, B., Pion, A. M., Wood, T. J., & Humphrey-Murto,
S. (2011). Comparison of student examiner to faculty examiner
scoring and feedback in an OSCE. Medical Education , 45, 183-191.
doi:10.1111/j.1365-2923.2010.03800.x
Pierre R. B., Wierenga A., Barton M., Thame K., Branday J. M., &
Christie C. D. C. (2005). Student self-assessment in a paediatric ob-
jective structured clinical examination. West Indian Medical Journal,
54, 144-148.
Regehr G., & Eva K. (2006). Self-assessment, self-direction and self-
regulating professional. Clinical Orthopedics Related Research, 449,
34-48.
Regehr, G., Freeman, R., Robb, A., Missiha, N., & Heisey, R. (1999).
OSCE performance evaluations made by standardized patients:
Comparing checklist and global rating scores. Academic Medicine,
74, 135-137. doi:10.1097/00001888-199910000-00064
Robb K. V., & Rothman A. (1985). The assessment of history-taking
and physical examination skills in general internal medicine residents
using a checklist. Royal College of Physicians and Surgeons of
Canada, 20, 45-48.
Rosen, K. R. (2008). The history of medical simulation. Journal of
Critical Care, 23, 157-166. doi: doi:10.1016/j.jcrc.2007.12.004
Schwartzman, E., Hsu, D. I., Law, A. V., & Chung, E. P. (2011a).
Assessment of patient communication skills during OSCE: Examin-
Copyright © 2012 SciRes. 935