Open Journal of Nursing
Vol.04 No.10(2014), Article ID:50267,8 pages
10.4236/ojn.2014.410075

Experience of Conducting Objective Structured Clinical Evaluation (OSCE) in Malawi

Tiwonge Ethel Mbeya Munkhondya, Gladys Msiska, Evelyn Chilemba, Maureen Daisy Majamanda

Kamuzu College of Nursing, University of Malawi, Lilongwe, Malawi

Email: tiwongembeya@kcn.unima.mw, gladysmsiska@kcn.unima.mw, evelynchilemba@kcn.unima.mw, mdmajamanda@kcn.unima.mw

Copyright © 2014 by authors and Scientific Research Publishing Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY).

http://creativecommons.org/licenses/by/4.0/

Received 13 August 2014; revised 11 September 2014; accepted 26 September 2014

ABSTRACT

In Malawi various nursing educational institutions have increased the enrollment of nursing students in order to respond to the rampant nursing shortage prevalent in the Malawian clinical settings. With this increase in intake, nurse educators are met with so many questions as to whether the nurses being trained are competent and fit for practice. To ensure that these nurses have appropriate competences, Objective Structured Clinical Evaluation (OSCE) has been embraced as a key strategy to evaluate student’s competence. The paper describes the lessons learnt from conducting OSCE to undergraduate student nurses at Kamuzu College of Nursing in Malawi. The paper considers the background and context of the school, the preparation of students, the formulation of OSCE tasks, recruitment of examiner and simulated patients and the evaluation of the OSCE. The paper concludes that OSCE can be a worthwhile valid strategy of teaching and assessing nursing students as long as it is properly designed. Nonetheless, profound commitment of all stakeholders involved is very vital.

Keywords:

OSCE, Student Nurses, Undergraduates

1. Introduction

Assessment of students learning is a debatable issue [1] and has proved to be a challenge in many educational institutions. The challenge is compounded when it comes to the assessment of students learning in clinical practice [2] [3] . Mahara (1998) points out that, clinical evaluation is intended to provide feedback to students and teachers on what learning has taken place and what is required to improve the teaching-learning process, thereafter allowing the teachers to make a definitive judgment whether the students’ practice meets the professional or the academic requirements [4] . The effectiveness of learning in the clinical setting can be evaluated by students’ achievement of clinical competences [5] .

Competence has been defined in different ways. ICN Framework of Competences for the Nurse Specialist (2005) described competence as application of a combination of knowledge, skill and judgment demonstrated by an individual in daily practice or job performance [6] . In agreement to this the Australian National Competence Standards for Nurses in General Practice (2005) defined Competence as the ability to perform tasks and duties to the standard expected in employment [7] . Furthermore, Cowan et al. (2005) suggested having a holistic definition of competence that should include knowledge, skills, performance, attitudes and values [8] . From the three definitions above it can be seen that there is an agreement in the definitions that competence reflects the holistic nature of nursing roles. Such student nurses to be certified fit to practice need to demonstrate that they have acquired the competences. Therefore, there is need to have effective means of assessing students competencies.

Many clinical assessment strategies are based on direct observations. While (1991) asserts that the main challenge to clinical evaluation lies in the subjectivity of the observational process [9] . He states that human observation is noted to have an inherent bias and is a subjective process. Chapman (1999) also supports this view arguing that it is difficult to overcome subjectivity as assessments are based on a value judgement, which varies from person to person [10] . A major challenge in any assessment process is to ensure that objective measurement is used and to guarantee objectivity is particularly difficult in the assessment of clinical competence [11] . Furthermore, the clinical evaluation should be based upon a constant one in one observation period with a student [9] . Clinical teachers or ward sisters are usually required to accommodate a varying number of students in their clinical supervision teaching schedule so that assessments of individual students’ performance are usually based upon a sample of the students’ total experience in the placement [9] . The forgoing discussion reflects some of the challenges in the assessment of students’ clinical competence and it is possible to overcome some of these challenges with OSCE.

Objective Structured Clinical Evaluation (OSCE) is defined as “an approach to the assessment of clinical competence in which the components of competence are assessed in a well-planned or structured way with attention being paid to objectivity” [12] . OSCE is a valid and reliable method of assessment [13] [14] . Further to this, a review done by Bartfay et al. (2004) regard OSCE as a gold standard assessment strategy for health professionals [15] and they enhance the quality of health professional education [16] .

Moreover, studies demonstrate that OSCE preparation may motivate students to participate more while in clinical practice [17] . OSCE motivates students to learn the clinical skills being examined [18] -[21] . Nulty et al. (2011) argue that OSCEs present one viable educational strategy to promote student engagement and the achievement of desired learning outcomes, notably including clinical competence [22] . It is increasingly being used as a method of assessment in nursing and allied health curricula [15] [23] [24] . OSCE is gaining popularity in undergraduate nursing programs throughout the western world [25] [26] . Conversely, there is scant literature pertaining to OSCE as an approach in evaluating undergraduate nursing programs in other settings. The purpose of this paper is to discuss how Kamuzu College of Nursing (KCN), a constituent college of the University of Malawi has been designing and conducting OSCE. The discussion will be relevant to nurse educators who use OSCE as a means of clinical skills assessment.

Why OSCE in Malawi?

Malawi faces challenges with shortage of nurses and the nurse/patient ratio is at 38 nurses per 100,000 population [27] . Responding to the shortage most training institutions have increased nursing students intake within the limited available resources. This may mean that students fail to learn adequately because there are too many of them in a clinical area [28] . In addition, patient acuity has increased in in-patient settings; the need for closer supervision of students has intensified. Given the current shortage of nurses in most facilities and the increasingly complex needs of patients, staff nurses do not have the time to provide acceptable level of supervision [29] . These changes significantly limit the ability of the institutions to provide high quality clinical education for nursing students, thereby increasing the imperative to develop alternative and innovative learning opportunities [22] . Tanner (2006) recommends integrating simulation as a complement to hands-on clinical experiences as it has the capacity to reduce clinical placement demands and improve the preparation for new graduates [29] . Similarly, Nulty et al. (2010) assert that simulated clinical situations such as OSCEs are intrinsically aligned and authentic, and should also promote student engagement and the achievement of desired learning outcomes and argue that this justifies their use of OSCE as both learning and assessment tool [22] .

Over the past ten years Kamuzu College of Nursing (KCN) has adopted the use of OSCE in the assessment of student’s attainment of clinical competences for the undergraduate nursing programme. The conduct of OSCE has varied from year to year and continuously being informed by each preceding year. However, OSCE is not used as a sore assessment strategy for student’s clinical competences. To ensure reliability and validity of our OSCE other assessment strategies are used, these include portfolios and case studies. However, Rushforth (2007: p. 488) argues that OSCE offers particular strengths in terms of assessor objectivity and parity of the assessment process for all students, especially when compared with other assessment of practice processes [24] . Additionally, Watson et al. (2002) observes that these other assessments do not assess the student’s acquisition of skills [30] . We believe and agree with Rushforth (2007) on the application of Millar (1999) model that OSCE puts the students at the “show how” level hence student’s competences are assessed in a more objective and standardized manner [24] .

2. OSCE Process

At KCN OSCE is administered to undergraduate student nurses after each clinical placement. This is usually at the end of each semester from the first year to the fourth year. The intention of OSCE is to facilitate learning while assessing whether the students have acquired the knowledge, skills and appropriate attitudes. In each semester the students start with theory block then they go for the clinical placement and after the clinical placement students are given OSCE. Usually the practice module is related to the content they cover during the theory block and the skills chosen for the OSCE are mapped with the learning outcomes and the students’ level of clinical exposure [31] . During the OSCE a number of skills are assessed within the examination and each skill is tested at a station. The length of the OSCE station is generally eight to ten minutes. Consistent with Pender & Looy (2004) and Byrne & Smyth (2008), all candidates are assessed using exactly the same stations with the same marking sheet and they rotate between stations until they have completed a circuit [32] [33] . Two examiners assess the student using the mark sheet and after the bell rings, to signify that the time is up, the two examiners agree on the average mark of the student and the grades are entered simultaneously. Rushforth (2007) pointed out that evidence caution relying on the judgments of single examiners [24] . By the end of the OSCE all the students will have gone through each station and been marked according to the mark sheet.

To accommodate large numbers of students, the circuits are duplicated. For instance, we organize multiple stations where students would be required to perform the same skill. This process is costly, very stressful and requires extensive preparations. Similarly, Walters and Adams (2002) agree that OSCE is labor intensive especially on the day [34] . Additionally, Khattab and Rawlings (2001) point out that the process requires careful organization [35] . Despite the above challenges, the education benefits of OSCE far outweighs the implications [35] since it greatly enhances the application of theoretical principles to practice and less time is required for marking of the mark sheets [34] . Moreover, the results are fulfilling because you are able to see the skills of an individual students and we believe that at the end of the programme our students are competent. Therefore, it is important to start the preparation for the assessment well in advance [31] . There is also need for extensive commitment from all the people involved.

2.1. Student Preparation

Student preparation is very vital before administering any OSCE. Barry (2011) regards OSCE preparation to include lecturer led theory and workshops, individual preparation and practicing in the laboratory in groups [36] . At the outset, of the academic year, students are given a detailed explanation that OSCE is one of the strategies that would be used to assess their competence. Further to this, during the course of learning and clinical practice, students are invited in groups to the skills laboratory to practice skills mostly those that are examined during OSCE. The clinical instructors and lecturers demonstrate to the student different nursing skills following a checklist and the students are given the opportunity to do the return demonstration. Khattab and Rawlings (2001) observe that demonstrating to the students help them to develop competence in clinical skills [35] . Similar to Furlong et al. (2005) at the end of each practical session the checklist are given to the students [37] .

When administering OSCE we appreciate that students consider OSCE to be very stressful [36] - [38] . To ensure that students are well prepared, a day before the OSCE students get oriented to the whole process of OSCE setup this is done to interact with the students and to respond to any other quires they may have. During this time the lectures, Dean of Students and the OSCE coordinator meet with the students. Corresponding to Walters and Adams (2002) our students have regarded this session as beneficial as it helps them to cope with the stress [34] . On the day of the OSCE students are checked in to a comfortable waiting area and are also briefed on the nature of the examination by the coordinator. According to Alinier et al. (2003) in whatever way the OSCE is used, students should be clearly briefed and informed about the aims and objectives of the session [20] . The briefing before the OSCE allows students time to become orientated to the process [20] .

The information during both briefing sessions include the instructions to the students, time allocated for each station, number of assessors and the role of assessors and the type of interaction to be expected. We agree with Pender & de Looy (2004) and Brosnan et al. (2006) that the highest stress is experienced prior to the assessment [32] [38] . As such, the coordinator continuous reassures the students before getting into the examination room. We strive to identify a lecturer with good communication skills to be the coordinator. Our students have reported reduced anxiety when interacting with the coordinator. This is congruent to the findings by Brosnan et al. (2006) who found that the corridor facilitator was “calming” and “reassuring” [38] . Nonetheless, there is a need to emphasize the role of the examiners to the students. Our students have reported that some lecturers are very serious and they make students to be more stressed during the assessment. Barry et al. (2011) allude to this that the level of stress experienced interferes with students’ performance [36] .

2.2. Simulated Patients

Over the years we have shifted from using manikin alone to using both manikins and simulated patient during OSCE. We noted that students were encountering some challenges because of the artificial nature of OSCE [30] more especially when manikins alone are used. For example, the use of manikins for procedures hinders nurse- patient interaction and sometimes students may get confused as whom to communicate with regarding the procedure. This is congruent to the findings of Barry et al. (2011) that some of the students felt that the use of the simulators could not replicate clinical practice in relation to assessment of communication and interpersonal skills [36] . Where the students are to perform a task on a manikin, a simulated patient is asked to sit in for purposes of communication. Simulated patients are individuals who portray a specific clinical case, typically, they are not affected by bio-psychosocial conditions they are depicting but they are simulating clinical problems solely for the purpose of training and assessment [39] . Simulated patients are given thorough instructions for them to effectively carry out their role and to ensure that they give the same information to all candidates. We have learnt that the use of manikin and simulated patients make the OSCE environment very artificial. Wass et al. (2001) maintain that the most rigorously controlled OSCE is still removed from the real world of clinical practice [40] . However, the use of real patients as subjects for the OSCE stations is very difficult and may not be appropriate.

One of the challenges we have had over the years is whether to let the simulated patients to give feedback on individual student’s attitude when performing the task. It has been urged that for us to assess the attitudes of the students, it is important to hear from the feeling of the simulated patients. Major (2005) maintains that asking simulated patient to give in their views adds objectivity to OSCEs [21] . Similarly, Walters and Adams (2002), Boursicot and Roberts (2005) encouraged simulated patient to feedback to the examiners [34] [31] . However, literature surrounding this argument is sparse.

2.3. Examiners

Equitable and consistent marking of OSCE stations is essential to ensure parity of assessment for students. Our OSCE is designed to be an objective assessment; however, we recognize that examiners can have potential subjective opinions when scoring and rating students. To ensure that objectivity is sustained we recruit lecturers from different departments in the college plus those in the department then the examiners are oriented to the examiners instruction, and scoring of the students using the mark sheets. Jones et al. (2010) argue that although a structured mark sheet enables consistency of marking, the role of the examiner in ensuring reliability is also crucial and careful preparation of all examiners is therefore essential [26] . We understand that the role of the examiner is to observe and record the student’s performance [20] . Rennie and Main (2006) point out that training of assessors is crucial to ensure reliability and consistency in the marking criteria [2] . Similarly, Alinier (2003) suggests that preparation of nurse educators before OSCE is essential [20] .

These briefing sessions clarify most of the issues the examiners would have. However, to conform to the assessment rules and regulations of our college, these examiners are not told the exact OSCE tasks. On the day of the examination the examiners arrive early enough to allow familiarization with their station mark sheet, initial conversations between examiners and simulated patients or volunteers at their respective stations. The challenge of involving lecturers from different department is that most of them feel uncomfortable to score the students [33] . Nonetheless, the continuous involvement in the OSCE has made most of the examiners to be comfortable to participate.

Lecturers marking the same station in different circuits are required to liaise with each other to ensure consistency in their approach. This helps to ensure that they are not influenced by their own values and beliefs, thereby promoting inter-observer reliability [26] . A reserve examiner is identified for the examination day. Usually it is the person in overall charge of the organization, and has familiarity with each of the task and can step in at each station if required.

2.4. Vetting

The OSCE is carefully structured to include parts from all elements of the curriculum as well as a wide range of skills. While designing OSCE we keep in mind that the process is aimed to direct students learning as such the stations are diversified, to help students improve different skills as well as their confidence [20] . The module coordinator together with lecturers, involved in teaching a particular clinical module, develop a blue print for the OSCE. The blue print is then used to come up with OSCE questions/tasks. Blue-printing is a process by which the skills to be examined within the stations that make up an OSCE are mapped to the specific learning outcomes of a module or course [26] . Newble (2004) maintains that this is an extremely valuable strategy for enhancing and defending the validity of an examination [41] .

The team is also responsible for the formulation of the mark sheets, examiners instructions, simulated patient instructions and a list of all available equipment. A meeting of all lecturers in the department is then called to vet all the documents developed. The aim of the vetting is to ensure that; candidates instructions include exactly what task they should perform in a station, examiners instructions assist the examiners at each station to understand their role and conduct the station properly, the mark sheet include all the important aspects of the skill being tested and checking the availability of all the equipment to be used. The conduct of vetting is congruent to the recommendation by Byrne and Smyth (2007) who recommended the formulation of a panel of nurse educators to validate the stations both for content and accuracy [33] . Additionally, Rushforth (2007: p. 488) emphasises that research “makes very clear that each new OSCE should be subject to rigorous scrutiny and piloting to ensure that the reliability and validity of that particular assessment is maximized” [24] .

2.5. Examination Time

During vetting the time for each station is determined. Allocating time to the station is not an easy task bearing in mind that the time should be the same for all the stations. The time for the stations has varied year to year in response to the feedback from the students and examiners. Students have always complained that the time allotted for each station is not enough. Similar findings are reported by [2] [33] [36] . However, with the large numbers of students, giving students more time per station may result in finishing the OSCE very late or reducing the number of stations. This may have an implication on the elements of the curriculum that are assessed. Furthermore, examiners may tire and their scoring may be jeopardized. To overcome this challenge, many authors recommend “mock running the stations before handing to check if the tasks are achievable within the specified time [33] [36] . This means that it is important to prepare for the OSCE well in advance in order to allow time for trying the stations. Mock running the stations seems to be a feasible and reliable strategy of ensuring validity of the OSCE. There is room for us to review and adopt the mock run to improve our conduct of OSCE.

2.6. Creating Marking Sheets

Students are scored using a predetermined mark sheet which is developed well in advance. The mark sheet is carefully designed to act as a score sheet as well as a checklist. Allowing the examiner check if the candidate does the task and also score the student simultaneously. Checklists are beneficial as they enable assessors with less experience with the skills to reliability test students’ performance [42] . An examiner assigned to one station observes and scores the student as they perform the task. Each mark sheet is accompanied by specific examiners instructions (see Table 1 and Table 2). Table 1 gives specific instructions to the examiners for grading a student ability to manage a child with hypoglycemia.

Table 1. Examiner instructions.

Table 2. Peformance mark sheet.

The challenge the school has had in the development of the mark sheet is to ensure that the mark sheet only relates to the skill being assessed. Most of the steps in the skills we assess the students on have elements that are not directly related to the skill for example, communication, hand washing, donning of gloves and documentation. Scoring the students performing these elements has made students, who were otherwise meant to fail, to pass OSCE although their performance of the skill was not safe. Jones et al. (2010) observe that other elements, such as greeting the patient and hand decontamination, whilst acknowledged as good practice, may not be considered essential elements of the skill and may distort the student’s overall score if marks are awarded [26] . While Jones et al. (2010) asserts that by removing these arguably unnecessary marking criteria, the content validity of the station is strengthened by measuring only components of the actual skill [26] , we maintain that the face validity of the skill is still vital.

There is no agreeable way of ensuring that the mark sheet only assesses what it intends to assess. In their approach, Walters and Adams (2002) deducted ten marks for the student whose practice was shown to be unsafe at any of the stations from the allocated score for that station [34] . However, one quick and effective way to identify a student’s demonstrated competence and safety for passing an OSCE session is to categorise some of the marking criteria as essential criteria or starred or critical points, for which a positive score must be elicited in order to pass the station. Theses essential criteria help to maintain safety. Table 2 is an example of marking mark sheet for a child with hypoglycemia. The candidate is expected to calculate the correct volume of 10% dextrose and administer the correct dose. The critical points are given more marks for example for calculating the volume the student can either be given a 0 or a 4. Using the format, we have seen candidates passing because they performed according to the expected standard.

2.7. Evaluation and Feedback

Students are allowed to evaluate the OSCE at the end of the session. Once they completed the circuit, they are given the evaluation form to complete. The form consists of; the organization of OSCE, relevance of the tasks, time allocated for each task, the examiners attitude and any suggestions they students may for the improvement of the OSCE. The lecturers that are coordinating the OSC quickly go through the evaluation form and communicate any issues raised by the students to the examiners. Immediate issues to do with students’ perception of bad reception from the examiners are dealt with immediately. Long term issues such as time and any suggestions are usually taken onboard and always informed our conduct of OSCE. We observed a tremendous improvement of examiners attitude over the years as the comments from the students shifted from examiners being unfriendly to being very friendly. At the end of the OSCE the students are given the general/overall feedback. This session is aimed to provide feedback to the students that help them improve their practice and build their knowledge. Alinier et al. (2006) recommended that students should regularly receive feedback to make sure that they take away from the experience what was expected [42] . Similarly, Pender and de Looy (2004) reported that OSCEs helps students to be aware of key skills necessary for the competent practitioners [32] . Our students are always keen to get the general feedback at the end of the OSCE agreeing with Brosnan et al. (2005) and Alinier et al. (2006) that feedback to students is so important and they highly value it [38] [42] .

Individual feedback is given to students who failed any of the station. To achieve this, examiners are encouraged to indicate on the mark sheets the comments for the practice of each student. This helps the students to understand why they failed and realize the areas that they need to improve. Congruent to Brosnan et al. (2005) Students who fail a station are required to attend remedial supervised practical skills before repeating the OSCE [38] . Therefore, early feedback to students is very vital especially to students who failed.

3. Conclusion

This paper has described the OSCE that is currently being done at KCN in Malawi as a tool to teach and measure clinical competence for undergraduate nursing students. We maintain that OSCE is a meaningful and fair form of assessment in our setting and that it has had a positive effect on our curriculum. The conduct of OSCE is congruent to the findings of most of the studies done on conducting OSCE with nursing students. However, we observe that the planning and conduct of OSCE can be changed in different settings. The paper concludes that OSCE can be a worthwhile valid strategy of teaching and assessing nursing students as long as it is properly designed. Nonetheless, profound commitment of all stakeholders involved is very vital.

Acknowledgements

The authors would like to thank all the lecturers and students from the KCN for their valuable participation and feedback in the numerous OSCE sessions that have been at KCN.

References

  1. Wellard, S., Bethune, E. and Heggen, K. (2007) Assessment of Learning in Contemporary Nurse Education: Do We Need Standardised Examination for Nurse Registration? Nurse Education Today, 27, 68-72. http://dx.doi.org/10.1016/j.nedt.2006.04.002
  2. Rennie, A. and Main, M. (2006) Student Midwives’ Views of the Objective Structured Clinical Examination. British Journal of Midwifery, 14, 602-607. http://dx.doi.org/10.12968/bjom.2006.14.10.21933
  3. Anderson, M. and Stickley, T. (2002) Finding Reality: The Use of Objective Structured Clinical Examination (OSCE) in the Assessment of Mental Health Nursing Students Interpersonal Skills. Nurse Education in Practice, 2, 160-168. http://dx.doi.org/10.1054/nepr.2002.0067
  4. Mahara, M.S. (1998) A Perspective on Clinical Evaluation. Journal of Advanced Nursing, 28, 1339-1346. http://dx.doi.org/10.1046/j.1365-2648.1998.00837.x
  5. Bradshaw, A. (1997) Defining “Competency” in Nursing (Part 1): A Policy Review. Journal of Clinical Nursing, 6, 347-354. http://dx.doi.org/10.1111/j.1365-2702.1997.tb00327.x
  6. ICN (2009) ICN Framework of Competencies for the Nurse Specialist. International Council of Nurses, Geneva.
  7. Australian Nursing & Midwifery Federation (2005) Australia National Competency Standards for Nurses in General Practice. http://anmf.org.au/documents/reports/compstandards_nursesingp.pdf retrieved 11th May 2014
  8. Cowan, D.T., Norman, I.J. and Coopamah, V.P. (2005) Competence in Nursing Practice: A Controversial Concept—A Focused Review of Literature. Nurse Education Today, 25, 355-362. http://dx.doi.org/10.1016/j.nedt.2005.03.002
  9. While, A.E. (1991) The Problem of Clinical Evaluation—A Review. Nurse Education Today, 11, 448-455. http://dx.doi.org/10.1016/0260-6917(91)90007-W
  10. Chapman, H. (1999) Some Important Limitations of Competency Based Education with Respect to Nurse Education: An Australian Perspective. Nurse Education Today, 19, 129-135. http://dx.doi.org/10.1054/nedt.1999.0620
  11. Dolan, G. (2003) Assessing Student Nurse Clinical Competence: Will We Ever Get It Right? Journal of Clinical Nursing, 12, 132-141. http://dx.doi.org/10.1046/j.1365-2702.2003.00665.x
  12. Harden, R.M. (1988) What Is an OSCE? Medical Teacher, 10, 19-22. http://dx.doi.org/10.3109/01421598809019321
  13. Roberts, C., Newble, D., Jolly, B., Reed, M. and Hampton, K. (2006) Assuring the Quality of High-Stakes Undergraduate Assessments of Clinical Competence. Medical Teacher, 28, 535-543. http://dx.doi.org/10.1080/01421590600711187
  14. Walsh, M., Bailey, P.H. and Koren, I. (2009) Objective Structured Clinical Evaluation of Clinical Competence: An Integrative Review. Journal of Advanced Nursing, 65, 1584-1595. http://dx.doi.org/10.1111/j.1365-2648.2009.05054.x
  15. Bartfay, W.J., Rombough, R., Howse, E. and LeBlanc, R. (2004) The OSCE Approach in Nursing Education: Objective Structured Clinical Examinations Can Be Effective Vehicles for Nursing Education and Practice by Promoting the Mastery of Clinical Skills and Decision-Making in Controlled and Safe Learning Environments. The Canadian Nurse, 100, 18-25.
  16. Mitchell, M.L., Henderson, A., Groves, M., Dalton, M. and Nulty, D.D. (2009) The Objective Structures Clinical Examination (OSCE): Optimising Its Value in the Undergraduate Nursing Curriculum. Nurse Education Today, 29, 398- 404. http://dx.doi.org/10.1016/j.nedt.2008.10.007
  17. Marshall, G. and Harris, P. (2000) A Study of the Role of an Objective Structured Clinical Examination in Assessing Clinical Competence in Third Year Student Radiographers. Radiography, 6, 117-122. http://dx.doi.org/10.1053/radi.1999.0229
  18. Bujack, L., McMillan, M., Dwyer, J. and Hazelton, M. (1991) Assessing Comprehensive Nursing Performance: The OSCA, Part 2—Report of the Evaluation Project. Nurse Education Today, 11, 248-255. http://dx.doi.org/10.1016/0260-6917(91)90086-P
  19. Bramble, K. (1994) Nurse Practitioners Education: Enhancing Performance through the Use of the OSCE. Journal of Nursing Education, 33, 59-65.
  20. Alinier, G. (2003) Nursing Students’ and Lecturers’ Perspectives of Objective Structured Clinical Examination Incor- porating Simulation. Nurse Education Today, 23, 419-426. http://dx.doi.org/10.1016/S0260-6917(03)00044-3
  21. Major, D. (2005) OSCEs—Seven Years on the Bandwagon: The Progress of an Objective Structured Clinical Evaluation Programme. Nurse Education Today, 25, 442-454. http://dx.doi.org/10.1016/j.nedt.2005.03.010
  22. Nulty, D.D., Mitchell, M.L., Jeffrey, C.A., Henderson, A. and Groves, M. (2011) Best Practice Guidelines for Use of OSCEs: Maximising Value for Student Learning. Nurse Education Today, 31, 145-151. http://dx.doi.org/10.1016/j.nedt.2010.05.006
  23. Wessel, J., Williams, R., Finch, E. and Gemus, M. (2003) Reliability and Validity of an Objective Structured Clinical Examination for Physical Therapy Students. Journal of Allied Health, 32, 266-269.
  24. Rushforth, H.E. (2007) Objective Structured Clinical Examination (OSCE): Review of the Literature and Implications for Nursing Education. Nurse Education Today, 27, 481-490. http://dx.doi.org/10.1016/j.nedt.2006.08.009
  25. Joy, R. and Nickless, L. (2008) Revolutionising Assessment in a Clinical Skills Environment—A Global Approach: The Recorded Assessment. Nurse Education in Practice, 8, 352-358. http://dx.doi.org/10.1016/j.nepr.2007.11.001
  26. Jones, A., Pegram, A. and Fordham-Clarke, C. (2010) Developing and Examining an Objective Structured Clinical Examination. Nurse Education Today, 30, 137-141. http://dx.doi.org/10.1016/j.nedt.2009.06.014
  27. Malawi Ministry of Health Annual Report (2011-2012).
  28. Harrison, S. (2004) Overcrowded Placements Hinder Student Learning. Nursing Standard, 18, 7.
  29. Tanner, C.A. (2006) The Next Transformation: Clinical Education. Journal of Nursing Education, 45, 99-100.
  30. Watson, R., Stimpson, A., Topping, A. and Porock, D. (2002) Clinical Competence Assessment in Nursing: A Syste- matic Review of the Literature. Journal of Advanced Nursing, 39, 421-431. http://dx.doi.org/10.1046/j.1365-2648.2002.02307.x
  31. Boursicot, K. and Roberts, T. (2005) How to Set up an OSCE. The Clinical Teacher, 2, 16-20. http://dx.doi.org/10.1111/j.1743-498X.2005.00053.x
  32. Pender, F.T. and de Looy, A.E. (2004) The Testing of Clinical Skills in Dietetic Students Prior to Entering Clinical Placement. Journal of Human Nutrition and Dietetics, 17, 17-24. http://dx.doi.org/10.1046/j.1365-277X.2003.00474.x
  33. Byrne, E. and Smyth, S. (2008) Lecturers’ Experiences and Perspectives of Using an Objective Structured Clinical Examination. Nurse Education in Practice, 8, 283-289. http://dx.doi.org/10.1016/j.nepr.2007.10.001
  34. Walters, J. and Adams, J. (2002) A Child Health Nursing Objective Structured Clinical Examination (OSCE). Nurse Education in Practice, 2, 224-229. http://dx.doi.org/10.1016/S1471-5953(02)00024-0
  35. Khattab, A. and Rawlings, B. (2001) Assessing Nurse Practitioner Students Using a Modified Objective Structured Clinical Examination (OSCE). Nurse Education Today, 21, 541-550.
  36. Barry, M., Noonan, M., Bradshaw, C. and Murphy, S.T. (2011) An Exploration of Student Midwives’ Experiences of the Objective Structured Clinical Examination Assessment Process. Nurse Education Today, 26, 115-122.
  37. Furlong, E., Fox, P., Lavin, M. and Collins, R. (2005) Oncology Nursing Students’ Views of a Modified OSCE. European Journal of Oncology Nursing, 9, 351-359. http://dx.doi.org/10.1016/j.ejon.2005.03.001
  38. Brosnan, M., Evans, W., Brosnan, E. and Brown, G. (2006) Implementing Objective Structured Skills Evaluation (OSCE) in Nurse Registration Programmes in a Centre in Ireland: A Utilization Focused Evaluation. Nurse Education Today, 26, 115-122. http://dx.doi.org/10.1016/j.nedt.2005.08.003
  39. Zabar, S., Kachur, E., Kalet, A. and Hanley, K. (2012) Objective Structured Clinical Examinations: 10 Steps to Planning and Implementing OSCEs and Other Standardized Patient Exercises. Springer, Berlin.
  40. Wass, V., Van der Vleuten, C., Shatzer, J. and Jones, R. (2001) Assessment of Clinical Competence. The Lancet, 357, 945-949. http://dx.doi.org/10.1016/S0140-6736(00)04221-5
  41. Newble, D. (2004) Techniques for Measuring Clinical Competence: Objective Structured Clinical Examinations. Medical Education, 38, 199-203. http://dx.doi.org/10.1111/j.1365-2923.2004.01755.x
  42. Alinier, G., Hunt, B., Gordon, R. and Harwood, C. (2006) Effectiveness of Intermediate-Fidelity Simulation Training Technology in Undergraduate Education. Journal of Advanced Nursing, 54, 359-369. http://dx.doi.org/10.1111/j.1365-2648.2006.03810.x