Creative Education
Vol.5 No.4(2014), Article ID:43645,7 pages DOI:10.4236/ce.2014.54025

Measuring the Effectiveness of Faculty Facilitation Training in Problem-Based Learning in a Medical School

Teresa Paslawski1, Ramona Kearney2, Jonathan White3

1Speech Pathology and Audiology, Faculty of Rehabilitation Medicine, University of Alberta, Edmonton, Canada

2Department of Anesthesiology, Faculty of Medicine, University of Alberta, Edmonton, Canada

3Department of Surgery, Faculty of Medicine, University of Alberta, Edmonton, Canada

Email: teresa.paslawski@ualberta.ca

Copyright © 2014 by authors and Scientific Research Publishing Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY).

http://creativecommons.org/licenses/by/4.0/

Received 17 December 2013; revised 17 January 2014; accepted 24 January 2014

ABSTRACT

This study examined the effectiveness of a faculty training program for problem-based learning (PBL) facilitation. A multi-level approach was used, following Kirkpatrick’s levels for assessing training effectiveness. Data were obtained from (1) tutor training workshop evaluations, (2) a survey of tutors’ attitudes and beliefs, (3) changes in tutors’ perceptions of their teaching styles through preand post-testing using the Teaching Styles Inventory (TSI), (4) changes in student attitudes and self-perceptions of their learning styles through preand post-testing using the Self-directed Learning Readiness Scale (SDLRS) and the revised Study Process Questionnaire (SPQ). The authors contend that measures were obtained for Kirkpatrick levels 1, 2, and 4 (Reaction, Learning, and Results, respectively) but that no measure of Kirkpatrick level 3 was completed. Overall, it was concluded that the training program was successful as measured at Kirkpatrick level 1 but was equivocally successful as assessed at higher levels in Kirkpatrick’s model. In addition to drawing conclusions regarding the training program for facilitators in PBL, limitations and challenges associated with assessment at each level are highlighted.

Keywords:Facilitator Training; Kirkpatrick Levels; Medical Education; Problem-Based Learning; Program Assessment; Revised Study Process Questionnaire (SPQ); Self-Directed Learning Readiness Scale (SDLRS); Teaching Styles Inventory (TSI)

1. The Challenge

The processes of accreditation and curriculum renewal stimulate periodic change in the techniques that medical schools employ to teach their students. Each time a school switches to a new teaching method, faculty development must be undertaken to ensure that faculty members are able to deliver instruction using the chosen technique. Schools also face an ongoing need for faculty development to train newly recruited teachers and to ensure that existing faculty members are kept up to date. We describe a faculty development program to facilitate a switch to a new curriculum in problem-based learning. We describe the program using Kanter’s framework for educational innovations (Kanter, 2008) and report the outcomes of this program using the framework of behaviour change described by Kirkpatrick (Kirkpatrick, 1996). The formatter will need to create these components, incorporating the applicable criteria that follow.

This study was conducted at a large North American medical school. In 2006, the Liaison Committee on Medical Education/Committee on Accreditation of Canadian Medical Schools found that our medical school provided insufficient opportunities for active learning by students. In response to this decision, our school decided to adopt a new curriculum in the pre-clinical years based on problem-based learning (PBL). We have previously described the process by which this new curriculum (which we called “Discovery Learning”) was successfully implemented (White et al., 2013) ten months after the decision to change was taken.

2. The Solution

A variety of stakeholders were engaged in the process of change including students, teachers (physicians and non-physicians), support staff, and faculty administration and leadership. A number of solutions to training the faculty in PBL were considered, including in-person and on-line delivery. Consideration was given to whether all faculty required training, and whether all faculty would agree to attend training or not. In the end, to ensure standardization of training across the tutor population, we decided to deliver in-person training using workshops which all tutors were required to attend.

2.1. The Program

The faculty development program was designed in accordance with recommendations by Dolmans and colleagues (Dolmans et al., 1994). The training was designed to (1) be convenient for the participants, (2) provide both background information and hands-on experience, (3) be compatible with the overall faculty development strategy, and (4) make realistic demands on available resources.

The program consisted of two 4-hour interactive workshops with additional ongoing support. In the first workshop, participants developed an understanding of learner-centred education, the importance of learning objectives, and the foundations of PBL. This workshop included a 30-minute session in which tutors role-played a group of students in order to demonstrate and discuss the dynamics of a PBL group. The second workshop focused on the development of specific skills necessary to facilitate a student PBL group, using a group of volunteer ‘standardized students’ in role-play, with direct observation via closed-circuit television and individual feedback. Identifying strategies to deal with common student behaviours in PBL, giving feedback to students, and student assessment were also part of this workshop. An ongoing tutor support program consisting of an orientation session to each course, peer tutoring assistance, and weekly tutoring briefing and de-briefing sessions were run during each term to assist in further development of tutoring skills.

2.2. Evaluating the Outcomes of the Program

In determining how to evaluate this faculty development program, a review of the literature identified a paucity of research demonstrating the effectiveness of faculty development (Steinert, 2005; Farmer, 2004). Farmer (2004: p. 59) stated that “There is a continuing need for rigorous outcome-based research and programme evaluation to define the best components and strategies for faculty development.” Pertinent to this investigation, there are many publications that report studies designed to assess the success of PBL or to define desirable characteristics of PBL tutors, but few provide evidence regarding the efficacy of faculty development, including facilitator training. Kirkpatrick’s framework (Kirkpatrick, 1996) for program evaluation recommends measurement at four levels: (1) participant reactions, (2) changes in attitudes, knowledge or skills of the learners, (3) changes in demonstrated teaching behaviours, and (4) changes in student learning or in the system designed to support student learning. Many reports on program evaluation examine only participant satisfaction, representing only the first level suggested by Kirkpatrick (Kanter, 2008). We therefore elected to evaluate the impact of this faculty development program using multiple levels of Kirkpatrick’s framework to examine effects on tutors and students. We hoped that using multiple outcomes in this way would allow us to demonstrate the wider effects of the program.

The outcomes of the faculty development program were evaluated in four ways, as shown in Table 1.

Tutor satisfaction (Kirkpatrick level 1). A workshop satisfaction survey was also completed at the end of each of the two workshops using 9 items rated on a 5-point Likert scale (strongly disagree, disagree, neutral, agree, strongly agree).

Teaching styles inventory (Kirkpatrick level 2). The Teaching Styles Inventory (TSI) (Leung et al., 2003) is a 35-item self-rating questionnaire validated for use with faculty development programs to assess teachers’ instructional styles. The inventory was administered twice to tutors, initially between the first and second tutor training workshop and again after they had completed their first experience of tutoring in a PBL course. Comparison of preand postresponses provided data regarding changes in the self-reported teaching styles of the PBL tutors over this time period as a result of the workshop and practice as a tutor. We hypothesized that after the implementation of an effective faculty development program, tutors would change from an assertive/suggestive style (associated with lecturing) to a more collaborative/facilitative style (associated with PBL).

Assessments of student attitudes and learning styles (Kirkpatrick level 4). Surveys of attitudes to learning and learning styles of the first year medical undergraduate students were examined at the beginning and end of the first year of medical school, using two established instruments—the Revised Biggs Study Process Questionnaire (SPQ) (Biggs et al., 2001) and the Self-directed Learning Readiness Scale (SDLRS) modified by Fisher et al. (2001).

The Biggs SPQ instrument (Biggs et al., 2001) is a 20-item questionnaire validated for use with programs that assesses the students’ approaches to learning and is widely used in educational research (Immekus and Imbrie, 2010). The SPQ reportedly measures the interaction of teaching context, on-task approaches to learning and learning outcomes, therefore SPQ responses are a function of both individual characteristics and the teaching context. At the contextual level SPQ scores can be used to evaluate teaching contexts such as measuring before and after mean scores following the introduction of an intervention in the same class. The revised SPQ contains 10 items for each factor—deep learning and surface learning, with each factor possessing its own motive and strategy subcomponents.

The SDLRS instrument (Fisher et al., 2001) measures the process of learning as well as the personality characteristics necessary for self-directed learning and has been validated for use in evaluating the self-directed learning readiness of students in health disciplines. The Fisher modification of the SDLRS (drawn from the work of Guglielmino, 1978, among others) was designed to aid in teaching self-directed learners with respect to program preparation, providing skills and assessing strengths in self-directed learning. Guglielmino (1978) notes that the instrument can also be used as an evaluative device in a program designed to develop self-direction in learning. The tool is a list of 40 items perceived to reflect the attributes, skills and motivational factors required of self-directed learners.

Ethics approval was granted from the Health Research Ethics Board at the University of Alberta for all components of this study.

3. Results

The faculty development program was successfully implemented, and the curriculum change was achieved as intended (for a detailed discussion of the faculty development program and outcomes see White et al., 2013). Two hundred and forty-four hours of faculty development were delivered.

Table 1. Study components and outcomes.

3.1. Program Outcomes

A total of 224 tutors were trained in the first year and 75% of attendees completed evaluations. 168 tutors attended both workshops and 56 tutors attended only the second session, a decision based on their previous training and experience with PBL. The rating of overall quality of the introductory workshop was consistently between 4.5 - 5/5 (n = 161, 96% response rate) while the second workshop was rated between 4.3 - 5/5 (n = 217, 97% response rate). The retention rate for tutors from the first to the second workshop was 74%.

3.2. Teaching Styles Inventory

71 of the 168 tutors who completed both workshops also completed the Teaching Styles Inventory prior to completion of their PBL training and again immediately after their first experience as tutors in the new PBL curriculum (42% response rate). A Wilcoxon Signed Ranks Test was conducted to evaluate whether participants changed their teaching style across the four teaching behaviours (assertiveness, suggestiveness, collaboration, and facilitation). The results indicated a significant difference in ratings for assertiveness (decreased, z = −5.941, p < .01), suggestiveness (decreased, z = −2.470, p < .05), and facilitation (increased, z = −4.238, p < .01). No change was observed in the collaborative style. Leung and colleagues (2003) suggest that assertive and suggestive styles of teaching are more consistent with traditional lecture-based teaching whereas “‘collaborative’ and “facilitative” teaching styles are similar to the ideal teaching behaviours of a PBL tutorial” (Leung et al., 2003: p. 411). Our data indicate that there was a change in self-expressed tutor teaching styles with a decrease in assertive and suggestive styles and an increase in facilitative style, as would be hoped in our PBL curriculum; however there was no change in collaborative style as perceived by the tutors participating in PBL.

3.3. Study Process Questionnaire (SPQ)

The SPQ was completed by 167 of 211 students (79% response rate). A Wilcoxon Signed Ranks Test was conducted to evaluate students’ perceptions of their own learning styles. The results indicated a significant difference for deep motivation (z = −4.846, p < .001) and deep strategy (z = −3.654, p < .001). Our data indicate that there was a decrease in deep motivation and deep strategy from pre-test to post-test suggesting that there was a decrease in deep motivation and deep strategy over the period under study.

3.4. Self-Directed Learning Readiness Scale (SDLRS)

The SDLRS questionnaire was completed by first year medical and dental students at the beginning of the academic year and again at the end of the academic year, with 167 of 211 students having full data for both questionnaires (79% response rate). No group difference was found in self-directed learning readiness from the beginning to the end of the first academic year for the students.

4. Discussion

This study demonstrates that a program designed to train faculty members to become tutors in problem-based learning can be designed and effectively delivered to a large number of tutors in a short time. Furthermore, by outcomes using Kirkpatrick’s framework to evaluate multiple outcomes (Kirkpatrick, 1996), we demonstrated that the program described here met a learning need, was well received, and appeared to positively affect selfdescribed tutor teaching styles. We were unable to demonstrate a significant beneficial program effect on student outcomes with respect to motivation and strategy as measured by the RSPQ and there was no observed effect on the SDLRS, however multiple interpretations of these data suggest that there is need for further investigation.

This study evaluated the effectiveness of a program of faculty development for PBL tutors using multiple levels identified by Kirkpatrick (Kirkpatrick, 1996). A survey administered at the start of the first workshop to evaluate attitudes, beliefs, and knowledge about PBL indicated that the majority of tutors had no prior experience of PBL tutors, and expressed a desire for training. Tutors were divided about whether PBL was a good method of teaching, but most appeared to believe that using a variety of teaching methods is beneficial. Because implementation of PBL is resource intensive with respect to the number of tutors required, not all of our participants in tutor-training were volunteers; some participants had been compelled to attend. Implementing a PBL program requires considerable investment on the part of the faculty (Hitchcock & Mylona, 2000), so it is reasonable to assume that attitudes about PBL would affect willingness to participate, satisfaction with training, and effectiveness as a tutor. Barrows (1988) observed that “facilitatory” tutoring can be a difficult role to comprehend and use, particularly for those more familiar with didactic approaches to teaching and learning.

At Kirkpatrick level 1, data from the workshop satisfaction surveys indicate that the training was well received by tutors. Given the diversity of attitudes and experiences participants came to the sessions with, we can be confident that this approach to training meets with the approval of the faculty.

At Kirkpatrick level 2, the Teaching Styles Inventory conducted preand post-training demonstrated a change in self-expressed teaching style towards a facilitative style, which is in keeping with the role of PBL tutors. While the developers of the Teaching Styles Inventory felt it could be used to increase self-awareness of tutors, for recruitment of suitable tutors and as an evaluation instrument for faculty development programs, one challenge with the instrument was a requirement for tutors to rate themselves while considering only a certain type of teaching. In this case the pre-test tutors were encouraged to think of small group teaching they currently did and some commented they did very little such teaching which may have made completion of the pre-test difficult. Independent observation of the tutor to compare an objective assessment of their tutoring style with their self-perception would have supported the validity of the self-assessment (Kirkpatrick level 3). It should be acknowledged that post-training scores may have been influenced by social pressure to respond in a certain way given the context in which the inventory was administered. The addition of observations as a measure of efficacy of the training program was discussed at the study design stage, but was ultimately deemed too labourand resource-intensive and therefore impractical.

Our findings at Kirkpatrick level 4 showed equivocal results. There was no beneficial change noted in student attitudes to self-directedness. There are a number of possible interpretations of these findings. The lack of change in self-directed readiness may have been due to students having a high level of self-directed learning readiness (score >150) (Fisher et al., 2001) upon entry to medical school, and our findings are in keeping with this. It has also been reported that it may take two years for students to be comfortable with PBL and this may have influenced the lack of change in the pre-and postscores (DesMarchais, 1993). Examining the SDLRS tool itself, there is some discussion in the literature regarding its sensitivity and appropriateness for this context (Bonham, 1991). Although Guglielmino’s Self-Directed Learning Readiness Scale demonstrated self-directed learning readiness of third year medical students (Shokar et al., 2002), there have been concerns with its use in medical education research and specifically when used to demonstrate the influence of PBL on self-directed learning readiness (Mann et al., 1994; Miflin et al., 1999; Litzinger et al., 2005; Hoban et al., 2005). It is also difficult to draw conclusions regarding the specific impact of PBL on the learning approach of an entire cohort of students; it is possible that a subset of students might demonstrate improvements in readiness for self-directed learning as a result of participation in PBL.

As participants in higher education, it should be reasonable to assume that we aspire to have students move toward deep learning, yet our data suggest that what is occurring is the opposite of what would be hoped for. While PBL reportedly promotes deep learning (Newble & Clarke, 1986), most undergraduate students become increasingly surface and less deep in their orientation to learning (Biggs et al., 2001). Our students may have used the context of the entire first year curriculum, a blended curriculum involving several approaches to teaching and learning, which may have prevented seeing a result from PBL alone. It may also be interpreted that the general influence of teaching across the curriculum early in the program reinforces surface learning, specifically rote memorization and limited integration, or at least does not encourage deep learning. As indicated by Stes et al. (2013: p. 17) “instructional development for teachers in higher education does not automatically result in effects on students’ study approach”.

It should be noted that there is evidence of level 4 change, reported in a separate paper examining the attitudes and experiences of leaders responsible for the move to PBL (White et al., 2013). However, based on the data reported here we must conclude that there is no strong evidence for a positive outcome at Kirkpatrick level 4 beyond the group of tutors to whom the program was delivered. It is interesting to note Kirkpatrick has indicated that “Evaluation becomes more difficult, complicated, and expensive as it progresses from level 1 to level 4” (Kirkpatrick, 1996: p. 56) which concurs with the findings of this study.

5. Limitations and Future Research

This study was limited by the lack of direct observation of tutor teaching behaviour (Kirkpatrick level 3) and by the relatively low response rate to the teaching styles tool. In future, it would be beneficial to track tutors with respect to amount of training and experience, to assess how this relates to student responses. The implementation of PBL into the curriculum was one of many changes rapidly implemented in response to a more general need to address weaknesses in the undergraduate program identified in an accreditation site visit. It is therefore difficult to know with certainty that we were measuring only the effect of the training and implementation of the PBL component. Ideally, with more time prior to implementation, measures of teaching and learning styles prior to changes to the curriculum would allow for a better understanding of the impact of PBL on undergraduate medical training.

6. Conclusion

This study evaluated the effectiveness of a program of faculty development for PBL tutors using multiple levels identified by Kirkpatrick (Kirkpatrick, 1996; as shown in Table 1). Moving through Kirkpatrick’s levels is meant to allow for successively more precise and meaningful measures of effectiveness. Overall we found that the program was well-received and was successful in meeting tutor learning needs and improving self-described teaching styles. We were unable to identify convincing evidence of positive change in student learning attitudes or behaviours.

From the very beginning, with the implementation of PBL in the undergraduate medical program, we intended to measure the effect and the effectiveness of PBL by employing multiple measures from the various stakeholders involved. That PBL was brought into the curriculum rapidly and under duress was both beneficial, in that it had the support of administration and it’s purpose was not questioned, and detrimental, because there was very little time to plan regarding appropriate measures, obtain ethical approval to conduct the study, and organize the resources to complete the study. It is evident from our analyses that assessment of programs at multiple Kirkpatrick levels is complex, and that more research is needed to illustrate effective program evaluation. However it is also evident that we can move beyond Kirkpatrick level 1 to a “broader and deeper reflection and lead to a more critical analysis” (Kanter, 2008: p. 704).

Acknowledgements

The authors wish to acknowledge the support of the late Dr. David Cook, and the assistance of Ms. Joanna Czupryn with data collection and analysis. This project was supported by funds from the Teaching and Learning Enhancement Fund of the University of Alberta.

References

  1. Barrows, H. S. (1988). The Tutorial Process. Springfield, IL: Southern Illinois University School of Medicine.
  2. Biggs, J., Kember, D., & Leung, D. Y. P. (2001). The Revised Two-Factor Study Process Questionnaire: R-SPQ-2F. British Journal of Educational Psychology, 71, 133-149. http://dx.doi.org/10.1348/000709901158433
  3. Bonham, L. A. (1991). Guglielmino’s Self-Directed Learning Readiness Scale: What Does It Mean? Adult Education Quarterly, 41, 92-99. http://dx.doi.org/10.1177/0001848191041002003
  4. DesMarchais, J. E. (1993). A Student-Centred, Problem-Based Curriculum: 5 Years’ Experience. Canadian Medical Association Journal, 148, 1567-1572.
  5. Dinsmore, D. L., & Alexander, P. A. (2012). A Critical Discussion of Deep and Surface Processing: What It Means, How It Is Measured, the Role of Context, and Model Specification. Educational Psychology Review, 24, 499-567. http://dx.doi.org/10.1007/s10648-012-9198-7
  6. Dolmans, D., Wolfhagen, I., Schmidt, H. G., & Van der Vleuten, C. P. M. (1994). A Rating Scale for Tutor Evaluation in a Problem-Based Curriculum: Validity and Reliability. Medical Education, 28, 550-558. http://dx.doi.org/10.1111/j.1365-2923.1994.tb02735.x
  7. Farmer, E. A. (2004). Faculty Development for Problem-Based Learning. European Journal of Dental Education, 8, 59-66. http://dx.doi.org/10.1111/j.1600-0579.2003.00337.x
  8. Fisher, M., King, J., & Tague, G. (2001). Development of a Self-Directed Learning Readiness Scale for Nursing Education. Nurse Education Today, 21, 516-525. http://dx.doi.org/10.1054/nedt.2001.0589
  9. Guglielmino, L. M. (1978). Development of the Self-Directed Learning Readiness Scale. Dissertation, University of Georgia. Dissertation Abstracts International, 38, 6467.
  10. Hitchcock, M. A., & Mylona, Z.-H. (2000). Teaching Faculty to Conduct Problem-Based Learning. Teaching and Learning in Medicine: An International Journal, 12, 52-57. http://dx.doi.org/10.1207/S15328015TLM1201_8
  11. Hoban, J. D., Lawson, S. R., Mazmanian, P. E., Best, A. M., & Seibel, H. R. (2005). The Self-Directed Learning Readiness Scale: A Factor Analysis Study. Medical Education, 39, 370-379. http://dx.doi.org/10.1111/j.1365-2929.2005.02140.x
  12. Immekus, J. C., & Imbrie, P. K. (2010). A Test and Cross-Validation of the Revised Two-Factor Study Process Questionnaire Factor Structure among Western University Students. Educational and Psychological Measurement, 70, 495-510. http://dx.doi.org/10.1177/0013164409355685
  13. Kanter, S. (2008). Toward Better Descriptions of Innovations. Academic Medicine, 83, 703-704. http://dx.doi.org/10.1097/ACM.0b013e3181838a2c
  14. Kirkpatrick, D. (1996). Great Ideas Revisited: Revisiting Kirkpatrick’s Four-Level Model. Training & Development, 50, 54-57.
  15. Leung, K. K., Lue, B. H., & Lee M. B. (2003). Development of a Teaching Style Inventory for Tutor Evaluation in ProblemBased Learning. Medical Education, 37, 410-416. http://dx.doi.org/10.1046/j.1365-2923.2003.01493.x
  16. Litzinger, T. A., Wise, J. C., & Lee, S. H. (2005). Self-Directed Learning Readiness among Engineering Students. Journal of Engineering Education, 94, 215-221. http://dx.doi.org/10.1002/j.2168-9830.2005.tb00842.x
  17. Mann, K. V., & Kaufman D. (1995). Skills and Attitudes in Self-Directed Learning: The Impact of a Problem-Based Curriculum. In A. I. Rothman, & R. Cohen (Eds.), Proceedings of the Sixth Ottawa Conference on Medical Education (pp. 607-609). Toronto: University of Toronto Bookstore Custom Publishing.
  18. Miflin, B. M., Campbell, C. B., & Price, D. A. (1999). A Lesson from the Introduction of a Problem-Based, Graduate Entry Course: The Effects of Different Views of Self-Direction. Medical Education, 33, 801-807. http://dx.doi.org/10.1046/j.1365-2923.1999.00399.x
  19. Newble, D., & Clarke, R. M. (1986). The Approaches to Learning of Students in a Traditional and in a Problem-Based Medical School. Medical Education, 20, 267-273.
  20. Shokar, G. S., Navkiran, K. S., Romero, C. M., & Bulik, R. J. (2002). Self-Directed Learning: Looking at Outcomes with Medical Students. Family Medicine, 34, 197-200.
  21. Steinert, Y. (2005). Learning Together to Teach Together: Interprofessional Education and Faculty Development. Journal of Interprofessional Care, 19, 60-75. http://dx.doi.org/10.1080/13561820500081778
  22. Stes, A., De Maeyer, S., Gijbels, D., & Van Petegem, P. (2013). Effects of Teachers’ Instructional Development on Students’ Study Approaches in Higher Education. Studies in Higher Education, 38, 2-19. http://dx.doi.org/10.1080/03075079.2011.562976
  23. White, J., Paslawski, T., & Kearney, R. (2013). “Discovery Learning”: An Account of Rapid Curriculum Change in Response to Accreditation. Medical Teacher, 35, e1319-e1326. http://dx.doi.org/10.3109/0142159X.2013.770133