Open Journal of Anesthesiology
Vol.05 No.07(2015), Article ID:58402,5 pages
10.4236/ojanes.2015.57032

Resident Learning Styles: Are We Maximizing Learning Opportunities for Today’s Resident Learner?

Kathleen E. Knapp, Nichole L. Townsend, Seth P. Hanley, Lopa Misra, Pamela A. Mergens

Department of Anesthesiology, Mayo Clinic, Phoenix, Arizona, USA

Email: knapp.kathleen@mayo.edu

Copyright © 2015 by authors and Scientific Research Publishing Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY).

http://creativecommons.org/licenses/by/4.0/

Received 23 June 2015; accepted 26 July 2015; published 29 July 2015

ABSTRACT

Medical education is in constant evolution. It is important to continuously evaluate how residents are being taught in order to improve, and to maximize didactic time while improving standardized test scores. The aim of this study is to assess how residents prefer to learn, which factors preclude residents from studying, the prevalence of certain teaching methods at our institutions, and how this affects standardized exam scores. In order to gather this data, residents across the three Mayo Clinic campuses were anonymously surveyed regarding their preferred study habits, factors that affect their ability to study, how they are most frequently taught within their program, and their most recent in training exam (ITE) scores. Residents are frequently encountering didactic lessons that are consistent with their preferred study methods. However, there seems to be a number of preferred study methods that may not be represented by standard didactic sessions. There are many other factors that affect a resident’s ability to study and those should be taken into consideration by the department when deciding how to teach their residents.

Keywords:

Resident Education, Standardized Examination, Didactic Sessions, Study Methods

1. Introduction

Since the institution of the Accreditation Council for Graduate Medical Education (ACGME) in 1981, resident education has gone through a number of drastic changes. Resident education has improved substantially since that time, initially at the cost of mandating program requirements that seems to prevent program directors from providing individualized resident mentoring [1] . Eventually, the Next Accreditation System (NAS) was established in 2013, leading to a liberalization of resident education [2] . Most recently, changes have been implemented in order to transition from a time-based to a competency-based education system where milestones or specific outcomes must be met in order to advance. With these new mandates, residency programs have attempted to revamp their programs both clinically and didactically in an effort to progress residents through these appropriate metrics in a timely manner [3] . Our investigation is directed at assessing how residents prefer to study versus how they are most frequently taught in order to better understand where potential changes can be made to the current curriculum and didactic schedule of our institution.

In addition to meeting certain milestone achievements, residents are also faced with the challenge of performing well on standardized tests, such as in-training and board exams. How then, can programs fine-tune their teaching to improve the learning environment of their residents in order to meet both the ACGME mandated milestones and standardized test requirements? Two important factors, both of which are addressed in this study, will potentially help programs better understand how to enhance their didactic sessions to maximize resident learning opportunities.

2. Materials and Methods

2.1. Survey Design

In May 2014, a workgroup consisting of residents and consultants in the department of Anesthesiology at Mayo Clinic in Arizona convened to design a survey questionnaire that could be sent to Mayo Clinic residents across the three sites to determine resident study habits. Survey questions were initially created by the survey committee and were modified by the Dean of Education for Mayo Clinic. Ten multiple choice and free-response questions were asked covering the spectrum of resident study habits, resident learning styles, and current educational methods. Fifteen educational resources were included in the survey based on resident feedback from Mayo Arizona residents (PowerPoint, lecture without visual aids, problem based learning, question and answer sessions, M & M/grand rounds, resident lead reaching sessions, journal club, computer based lectures, textbook based self-study, bedside clinical instruction, simulation center, national/regional meetings, audio lectures, small group study sessions, or “other”). Respondents were asked to rate the resources on a scale of 1 (almost never; yearly) to 5 (very frequently; multiple times per day) or “other” to determine how frequently they were used/ encountered.

The decision to ask respondents to indicate their examination score (percentile ranking), a potentially sensitive topic, was made to facilitate subgroup analysis of differences between resident study habits of those who did very well, those who passed, and those who failed. Anonymity was maintained throughout the administration of the survey. Participants were not asked to submit their total ITE score or individual section scores, and they had the option to opt out of the response altogether. Informed consent was implied with the completion of the survey as it was completely optional to participate. All questions were reviewed by the medical education committed prior to distribution however approval from the Ethics Committee was not necessary as all participant data remained anonymous.

In August 2014, an anonymous electronic survey using Survey Monkey (Palo Alto, Ca) was distributed via email from Mayo Clinic’s education director to all Mayo Clinic residents across the three sites Phoenix, Rochester, and Jacksonville. Responses were collected over a four-week period and subsequently analyzed based on a combination of factors.

2.2. Data Analysis

Raw data in the form of percentages, and mean values were exported directly from the Survey Monkey site to Microsoft Excel for further organization and more complete analysis. Resident study habits and preferences were compared based on defined specialties and ITE results.

3. Results Based on Responders

During the 4-week response period, 207 of 1622 Mayo Clinic residents and fellows responded to the survey with a 12.7% response rate. Of the respondents 90.34% (187) were aged 25 - 34, with 9.66% (20) aged 35 - 44. 55.5% (115) male and 44.4% (92) female. The majority of the responses came from the Mayo Rochester campus 74.51% (152) with 15.69% (32) from Mayo Phoenix and 9.80% (20) from Mayo Jacksonville. The greatest number of responses came from residents/fellows in an internal medicine residency or related fellowship (60/32.1%), followed by surgical specialties (39/18.8%), anesthesiology (31/16.6%), radiology (14/7.5%), neurology (11/5.8%), pathology (10/5.3%), dermatology (8/4.3%), pediatrics (7/3.7%), and other specialties (24/12.8%) which includes psychiatry, OBGYN, EM, FM, PM & R. Duration of training at the time of the survey ranged from 0.5 - 6 years with the majority of responses coming from residents in their second or third year of training. The respondents were asked to rank the frequency with which they utilized the following study habits based on a numerical scale with one being almost never (yearly basis), two (rarely: 2 - 3 times per year), three (on occasion: monthly), four (frequently: daily) and five being (very frequently: multiple times a day).

The most commonly utilized study methods based on all responses were bedside clinical instruction followed closely by textbook based self-study. The next most popular methods were problem-based learning, PowerPoint presentations, resident led teaching, and grand rounds, in descending order. The least commonly utilized study methods were national/regional meetings (1.3 on the 5 point scale), small group study sessions (1.36 on the 5 point scale), and audio-based lectures (1.62 on the 5 point scale). Residents also noted that they commonly utilized peer reviewed journal articles and other evidence based medicine sources to study. The respondents were again asked to rank (using the same 5 point scale) the same study tactics however the question was altered to represent how frequently those were utilized, or encouraged by their residency program for instruction in the didactic settings. The most frequently encountered teaching method were bedside clinical research with a score of 3.4 on the 5 point scale, and PowerPoint/slide based teaching with a score of 3.36 on the 5 point scale. The least commonly encountered method of teaching was similar to the least commonly encountered method of study and included teaching at national/regional meetings (1.38 on the 5 point scale), small group study sessions (1.45 on the 5 point scale), and audio based lectures (1.56 on the 5 point scale). This data is compared in Figure 1.

As can be seen from Figure 1, the rankings of teaching methods closely resembled those for learning preferences. This shows that residency programs are already succeeding at providing teaching styles that closely match the learning needs of their residents. It can also be seen that the largest discrepancy between learning and teaching methods exists with PowerPoint material, which is utilized as a teaching tool more than residents prefer to utilize for learning (Figure 1). With regards to in training exam scores, 1.93% (n = 4) stated that they fell within the 0 - 25th percentile, 8.70% (n = 18) 26 - 50th percentile, 23.19% (n = 48) 51 - 75th percentile, 23.19% (n = 51) 76 - 90th percentile, and 20.29% (n = 42) > 91st percentile. 21.26% (n = 44) either chose not to answer or had not yet taken an in-training exam to provide an applicable answer. This information can be visualized in graphic form in Figure 2.

Figure 1. Comparison of educational materials (y-axis), versus the frequency of respondent exposure to the materials based on a 5 point scale (x-axis). The red bar represents the educational material from a program directed didactic standpoint, while the blue bar signifies the educational material based on preferred study method.

Figure 2. In-training exam percentile ranks. Where the y-axis represents the total number of respondents within each group and the x-axis represents the percentile (%ile) from 0 - 100.

When looking at the percentages of those surveyed in each discipline that scored in the 76th percentile or higher on their ITEs, surgery had 59%, internal medicine had 38%, anesthesiology had 35%, radiology had 50%, neurology had 54%, pathology had 60%, dermatology had 37%, and pediatrics had 71%. When looking at participants from all disciplines in Figure 2, we can see that well over half of those choosing to provide an ITE score had percentiles of 76% or above, with 43/207 choosing the “prefer not to answer” category the majority of whom cited that they had not yet taken an in-training exam and therefore could not answer accurately.

4. Discussion

Since the establishment of the ACGME in 1981, the graduate medical education realm has been faced with the stresses of variability in the quality of resident education and the emerging formalization of subspecialty education [1] . Certain challenges presented themselves over the following 30 years, during which resident education improved substantially at the cost of “prescriptive” program requirements that seemed to prevent program directors from providing innovation and resident mentoring while they were forced to “manage” their programs [1] . This led to the implementation of the Next Accreditation System (NAS) in 2013, in which resident learning milestones are assessed and more innovative forms of teaching are encouraged [2] . Such milestones include professionalism, interpersonal and communication skills, practice-based learning and improvement, and systems-based practice [4] .

Regardless of milestone achievements, residents are still expected to perform well on standardized tests, including their in-training and board exams. How then, can programs fine-tune their teaching to improve the learning environment of their residents? Two important factors, both of which are addressed in our survey, are the methods that residents prefer to utilize when studying, and the methods utilized by residency programs to teach their residents. From the data collected from our three Mayo Clinic institutions, it is clear that over half (51%) of our respondents are scoring at the 76th percentile or above on their in-training exams (Figure 2), with program teaching styles closely matching learning preferences (Figure 1).

Does this mean residents are studying as efficiently as possible? As Dr. Lewis notes in her essay entitled finding the learning sweet spot, “Learning in residency looks different than it did in medical school. For those like me who learn from hands-on patient interactions, there is endless material, but little time to process it.” This fact is evident in the survey responses we received. Many residents encounter bed-side learning very-frequently (on a daily basis) however they note that there is a disparity of time remaining in the day after their clinical duties have been completed to process that information. “For this reason, in residency we should be paying as much attention to how we learn as what we learn” [5] . Alternatively, Chang et al. show that surgical residents who complete more practice questions perform better on in-training exams [6] . Similarly, Eastin and Bernard surveyed emergency medicine residents who reported that they prefer question-based preparation over text- based learning for ITE preparation [7] . Perhaps completing study questions was considered by some respondents as falling under the “textbook based self-study” option on our questionnaire. Improvements in our study could be guided at determining how many study questions residents complete when preparing for an exam.

During residency training, learners have to work largely on their own to master a large body of new information from disparate sources. Perhaps 95 percent of what is learned during residency training is acquired in the clinical setting or at home. “Specialty conferences, core didactic sessions, grand rounds, morbidity and mortality conferences, and skills labs may account for as many as five hours per week, but topics taught during those hours will not necessarily be appropriate for every postgraduate year (PGY) training level nor relate to cases being seen on rotation” [8] . Residents frequently encounter didactic lessons that are presented using their preferred study methods. There seem to also be a number of preferred study methods that may not be represented by standard didactic sessions. Perhaps more emphasis should be based on practice questions, with less time being devoted to PowerPoint presentations. There are many other factors that affect a resident’s ability to study and those should be taken into consideration by the department when defining how best to teach their residents.

Potential limitations to this study include the somewhat small sample size and limited response rate, 207 of 1622 total residents at the three Mayo Clinic sites (Phoenix, Rochester, and Jacksonville). Future studies could potentially include residents at other programs across the United States to supply a larger and broader respondent group. Another potential limitation to this study is the fact that the statistical analysis was intentionally simplified. The authors elected for a comparison of mostly raw percentage data and mean values for ease of interpretation and discussion. More formal, larger studies could potentially utilize more complex methods of analysis to further quantify statistical significance. Despite these limitations, this study has certain strengths. It brings to light the importance of the continued evaluation of individual program didactic sessions, especially now that there are a number of innovative learning opportunities are available.

5. Conclusion

Resident education has improved substantially over the years with a shift from time-based to a competency- based education system. With these new mandates, residency programs have attempted to revamp their programs both clinically and didactically in an effort to progress residents through these appropriate metrics in a timely manner. This study has shown that residents are frequently encountering program mandated didactic lessons which are consistent with their preferred study methods. Most commonly, these are things such as bedside clinical teaching, and resident lead study sessions. However, there does seems to be a number of preferred study methods which may not be represented by standard didactic sessions, including but not limited to textbook based self-study, and computer based learning (question banks and online lectures). Despite some discrepancies in preferred learning methods and mandated teaching sessions, the majority of residents who responded to the survey scored above the 50th percentile on the standardized in-training exams.

Cite this paper

Kathleen E.Knapp,Nichole L.Townsend,Seth P.Hanley,LopaMisra,Pamela A.Mergens, (2015) Resident Learning Styles: Are We Maximizing Learning Opportunities for Today’s Resident Learner?. Open Journal of Anesthesiology,05,177-182. doi: 10.4236/ojanes.2015.57032

References

  1. 1. Nasca, T.J., Philibert, I., Brigham, T. and Flynn, T.C. (2012) The Next GME Accreditation System—Rationale and Benefits. The New England Journal of Medicine, 366, 1051-1056.
    http://dx.doi.org/10.1056/NEJMsr1200117

  2. 2. Green, M.L., Aagaard, E.M., Caverzagie, K.J., et al. (2009) Charting the Road to Competence: Developmental Milestones for Internal Medicine Residency Training. Journal of Graduate Medical Education, 1, 5-20.
    http://dx.doi.org/10.4300/01.01.0003

  3. 3. Ebert, T.J. and Fox, C.A. (2014) Competency-Based Education in Anesthesiology: History and Challenges. Anesthesiology, 120, 24-31.

  4. 4. Moskowitz, E.J. and Nash, D.B. (2007) Accreditation Council for Graduate Medical Education Competencies: Practice-Based Learning and Systems-Based Practice. American Journal of Medical Quality, 22, 351-382.
    http://dx.doi.org/10.1177/1062860607305381

  5. 5. Lewis, S.B. (2011) Finding the Learning Sweet Spot. NEJM Journal Watch.
    http://blogs.jwatch.org/general-medicine/index.php/2011/09/finding-the-learning-sweet-spot/

  6. 6. Chang, D., Kenel-Pierre, S., Bas, S., et al. (2014) Study Habits Centered on Completing Review Questions Result in Quantitatively Higher American Board of Surgery In-Training Exam Scores. Journal of Surgical Education, 71, 127-131.
    http://dx.doi.org/10.1016/j.jsurg.2014.07.011

  7. 7. Eastin, T.R. and Bernard, A.W. (2013) Emergency Medicine Residents’ Attitudes and Opinions of In-Training Exam Preparation. Advances in Medical Education and Practice, 4, 145-150.
    http://dx.doi.org/10.2147/AMEP.S49703

  8. 8. Schmitz, C.C., D’Cunha, J. and Antonoff, M.B. (2010) Developing Self-Regulated Learning Habits Can Help Residents Be Better Learners. American College of Surgeons.