Creative Education
Vol.10 No.07(2019), Article ID:93673,15 pages
10.4236/ce.2019.107109

An Examination of Field Experiences as They Relate to InTASC Standards: A Retrospective Pilot Study for an Educator Preparation Provider

Elizabeth Block, Angelle Hebert, Leah Peterson, Alyson Theriot

Nicholls State University, Thibodaux, LA, USA

Copyright © 2019 by author(s) and Scientific Research Publishing Inc.

This work is licensed under the Creative Commons Attribution International License (CC BY 4.0).

http://creativecommons.org/licenses/by/4.0/

Received: May 20, 2019; Accepted: July 12, 2019; Published: July 15, 2019

ABSTRACT

This paper outlines the efforts made by a teacher preparation program to examine the way field experiences are implemented and structured. A retrospective approach is taken to examine the educational preparation provider’s (EPP) current practices and structure field experiences with greater intentionality. A pilot study is designed for the teacher preparation program to align experiences to applicable InTASC standards and better define requirements. Throughout the paper, a strong emphasis is placed on field and clinical practice as an integral part of the preparation of preservice teachers.

Keywords:

Teacher Education, Field Experiences, Clinical Practice, Accreditation Standards, Educator Preparation

1. Introduction

Educator Preparation Providers (EPPs) cope with a variety of pressures when it comes to higher education quality assurance: the tension of meeting rigorous accreditation standards, the difficulty of keeping up with federal and state demands, as well as the struggle to create new and improved approaches to assessment (Ewell, 2009) . As Darling-Hammond (2014) states, “The question how to strengthen teacher education is increasingly at the forefront of U.S. education policy-making, as the demands on teachers to teach ever more challenging curriculum to ever more diverse learners continue to increase exponentially” (p. 547). A large part of these demands is centered on field and clinical practice. For example, the Council for Accreditation of Educator Preparation (CAEP) dedicates an entire standard to “Clinical Partnerships and Practice” (CAEP, 2015) . The standard states that high-quality clinical practice is essential for aspiring teachers to acquire the appropriate knowledge, skills, and dispositions to have a positive impact on P-12 students (CAEP, 2015) . Based on a need to promote continuous improvement efforts in the area of field experiences, an examination of current practices in field experience and clinical practice was warranted due to increasing demands and national calls for reform (Capraro, Capraro, & Helfeldt, 2010) . Darling-Hammond (2014) states, “Efforts to improve teacher education have recently focused in on the importance of well-supervised clinical practice as a critical element of effective preparation” (p. 547). Outside entities such as accreditors and policy-makers have placed a strong emphasis on field experiences being an integral part of EPPs (Capraro, Capraro, & Helfeldt, 2010; Darling-Hammond, 2014) . Teacher preparation programs must respond to this call and rethink the way learning experiences are structured for pre-service teachers to have opportunities to better integrate theory and practice in real-life classroom settings. Specifically regarding this study, a retrospective approach was taken to examine the EPP’s current field experiences to improve their standard alignment, content, and sequencing. Alignment was the focus of the first step in analyzing current practices.

2. Meeting Accreditation Standards

Educational Preparation Provider’s continuous improvement efforts are motivated by several factors including positive student outcomes and seeking and maintaining accreditation. Accreditation, the process of external quality review, is carried out by a variety of councils and associations for specific purposes. CAEP, for example, reviews EPPs as a part of both institutional and professional accreditation cycles. Efforts related to field experience and clinical practice are directly related to CAEP Standard 2 (Clinical Partnerships and Practice) and the Interstate Teacher Assessment and Support Consortium (InTASC) categories (Salazar, 2015) . InTASC “offers a new vision for preparing, supporting, evaluating, and rewarding teachers along their careers” ( Salazar, 2015 , Slide 10, para. 1) and identifies four categories of core teaching standards: the Learner and Learning, Content, Instructional Practice, and Professional Responsibility. For the purpose of this study, improvement efforts with regard to field experiences were focused on integrating InTASC categories as recommended by CAEP in order to promote continuous improvement of the EPP.

3. Current Practices

The EPP is a moderately sized, regional university located in southern United States. Current student enrollment is 6488 with 5896 undergraduate students and 592 graduate students. Ninety-three percent of the student population originates from in-state, 5% from out-of-state, and 2% as international students. The EPP offers 107 programs of study with teacher education ranking as fourth in the graduation of candidates. The average undergraduate class size is 23 with a 20:1 student to faculty ratio. Sixty-eight percent of freshmen are first-generation college students with 79% of students who commute to campus for classes and extracurricular activities. The EPP has a strong presence in the community as it provides over 80% of the teachers and nurses to the local schools and hospitals. Within the EPP, the Department of Teacher Education offers one associate degree, 10 undergraduate bachelor's degrees resulting in certification, certification-only post-baccalaureate degrees, MAT degrees, and Master’s Degrees. Programs of study are accredited through CAEP and meet state policy with the number of pre-service field experience hours and culminating hours in student teaching/residency (Nicholls State University, 2019) . The EPP’s field experiences are divided into three categories: level 1—observation or case study, level 2—tutoring, small groups, or interviews, and level 3—whole class instruction. Each program within the EPP requires a certain number of field experience hours per level. Hour requirements for each level are specified by each course within a program to satisfy state guidelines. Implementation of field experiences is scaffolded; therefore, upper-level courses include a greater number of level 3 experiences. Charts were developed for initial programs within the EPP, which specifies the hours and levels of experiences for courses. Before the EPP examined current practices, faculty were well versed in field experiences within their own courses but ill-informed on the implementation of field experiences across coursework and programs. This lack of awareness of field experiences across programs hindered their abilities to scaffold candidates' experiences properly.

Pre-service teachers receive placement for their experiences by the EPP’s Field Experience Coordinator, and data for each experience are entered manually into the EPP’s assessment system, LiveText, by the candidates. Pre-service teachers enter self-reported data of their experiences using an online survey format within LiveText. These data include, but are not limited to, the level of the field experience, location and date of the field experience, subject area(s) and grade level(s) in which the experience took place, ethnicity and gender of the supervising school personnel and the duration of the experience (See Appendix A for Field Experience Demographics Form). Field experience data are exported annually as a part of the EPP’s unit data collection processes. Data are also disaggregated by program and shared with faculty and program coordinators.

Despite these efforts, the current processes for field experiences within the EPP leave some areas of concern. While specific courses are assigned a minimum number of hours to be earned within programs, no other specifications were aligned to these field experiences. The only information available was self-reported data from LiveText. This warranted an investigation into current practices. Additionally, the EPP was aware that all institutions must make a more conscious effort to establish better action-based measures. The question arose: How can the EPP provide more actionable data to improve field experiences?

4. The Pilot

Within the EPP, faculty members are assigned to specific CAEP standards according to their areas of expertise. The CAEP Standard 2 committee is co-chaired by two faculty members: the Field Experience Coordinator and the Director of Student Teaching. Based on prompting from state evaluators, the question of how to provide more actionable data to improve field experiences was posed to the Standard 2 Committee. In Spring 2016, this committee met to discuss concerns related to field experience and clinical practice.

The EPP was prompted to examine the scope and sequence of pre-service field experiences following feedback from the EPP’s on-site state review process. Feedback from the outside review team indicated that there were a sufficient amount of experiences within programs, but there were other areas of concern that needed to be addressed. As Capraro, Capraro and Helfedt (2010) state, “Bridging the gap between theory and practice does not automatically occur simply as a result of participating in field experiences” (p. 132). In this regard, the Standard 2 Committee found that there was some disconnect between coursework and experiences. Additionally, the committee found there could be greater collaboration between faculty with a focus on the sequencing and alignment of experiences. This applied to the progression of experiences within courses, between courses, and throughout programs. The Standard 2 Committee agreed to pilot a study that examined alignment of field experiences to the sound underpinnings of InTASC categories and provide in-depth descriptions of field experiences required in select teacher education courses.

5. Methodology

The CAEP standard two committees met in spring 2016 to determine the procedure for the pilot study. This committee was comprised of the Director of Student Teaching who taught secondary ELA, the field experience coordinator, the assessment coordinator and three faculty members who represented early childhood education, elementary education, middle school, and secondary education. The committee determined that the pilot study would begin with faculty participants who would complete a “field experience matrix” (see Appendix B for the Field Experience Matrix). This matrix required each participating member to align courses to InTASC categories and identify certain elements of field experiences within their courses. The committee determined that in order to allow for field experience data to be collected across programs, all members of the field experience committee (six total) would be asked to be participants in the study. The six faculty members represented a strong cross-section of courses offered by the EPP. In order to thoroughly analyze the required experiences in each course, the field experience committee developed a data collection matrix that summarized important elements of field experiences. The elements were identified as the Order in which the field experience was offered, Level (1, 2 or 3), Type (video, observation, small groups, tutoring, interview, case study, or whole class-instruction), Quantity (required hours in field), Relationship to the InTASC Category (InTASC category that is most closely aligned to experience as determined by the professor), and a Description (summary of tasks required of candidates in the field). Participating faculty members were given two weeks to complete the matrices for their courses.

At the completion of phase one of the pilot study (end of spring 2016), the Standard 2 Committee collected matrices from three of the six committee members for six different courses (Appendix C). The following courses/faculty members agreed to participate in the pilot study: 1) EDUC 312: Planning for Teaching in Multicultural Classrooms which all undergraduate candidates must successfully complete with a grade of C or higher in order to progress to methodology courses; 2) EDUC 421: Current Practices and Strategies in Teaching which all certification-only candidates must pass with a C or higher in order to progress to methodology coursework; 3) FCED 239: Preschool Practicum which undergraduate candidates in early childhood education must successfully complete prior to student teaching; 4) EDCI 573: Curriculum and Methods for Early Childhood Special Education which is completed by candidates in the Master’s Degree in Early Childhood Education; EDCI 579: Practicum in Early Childhood Education which is completed by candidates in the Master’s Degree in Early Childhood Education; and 5) EDCI 580: Interdisciplinary and Interagency Teaming in Early Childhood Education. These courses represented a cross section of candidate classification, major, and coursework (undergraduate, graduate, and certification-only).

The committee met to review matrices and determine next steps for phase two of the pilot: implementation in summer courses. Even though faculty members aligned field experiences to InTASC standards by course on the matrices, data for these pilot courses had to be collected and aggregated through candidate self-reporting in LiveText. Along with the collected matrices, corresponding changes were made to the field experience form in LiveText that pre-service teachers complete after conducting their experiences. To collect data on alignment to InTASC categories in phase two of the pilot, the Assessment Coordinator added a dropdown menu where pre-service teachers chose one of the four InTASC categories as designated by their instructor and the nature of the field experience. As a part of the pilot, any members of the committee teaching courses which required field experiences were asked to complete a field experience matrix and have students document the appropriate InTASC category in their field experience forms.

Before collecting alignment to InTASC data via LiveText, the committee was able to analyze and compare the matrices and made the following observations: faculty descriptions and number of InTASC categories aligned to each experience varied by instructor. This presented two issues: 1. The dropdown menu in LiveText would only allow for one InTASC category to be assigned to each experience, and 2. some descriptions might not include adequate information of each experience and purpose. To remedy the first issue the committee decided that experiences should be aligned to the most applicable InTASC category that applied best to the experience. To address the second issue; in the event that a more detailed description was needed, the committee chair would meet with faculty members to get additional information.

As candidate reporting of field experience alignment to InTASC standards was critical to the analysis of data, the assessment coordinator worked with the three faculty members involved in the pilot study on how to implement information from the matrices in their courses. The faculty members were asked to review the rationale for pre-service field experiences to their candidates, specific alignment to InTASC standards, and why this alignment is critical in producing teachers who create success for each K-12 student in his/her future classrooms. Pilot faculty taught their six courses in summer 2016 and integrated InTASC content and procedural tasks on reporting this content in LiveText throughout their respective courses. Candidate data collection was completed in July, 2016.

6. Results

At the completion of Summer 2016, the Assessment Coordinator pulled data from LiveText to examine the implementation of the pilot using the six matrices from the three participating faculty members. Table 1 and Table 2 provide a summary of that data as it relates to InTASC alignment. Raw data is provided in Appendix D.

Table 1 indicates that with 272 out of 557 candidate submissions, the lowest response rate of 49% occurred for FE1 Observation experiences with the highest

Table 1. Summer 2016 pilot data by form.

Table 2. Summer 2016 pilot data by InTASC category.

response rate of 87% for FE1 school-based case studies with 20 out of 23 submissions by candidates. These data indicate that of the total number of field experiences for summer 2016, 49% to 87% of candidates were enrolled in and completed field experience forms for the 6 participating pilot courses. This significant response rate from only three faculty members over six summer courses indicates that the sample population in the pilot study was a strong representation of the overall population enrolled in summer school. Additionally, the new category added to the LiveText form (InTASC category) provided valuable data on what percent of forms were addressing particular InTASC categories. Candidates enrolled in summer school completed field experience forms aligned to InTASC in 49% of student observations (FE1), 69% of student tutoring experiences (FE2) and 79% of whole class instruction. This response rate indicates that candidates submitted data forms across field experience types giving greater generalizability to the InTASC categories reported in Table 2.

Data in Table 2 indicates the alignment of field experiences to InTASC categories across six of the courses taught in summer 2016. Candidates who completed observations of students (FE1) reported that these observations were most indicative of instructional practices in the classroom (46%) and least indicative of content (5%). Candidates who completed tutoring experiences in the classroom (FE2) indicated that these experiences were most reflective of InTASC category “Instructional Practice” (41%) and least reflective of InTASC category “Content” (13%). Candidates who completed whole class instruction (FE3) indicated that these field experiences were most closely aligned to “Instructional Practice” (47%) with the lowest alignment to “The Learner and Learning” (6%) followed closely by “Content” (14%). “Professional Responsibility” was consistently represented in alignment across field experience levels with 37% in FE1, 30% in FE2 and 33% in FE3. InTASC category “Instructional Practice” was also relatively stable across field experience levels at 46% (FE1), 41% (FE2), and 47% (FE3).

7. Discussion

Even with a small sample size of six Teacher Education courses participating in this pilot study, an adequate response rate was received allowing the authors to draw tentative conclusions leading to future research opportunities. In reviewing the categorization of field experiences to InTASC, the authors observe that “Instructional Practice” was consistently ranked as the highest competency noted in observations (FE1), tutoring (FE2) and whole group teaching (FE3). “Content” is consistently ranked as the lowest competency noted in FE1 and FE2 with the exception of “The Learner and Learning” as the lowest ranked category in FE3. This disparity in representation of InTASC categories gives the authors pause as all four categories should ideally be scaffolded and sequenced with a different emphasis throughout the programs. As candidates progress through their teacher preparation programs, field experiences should be sequenced with equal priority given to “The Learner and Learning,” “Content,” “Instructional Practices,” and Professional Responsibility” at different points throughout the coursework. If consistent focus is given to one category (Instructional Practices) in all three types of field experiences, candidates may not have the foundational content and pedagogy needed to be successful in the classroom. In sequencing field experiences, it is important for faculty to review all courses, the order in which they are completed, and how “Content” and “The Learner and Learning” can be more predominantly represented in FE1 and FE2 experiences with “Instructional Practices” and “Professional Responsibility” represented across FE2 and FE3 experiences. This intentional design of field experiences will provide candidates with growth and scaffolding across their programs. While these data were limited to the faculty participating in the study, the committee found enough evidence to expand the pilot to all faculty within the EPP. Additionally, with data from an entire academic year, the committee would be able to evaluate data that is more representative of the entire pre-service teacher population and of field experiences and clinical practice in general.

8. Limitations

There were several limitations within this pilot study. One limitation was the implementation of the study within the summer semester. The availability of field experience placements is limited due to most PK-12 schools being out for the summer. This affects what courses are offered and how field experiences are assigned. Additionally, the summer months are restricted to a much smaller sample that may not be representative of the EPP’s entire pre-service teacher population. Secondly, the findings for this pilot, already limited, presented stronger internal validity rather than external validity. Since the study was restricted to one EPP, there is limited evidence to support that this pilot and future recommendations could be successfully implemented for other education providers. The third limitation was the use of only three faculty members to begin the pilot study. Stronger conclusions cannot be drawn from the evidence until all faculty members within the EPP are documenting their current practices and implementing the changes in their courses. This, in conjunction with an entire academic year of data, will provide more valid findings and actionable figures. Additionally, it will give the EPP a holistic view of the alignment, content, and sequencing of all field experiences for each program as well as the unit.

9. Recommendations

Staying true to the nature of a pilot study, there are multiple recommendations for the EPP that are vital to future research efforts and continuation of this initiative. The first recommendation is to move from the pilot study to full implementation within the EPP. Significant findings cannot be made until all faculty are involved in the new processes and more data are collected. The second recommendation is to collect data on an annual basis. A full year of data including the summer, fall, and spring semesters of an academic year would provide a larger sample size and more accurate representation of pre-service teachers and their experience in the field. Furthermore, this will integrate well into the EPP’s already established assessment cycle with field experience data collection in the summer, analysis in August, collaboration with stakeholders in October, and recommendations and proposed changes in November. Most significantly, this pilot study did not explore best practices on sequencing field experiences for pre-service teachers. This restructuring of field experiences is an area ripe for research by this team in future studies and should be based on thoughtful scaffolding of InTASC standards across experiences and programs. With a state-mandated shift to a one-year residency program rather than one semester of student teaching, it is essential for the authors to move beyond the examination of field experiences as they relate to InTASC standards and determine how to restructure field experiences for each program based on the InTASC classification. Using data from the pilot study will guide faculty members in a comprehensive examination of the scope and sequence of field experiences so that candidates enter their culminating semesters of residency with the knowledge, skills, and dispositions to be successful.

10. Summary

This pilot study served as the EPP’s foundation for examining field experiences and their relationships to InTASC categories as reported through matrices and candidate self-report. The CAEP Standard 2 Committee found the information provided by the participating faculty to be beneficial to the unit’s self-study. The committee also found that if matrices were provided by all faculty, the EPP could continue its self-evaluation in a more comprehensive manner. In addition, these efforts have the ability to increase collaboration between faculty and diminish disconnect between instructors, coursework, and experiences. Despite the ambiguity in the past surrounding the various field experiences being implemented within the EPP, a retrospective approach was taken to better define, align, and sequence these practices. The EPP plans to follow through with the committee’s recommendations to expand the pilot to all faculty members, commence efforts to evaluate and restructure field experiences, and to document and report on its findings.

Conflicts of Interest

The authors declare no conflicts of interest regarding the publication of this paper.

Cite this paper

Block, E., Hebert, A., Peterson, L., & Theriot, A. (2019). An Examination of Field Experiences as They Relate to InTASC Standards: A Retrospective Pilot Study for an Educator Preparation Provider. Creative Education, 10, 1492-1506. https://doi.org/10.4236/ce.2019.107109

References

  1. 1. Capraro, M. M., Capraro, M. R., & Helfeldt, J. (2010). Do Differing Types of Field Experiences Make a Difference in Teacher Candidates’ Perceived Level of Competence? Teacher Education Quarterly, 37, 131-154. http://www.eric.ed.gov/PDFS/EJ872653.pdf [Paper reference 3]

  2. 2. Council for the Accreditation of Educator Preparation (CAEP) (2015). Standard 2: Clinical Partnerships and Practice. http://www.ncate.org/standards/standard-2 [Paper reference 2]

  3. 3. Darling-Hammond, L. (2014). Strengthening Clinical Preparation: The Holy Grail of Teacher Education. Peabody Journal of Education, 89, 547-561. https://doi.org/10.1080/0161956X.2014.939009 [Paper reference 3]

  4. 4. Ewell, P. T. (2009). Assessment, Accountability, and Improvement: Revisiting the Tension. Champaign, IL: National Institute for Learning Outcomes Assessment. [Paper reference 1]

  5. 5. Nicholls State University (2019). Fast Facts. http://www.nicholls.edu/about/fast-facts/ [Paper reference 1]

  6. 6. Salazar, M. (2015). Quality Assessments Workshop. http://caepnet.org/~/media/Files/caep/conferences.../quality-assessment-denver-salazar.pdf [Paper reference 2]

Appendix A—Field Experience Demographics Form

1) Enter a name for the field experience(s).

2) Today’s Date.

3) Your First Name.

4) Your Last Name.

5) Your Major.

6) If you are a Certification Only major, please indicate your area of concentration. If you are not an alternative certification candidate, please choose the first answer—Not Applicable.

7) If you are a Middle School (48ED) major, please indicate your areas of concentration. If you are not a Middle School major, please choose the first answer—Not Applicable.

8) Your Program Level.

9) Course in which field experience was assigned.

10) Instructor of course in which field experience was assigned.

11) Name of School, Site, or Video.

12) Name of Site Contact.

13) Gender of Site Contact.

14) Ethnicity of Site Contact.

15) Number of male students participating in the field experience activity (ies).

16) Number of female students participating in the field experience activity (ies).

17) Number of American Indian or Alaskan native students participating in the field experience activity (ies).

18) Number of Asian students participating in the field experience activity (ies).

19) Number of Black, non-Hispanic students participating in the field experience activity (ies).

20) Number of Hispanic students participating in the field experience activity (ies).

21) Number of White, non-Hispanic students participating in the field experience activity (ies).

22) Religion: Indicate if any students belong to these religions.

23) Grade Levels.

24) Subject Areas Observed.

25) Number of students receiving free/reduced lunch.

26) Number of students classified as general/regular education that participated in the field experience activity (ies).

27) Number of students classified as 504 that participated in the field experience activity (ies).

28) Number of students classified as 1508 (special education, non-gifted) that participated in the field experience activity (ies).

29) Number of students classified as 1508 (special education, gifted) that participated in the field experience activity (ies).

30) Number of students classified as non-cat pre-school that participated in the field experience activity (ies).

31) Number of students classified as limited English proficiency that participated in the field experience activity (ies).

32) Total time for the field experience(s). Please enter as decimals, round to the nearest quarter hour. (Ex., if the FE is 2 hours, type 2.00. If the FE is 2 hours and 15 minutes, type 2.25. If the FE is 2 hours and 30 minutes, type 2.50. If the FE is 2 hours and 45 minutes, type 2.75).

Appendix B—Field Experience Matrix

Instructor:

Course:

Order—The order in which the field experiences are completed. Most likely, this will be in numerical order going down as 1, 2, 3, etc.

Level—Options are FE1, FE2, FE3, PCB (published case study), or SBCS (school based case study).

Type—Options are observation, video, interview, small groups, tutoring, whole class instruction, lesson implementation, etc.

Quantity—How many of these field experiences are completed in the course.

Relationship to InTASC Category—Options are “The Learner and Learning,” “Content Knowledge,” “Instructional Practice,” or “Professional Responsibility.” Candidates must choose one of these four options, designated by you the instructor, in their field experience forms in LiveText. For more information on InTASC categories, see The InTASC Model Core Teaching Standards document.

Description—A concise narrative describing the field experience such as a description of the video being viewed, if the field experience is required to be in a certain major or grade level, the focus of the experience, etc. More specific examples of descriptions can be found in the pilot documents by request.

Appendix C—Field Experience Matrices Pilot—2016