CBE—Life Sciences Education Vol. 14, 1–12, Summer 2015 Article Breaking the Cycle: Future Faculty Begin Teaching with Learner-Centered Strategies after Professional Development Diane Ebert-May,* Terry L. Derting,† Timothy P. Henkel,‡ Jessica Middlemis Maher,§ Jennifer L. Momsen, Bryan Arnold,¶ and Heather A. Passmore† *Department of Plant Biology, Michigan State University, East Lansing, MI 48824; †Department of Biological Sciences, Murray State University, Murray, KY 42071; ‡Department of Biology, Valdosta State University, Valdosta, GA 31698; §Delta Program, University of Wisconsin–Madison, Madison, WI 53706; Department of Biological Sciences, North Dakota State University, Fargo, ND 58108; ¶Department of Biology, Illinois College, Jacksonville, IL 62650 Submitted December 3, 2014; Revised March 11, 2015; Accepted March 23, 2015 Monitoring Editor: Deborah Allen The availability of reliable evidence for teaching practices after professional development is limit- ed across science, technology, engineering, and mathematics disciplines, making the identiication of professional development “best practices” and effective models for change dificult. We aimed to determine the extent to which postdoctoral fellows (i.e., future biology faculty) believed in and implemented evidence-based pedagogies after completion of a 2-yr professional development pro- gram, Faculty Institutes for Reforming Science Teaching (FIRST IV). Postdocs (PDs) attended a 2-yr training program during which they completed self-report assessments of their beliefs about teach- ing and gains in pedagogical knowledge and experience, and they provided copies of class assess- ments and video recordings of their teaching. The PDs reported greater use of learner-centered compared with teacher-centered strategies. These data were consistent with the results of expert reviews of teaching videos. The majority of PDs (86%) received video ratings that documented ac- tive engagement of students and implementation of learner-centered classrooms. Despite practice of higher-level cognition in class sessions, the items used by the PDs on their assessments of learning focused on lower-level cognitive skills. We attributed the high success of the FIRST IV program to our focus on inexperienced teachers, an iterative process of teaching practice and relection, and development of and teaching a full course. INTRODUCTION of Science, 2011; Anderson et  al., 2011; President’s Council of Advisors on Science and Technology, 2012; Association Despite the need to transform teaching and learning in the of American Universities, 2014) and the emerging body of sciences (e.g., American Association for the Advancement research on how to do so (Amundsen and Wilson, 2012; Singer et al., 2012), adoption of these indings by faculty who CBE Life Sci Educ June 1, 2015 14:ar22 teach undergraduate science courses has been slow, at best DOI:10.1187/cbe.14-12-0222 (Brownell and Tanner, 2012; Smith and Valentine, 2012). The Address correspondence to: Diane Ebert-May (ebertmay@msu transformation of undergraduate science, technology, engi- .edu). neering, and mathematics (STEM) classroom experiences © 2015 D. Ebert-May et  al. CBE—Life Sciences Education © 2015 requires a fundamental shift in how instructors approach The American Society for Cell Biology. This article is distributed teaching and learning, moving from an information-transfer, by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Non- teacher-centered model to one that is concept-focused, learn- commercial–Share Alike 3.0 Unported Creative Commons License er-centered, and collaborative (Weimer, 2002). (http://creativecommons.org/licenses/by-nc-sa/3.0). The way instructors approach their teaching is inlu- “ASCB®”and “The American Society for Cell Biology ®” are regis- enced profoundly by their beliefs and conceptions about tered trademarks of The American Society for Cell Biology. teaching (Ho et  al., 2001; Lindblom-Ylanne et  al., 2006; 14:ar22, 1 Supplemental Material can be found at: http://www.lifescied.org/content/suppl/2015/05/27/14.2.ar22.DC1.html Downloaded from http://www.lifescied.org/ by guest on June 5, 2015 D. Ebert-May et al. Postareff et  al., 2007). Some authors claim that change in conceptualization of the learning process and thus change conceptions about teaching is a necessary prerequisite to their teaching practices. Researchers suggest that successful changing instruction (Ho et al., 2001), while others claim the strategies for effecting change are collegial and community opposite; that is, change in teaching practices occurs before based, focus on content knowledge, and utilize concrete change in beliefs (Guskey, 2000). In either case, conceptions and coherent active-learning opportunities (Emerson and about teaching take time to change (Postareff et al., 2007) and Mosteller, 2000; Garet et al., 2001). Furthermore, conceptual require instructors to relect on their own teaching practices change is more likely to occur when participants engage and evaluate the relative consistency between their beliefs over an extended period of time (more than one semester; and actions in the classroom (Wlodarsky, 2005). Instructors’ Shields et al., 1998; Weiss et al., 1998; Emerson and Mosteller, teaching conceptions are further inluenced by the discipline 2000; Henderson et al., 2011). Finally, mentoring of and re- and teaching context (Lindblom-Ylanne et  al., 2006). All of lection by participants is a component of professional de- these variables contribute to the complexity of teaching and velopment that is critical for conceptual change (Hubball learning in the higher education system. Institutions have et al., 2005; Brownell and Tanner, 2012). Collectively, these acknowledged this reality and in response have advocated change strategies are predicted to work, because they ad- for decades the need to provide faculty opportunities to dress individual beliefs and experiences as well as situa- learn about effective teaching through professional develop- tional factors that support or impede changes in teaching ment workshops (Connolly and Millar, 2006). (Henderson and Dancy, 2007). Even with the continued availability of and interest in The aim of our research was to determine the extent to teaching development opportunities, there is little evi- which postdoctoral fellows (i.e., future biology faculty) dence of resulting widespread impact on teaching prac- believed in and implemented evidence-based pedagogies tices and even less about the impact on student learning after completion of a 2-yr professional development pro- (Garet et  al., 2001; Gibbs and Coffey, 2004; Henderson gram, Faculty Institutes for Reforming Science Teaching et  al., 2011, 2012). For example, faculty participation in a IV (FIRST IV). If the program was effective at transform- 3-yr program of professional development that focused ing teaching, then we predicted that 1) postdocs (PDs) on transforming teaching did not result in implementa- would demonstrate belief in learner-centered approaches tion of learner-centered teaching by most participants. In to teaching, 2) implement learner-centered teaching prac- surveys, the faculty reported incorporating many learn- tices in the classroom, and 3) design assessments that er-centered activities in class, but observational data did are aligned with beliefs and practices of learner-centered not conirm those assertions (Ebert-May et al., 2011). Other teaching. notable examples of programs that target professional In developing FIRST IV, we selected implementation development of current and future STEM faculty include strategies based on scientiic teaching (i.e., teaching science the Center for the Integration of Research, Teaching, and using evidence-based practices that include active learning Learning (Austin et al., 2008; Pfund et al., 2012), the Sum- and diversity; Handelsman et  al., 2004), research, indings mer Institutes (SI) through the Center for Scientiic Teach- from the conceptual change literature, and results from pre- ing at Yale (Handelsman et al., 2004, 2006), On the Cutting vious professional development programs. The FIRST IV Edge Workshops and Resources for Early Career Geosci- program adapted theoretically based strategies in a profes- ence faculty (Manduca et al., 2010), and the Workshop for sional development model built on a mentored, team-based New Physics and Astronomy Faculty (Henderson, 2008). approach to learning, in which participants engaged in an Collectively, these programs impacted several thousands iterative process of curriculum development and teaching of faculty (Hilborn, 2012). Survey data from participants practicum, followed by relection, revision, and a second in these programs indicated that a large percentage of teaching experience. Our approach was consistent with re- respondents made speciic changes to their teaching views of faculty professional development in which posi- practices, primarily shifting toward active-learning tech- tive and/or lasting effect on teaching was associated with niques. Only one of these programs reported data from di- the use of repeated active and experiential interventions rect observation, however, which indicated that 45% of the over time and collaboration (e.g., Gibbs and Coffey, 2004; faculty who participated in workshops were transitioning Steinert et al., 2006) and in which participants were actively toward learner-centered teaching and only 25% actually engaged in all dimensions of learner-centered pedagogy implemented learner-centered instruction (Teasdale et al., (Henderson, 2008). 2011; Manduca et al., 2014). Overall, the availability of reli- FIRST IV also built on the lessons learned from the earlier able evidence for transformed teaching after professional iteration of the project (referred to here as FIRST II; Ebert-May development is limited across STEM disciplines, making et al., 2011) by 1) selecting participants who were less-expe- the identiication of professional development best prac- rienced instructors (i.e., PDs), 2) focusing on actual teaching tices and effective models for change dificult (Henderson practices rather than focusing only on teaching tools to use et al., 2011; Amundsen and Wilson, 2012). in the classroom, and 3) helping participants build an entire Effective professional development requires that in- course rather than develop a single unit of instruction. Fur- structors reconceive the learning and teaching experience thermore, in response to the need for rigorous evaluation of (Emerson and Mosteller, 2000; Henderson et  al., 2011), a the FIRST model, we incorporated methods that used direct process that can be productively viewed through the lens observation and analysis of the participants’ teaching and of conceptual change theory (Posner et  al., 1982; Pintrich self-reported data. In doing so, we demonstrated that FIRST et al., 1993; Feldman, 2000). Workshop strategies that align IV instructors implemented learner-centered teaching and with the theoretical framework of conceptual change are did so to a greater degree than several comparison groups predicted to successfully help teachers transform their of faculty. 14:ar22, 2 CBE—Life Sciences Education Downloaded from http://www.lifescied.org/ by guest on June 5, 2015 Development of Learner-Centered Teachers METHODS surveys that we designed to document the PDs’ knowledge and experience with active-learning pedagogy and teaching The FIRST IV Program strategies (Supplemental Material, section 2). The ATI was PDs were the subjects of our research. The PDs were recruit- developed to measure qualitative variation in two key di- ed nationally in 2009 and 2011, forming two separate, 2-yr mensions of teaching; speciically, conceptual change/stu- cohorts of 99 and 102 PDs, respectively. The PDs accepted dent focused (CCSF) and information transmission/teacher to FIRST IV were assigned to one of ive regional teams lo- focused (ITTF). Results from the ATI are scored on a CCSF cated around the United States, each of which was based at and an ITTF scale. Instructors who use a CCSF approach a biological research ield station and led by a team of two aim to change students’ thinking about the material studied, or three regional team leaders (RTLs) who were experts in with a focus on ways to challenge students’ current ideas biological science and pedagogy. Thirteen RTLs participated so that students construct their own knowledge. Instructors in a training workshop in the Spring of each of the irst 2 yr using an ITTF approach see their role as mainly to transmit of the project to prepare to implement the workshops (see information to students and to focus on development of Supplemental Material, section 1). skills that improve competency in information transfer. The The PDs engaged in a 2-yr program of professional devel- two scales are independent rather than ends of a continuum opment. At the beginning of each year of participation, they (Prosser and Trigwell, 1997). Use of the ATI is context specif- completed a summer workshop. The broad objectives of the ic; thus, each PD completed the ATI at the end of each course irst workshop (4 d) for each cohort were for participants to that he or she taught. Project-created surveys about teaching 1) gain knowledge about evidence-based methods that sup- knowledge and experience with education reform and ac- port learner-centered STEM teaching; 2) begin to develop a tive-learning approaches to teaching were completed by the learner-centered course, in which objectives, assessments, PDs at the beginning and end of their participation in FIRST and instruction are aligned; and 3) make useful and sustain- IV (see Supplemental Material, section 2). able connections with other PDs and RTLs for continuing project-related work after the workshop. The objectives for the second workshop (3 d) were for participants to 1) relect Participants’ Teaching Practice on their teaching experience(s) from the prior year; 2) gain Classroom teaching practices were assessed using an exter- further practice with learner-centered, evidence-based teach- nal review process. Each PD submitted videos for at least ing methods; 3) gain access to additional teaching and as- two complete class sessions for each full course that he or sessment tools and resources; 4) receive feedback about their she taught during his or her FIRST IV participation. The spe- teaching and job-seeking experiences; and 5) complete ac- ciic class sessions that were recorded were determined by tion plans for revision of their course and teaching in year 2. each PD. The PDs were asked to focus their videos on what The activities of the PDs in the academic year between they and their students did during the class. For example, workshops focused on three elements. First, each PD con- we asked them to capture interactions with and among the tinued to develop a learner-centered introductory biology students, to include any visual materials used (e.g., Power- course with a team of PDs established during their irst Point slides), and to accurately record audio of both their workshop. The second element was interaction of the PDs instruction and the students’ conversations with each other. with their assigned RTL mentors and PD team as a means of Three considerations were key to our selection of an in- receiving feedback about teaching, development of courses strument for evaluating teaching practices of the PDs. First, and teaching materials, and job applications. RTL mentors the instrument needed to focus on the nature of student and their PDs established a meeting schedule and other in- learning and in-class interactions rather than provide an teractions as needed. Third, the PDs completed an authentic ethogram of teaching behaviors. Classroom dynamics are in- teaching experience. Ideally, the experience was teaching herent to the teaching approach used (e.g., teacher centered one or more entire course(s); for many PDs, however, op- vs. learner centered) regardless of the topics studied during portunities were only available to teach one unit or a few a given class period. Second, we considered the availability lectures of a course. In cohorts 1 and 2, 53% (74 of 140) and of comparative data from other professional development 67% (103 of 154) of the teaching experiences that occurred projects as a means of increasing the rigor of our project were full courses, respectively. evaluation process (Hill et al., 2013). Third, the instrument had to it within the design of the FIRST IV project (e.g., time Assessing Teaching eficient given the large number of videos). Accordingly, we chose the Reformed Teaching Observation Protocol (RTOP). To determine the effectiveness of FIRST IV for training learn- The RTOP is a validated observational instrument designed er-centered teachers, we used a mixed-methods approach to measure the degree to which classroom instruction uses (Creswell and Clark, 2007) that incorporated: 1) PDs’ per- “reformed teaching” as deined by Sawada et  al. (2002). ceptions of their teaching, 2) rating of PD teaching based on The RTOP focuses on the nature of student learning and independent observations of teaching videos, 3) rating of student–student and student–faculty interactions and is teaching videos obtained from non-FIRST faculty, and 4) the aligned with the theoretical underpinnings of constructivist contents of assessments used by the PDs when teaching. literature about teaching and learning (Piburn et  al., 2000; MacIsaac and Falconer, 2002; Sawada et al., 2002; Marshall Participants’ Perceptions about Teaching et  al., 2011). It is a highly reliable instrument in terms of We characterized the PDs’ beliefs about their own teach- item reliability and interrater reliability across institutions ing using the Approaches to Teaching Inventory 22 (ATI; and instructors (Marshall et  al., 2011; Amrein-Beardsley Trigwell and Prosser, 2004; Trigwell et  al., 2005) and and Osborn Popp, 2012) with strong predictive validity for Vol. 14, Summer 2015 14:ar22, 3 Downloaded from http://www.lifescied.org/ by guest on June 5, 2015 D. Ebert-May et al. calibration video was withheld from the reviewers to ensure Table 1. Scoring categories of the RTOPa that the review was conducted similarly to any other video. Typical Interrater reliability of the reviewer group was measured RTOP RTOP by calculating the cumulative monthly ICC each month. If category score Type of teaching a reviewer had a video score that was an outlier for two cal- ibration video recordings, then the reviewer was asked to I 0–30 Straight lecture reexamine the recordings and their ratings in light of scores II 31–45 Lecture with some demonstration and minor student participation provided by the other reviewers of that video. The average III 46–60 Signiicant student engagement with ICC for the total review period was 0.71 (range: 0.46–0.85). some minds-on as well as hands-on There was no signiicant change in ICC score over the 17-mo involvement video-review period (linear regression, r2 = 0.017, p > 0.05). IV 61–75 Active student participation in the Scores on the two videos submitted by a PD were averaged critique as well as the carrying out of to obtain a inal total RTOP score and subscores for each PD. experiments There was no signiicant difference in total RTOP scores for V 76+ Active student involvement in open- the two videos submitted by the PDs (paired t-test, p > 0.05). ended inquiry resulting in alternative hypotheses, several explanations, and For comparison purposes, we also obtained teaching vid- critical relection eos from 20 biology faculty who were not associated with a speciic professional development program, each at a dif- a Adapted from Sawada (2003). ferent institution, during 2011–2013. These faculty were re- cruited by FIRST IV PDs who were now in faculty positions and are hereafter referred to as “comparison faculty” (CF). student achievement (Falconer et  al., 2001; Lawson et  al., When recruiting CF, the PDs sought faculty with teaching 2002; Bowling et al., 2008). The RTOP has been used to assess experience similar to their own; speciically, junior faculty the effectiveness of a variety of professional development who taught an introductory-level biology course. The teach- programs (Adamson et al., 2003; Addy and Blanchard, 2010; ing approach used by the CF was not a criterion for selection. Ebert-May et al., 2011). Also, comparative RTOP data were Background information was obtained from all CF at the be- available from two previous faculty professional develop- ginning of their semesters of participation. All but three of ment programs. the faculty had less than 6 yr of teaching experience. Most Total score on the RTOP indicates the degree of learn- of the CF (65%) reported no participation in faculty profes- er-centered instruction and student involvement in a class sional development programs in the prior 2 yr. Those who session (Sawada et al., 2002). The total score is obtained by did engage in professional development reported activities summing subscores for each of ive subcategories. The total such as attending education conferences and workshops, score is classiied into one of ive categories in which catego- participating in a faculty learning community, working with ries I and II represent teacher-centered classrooms and cate- a teaching mentor, and participating in a national training gories III–V represent classrooms that are learner-centered to program. Each of the CF, the majority (78%) of whom taught varying degrees (Sawada, 2003; Table 1). Details of the RTOP an introductory-level biology course, submitted video re- subcategories and score interpretations are explained fully in cordings for at least two class sessions. The videos were re- Budd et al. (2013). viewed as part of the pool of recordings described earlier. We trained and calibrated biology education experts in the use of RTOP. During the initial calibration, all potential re- viewers (n = 18) watched a set of eight to 14 videos, followed Participants’ Assessments of Learning by discussion of their RTOP scoring. On completion of the An important component of the FIRST IV training was initial calibration, reviewers who had an intraclass correla- learning to design assessments that aligned with the types tion coeficient (ICC; Gwet, 2010) of at least 0.7 and the time of learning students practiced during a course, for exam- to commit to the video-review process, were selected as the ple, higher-order cognitive thinking and constructivist inal pool of reviewers (n = 13). Four of the experts were as- learning through cooperative work and active engagement. sociated with a former FIRST project, two were project PDs, We evaluated the progress of PDs in their design of assess- and the remainder were external to the FIRST IV program. ments by determining the level of cognitive skills targeted Review of videos was not initiated until year 3 of the proj- in their high-stakes assessments (i.e., exams and quizzes) ect, so a pool that included videos from both PD cohorts when teaching an entire course. We used Bloom’s taxonomy was available. Each reviewer was assigned four to eight (1956) to classify the cognitive skills assessed by each quiz/ randomly selected videos to review each month. Each video exam question used by PDs when teaching an entire course. (n = 489) was reviewed by two experts who did not know Cognitive skill categories consist of six levels that repre- the PD, the cohort, or when the video was recorded. If the sent a continuum from simple to complex cognitive tasks: two reviewer scores for a video did not fall within the same 1) knowledge, 2) comprehension, 3) application, 4) analysis, RTOP category or lay at opposite ends of a single category 5) synthesis, and 6) evaluation. The irst two categories can (Table 1), then additional reviews were conducted by new be considered to describe lower-order cognitive skills and the reviewers until a majority of reviews yielded similar scores latter four categories to describe higher-order cognitive skills (< 8 points apart within one category). Outlier scores were (Anderson and Krathwohl, 2001). Assessment items were as- not included in the inal average RTOP score for that video. signed cognitive skill levels by two independent raters who One randomly selected video was assigned to all review- had achieved a Cohen’s kappa of 0.87 (n = 188 assessments). ers each month for calibration purposes. The identity of the We determined the percent of points on each quiz or exam 14:ar22, 4 CBE—Life Sciences Education Downloaded from http://www.lifescied.org/ by guest on June 5, 2015 Development of Learner-Centered Teachers that was assigned in each Bloom’s category (e.g., [25 points/ Table 2. Demographic and background characteristics of the two 80 points total] × 100) and averaged the values within each Bloom’s category for all assessments used in each course. If cohorts of FIRST IV postdoctoral scholars a PD (n = 57) taught the same course more than once, then Cohort 1 Cohort 2 the scores for all assessments for that course were averaged Demographic/background variable (n = 93) (n = 97) (seven of 57 PDs taught the same course two or more times). Gender ratio (F:M) 1.7:1 2.1:1 Home institution type (research: 2.7:1 1.7:1 Statistical Analyses non-research)a The surveys used to assess faculty approaches to their teach- Prior TA experience 84% 83% ing were based on a ive-point Likert scale. The data were Prior instructor experience 37% 56% treated as ordinal, and the statistical analyses were con- Prior professional development 25% 29% ducted using nonparametric tests (Roberson et al., 1995). To activity (discussion group, workshop, longer-term program) characterize the PDs’ approaches to teaching, we tested for differences between the subscales (CCSF and ITTF) using a Based on Carnegie classiications. Wilcoxon signed-rank tests. To determine whether there was signiicant change in the PDs’ approaches to teaching over the course of their training, we used mixed linear analyses, centered teaching, compared with traditional informa- with instructor as a random effect, and paired t-tests of pre- tion-transfer, teacher-centered instruction during the FIRST and postscores on the ATI, with prescores obtained during IV project. On average, the PDs reported signiicantly higher the initial workshop as the PDs developed the courses they ratings on the CCSF scale (mean = 3.87 ± 0.04) of the ATI, would teach the following year. Signiicant gains in the PDs’ compared with the ITTF scale (mean = 3.28 ± 0.04) when knowledge and irsthand experience with active-learning teaching a full course (n = 190 courses; Wilcoxon signed-rank pedagogy and teaching strategies were determined by sub- test, p < 0.0001). On the ATI, participants reported the extent tracting the prerating from the postrating and testing the re- to which each survey item was true in their speciic course sulting difference using Wilcoxon signed-rank tests. on a ive-point Likert scale ranging from “only rarely” to Using a chi-square test with Yates correction, we com- “almost always.” A large majority of participants (91%) who pared the frequency of scores in the ive RTOP categories taught a complete course had a mean score above 3 for the (Table 1) for PDs who taught entire courses with those who CCSF survey items, compared with 67% for the ITTF items. taught part of a course. Differences in mean RTOP scores for Also, 74% of the participants scored higher on the CCSF PDs who taught an entire course and those who taught part compared with the ITTF scale. of a course were analyzed using a t-test after testing the data We tested for change in ATI score during participation in for normality. We also tested for an effect of course level and FIRST IV. There was no signiicant main effect of time on enrollment using regression analysis. Course level was con- mean CCSF or ITTF score for participants who taught one verted to a dummy variable with two categories, lower level or more complete courses (n = 190) from when they irst en- (100–200 level) and upper level (≥300 level courses). tered the project to their last teaching experience (mixed lin- To analyze the distribution of points assigned to each ear analysis, p > 0.05). Because results of the ATI are speciic Bloom’s category on the tests and quizzes used by the PDs, to the course being taught, we also examined ATI scores for we compared the distribution for lower-level courses (100– PDs (n = 28) who completed the ATI three times for the same 200 level) with that of higher-level courses (300–400) using course; that is, before teaching the course, at the end of their the Kolmogorov-Smirnov two-sample test. The alignment irst time teaching the course, and at the end of their second between teaching practice, as measured by RTOP score, and experience teaching that same course. Before teaching the the cognitive skills assessed by PDs, as determined by mean course, the PDs gave nearly identical ratings for their sup- Bloom’s score, was analyzed using Spearman’s correlation port and use of CCSF and ITTF approaches. At the end of the coeficient. We used SAS version 9.3, release TS1M2 (SAS second teaching experience, support for the two approaches Institute, Cary, NC) for all statistical analyses. Data are pre- differed signiicantly (paired t-test, p = 0.002), with support sented as arithmetic means ± 1 SE. Statistical signiicance for ITTF approaches decreasing over time by 0.28 ± 0.11 was determined as p ≤ 0.05. All protocols used in the FIRST (paired t-test, p = 0.019; effect size = 0.49), and support for project were approved by the Michigan State University In- CCSF approaches increasing by 0.27 ± 0.15, although not sig- stitutional Review Board (IRB X08-550 exempt, category 2). niicantly (paired t-test, p = 0.081; effect size = 0.38). Participants’ Teaching Practice RESULTS The PDs also reported signiicant gains in their knowledge and irsthand experience with active-learning pedagogy Participants and Their Beliefs about Teaching and teaching strategies (Figure 1). Using data from PDs who Participants in the two FIRST IV cohorts had similar demo- completed both the pre- and postsurveys (n = 130), we found graphics and teaching backgrounds, except that more of the signiicant gains in irsthand experience in each of the ive cohort 2 PDs had experience as an adjunct/lecturer/instruc- areas of pedagogy (Wilcoxon one-sample signed-rank test, tor and more were from a non–research intensive institution p < 0.0001). Greatest gains in experience occurred in the ar- compared with cohort 1 PDs (Table 2). eas of course/curriculum development and assessment. The Results from self-reported data indicated that the PDs gains for knowledge in the ive areas of pedagogy paral- placed a greater emphasis on concept-focused, learner- leled the gains in irsthand experience. Signiicant gains in Vol. 14, Summer 2015 14:ar22, 5 Downloaded from http://www.lifescied.org/ by guest on June 5, 2015 D. Ebert-May et al. Figure 2. The frequency distribution of RTOP scores for PDs who taught an entire course and those who taught a partial course during their participation in FIRST IV. centered RTOP category more frequently compared with those teaching an entire course (df = 4, Χ2 = 44.6, p < 0.0001). The PDs who taught part of a course rather than an entire course also had signiicantly lower RTOP scores on aver- age (mean = 43.7 ± 1.0 and 49.6 ± 0.7, respectively; t-test, p < 0.0001; effect size = 0.31; Figure 2). Course level and enrollment had no signiicant effect on RTOP score (F2, 146 = 1.51, p > 0.05). The RTOP scores of PDs differed from those of a sample of faculty who had not participated in FIRST IV (mixed anal- ysis of variance, p < 0.0001). Less than 30% of the CF had Figure 1. FIRST IV participants reported gains in irsthand experi- scores in RTOP categories that indicated use of learner-cen- ence with different dimensions of active-learning pedagogy (top: TI = technology instruction, CCD = course/curriculum development, tered teaching (Figure 3). The lower total RTOP scores of CF AS = assessment, BER = biology education reform, TL = theories of resulted from signiicantly lower scores for all subscales ex- learning) and strategies (bottom: IBL = inquiry-based laboratories, cept propositional knowledge, compared with FIRST IV PDs PBL = problem-based learning, CCL = cooperative/collaborative (Table 3; mixed analyses of variance, p < 0.0001). learning, IBFP = inquiry-based ield projects, TP = teaching portfo- lios, CS = case studies). All responses were based on a ive-point Likert-type scale with 5 as the highest rating and 1 the lowest rating. Error bars represent the SEs. experience with ive active-learning teaching strategies also occurred (Wilcoxon one-sample signed-rank test, p < 0.01; Figure 1). The greatest gains were in experience with cooper- ative/collaborative learning and case studies. Expert reviews of the PDs’ teaching videos provided in- dependent evidence that the majority of PDs used trans- formed teaching practices. Almost all (86%) of the PDs who taught an entire course had total scores in RTOP categories III–V (mean = 54.1 ± 0.6; Figure 2), exhibiting signiicant student engagement and reformed teaching (MacIsaac and Falconer, 2002). In addition, the RTOP scores were aligned with the PDs’ self-reported beliefs about their teaching in Figure 3. Mean total RTOP score (±1 SE) for participants of three dif- those same courses. The PDs’ self-ratings on the ATI for the ferent professional development programs and a group of CF who CCSF subscale were positively correlated with total RTOP did not participate in any of the three programs. Video recordings score (Spearman r = 0.43, p < 0.0001; n = 99), while self-rat- for SI and FIRST II were made after completion of professional de- velopment by the participants. Those for FIRST IV were from PDs ings for the ITTF subscale were negatively correlated who taught an entire course during the inal year of professional (Spearman r = −0.39, p < 0.0001; n = 99). Teaching only development. Histograms bars with the same letter show means part of a course had an adverse effect on RTOP scores. The that are not statistically different from each other (mixed analysis of PDs who taught a partial course had scores in a teacher- variance, p < 0.0001). 14:ar22, 6 CBE—Life Sciences Education Downloaded from http://www.lifescied.org/ by guest on June 5, 2015 Development of Learner-Centered Teachers Table 3. Mean (± 1 SE) scores on the ive subscales of the RTOP for teaching videos of participants in three faculty professional development programs and comparison faculty (CF) CF SI facultya FIRST II facultya FIRST IV PDs Subscale (n = 20) (n = 37) (n = 37) (n = 145) Lesson design 5.92 ± 0.59 5.40 ± 0.42 6.31 ± 0.56 9.01 ± 0.20 Propositional knowledge 13.94 ± 0.36 13.11 ± 0.25 12.59 ± 0.46 13.87 ± 0.12 Procedural knowledge 4.10 ± 0.48 4.20 ± 0.34 4.94 ± 0.50 7.26 ± 0.18 Communicative interactions 6.62 ± 0.40 5.57 ± 0.42 6.93 ± 0.54 9.49 ± 0.18 Student–teacher interaction 6.79 ± 0.52 6.09 ± 0.45 7.56 ± 0.69 9.99 ± 0.18 a Data were obtained as described in Ebert-May et al. (2011). Participants’ Assessments of Learning Evidence of Effectiveness In addition to analyzing teaching practices, we also deter- We predicted that, if the FIRST IV program was effective, mined the Bloom’s level of assessments (n = 188) used by then the PDs would implement learner-centered teaching the PDs. The primary foci of the quiz and test questions practices in the classroom and teach in ways that were dif- used by the PDs who taught an entire course (n = 57) were ferent from peers who had not completed the FIRST IV pro- knowledge and comprehension, which represented low- gram. Our data supported this prediction. er-order cognitive skills (Bloom, 1956). Twelve percent of Comparison of actual teaching practices by the PDs with the quiz and exam points, on average, were allocated to that of participants in other professional development pro- assessing higher-order thinking (Figure 4). There were no grams is dificult, because approaches to program assess- signiicant differences in the distribution of points assigned ment vary and published results typically rely on self-re- within any Bloom’s level between upper- and lower-level ported data rather than independent evaluation by external courses (Kolmogorov-Smirnov two-sample test, p > 0.05). experts (Hill et  al., 2013). We used the RTOP to evaluate In addition, there was no signiicant relationship between teaching by the PDs, in part because we could compare the the teaching practices of PDs based on RTOP score and the results with RTOP scores from the CF and faculty from two corresponding mean Bloom’s score for the assessments used prior faculty professional development programs, SI (Pfund (Spearman correlation coeficient, r = 0.22, p > 0.05). et al., 2009) and FIRST II. We originally reported data from SI and FIRST II faculty in Ebert-May et al. (2011). Baseline data about the CF and participants in the profes- DISCUSSION sional development programs and the courses they taught are presented in Table 4. The emphasis of FIRST II and SI We documented the teaching practices of postdoctoral fel- on development of experienced faculty is relected in their lows who participated in the FIRST IV professional develop- participants’ signiicantly greater number of years of teach- ment program. The results supported our predictions about ing experience compared with those in the FIRST IV and CF participant beliefs and teaching practices but not design of groups. The other notable difference among groups was the assessments. We use our results, from both direct observa- larger class size, on average, for SI faculty. The frequency of tion and surveys, to identify best practices for the design males and females and courses taught at the introductory and implementation of effective professional development level did not differ signiicantly among the groups (χ2, p > programs, models of which are missing from the literature 0.05). There were no differences in the self-reported knowl- (Henderson et al., 2011; Hill et al., 2013; Wilson, 2013). edge of active-learning pedagogies or irsthand experience with active-learning (mixed analysis of variance, p > 0.05). Smith et  al. (2014) suggested that, when participants in a professional development program are asked to provide self-reported information about their teaching, they may feel pressured to provide positive responses as feedback to the program leaders. Such a response could contribute to the lack of agreement between self-reported data and data from external reviewers about the teaching practices of fac- ulty (e.g., Ebert-May et al., 2011). If pressure to meet program expectations was a signiicant factor, then we expected that the self-reported data from faculty in the professional de- velopment programs would be high and data from the CF would be signiicantly lower. Instead, we found that all fac- ulty, whether in a professional development program or not, perceived themselves as having equally high levels of expe- Figure 4. Mean percentage of assessment points per course (n = 57) rience with active-learning teaching strategies (Table 4). categorized into each Bloom’s category for PDs who taught an entire The procedure used for obtaining and reviewing course. recordings of class sessions for all groups was the same. Vol. 14, Summer 2015 14:ar22, 7 Downloaded from http://www.lifescied.org/ by guest on June 5, 2015 D. Ebert-May et al. Table 4. Characteristics of the participants in three professional development programs (SI = Summer Institutes, FIRST = Faculty Institutes for Reforming Science Teaching) and a comparison faculty (CF) groupa Active-learning Active-learning Participants Teaching experienceb knowledge experience Course level Class size Group n Female (%) Mean (years) SE Mean SE Mean SE % Introductory % Large (> 75 students) † SI 39 47 14.5 1.6 39.7 1.7 35.6 1.5 91 80† FIRST II 38 56 11.6† 1.2 38.5 1.5 35.5 1.2 79 37* FIRST IV 71 66 1.8* 0.14 42.1 1.2 40.2 1.2 67 17* CF 20 47 3.9* 1.0 34.6 2.6 34.9 2.3 64 32* a Knowledge of and experience with active learning was determined from survey data. Introductory courses were 100- and 200-level courses. b Values in a column that share the same symbol (*, †) and values in columns without symbols are not signiicantly different from one another (generalized linear model, p > 0.05). Speciically, participants chose at least two class sessions to Situational differences among the groups could have record when teaching a complete course. The SI and FIRST marked inluences on RTOP scores. Therefore, we tested for II faculty provided recordings of courses taught after com- effects of the variables in Table 4 on total RTOP scores, exclud- pletion of their professional development program. The re- ing “knowledge of active-learning pedagogy” because of its cordings were deidentiied and reviewed by at least two ex- high correlation with “experience with active learning.” The perts who were trained and calibrated in the use of the RTOP most inluential variable on RTOP score by far was “group” and who did not know the instructor in the recordings. Any (i.e., professional development program or lack thereof; gen- biases by participants in providing recordings, such as se- eral linear model, partial η2 = 0.14), with SI and CF group lecting class sessions that were closely aligned with peda- having signiicantly lower RTOP scores on average compared gogy learned in a professional development program, would with FIRST II and IV faculty (general linear model, p = 0.002). likely be similar among the groups. As for the videos from The only other variable of statistical signiicance was class the FIRST IV PDs, there was no statistically signiicant differ- size (general linear model, p = 0.033; partial η2 = 0.05). Faculty ence in RTOP scores for the irst compared with the second teaching larger courses had lower RTOP scores in general video submitted by participants in the SI or FIRST II pro- compared with faculty teaching smaller courses. Thus, the grams or the CF (paired t-test, p > 0.05). Consistency in RTOP relatively low RTOP scores of SI faculty could be attributable, scores from class to class and independence of RTOP score in part, to the preponderance of large courses they taught from the topic taught in a class session were also reported (Table 4). When we controlled for variation in class size by Budd et  al. (2013) for most instructors. Our comparison in the analysis of differences among the groups, however, here focuses solely on teaching practices after completion of the results were the same as without the covariate (Figure a professional development program, because baseline data 3). Mean RTOP score for FIRST IV participants was still documenting the teaching practices of participants before signiicantly greater than for the other three groups (anal- professional development were not available for any of these ysis of covariance, p < 0.0001), and scores among the non– programs. FIRST IV groups did not differ signiicantly from one an- The effectiveness of the FIRST IV program was striking other. It is worth noting that large class size does not prevent when compared with other professional development pro- grams and faculty with no FIRST IV training. The average RTOP score was signiicantly greater for FIRST IV partici- pants compared with those of the FIRST II, SI, and CF groups (Figure 3; mixed analysis of variance, p < 0.0001). Three- fourths (74%; Figure 5) of the FIRST IV PDs who taught an entire course implemented learner-centered teaching (i.e., RTOP categories III–V) during their second year of profes- sional development, based on RTOP scores assigned by expert raters. In contrast, less than one-third of the video-recorded faculty participants in the FIRST II or SI programs had a to- tal RTOP score within a learner-centered RTOP category (i.e., categories III–V). The same was true of CF, of whom < 30% had RTOP scores that indicated use of learner-centered teach- ing (Figure 5). The scores of FIRST IV PDs on four of the ive subscales of the RTOP were 37–77% greater than those of the FIRST II, SI, and CF groups (Table 3). Higher scores by the FIRST IV PDs on the four subscales resulted from their Figure 5. Frequency of RTOP scores among the ive RTOP catego- greater engagement of students in the learning process (e.g., ries (Table 1) for participants of three different professional develop- making and testing predictions, creating and using models, ment programs and a group of CF who did not participate in any of and communicating their ideas to others; Budd et al., 2013). the three programs. 14:ar22, 8 CBE—Life Sciences Education Downloaded from http://www.lifescied.org/ by guest on June 5, 2015 Development of Learner-Centered Teachers implementation of learner-centered courses (Kober, 2015). research (Prosser et al., 2003). The subsequent decrease in PD As pointed out by Budd et  al. (2013), some faculty in their scores on the ITTF scale indicated that, with experience and study who taught large courses had high RTOP scores. The professional development, the PDs realized that information same was true in our comparisons in which seven faculty accumulation did not lead to conceptual change in students’ with courses of 100 to more than 200 students had total RTOP understanding. Prosser et al. (2003) suggested that such shifts scores that were in a learner-centered category (score > 50). in teaching approach require understanding of the students’ Gender of the professor, years and perception of teaching ex- experience of learning (i.e., learner-centered approach) rather perience, and course level had no main effect on RTOP score. than the instructor’s experience of teaching. The professional context of the participants’ teaching in Our inal prediction was that, if FIRST IV was effective, the four comparison groups might also have inluenced the PDs would design assessments aligned with beliefs their teaching practices. The PDs were employed in full- and practices of learner-centered teaching. Thus, the types time research positions and also had the support of their of learning practiced by students during the course on ex- principal investigators for teaching the course. The PDs, ams and quizzes would include an emphasis on assessing especially the 60% who taught an entire course, were in students’ higher-order cognitive skills. During the FIRST effect balancing research and teaching within their pro- IV workshops, the PDs learned to design assessments that fession. However, since they were not in faculty positions revealed student proiciency with the types of thinking for which teaching was a formal responsibility and part of and skills students used in the classroom, using Bloom’s annual evaluations, their interest and willingness to imple- taxonomy (Bloom, 1956) as a framework for levels of cog- ment transformed teaching was perhaps less constrained by nitive thinking. For example, the PDs practiced designing traditional departmental expectations. Faculty in the other exam questions that incorporated conceptual model devel- groups, especially those who were untenured (SI = 45%, opment and evaluation, argumentation, and data analysis FIRST II = 66%, CF = 90% untenured), might have felt more (Crowe et  al., 2008). Yet the majority of the assessments limited in their teaching options. Data from the background used by the PDs during their teaching experiences focused survey completed by the CF provided some insight into on questions and tasks that required lower-order cogni- possible impacts of tenure. Speciically, the CF were asked tion, despite engaging students in a variety of activities in to rate “the extent to which tenure-related issues pose a class that required higher-level thinking. The PD’s use of challenge as you implement an active-learning course.” On lower-level assessment questions agreed with indings by average, the responses were between “not challenging” and Momsen et al. (2010). These questions emphasize primarily “somewhat challenging” (mean = 2.5 ± 0.3, where 2 = not factual and concrete concepts, excluding the integration of challenging and 3 = somewhat challenging). These results concepts and skills that align with a learner-centered class- suggested that tenure, or the lack thereof, did not have a room experience. One explanation for the use of lower-level major inluence on RTOP scores. questions could be the practical dificulties associated with We also predicted that, if FIRST IV was effective, then grading extended-response questions that focus on high- the PDs would demonstrate belief in learner-centered ap- er-level cognitive skills (e.g., synthesis, evaluation) in large proaches to teaching. The PDs believed that they increased courses. Courses taught by the PDs ranged in size from ten their knowledge and experience with learner-centered peda- to 205 students (mean = 48), yet there was only a margin- gogy and teaching strategies (Figure 1), based on self-reported ally signiicant negative correlation between class size and data. These gains were typical of self-reported data from fac- mean Bloom’s score on assessments (Pearson’s correlation, ulty following professional development (e.g., Light et  al., r = 0.26, p = 0.05). Momsen et al. (2010) found no relationship 2009; Pfund et  al., 2009; Ebert-May et  al., 2011). Further ev- between course size and the cognitive level of assessments. idence for the PDs’ belief in learner-centered teaching was The alignment between the training the PDs received and obtained from the ATI. When all courses were studied, the the assessments they used could potentially be improved PDs reported greater use of CCSF compared with ITTF ap- by adding more peer and expert mentoring around assess- proaches to teaching in their classes. Similar outcomes from ment development, and analysis of assessment items during the ATI were reported for junior (Light et al., 2009) and inter- workshops. national (Gibbs and Coffey, 2004) faculty after professional development, but not for their untrained control groups. We expected that the PDs’ responses on the ATI would Contributions to Future Professional change over time, with an increased emphasis on CCSF ap- Development Programs proaches. When we examined only courses that were taught What components of the FIRST IV program led to a large at least twice by the same PD, the PDs reported equal support majority of participants who implemented learner-cen- for the CCSF and ITTF scales the irst time they taught. The tered teaching practices? This question is dificult to answer mean score for these PDs on the CCSF scale (3.60 ± 0.14, n = 28) without controlled studies in which speciic program com- was similar to that reported for faculty teaching in other hard ponents are investigated. We offer here our best insights sciences, while the score for the ITTF scale (3.50 ± 0.12, n = 28) based on knowledge of previous professional development was somewhat higher than for other faculty in the hard sci- programs and their outcomes and feedback from the FIRST ences (Lindblom-Ylanne et al., 2006). Our results indicated an IV participants. initial level of dissonance in the PDs’ beliefs about how they An obvious and unique component of FIRST IV was its could help students learn. A high score on both scales may focus on postdoctoral scholars rather than faculty with ex- result from a belief that conceptual change in students’ think- perience in teaching. Based on outcomes from the FIRST II ing could be accomplished through information transfer project, less-experienced instructors were expected to more only, an approach that is not supported by science education readily learn and adopt nontraditional teaching practices Vol. 14, Summer 2015 14:ar22, 9 Downloaded from http://www.lifescied.org/ by guest on June 5, 2015 D. Ebert-May et al. (Gibbs and Coffey, 2004; Ebert-May et al., 2011). In fact, that opportunities in workshops that genuinely model what we expectation was consistent with our results (Figures 3 and do in the STEM classroom itself. 5). The PDs chose to participate in FIRST IV, suggesting that they considered teaching as a key component of their profes- ACKNOWLEDGMENTS sional identities and balance (Rybarczyk et al., 2011; Brownell and Tanner, 2012). The research was funded by the National Science Foundation under The second key component was relection by the PDs on Division of Undergraduate Education Award 08172224 to D.E.-M. and T.L.D. We are appreciative of the postdoctoral scholars who par- their understanding of transformed teaching. Relection was ticipated in the FIRST IV program. We are deeply indebted to the enabled during the iterative process of learning new peda- regional team leaders (Stephanie Aamodt, Janet Batzli, Marguarite gogical strategies during workshops, implementing the new Brickman, Elizabeth Derryberry, Clarissa Dirks, Christopher Finel- strategies in courses, and relecting on teaching with mentors li, Janet Hodder, Jenny Knight, Debora Linton, Tammy Long, Mar- and peers that occurred over 2 yr. Given the effectiveness of cy Osgood, Emily Rauschert, Courtney Richmond, Alison Roark, visual review of teaching practices (e.g., Baecher et al., 2013; Christopher Tubbs, Kathy Williams, and Michelle Withers) for im- plementing the training workshops and mentoring the postdocs and Osborne et al., 2013), the relection activities speciically in- to the experts who provided many hours of work reviewing video cluded mentored review of video exemplars, videos of other recordings. We extend our thanks to Sarah Jardeleza, Rachel Nye, PDs, and the participants’ own teaching. The PDs’ interac- Matt Berry, Dan Totzkay, Alec Aiello, and Gregory Moyerbrailean tions with their mentors included examination of formative for their assistance in implementing the project and to Chris Mecklin feedback from teaching videos, discussions about course de- for his statistical advice. sign, students’ feedback, and formal self-relection before the second annual workshop. Critical relection and dialogue REFERENCES with mentors may have enabled the PDs to develop congru- ence between their beliefs about teaching and subsequent Adamson SL, Debra Banks, Burtch M, Cox F III, Judson E, Turley JB, classroom practices (Guskey, 2000; McAlpine and Weston, Benford R, Lawson AE (2003). Reformed undergraduate instruction and its subsequent impact on secondary school teaching practice 2000; Sandretto et al., 2002; Wlodarsky, 2005; Hatzipanagos and student achievement. J Res Sci Teach 40, 939–957. and Lygo-Baker, 2006). Addy TM, Blanchard MR (2010). The problem with reform from the A third difference from most professional development bottom up: instructional practices and teacher beliefs of graduate events was our focus on the PDs developing an entire course teaching assistants following a reform-minded university teacher rather than a smaller piece of instruction or individual teach- certiicate programme. Int J Sci Educ 32, 1045–1071. ing tools (e.g., case study, clickers). The basis for PDs’ devel- American Association for the Advancement of Science (2011). Vision opment of a course was to gain experience in creating course and Change in Undergraduate Biology Education: A Call to Action, goals and objectives, designing assessments, and selecting Final Report, Washington, DC. course activities. By doing this, the PDs worked within a Amrein-Beardsley A, Osborn Popp SE (2012). Peer observations learner-centered course framework that enabled them to fur- among faculty in a college of education: investigating the summa- ther develop, modify, and add lessons over time. In FIRST tive and formative uses of the Reformed Teaching Observation Pro- II and SI, the faculty developed pieces of instruction rather tocol (RTOP). Educ Assess Eval Account 24, 5–24. than a framework for an entire course and the RTOP scores Amundsen C, Wilson M (2012). Are we asking the right questions? were signiicantly lower than those of the FIRST IV PDs (Fig- A conceptual review of the educational development literature in ure 3). We assert that this holistic approach to course design higher education. Rev Educ Res 82, 90–126. was more effective than developing a single lesson of learn- Anderson LW, Krathwohl DR (2001). A Taxonomy for Learning, er-centered instruction to insert into a framework of a tradi- Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Edu- tional course. cational Objectives, complete ed., New York: Longman. As schools, colleges, universities, and funding agencies Anderson WA, Banerjee U, Drennan CL, Elgin SCR, Epstein IR, continue to address the need for transformation of the STEM Handelsman J, Hatfull GF, Losick R, O’Dowd DK, Olivera BM, et al. classroom experience through current and future faculty (2011). Changing the culture of science education at research univer- professional development, the consideration of professional sities. Science 331, 152–153. development program design features is critical to improv- Association of American Universities (2014). Undergraduate STEM ing the returns on time and funds invested (Hill et al., 2013; Education Initiative. https://stemedhub.org/groups/aau (accessed 1 October 2014). Wilson, 2013). Results from FIRST IV are consistent with prior research on professional development (e.g., Osborne Austin AE, Connolly M, Colbeck CL (2008). Strategies for preparing et al., 2013), suggesting that duration (e.g., Desimone et al., integrated faculty: the Center for the Integration of Research, Teach- ing, and Learning. In: Educating Integrated Professionals: Theory 2002) of professional development and practice is key to and Practice on Preparation for the Professoriate, New Directions successful future programs. This means that participants in for Teaching and Learning, ed. CL Colbeck, KA O’Meara, and AE programs with education objectives must teach a full course, Austin, San Francisco: Jossey-Bass, 69–81. or at minimum the majority of a course, and teach more than Baecher L, Kung S-C, Jewkes AM, Ro C (2013). The role of video once. Furthermore, the participants’ teaching experiences for self-evaluation in early ield experiences. Teach Teacher Educ 36, must be paired with expert feedback, ideally from a mentor 189–197. who is constructivist minded. Both the mentor and mentee Bloom BS (1956). Taxonomy of Educational Objectives: The Classii- need to make judicious use of teaching observations and cation of Educational Goals, New York: McKay. review of course materials combined with an iterative pro- Bowling BV, Huether CA, Wang L, Myers MF, Markle GC, Dean GE, cess of relection by the mentee. For all practical purposes, Acra EE, Wray FP, Jacob GA (2008). Genetic literacy of undergradu- professional development is all about implementation and ate non-science majors and the impact of introductory biology and feedback, and we need to construct professional “learning” genetics courses. BioScience 58, 654–660. 14:ar22, 10 CBE—Life Sciences Education Downloaded from http://www.lifescied.org/ by guest on June 5, 2015 Development of Learner-Centered Teachers Brownell SE, Tanner KD (2012). Barriers to faculty pedagogical Hilborn RC (2012). The Role of Scientiic Societies in STEM Faculty change: lack of training, time, incentives, and … tensions with pro- Workshop, Meeting Overview, Washington, DC: Council of Scientif- fessional identity? CBE Life Sci Educ 11, 339–346. ic Society Presidents/American Chemical Society. Budd DA, Kraft KJ, McConnell DA, Vislova T (2013). Characteriz- Hill HC, Beisiegel M, Jacob R (2013). Professional development re- ing teaching in introductory geology courses: measuring classroom search: consensus, crossroads, and challenges. Educ Res 42, 476–487. practices. J Geosci Educ 61, 461–475. Ho A, Watkins D, Kelly M (2001). The conceptual change approach Connolly M, Millar S (2006). Using workshops to improve instruc- to improving teaching and learning: an evaluation of a Hong Kong tion in STEM courses. Metropolitan Universities 17, 453–65. staff development programme. Higher Educ 42, 143–169. Creswell JW, Clark VLP (2007). Designing and Conducting Mixed Hubball H, Collins J, Pratt D (2005). Enhancing relective teaching Methods Research, Thousand Oaks, CA: Sage. practices: implications for faculty development programs. Can J High Educ 35, 357–81. Crowe A, Dirks C, Wenderoth MP (2008). Biology in Bloom: imple- menting Bloom’s taxonomy to enhance student learning in biology. Kober L (2015). Reaching Students: What Research Says about CBE Life Sci Educ 7, 368–381. Effective Instruction in Undergraduate Science and Engineering, Washington, DC: National Academies Press. Desimone L, Porter A, Garet M, Yoon K, Birman B (2002). Effects of professional development on teachers’ instruction: results from Lawson AE, Benford R, Bloom I, Carlson MP, Falconer KF, Hestenes a three-year longitudinal study. Educ Eval Policy Anal 24, 81–112. DO, Judson E, Piburn MD, Sawada D, Turley J, et al. (2002). Evaluat- ing college science and mathematics instruction: a reform effort that Ebert-May D, Derting TL, Hodder J, Momsen JL, Long TM, Jardeleza improves teaching skills. J Coll Sci Teach 31, 388–393. SE (2011). What we say is not what we do: effective evaluation of faculty professional development programs. BioScience 61, 550–558. Light G, Calkins S, Luna M, Drane D (2009). Assessing the impact of a year-long faculty development program on faculty approaches to Emerson JD, Mosteller F (2000). Development programs for college teaching. Int J Teach Learn Higher Educ 20, 168–181. faculty: preparing for the twenty-irst century. In: Educational Media and Technology Yearbook, vol. 25, ed. RM Branch and MA Fitzgerald, Lindblom-Ylanne S, Trigwell K, Nevgi A, Ashwin P (2006). How ap- Englewood, CO: Libraries Unlimited, 26–42. proaches to teaching are affected by discipline and teaching context. Stud Higher Educ 31, 285–298. Falconer K, Wyckoff S, Joshua M, Sawada D (2001). Effect of re- formed courses in physics and physical science on student concep- MacIsaac D, Falconer K (2002). Reforming physics instruction via tual understanding. Paper presented at the Annual Conference of RTOP. Phys Teach 40, 16–22. the American Educational Research Association, held 13 April 2001, Manduca CA, Iverson E, Mcconnell DA, Bruckner M, Greenseid in Seattle, WA. L, Macdonald RH, Tewksbury B, Mogk DW (2014). On the cutting Feldman A (2000). Decision making in the practical domain: a model edge: combining workshops and on-line resources to improve geo- of practical conceptual change. Sci Educ 84, 606–623. science teaching. Paper presented at the Geological Society of Amer- ica Annual Meeting, held 19–22 October 2014 in Vancouver, BC. Garet MS, Porter AC, Desimone L, Birman BF, Yoon KS (2001). What makes professional development effective? Results from a national Manduca CA, Mogk DW, Tewksbury B, Macdonald RH, Fox SP, sample of teachers. Am Educ Res J 38, 915–945. Iverson ER, Kirk K, McDaris J, Ormand C, Bruckner M (2010). SPORE: Science Prize for Online Resources in Education: On the Gibbs G, Coffey M (2004). The impact of training of university teach- Cutting Edge: teaching help for geoscience faculty. Science 327, ers on their teaching skills, their approach to teaching and the ap- 1095–1096. proach to learning of their students. Active Learn Higher Educ 5, 87–100. Marshall JC, Smart J, Lotter C, Sirbu C (2011). Comparative analysis of two inquiry observational protocols: striving to better understand Guskey TR (2000). Evaluating Professional Development, Thousand the quality of teacher-facilitated inquiry-based instruction. School Oaks, CA: Corwin. Sci Math 111, 306–315. Gwet KL (2010). How to Compute Intraclass Correlation with MS McAlpine L, Weston CB (2000). Relection: issues related to im- EXCEL: A Practical Guide to Inter-Rater Reliability Assessment for proving professors’ teaching and students’ learning. Instr Sci 28, Quantitative Data, Gaithersburg, MD: Advanced Analytics. 363–385. Handelsman J, Ebert-May D, Beichner R, Bruns P, Chang A, DeHaan Momsen JL, Long TM, Wyse SA, Ebert-May D (2010). Just the facts? R, Gentile J, Lauffer S, Stewart J, Tilghman SM, et al. (2004). Scientiic Introductory undergraduate biology courses focus on low-level cog- teaching. Science 304, 521–522. nitive skills. CBE Life Sci Educ 9, 435–440. Handelsman J, Miller S, Pfund C (2006). Scientiic Teaching, New Osborne J, Simon S, Christodoulou A, Howell-Richardson C, York: Freeman. Richardson K (2013). Learning to argue: a study of four schools and Hatzipanagos S, Lygo-Baker S (2006). Teaching observations: pro- their attempt to develop the use of argumentation as a common in- moting development through critical relection. J Further Higher structional practice and its impact on students. J Res Sci Teach 50, Educ 30, 421–431. 315–347. Henderson C (2008). Promoting instructional change in new faculty: Pfund C, Manske B, Austin AE, Connolly M, Moore K, Mathieu R an evaluation of the Physics and Astronomy New Faculty Work- (2012). Advancing STEM undergraduate learning: preparing the na- shop. Am J Phys 76, 179. tion’s future faculty. Change 44, 64–72. Henderson C, Beach A, Finkelstein N (2011). Facilitating change in Pfund C, Miller S, Brenner K, Bruns P, Chang A, Ebert-May D, Fagen undergraduate STEM instructional practices: an analytic review of AP, Gentile J, Gossens S, Khan IM, et al. (2009). Summer Institute to the literature. J Res Sci Teach 48, 952–984. improve university science teaching. Science 324, 470–471. Henderson C, Dancy M (2007). Barriers to the use of research-based Piburn MD, Sawada D, Falconer K, Turley J, Benford R, Boom I instructional strategies: the inluence of both individual and situa- (2000). Reformed Teaching Observation Protocol (RTOP). http:// tional characteristics. Phys Rev ST Phys Educ Res 3, 020102. physicsed.buffalostate.edu/AZTEC/RTOP/RTOP_full (accessed Henderson C, Dancy M, Niewiadomska-Bugaj M (2012). Use of re- 11 March 2015). search-based instructional strategies in introductory physics: where Pintrich PR, Marx RW, Boyle RA (1993). Beyond cold conceptual do faculty leave the innovation-decision process? Phys Rev ST Phys change: the role of motivational beliefs and classroom contextual fac- Educ Res 8, 020104. tors in the process of conceptual change. Rev Educ Res 63, 167–199. Vol. 14, Summer 2015 14:ar22, 11 Downloaded from http://www.lifescied.org/ by guest on June 5, 2015 D. Ebert-May et al. Posner GJ, Strike KA, Hewson PW, Gertzog WA (1982). Accom- Singer SR, Nielsen NR, Schweingruber HA (2012). Discipline-Based modation of a scientiic conception: toward a theory of conceptual Education Research: Understanding and Improving Learning in change. Sci Educ 66, 211–227. Undergraduate Science and Engineering, Washington, DC: National Academies Press, 264. Postareff L, Lindblom-Ylanne S, Nevgi A (2007). The effect of ped- agogical training on teaching in higher education. Teach Teacher Smith DJ, Valentine T (2012). The use and perceived effectiveness of Educ 23, 557–571. instructional practices in two-year technical colleges. J Excell Coll Teach 23, 133–161. President’s Council of Advisors on Science and Technology (2012). En- gage to Excel: Producing One Million Additional College Graduates Smith MK, Vinson EL, Smith JA, Lewin JD, Stetzer MR (2014). with Degrees in Science, Technology, Engineering, and Mathematics. A campus-wide study of STEM courses: new perspectives www.whitehouse.gov/sites/default/files/microsites/ostp/pcast on teaching practices and perceptions. CBE Life Sci Educ 13, 624– -engage-to-excel-inal_2-25-12.pdf (accessed 15 October 2014). 635. Prosser M, Ramsden P, Trigwell K, Martin E (2003). Dissonance Steinert Y, Mann K, Centeno A, Dolmans D, Spencer J, in experience of teaching and its relation to the quality of student Gelula M, Prideaux D (2006). A systematic review of faculty learning. Stud Higher Educ 28, 37–48. development initiatives designed to improve teaching effective- ness in medical education: BEME Guide No. 8. Med Teach 28, Prosser M, Trigwell K (1997). Relations between perceptions of the 497–526. teaching environment and approaches to teaching. Br J Educ Psy- chol 67, 25–35. Teasdale R, Budd D, Cervato C, Iverson E, Kraft KJVDH, Manduca C, McConnell DA, McDaris JR, Murray DP, Slattery W (2011). En- Roberson PK, Shema SJ, Mundfrom DJ, Holmes TM (1995). Analysis hancing student-centered teaching practices: approaches developed of paired Likert data: how to evaluate change and preference ques- on the new Cutting Edge Geosciences RTOP website. Paper present- tions. Family Med 27, 671–675. ed at the Geological Society of America Annual Meeting, held 9–12 Rybarczyk B, Lerea L, Lund PK, Whittington D, Dykstra L (2011). October 2011, in Minneapolis, MN. Postdoctoral training aligned with the academic professoriate. Trigwell K, Prosser M (2004). Development and use of the approach- BioScience 61, 699–705. es to teaching inventory. Educ Psychol Rev 16, 409–424. Sandretto S, Kane R, Heath C (2002). Making the tacit explicit: a Trigwell K, Prosser M, Ginns P (2005). Phenomenographic pedago- teaching intervention programme for early career academics. Int J gy and a revised approaches to teaching inventory. High Educ Res Acad Dev 7, 135. Dev 24, 349–360. Sawada D (2003). Reformed Teacher Education in Science and Math- Weimer M (2002). Learner-Centered Teaching: Five Key Changes to ematics: An Evaluation of the Arizona Collaborative for Excellence Practice, San Francisco, CA: Jossey-Bass. in the Preparation of Teachers, Arizona State University Document Production Services, Tempe. Weiss I, Montgomery D, Ridgway C, Bond S (1998). Local Systemic Change through Teacher Enhancement: Year Three Cross-Site Re- Sawada D, Piburn MD, Judson E, Turley J, Falconer K, Benford port, Chapel Hill, NC: Horizon Research. R, Bloom I (2002). Measuring reform practices in science and mathematics classrooms: the Reformed Teaching Observation Pro- Wilson SM (2013). Professional development for science teachers. tocol. School Sci Math 102, 245–253. Science 340, 310–313. Shields PM, Marsh JA, Adelman NE (1998). Evaluation of NSF’s Wlodarsky R (2005). The professoriate: transforming teaching Statewide Systemic Initiatives (SSI) Program: The SSIs’ Impacts on practices through critical relection and dialogue. Teach Learn 19, Classroom Practice, Menlo Park, CA: SRI International. 156–172. 14:ar22, 12 CBE—Life Sciences Education Downloaded from http://www.lifescied.org/ by guest on June 5, 2015 Supplemental Material CBE—Life Sciences Education Ebert-May et al. 2 Development Of Learner-Centered Teachers Supplemental Materials I. Details of Recruitment and Training of Regional Team Leaders (RTL) A. RTL Recruiting procedure: Advertisement of the FIRST IV opportunity was made through listservs of professional biology societies (e.g., Ecological Society of America, American Society for Cell Biology) that provided broad dissemination to teaching institutions. A total of 128 and 141 PDs applied for cohorts 1 and 2, respectively. B. Training for RTLs: The objectives of the RTL workshops were to 1) design the summer workshops for the PDs, 2) develop a RTL team plan for implementing the summer workshops, 3) further develop their skills in identifying and evaluating learner-centered teaching, 4) discuss ongoing research questions and preliminary findings of the FIRST IV project, 5) provide assessment data about their own professional development activities, and 6) discuss their mentoring responsibilities during the months between the summer workshops. II. FIRST IV-designed Surveys A. Background Survey (see pages 3 to 9 of this document) The purpose of the Background Survey was to determine the prior experiences of the project participants with teaching and professional development, and their perceptions of potential challenges to implementation of learner-centered teaching. 3 Development Of Learner-Centered Teachers First IV Background Survey Q16 Welcome to the FIRST IV Program! We would like to get some background information on your previous experiences teaching biology. Please complete the following survey. Thanks!-The First IV Team Q1 Do you have any teaching experience (broadly defined – as an undergraduate or graduate TA, a lecturer, teacher in the k-12 setting, etc.)?  Yes (1)  No (2) Q2 In what capacities have you taught and for how long (years): Years (1) TA as an Undergraduate (1) TA as a Grad Student (2) Adjunct/Lecturer/Instructor (3) Elementary school (4) Secondary school (5) Other (please explain): (6) Q3 Estimate the percentage of your present appointment (0-100%) you currently dedicate to each of the following activities. If none of your appointment is dedicated to a particular activity, record 0. ______ Teaching activities and responsibilities (e.g., course preparation and administration, assisting students with coursework, projects, etc…) (1) ______ Service (e.g., committees, student advising, outreach) (2) ______ Administration (e.g., departmental chair, field station manager, program administrator, etc…) (3) ______ Research (4) Q4 With respect to your current department’s commitment toward undergraduate education, please indicate to what degree you agree or disagree with the following statements: no basis or strongly strongly disagree (3) agree (4) n/a (1) disagree (2) agree (5) My department is committed to reforming curricula and      courses to enhance active learning and inquiry- 4 Development Of Learner-Centered Teachers based teaching. (1) I frequently discuss issues pertaining to the improvement      of teaching and learning with colleagues in my department. (2) Other faculty in my department feel the same as I do about the need to      improve undergraduate biology teaching and learning. (3) Faculty in my department collaborate to achieve effective      biology teaching (e.g., team teach, and design, test, discuss curricula, etc…) (4) Faculty in my department are interested in or are      already conducting scholarly work about teaching and learning. (5)      Faculty in my department 5 Development Of Learner-Centered Teachers are recognized, evaluated and rewarded for effective teaching. (6) Q10 Have you participated in any education-related professional development program within the past 2 years?  Yes (1)  No (2) Q5 Please identify any education-related professional development programs you are participating in currently or have participated in within the last 2 years. Provide the name and a brief description for each Program Type Q6 Professional development projects (externally funded): Q7 On campus workshops: Q8 Reading groups/informal discussion groups: Q9 Other (please specify): Q11 Please rate your theoretical knowledge and first-hand experience with each of the following using a scale of 1 – 5 (1=lowest rating; 5=highest rating). Theoretical Knowledge First-hand Experience 1 (1) 2 (2) 3 (3) 4 (4) 5 (5) 1 (1) 2 (2) 3 (3) 4 (4) 5 (5)           Biology education reform (1)           Course/curriculum planning (2) Theories of           learning (e.g., constructivism) (3)           Use of technology in instruction (4) Assessment (5)                     Cooperative learning (6) Case studies (7)           Problem-based           6 Development Of Learner-Centered Teachers learning (8)           Inquiry-based lectures (9)           Inquiry-based laboratories (10)           Inquiry-based field projects (11)           Teaching portfolios (12) Q12 Which phrase best describes your confidence about your current level of preparation as a teacher?  Extremely confident (1)  Somewhat confident (2)  Somewhat unconfident (3)  Extremely unconfident (4) Q13 Please Explain Q18 Do you have any experience developing, leading, or managing biology education projects.  Yes (1)  No (2) Q14 We are interested in knowing more about your experience and success relate to developing, leading, and managing biology education projects. Using a scale of 1 – 5 (1=lowest rating; 5=highest ranting), please rate your level of first-hand experience and degree of success with each of the following: First-hand Experience Degree of Success 1 (1) 2 (2) 3 (3) 4 (4) 5 (5) 1 (1) 2 (2) 3 (3) 4 (4) 5 (5) Planning, writing and submitting           grant proposals. (1) Maintaining communication with colleagues           about a project’s progress. (2)           Taking leadership of a 7 Development Of Learner-Centered Teachers team. (3) Working with people from           different disciplines than your own. (4) Handling matters regarding: responsible conduct of research,           human subjects in research, intellectual property. (5) Writing or speaking to the general public about           your project’s mission or outcomes. (6) Q15 Please rate the extent to which each of the following issues pose a challenge as you implement an active-learning course. Please use the following scale: not challenging somewhat very challenging n/a (1) (2) challenging (3) (4) Time to plan (1)     Time to develop or adapt     materials (2) Time to grade or give adequate     feedback (3) Time to train colleagues or     TAs (4) Cooperation of faculty in my     department (5)     Cooperation of faculty in other 8 Development Of Learner-Centered Teachers departments (6) Cooperation of     TAs or instructional staff (7) Support of     campus administration (8) Recognition or reward for     teaching (9)     Tenure-related issues (10) Issues for non- tenure track     positions (11) Student attitudes toward alternative     teaching methods (12) Student motivation for     learning (13) Student     feedback through course evaluations (14) Classroom infrastructure     (15) Technical issues with audience response     systems (‘clickers’) (16) Technical issues     with other instructional technology (17) Level of support staff or technical     staff (18)     Balancing teaching with 9 Development Of Learner-Centered Teachers other responsibilities (19) Financial support for     teaching (20) B. End of Year Two Survey (see pages 9 to 18 of this document) The End of Year Two Survey was taken by the participants upon completion of the FIRST program. The purpose of the survey was to obtain updated information on the participants’ teaching-related activities and the perceived effectiveness of the FIRST project. Several items on the survey are the same as in the Background Survey, thereby allowing pre-post comparisons of the participants’ perceptions and activities. First IV End of Year Two Survey Q1 FIRST IV - End of Year Two Survey - Now that you have completed two years of FIRST IV, we ask you to take a few minutes to complete the following survey. This survey assesses participant knowledge and activities at the end of year two and provides us data that we will use to evaluate the effectiveness of the project. -The First IV Team Q2 Participant Information: Here is your current contact information on file with FIRST IV. Please verify that it is correct. If it is not correct, please visit this link to update your information. Q56 Name: Q3 Current Position Title: Q5 Name of Institution, Agency, Company (Private Sector): Q6 Department: Q58 Current Email Address: 10 Development Of Learner-Centered Teachers Q7 Since beginning the FIRST IV program, please list your teaching experiences. For each course taught, list the full name of the course, whether you taught the whole course or part of the course [if part, please name the unit or module or classes you taught (e.g., ecology unit, two classes on cell signaling)], the year and semester you taught the course, and the course level (e.g., introductory, upper level, etc.). Q8 Teaching Experiences Q8 Since you began participating in FIRST IV have you taken on any additional responsibilities (e.g. curriculum development) related to your interest in and experience with improving teaching and student learning?  Yes (1)  No (2) Q9 If yes, please explain. Q10 Briefly, please update your research interests/activities since completion of FIRST IV. Q11 Estimate the percentage of your present appointment that you currently dedicate to each of the following activities. (Please note that the 5 main categories must add to 100%). ______ Teaching activities and responsibilities (e.g., course preparation and administration, assisting students with coursework, projects, etc…) (1) ______ Service (e.g., committees, student advising, outreach) (2) ______ Administration (e.g., field station manager, program administrator, etc…) (3) ______ Research (4) ______ Other (5) Q12 Within the research category referenced above, what proportion of your time is spent: (Please note that these must add to 100%). ______ mentoring postdoctoral associates on their research (1) ______ mentoring graduate students on their research (2) ______ mentoring undergraduate students on their research (3) ______ working independently or collaboratively with colleagues (4) Q13 If you could allocate the time spent in your present appointment in any way you desired, indicate the percentage you would dedicate to each of the following. (Please note that the 5 main categories must add to 100%). ______ Teaching activities and responsibilities (e.g., course preparation and administration, assisting students with coursework, projects, etc…) (1) ______ Service (e.g., committees, student advising, outreach) (2) ______ Administration (e.g., field station manager, program administrator, etc…) (3) ______ Research (4) ______ Other (5) 11 Development Of Learner-Centered Teachers Q14 Within the research category referenced above, if you could allocate the time spent in any way you desired, what proportion of time would be spent: (Please note that these must add to 100%) ______ mentoring postdoctoral associates on their research (1) ______ mentoring graduate students on their research (2) ______ mentoring undergraduate students on their research (3) ______ working independently or collaboratively with colleagues (4) Q15 If there is a difference between your actual and desired appointment time allocations, what factors do you believe contribute to that discrepancy? Q16 To date, has your participation in FIRST IV met your initial goals?  Yes (1)  No (2) Q17 Stated initial goals: Q18 What did you gain from participation in FIRST IV? Please be as specific as possible. Q19 What would have improved your experience in FIRST IV? Please be as specific as possible. Q20 With respect to your current department’s commitment toward undergraduate education, please indicate to what degree you agree or disagree with the following statements: strongly moderately moderately strongly neutral (3) disagree (1) disagree (2) agree (4) agree (5) My department is committed to reforming curricula and      courses to enhance active, learner- centered classrooms. (1) I frequently discuss issues      pertaining to the improvement of teaching and learning 12 Development Of Learner-Centered Teachers with colleagues in my department. (2) Other faculty in my department feel the same as I do about the need to      improve college science teaching and learning. (3) Faculty in my department collaborate to achieve effective science teaching (e.g., share      pedagogies, ideas, materials, revise curricula, co- teach etc.) (4) Faculty in my department are interested in or are      already conducting scholarly work about teaching and learning. (5) Faculty in my department are      recognized, evaluated and rewarded for effective teaching. (6) 13 Development Of Learner-Centered Teachers Q21 Other comments about your department's commitment: Q22 During your time as a FIRST participant, has anyone on campus helped you become a better teacher?  Yes (1)  No (2) Q23 If yes, who? (name, position, role) Q24 Have you participated in any teaching professional development programs in addition to FIRST IV within the last 2 years?  Yes (1)  No (2) Q54 If yes, please provide the name and a brief description for each program type below. Q25 On campus workshops: Q26 Workshops offered by professional/national societies Q27 Reading groups/journal clubs Q28 Other (please specify) Q55 We are interested in knowing more about your experience and success related to developing, leading, and managing science education projects.Have you participated in any education projects in addition to your involvement with FIRST IV?  Yes (1)  No (2) 14 Development Of Learner-Centered Teachers Q30 If you answered "yes" to the question above, using a scale of 1 – 5 (1=lowest rating; 5=highest rating), please rate your level of first-hand experience and degree of success with each of the following: If you have not participated in any education projects, please skip to the next question. First-hand Experience Degree of Success 1 (1) 2 (2) 3 (3) 4 (4) 5 (5) 1 (1) 2 (2) 3 (3) 4 (4) 5 (5) Planning, writing and submitting grant proposals including           ideas, projects, activities for science education in the "broader impact" section of a proposal (1) Maintaining communication with colleagues           about a project’s progress. (2) Taking leadership of an education           project team. (3) Working with people from           different disciplines than your own. (4) Handling matters regarding: responsible           conduct of research, human subjects in research, intellectual 15 Development Of Learner-Centered Teachers property, etc. (5) Writing or speaking to the general public about           your project’s mission or outcomes. (6) Q31 Please update the following information on presentations about teaching: Since you began participating in FIRST IV, have you presented any seminars etc., related to teaching to your department, institution, or at a professional meeting?  Yes (1)  No (2) Q32 If yes, please give the titles and audience for each presentation. Q33 Please update the following information on science education grant proposals: Have you submitted any grant proposals for science education projects since you began participating in the FIRST IV project?  Yes (1)  No (2) Q34 If yes, please give: the title, funding agency, program within the agency, funding amount, and funding status (pending, not funded, funded) for each grant Q35 Please update the following information on departmental change: Since you began participating in FIRST IV, has your department made or initiated any curricular changes specifically to implement active, learner-centered courses?  Yes (1)  No (2) Q36 If yes, please describe the changes. 16 Development Of Learner-Centered Teachers Q37 Please rate your theoretical knowledge and first-hand experience with each of the following using a scale of 1 – 5 (1=lowest rating; 5=highest rating). Theoretical Knowledge First-hand Experience 1 2 3 4 5 1 2 3 4 5 (1) (2) (3) (4) (5) (1) (2) (3) (4) (5)           Science education reform (1) Course/curriculum development, Backward           design (2)           Theories of learning (e.g., constructivism) (3)           Use of technology in instruction (4) Interdisciplinary approaches to inquiry           and problem-solving (5) Assessment (6)                     Cooperative/collaborative learning (7) Case studies (8)           Independent projects (9)                     Problem-based learning (10)           Inquiry-based laboratories (11)           Inquiry-based field projects (12) Teaching portfolios (13)                     Other (please specify below): (14) Q38 Other Q39 Which phrase best describes your confidence about your current level of preparation as a teacher?  Extremely confident (4)  Somewhat confident (3)  Somewhat unconfident (2)  Extremely unconfident (1) Q40 Please explain your answer: 17 Development Of Learner-Centered Teachers Q41 Please rate the extent to which each of the following issues pose a challenge to you in using active, student-centered learning methods in your classroom. Please use the following scale: not challenging somewhat very challenging n/a (0) (1) challenging (2) (3) Time to plan (1)     Time to develop or adapt     materials (2) Time to grade or give adequate     feedback (3) Time to train colleagues or     TAs (4) Cooperation of faculty in my     department (5) Cooperation of faculty in other     departments (6) Cooperation of     TAs or instructional staff (7) Support of     campus administration (8) Recognition or reward for     teaching (9)     Tenure-related issues (10) Issues for non- tenure track     positions (11) Student attitudes toward alternative     teaching methods (12) Student feedback     through course 18 Development Of Learner-Centered Teachers evaluations (13) Classroom infrastructure     (14) Technical issues with "clickers" (personal     response systems) (15) Technical issues     with other instructional technology (16) Level of support staff or technical     staff (17) Balancing teaching with other     responsibilities (18) Financial support or     funding (19) Other (please specify below):     (20) Q42 Other: Q43 Please describe any goals or activities you have that extend beyond your own courses to improve the quality of education for students at your institution. Q44 When a course you teach is successful, how are your students different as a result (e.g., in terms of what they know, what they are able to do, attitudes, confidence)?