Skip to main content

Full text of "ERIC ED564713: An Evaluation Report: i3 Development Grant Dev07--Sammamish High School. "Re-Imagining Career and College Readiness: STEM, Rigor, and Equity in a Comprehensive High School""

See other formats


An Evaluation Report: i3 Development Grant Dev07 - Sammamish High School 

“Re-imagining Career and College Readiness: 

STEM, Rigor, and Equity in a Comprehensive High School” 


Randy Knuth, Ph.D. 
Knuth Research, Inc. 
randy@knuthresearch.com 

Paul S. Sutton, Ph.D. 
Pacific Lutheran University 
suttonps@plu.edu 

Sheldon Levias, Ph.D. 
University of Washington, Seattle 

Annie Camey Kuo, Ph.D. 
Stanford University 

Matthew Callison 
Indiana University 


March 1, 2016 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Acknowledgements 

The findings reported herein were supported by funding from an Investing in Innovation Grant (i3), 
PR/Award # U396C100150, as awarded by the Department of Education. 

The authors would like to thank the teachers and school leaders at the awarded institution who gave 
generously of their time and insights to help us complete this evaluation. The authors acknowledge, 
as well, the assistance of Andrew W. Shouse and Elizabeth Wright who made contributions to this 
project and, in some cases, early support towards this evaluation. 


2 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Table of Contents 

Executive Summary.1 

Chapter 1: The School Context.7 

District Impact on Teacher-Centered Curriculum Design.8 

Why PBL?.10 

Chapter 2: Fidelity of Implementation.12 

Scoring.22 

Analysis.23 

Chapter 3: Qualitative Methods.25 

Chapter 4: Increasing Teacher Pedagogical Expertise.30 

Summer Institute of Learning and Teaching (SILT).32 

Methodology.32 

Findings.34 

PBL Curriculum Design Teams.38 

Methodology.41 

Findings.43 

Chapter 5: School Leadership Structure.51 

Purpose of Leadership Team.51 

Methodology.51 

Leadership Team Membership.52 

Peripheral Members of Leadership Team.53 

Leadership Team Roles.53 

Leadership Team Meetings.60 

Advisory Board Meetings.60 

Findings.61 

Discussion.65 

Chapter 6: Exploratory Studies 1 and 2.66 

Assessing the Impact of Problem Based Learning (PBL) on Students’ Career and College 

Readiness.66 

Stated Goals and Student Populations.67 

PBL Redesigned Courses, Student Cohorts, and Student Populations.67 

How We Describe Groups of Students.71 

Methodology: Exploratory Study #1.73 

Findings: Exploratory Study #1: Comparison of Student Performance in AP Coursework.78 

Impact on Special Populations: English Language Learners.99 

PBL Adoption and Comparative AP Scores by Department.107 

Adoption of PBL Within Departments.109 

Comparison of Mean AP Scores by All Students Disaggregated by Department.122 

College and Career Readiness Outcomes for Students Not Participating in AP Coursework.130 








































Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Findings: Exploratory Study #2: Impact of Starting Strong/Sammamish Leads on Students’ 

Career and College Readiness.132 

Methodology: Exploratory Study #2.132 

Chapter 7: Discussion and Conclusions.140 

References.142 

Appendix A. Implementing PBL Classroom Observation Protocol.145 

Appendix B: Key Element Classroom Observation Protocol.148 

Appendix C: Levels of Use (LOU) Teacher Interview Protocol.151 


ii 










Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


List of Tables and Charts 


Tables: 

Table 1: Sammamish High School’s Free and Reduced Lunch Population 
Table 2: Teacher Population by Year 
Table 3: Levels of Use (LOU) 

Table 4: Establishing the Threshold for PBL Implementation and PBL Readiness 

Table 5: Sammamish High School Professional Learning Infrastructure 

Table 6: Measuring Relevance at SILT 

Table 7: Teacher Participation in SILT 

Table 8: Fidelity Rating: SILT 

Table 9: SILT Data Collection Summary 

Table 10: Overall SILT Effectiveness of Relevance 

Table 11: Teacher Design Team Data Summary 

Table 12: Instances of Sharing in AP Human Geography Design Team Meetings 
Table 13: Instances of Sharing in the Junior English Design Team’s Satire Unit 
Table 14: Instances of Sharing in Geometry Design Team Meetings 

Table 15: Overview of Teacher Leaders, Department Affiliation, Major Responsibilities, and FTE 

Table 16: Advisory Board Meetings 

Table 17: Courses Targeted for PBL Redesign 

Table 18: Description of Cohort and Graduation Year 

Table 19: Comparison of Mean Number of PBL and All Courses Taken by Cohort 
Table 20: Description of PBL Dosage by Cohort 

Table 21: Sammamish High School Changing Demographics Over Time 

Table 22: Number of Students in Each Cohort Matched by Number of SHS Years and AP Test 
Table 23: AP Tests taken by Course 

Table 24: Statistically Significant Gains in AP Scores by Course 

Table 25: Statistically Significant Gains by Students Who Speak a First Language Other Than 
English at Home (EngNotFirst) in Mean AP Scores by Course 

Table 26: Statistically Significant Gains by Students Who Receive Free and Reduced Lunch (FRL) 
Services in Mean AP Scores by Course 

Table 27: Statistically Significant Gains by Students with Disabilities (SWD) in Mean AP Scores by 
Course 

Table 28: Comparison of Mean AP Scores Between All Students in Exploratory Study #1: Mean AP 
Scores 

Table 28a: Course Pre/Post Assignment in initial ITS Pass Rate Exploratory Study 

Table 29: Sample of ELL Students 

Table 30: CBAM Survey Respondents by Year 

Table 31: Dependent Variable: Advanced Placement Test Score (Social Studies) 

Table 32: Dependent Variable: Advanced Placement Test Score (Science) 

Table 33: Dependent Variable: Advanced Placement Test Score (Math) 

Table 34: Dependent Variable: Advanced Placement Test Score (English) 

Table 35: Statistically Significant Gains in Mean AP Scores by Department 

Table 36: Statistically Significant Gains in Mean AP Scores by FRL Students by Department 

Table 37: Statistically Significant Gains in Mean AP Scores by EngNotFirst Students by Department 


iii 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Table 38: Statistically Significant Gains in Mean AP Scores by Students with Disabilities (SWD) by 
Department 

Table 39: Data Collection for Exploratory Study #2 

Table 40: Dimensions, Aspects, and Components of the Campus Ready Instrument 

Charts: 

Chart 1: Percent of SWD, FRL, and EngNotFirst by Matched, Non-Matched, and Non-AP test 
Takers 

Chart 2: Number of SWD and EngNotFirst Students Taking AP Tests 
Chart 3: Percentage of Each Cohort as Members of PBL Exposure Groups 
Chart 4: School Level AP Test Performance (Estimated Marginal Means) 

Chart 5: Percent of AP test Takers Passing with a Score of 3 or Higher - Whole School 
Chart 6: Course Level AP Test Results (Estimated Marginal Means) 

Chart 7: Percent of AP Test Takers Passing with a Score of 3 or Higher - Whole School (By Course) 
Chart 8: School Level AP Test Performance - EngNotFirst (Estimated Marginal Means) 

Chart 9: Course Level AP Test Results - EngNotFirst (Estimated Marginal Means) 

Chart 10: Percent of AP Test takers Passing with a Score of 3 or Higher - EngNotFirst 
Chart 11: School Level AP Test Performance — FRL (Estimated Marginal Means) 

Chart 12: Course Level AP Test Results - FRL (Estimated Marginal Means) 

Chart 13: Percentage of AP Test Takers Passing with a Score of 3 or Higher - FRL 
Chart 14: School Level AP Test Performance - SWD (Estimated Marginal Means) 

Chart 15: Course Level AP Test Results - SWD (Estimated Marginal Means) 

Chart 16: Percent of AP test Takers Passing with a Score of 3 or Higher - SWD 
Chart 17: AP Test Mean by Targeted PBL Course Exposure 

Chart 18: Percent of Students Passing AP Tests by dose of PBL Exposure, Cohorts 2005-2011 
Chart 19: School Level AP Performance - Took AP Human Geography Course (Estimated 
Marginal Means) 

Chart 20: School Level AP Test Performance - Took AP Human Geography Test (Estimated 
Marginal Means) 

Chart 21. Course Level AP test Results - Took AP Human Geography Course as Freshman 
(Estimated Marginal Means) 

Chart 22: Course Level AP Test Results - Took AP Human Geography Test as Freshman 
(Estimated Marginal Means) 

Chart 23: Hypothesized CBAM Implementation Wave 

Chart 24: CBAM: SHS 2011-2015 

Chart 25: CBAM: Social Studies 2011-2015 

Chart 26: CBAM: Science 2011-2015 

Chart 27: CBAM: Math 2011-2015 

Chart 28: CBAM: English 2011-2015 

Chart 29: CBAM: Years of Experience: 2015 

Chart 30: CBAM: 

Chart 31: Percent of AP Test Takers Passing with a Score of 3 or Higher - Whole School 
(Department) 

Chart 32: Department Level AP Test Performance - FRL (Estimated Marginal Means) 

Chart 33: Percent of AP Test Takers Passing with a Score of 3 or Higher - FRL 

Chart 34: Department Level AP Test Performance - EngNotFirst (Estimated Marginal Means) 

Chart 35: Percent of AP Test Takers Passing with a Score of 3 or Higher - EngNotFirst 


IV 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 36: Department Level AP Test Performance - SWD (Estimated Marginal Means) 
Chart 37: Percent of AP Test Takers Passing with a Score of 3 or Higher - SWD 
Chart 38: Mean Campus Ready Scores for Non-AP Students: PBL Exposure 
Chart 39: Campus Ready Post Test Mean 


v 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


vi 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Executive Summary 

In 2009, Sammamish High School, a public comprehensive high school in the Bellevue 
School District, was struggling. Enrollment was declining. Since 2002, Sammamish High School has 
served an increasingly linguistically, socio-economically, racially, and ethnically diverse student body. 
Student achievement data revealed that gaps between groups of students remained a chronic 
problem despite efforts by various school leaders to significantly narrow them. Whereas gaps in 
reading and writing were narrowed, gaps continued in math and, to some extent, graduation rates. 
White students continued to outperform their African American and Hispanic peers. Middle class 
and affluent students continued to outperform their more impoverished peers. Students qualifying 
for Special Education and English Language Learner services and accommodations struggled to 
keep up with their mainstream, native English peers. 

Instigated by a committee of teachers looking for a dramatic way to reset the school’s 
academic culture, the school investigated various options for improving student outcomes. Problem- 
based learning (PBL) emerged as a promising approach. In 2010, Sammamish High School received 
a “Development’’-level Investing in Innovation (i3) grant from the Department of Education. The 
school focused on science, technology, engineering, and mathematics (STEM) disciplines and 
identified PBL as their primary tool of school-wide improvement. In their grant proposal 
Sammamish articulated several goals and student learning and achievement outcomes they hoped to 
accomplish by 2015. 

Their goals included: 

• Implementation of PBL curriculum throughout the school to establish a scalable, sustainable, 
21 st century skills based program in Advanced Placement (AP) and non-AP coursework, 

• Use PBL as a framework to support student growth in key cognitive strategies and academic 
behavior, 

• Implement a series of specific supports for struggling students, focused on increased 
mathematics literacy, 

• Provide customized and situated professional development (PD) that will help teachers 
implement new PBL curricula and evaluate their effectiveness to do so. 

Their student learning and achievement outcomes included: 

• 20% increase in AP pass rates, especially in STEM content areas (Biology, Chemistry, 
Statistics, Calculus AB/BC, Physics, Environmental Science), 

• 20% increase in students with disabilities (SWD) and limited English proficient students 
(I E PS) enrolling in AP STEM classes, 

• 75% of all students, 50% of SWDs, and 60% of LEPS successfully completing pre-calculus 
with a B or better, 

• 100% of all students reaching standard on the state math test, 

• 10% annual improvement on the state science test for all students, 


1 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


• 15% annual improvement for SWDs and LEPs, 

• 90% on-time graduation rate for SWDs and 75% on-time graduation for LEPS. 

The purpose of this evaluation is to assess the extent to which Sammamish High School 
teachers and school leaders accomplished their goals. Because the school used PBL as the primary 
tool of school improvement, we focus specifically on the ways the school implemented PBL in 
coursework across the core content areas of Math, English, Science, and Social Studies, and the 
extent to which PBL may have contributed to school-wide differences in student learning and 
achievement. Throughout our evaluation process, we leveraged quantitative data stretching back 
over 10 years and qualitative data collected over the 5-year duration of the grant. 

Problem-Based Learning (PBL) as the Driver for School Improvement 

In their review of the research on problem- and project-based learning, Barron, et al. (1998) 
describe how problem based learning supports both skill and knowledge acquisition with deepening 
students’ metacognitive skills. They state 

These principles mutually support one another toward two ends. One 
end is the acquisition of content and skills. The other end is to help 
students become aware of their learning activities so they may take on 
more responsibility and ownership of their learning. This awareness 
includes many aspects of what has been characterized under the 
umbrella term metacognition —knowing the goal of their learning, self- 
assessing how well they are doing with respect to that goal, 
understanding that revision is a natural component of achieving a 
learning goal, and recognizing the value of scaffolds, resources, and 
social stmctures that encourage and support revision (p. 273). 

This and other educational research informed how Sammamish High School teachers and 
school leaders thought about the balance between the hard academic skills like reading, writing, and 
math, the soft 21 st century skills of collaboration, critical thinking, and problem solving, and the 
metacognitive skills listed above that served as the foundation for how they articulated and 
acculturated those principles to achieve school-wide transformation through PBL. 

Working over 4 years and in collaboration with researchers from the University of 
Washington, teachers and school leaders developed the Key Elements of Problem Based Learning 
to provide teachers with a highly articulated framework for understanding what PBL is and what it 
looks like in practice. While the school implemented multiple tools and policies to implement PBL 
throughout the school, the Key Elements served as the over-arching framework encompassing them 
all. Teachers working to redesign curriculum for various courses used the Key Elements framework 
to closely guide their curricular choices. Teachers and teacher leaders worked to develop and 
implement Sammamish Leads, a summer PBL enrichment program that matches groups of students 
with industry experts to solve authentic problems, and used the Key Elements to guide their design 
process. Teacher and school leaders working to design professional learning experiences to support 
the work teachers were doing in PBL curriculum redesign teams used the Key Elements to design 
teachers’ professional learning in the summer and throughout the year. 


2 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Amidst competing definitions and descriptions of the difference between project- and 
problem-based learning and of what problem-based learning is and should look like, the Key 
Elements provided teachers and school leaders with a specific, common language for understanding 
what PBL is and how they could implement it in their classrooms. In sum, the Key Elements 
became the language Sammamish High School teachers and school leaders used to communicate 
their standards and expectations for what highly rigorous, immersive, and engaging student learning 
should look like. 

The Key Elements of Problem Based Learning include: 

• Authentic Problems 

• Authentic Assessment 

• Expertise 

• Collaboration 

• Academic Discourse 

• Student Voice and Leadership 

• Culturally Responsive Instruction 

Taken together, the Key Elements provides teachers with a research-informed, practice-rich 
way to understand and implement PBL in their courses and classrooms. Readers of this report 
should be aware that for all intents and purposes, when we use the acronym “PBL” to describe the 
primary intervention Sammamish used to transform their school, we mean the Key Elements. 

Measuring the Impact of PBL 

The Department of Education awards three kinds of i3 grants: development, validation, and 
scale up. Sammamish High School received a “development” i3 grant. The purpose of the 
development grant was for schools to develop promising practices and policies other schools and 
educational organizations could learn from. The purpose of our evaluation is not to validate the 
school’s choice of PBL as their primary intervention or to evaluate their success in seeding similar 
PBL interventions in other schools or districts. The purpose of our evaluation is two-fold. First, we 
identify and describe what the intervention was. We focus on PBL as the school’s central 
intervention and also identify and describe several policies Sammamish High School leaders 
implemented to support teachers’ implementation of PBL. Second, we measure how successful the 
intervention was in improving specific outcomes for students. In this Evaluation Report, we 
primarily focus on student performance outcomes demonstrated through quantitative findings. 

To that end, we spend a majority of our evaluation describing the intervention, in this case 
PBL, and how it was implemented by Sammamish High School to improve student outcomes. We 
do this to evaluate the extent to which the school implemented their intervention with fidelity 
according to their evolving PBL framework and their initially stated goals. These include: 

• The professional learning infrastmcture the school designed to support teachers design and 

implementation of PBL curriculum, 

• The process of PBL curriculum design as observed in teacher design teams, 

• Supports the school created to improve students’ career and college readiness. 


3 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


• The development, adaptation, adoption of the Key Elements of Problem Based Learning 
which provided the framework for PBL design, implementation, and evaluation, 

• The leadership structure the school used to support teachers’ design and implementation of 
PBL curriculum. 

To evaluate the effectiveness of the intervention, we share findings from a study in which we 
focus on student AP mean scores to compare two matched groups of students: those who took AP 
courses and the associated AP tests previous to the PBL intervention and those who took AP 
courses and the associated AP tests as teachers were implementing PBL across content areas. 

Findings 

Analysis of teacher surveys, teacher interviews, teacher focus groups, student focus groups, 
school leader interviews, classroom observations, and design team observations demonstrate that 
Sammamish High School developed and sustained effective stmctures for supporting teachers as 
they designed and implemented PBL across content areas. Specific findings suggest that 

• Teachers found the professional learning experience of working in design teams and 
attending the summer Sammamish Institute of Teaching and Learning (SILT) relevant to 
their classroom practice and useful for helping them implement PBL principles and practices. 
Additionally, teachers highly valued their experiences collaborating with colleagues to 
redesign curriculum in PBL curriculum design teams. 

• Many design teams successfully redesigned large portions of existing course curricula into 
PBL curricula during their design year. 

• Throughout the duration of the i3 grant, students took the EPIC Campus Ready Assessment 
instrument at the end of each academic school year. Over time, students scored higher on 
the Key Cognitive Strategies portion of the survey, suggesting the school was successful in 
supporting an increase in students’ career and college readiness. 

• Teachers, teacher leaders, school leaders, and University of Washington educational 
researchers collaborated over four years to develop and refine the Key Elements of Problem 
Based Learning framework. Teachers used the Key Elements framework to redesign existing 
curriculum into PBL curriculum. Teacher leaders used the Key Elements framework to 
redesign all school-wide professional learning experiences. Over time, teachers and school 
leaders used the Key Elements framework to define and describe highly rigorous and 
engaging PBL curriculum and coursework. 

• Teacher leaders were recmited from within the Sammamish High School teaching ranks to 
design all professional learning experiences and support teachers’ work in PBL curriculum 
design teams and later in their efforts to implement PBL coursework. Teacher leaders and 
the principal constituted the school’s Leadership Team. The Leadership Team was 
responsible for supporting and implementing nearly every aspect of the PBL i3 grant work. 

Comparison of student Advanced Placement (AP) mean scores between matched groups of 
students who took AP courses and the associated AP test previous to the PBL intervention 


4 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

(comparison group) and students who took AP courses and the associated AP test during the PBL 
implementation (treatment group) show gains in the treatment group’s AP mean scores. We 
disaggregate these data according to course and academic departments. Specific findings show 

• Students in the treatment group outperformed their matched peers in the comparison group 
on multiple AP tests. In some cases, student gains were statistically significant even when 
disaggregated according to students who receive free and reduced lunch (FRL), students 
with disabilities (SWD), and students who speak a first language other than English at home. 

• Overall, students in the treatment group passed a higher percentage of their AP tests despite 
a dramatic increase in student enrollment in AP courses. 

• Five years of CBAM survey data suggest not every academic department at Sammamish 
High School interacted with PBL in the same ways nor adopted PBL at the same level. The 
Social Studies and Science department fully adopted PBL as a guiding pedagogical 
framework. While they did not universally adopt PBL as a guiding pedagogical framework, 
teachers in the Math department continue to use the Key Elements to inform further 
curricula revisions. During the early years of the project the English department by and large 
determined that PBL was not a good fit as the instructional foundation for their 

courses. Thus, while some English courses were redesigned with PBL as their foundation, 
the overall stance of the English Department toward PBL is ambivalence. 

• When aggregated to AP mean scores by academic department, students in the treatment 
group outperformed their comparison group peers in AP mean scores in AP coursework in 
the Math, Science, English, and Social Studies departments. In the Math, Science, and Social 
Studies departments, those gains were statistically significant. 

• A strong, positive correlation exists between the number of PBL courses students take and 
their performance on AP tests. 

Emerging Tensions 

While these findings suggest gains in student learning and teacher growth achieved during a 
time of implementation of various PBL-focused policies implemented between 2010-2015, we 
observed several tensions within the school community that also seem to have emerged at that time. 
Interviews with school leaders suggest many of these tensions were both predictable yet 
unanticipated. Meaning, school leaders were cognizant that tensions inevitably surface whenever 
schools attempt to make wholesale changes to the way teachers teach and students learn. However, 
each school is different and the way those tensions manifest themselves within the school often 
times varies depending on the specific personalities of the students, teachers, and school leaders 
involved. 

Tension 1: balancing the Investment in Design and Implementation 

The school invested heavily in teachers’ PBL curriculum redesign process. However, little 
funds were invested in supporting teachers as they began implementing the PBL curriculum they 
planned in their design year. In a majority of teams, teachers found the implementation process to 
be just as time intensive and consuming as they worked to revise lessons and units in real time and 


5 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


find time to regularly meet to share problems of practice to solve issues with the new curriculum. 
While some teams found ways to meet regularly and make the needed revisions, other teams and 
teachers stmggled to meet the growing demands of teaching PBL curriculum. 

Tension 2: The Influence of Departments and Department leadership on Teacher Buy-In 

Our data suggest that departments and department leadership provided a significant 
affordance and constraint to whether or not individual teachers adopted PBL as a pedagogical model. 
In some cases, department leaders supported teachers’ ongoing PBL curriculum design and 
implementation work by doing such things as providing inter-departmental professional 
development during department “retreat” days and explicitly supporting ongoing course redesign 
and implementation. In other cases, department leaders hindered teachers’ ongoing PBL curriculum 
design and implementation process by publicly expressing discontent or suspicion with PBL as a 
pedagogical model. In both cases, the power and influence department leaders brought to bear was 
crucial to the extent to which teachers felt supported continuing to implement PBL beyond the 
design year. In departments in which teachers had varied years of experience and varied expertise, 
which was most of them, the way departments supported or hindered the adoption of PBL 
impacted the extent to which novice teachers felt comfortable buying in to the PBL model. 

Tension 3: Kesentment Toward Teacher leaders in Some Corners 

Even though many teachers speak glowingly of the work the teacher leaders invested in 
supporting teachers’ efforts to design and implement PBL curriculum, over time some teachers 
began to question various dimensions of the teacher leader role within the school. Some teachers 
voiced frustration that the teachers who were not teacher leaders had accumulated more expertise 
teaching PBL than the teacher leaders whose job it was to support them. Some teachers expressed 
resentment that the teacher leaders and leadership team accumulated outsized decision making 
power within the school and that there were few, if any, efforts made to broaden Leadership Team 
membership. They claimed that power was increasingly concentrated within the leadership team 
rendering other committees in the school, such as the Instructional Leadership Team (ELT), 
inconsequential and irrelevant. Still other teachers questioned how school leaders were holding the 
teacher leaders accountable and why the standards by which they were held accountable were not 
made more public and transparent to the staff. 

The data suggest teacher leaders struggled with their new role as both teachers and teacher 
leaders and the ways in which the teacher leader role strained their relationships with some 
colleagues. Both the pace of their work and the urgency by which tasks needed to get done on a day- 
to-day basis made reflection by individual teacher leaders and the collective leadership team 
complicated and problematic. Although teacher leaders and the principal were aware of some 
teachers’ concern described above, they may not have been aware of the extent and depth of 
growing resentment amongst some teachers on staff. For some teachers, tensions with the 
Leadership Team have conflated with other concerns possibly endangering the school’s ability to 
sustain positive momentum toward PBL pedagogy and practice. 

Tensions 1, 2, and 3 are most clearly evident in the qualitative data collected through teacher 
and school leader interviews and focus groups. 


6 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chapter 1: The School Context 

Sammamish High School serves students from various racial, ethnic, linguistic, cultural, and 
socio-economic backgrounds 1 . Specifically, it serves a significant English Language Learner (ELL) 
and Special Education (SPED) population. Approximately 1000 students attend the school: 45% of 
students qualify for free or reduced-price lunch, 10% qualify for ELL support services, 12% qualify 
for special education services and 47% will be in the first generation of college graduates for their 
family. Racially, 6% of students identify as African American, 20% as Asian, 20% Hispanic, 46% 
White, and 8% Multi-ethnic students. 

When compared with most other high schools within the Bellevue School District, by just 
about any measure, Sammamish High School is more diverse. Only Interlake High School, which 
serves some of the same neighborhoods and students as Sammamish High School, has a similar 
demographic. For example, whereas approximately 45% of students at Sammamish High School 
receive free and reduced lunch, approximately 10% of students at Bellevue High School, 
approximately 10% of students at Newport High School, approximately 3% of students at 
International High School, and approximately 33% of students at Interlake High School receive free 
and reduced lunch 2 . 

Table 1 below demonstrates how Sammamish High School’s free and reduced lunch 
population has dramatically increased since 2002. 

Table 1. 


Sammamish High School’s Free and Reduced Lunch 
Population 3 

Year 

Percentage 

2002 

20% 

2003 

24% 

2004 

28% 

2005 

28% 

2006 

30% 

2007 

27% 

2008 

30% 

2009 

32% 

2010 

39% 

2011 

41% 

2012 

45% 


1 Data retrieved from the Washington State Office of the Superintendent of Public Instruction (OSPI) to 
reflect 2013 demographic data. 

" Data retrieved from the Washington State Office of the Superintendent for Public Instruction on June 9 th , 
2015. 

3 

Data represented in Table 1 provided by the Bellevue School District. 


7 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


2013 

46% 

2014 

41% 


As research has repeatedly illustrated, a student’s socioeconomic status, here represented by the 
percentage of students who receive free and reduced lunch, can be predictive of a students’ 
performance on various high stakes tests (Darling Hammond, 2010; Berliner, 2013). Unlike other 
schools in the district, Sammamish High School leaders and teachers felt increased urgency to better 
support the growing number of impoverished students who attended the school and who 
experienced significant challenges in the classroom. 

District Impact on Teacher-Centered Curriculum Design 

Before the school began working on the i3 PBL project, much of the school improvement 
policies underway at Sammamish High School originated at the district level. In 1996, the school 
district hired a new superintendent who, over time, sought to improve district schools through the 
implementation of a common curriculum from kindergarten through 12 th grade for every content 
area, and worked to open access to AP classes for all students. Both the common curriculum and 
AP open access policies relied heavily on curriculum as the driver of school improvement. Both 
policies were managed largely at the district level. 

Knowledge of both policies is important to understand how and why Sammamish High 
School leaders and teachers approached implementation of PBL. First, the policy to open access to 
AP coursework to all students represented a fundamental shift in thinking for teachers for a couple 
of reasons. Many teachers were themselves students in AP classes that catered to a small population 
of students who had the support and resources to be successful in those classes. In many cases rigor 
was defined by massive amounts of reading and memorization to prepare students to be successful 
on the multiple choice and essay sections on the AP exam. By opening access to those classes to any 
students who wanted to take them, regardless of reading ability, motivation, or learning ability or 
disability, teachers worried that their pass rates would decline sharply. In some cases, teachers were 
right. Pass rates did decline at first. Teachers’ frustrations were exacerbated when they received few 
additional resources to support the needs of the diversity of students who now took AP classes. 

Over time, however, AP pass rates stabilized and even increased as more students took more AP 
classes. 

Second, the common curriculum was intended to support the open access AP class policy by 
scaffolding student learning in rigorous coursework from kindergarten through high school so 
students would be prepared to take and be successful in AP classes. The district’s intention was to 
have teachers develop and design the common curriculum in grade level teams. Once developed, 
that curriculum would be posted online where teachers could access it and use it in their classrooms. 
From time to time teachers across the district would meet and discuss the effectiveness of the 
curriculum and design common assessments by which they could measure, across schools, the 
effectiveness of the new curriculum. The district’s intention was that this process was to be 
democratic and collaborative and would increase the quality of student learning across the district by 
providing relevant and ongoing professional learning opportunities for teachers. 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


However, the common curriculum was not without problems. In some content areas, such 
as ELA and Math, once written, teachers felt as though they had little control to adapt and change 
the curriculum. In ELA and Math, the district curriculum was reinforced by mandated district 
assessments. In other content areas, such as Social Studies and to a lesser degree Science, the 
common curriculum existed but was not as highly articulated as it was in English and Math. Over 
time teachers in different content areas came to have vastly different experiences both with the 
“commonness” of the curriculum and the extent to which they controlled what they taught to their 
students. 

The choices the school leaders made at Sammamish High School regarding curriculum 
redesign followed in the district’s footsteps but took a different approach to implementation. The 
school asked teachers who had experience teaching a course to be part of the redesign effort. The 
school provided them with the Key Elements as a guiding document, but then got out of their way 
and gave them the time and space to redesign the curriculum with colleagues they knew and tmsted. 
Either during the redesign year or in the year to follow, those same teachers piloted the curriculum 
they had redesigned and continued to work together to refine the plans they originally made. This 
often meant that teachers were teaching the same things, on the same days, in similar ways. 

The historical legacy of the common curriculum and the varied way teachers experienced it 
across the district and at Sammamish High School did not evaporate because teachers’ curriculum 
design work shifted. The school never intended to supplant the work teachers were already doing to 
align their curriculum with various national (Advanced Placement, Common Core, Next Generation 
Science Standards) and state standards (Essential Learning Requirements) and district expectations 
with PBL coursework. In those content areas where the curriculum infrastructure was especially 
robust, such as in Math and English, teachers were expected to design PBL curriculum in ways that 
aligned with Common Core State Standards and/or AP frameworks and existing district curriculum 
and assessments. For some teachers, that work was complex, complicated, and at times problematic. 

Teachers and Teacher Attrition 

The number of teachers teaching at Sammamish High School since 2010 has remained fairly 
consistent as illustrated by Table 2 below 4 . 

Table 2. 


Teacher Population by Year 

Year 

Total Number of Staff 

2010-2011 

83 

2011-2012 

70 

2012-2013 

81 

2013-2014 

76 

2014-2015 

75 


4 Data in Table 1 provided by data analysts at SEDL. 


9 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


A vast majority of SHS teachers are “highly qualified.” 79% of teachers hold a Masters degree or 
higher and 97% of teaches meet with ESEA definition of a “highly qualified” teacher. Although we 
are not privy to specific data, a large number of Sammamish High School teachers have earned their 
National Board Teaching Certificate (NBTC). 

Why PBL? 

PBL is not a new pedagogical idea (Dewey, 1938). Recent research has demonstrated that 
inquiry-, project-, and problem-based coursework has several benefits for student learning, including 
increasing student engagement and motivation (Blumenfeld et al., 1991; Barron and Darling- 
Hammond, 2008; Baoler and Staples, 2008; Belland, Glazewski and Ertmer, 2009; Conley, 2010). 
Although there is no guarantee that problem-based learning will completely close achievement gaps, 
various studies have shown that because the work is purposeful, interactive, and complex, students 
are more fully engaged in the learning than in more traditional learning environments (lecture-based, 
teacher-directed instmction) (Blumenfeld et al. 1991; National Research Council, 2000; Ravitz, 2009). 
This is especially true for students who historically have fallen within achievement gaps such as 
special education students, English language learners, and Hispanic and African American males. 
Recent research (Parker et al. 2011; Parker et al. 2013; Halvorsen et al. 2014) suggests that problem 
based learning deepens high school students’ conceptual knowledge in Advanced Placement Social 
Studies and Science classes and can shrink achievement gaps between low and high SES elementary 
students in civics, economics, and citizenship coursework. 

The basic tenets of problem-based learning include: 

• Student-centered and generated learning, 

• Struggle and collaboration around authentic and ill-defined problems, 

• Extensive use of formative assessment and instmctional coaching to scaffold student 

learning and skill proficiency, 

• Innovation achieved through creative problem-solving, 

• Making “the work” public through collaboration in teams to reach learning benchmarks, and 

• Balanced theoretical content and real-world knowledge. 

Barron and Darling Hammond (1998) argue that problem- and project-based learning offers 
challenges to teachers by demanding that they simultaneously develop curriculum, change and 
improve instmction, and develop richer assessment practices. In the schools where it works both 
teachers and students are engaged in problem-based learning and creative problem-solving to tackle 
difficult and persistent problems. Just as the learning looks fundamentally different for students, the 
work of teachers should look different. 

Teachers have much to gain by enacting a problem-based model in their own work and 
learning endeavors. Just as student work is most effective when it is stmctured as inquiry around an 
ill-defined and authentic problem, teachers learn more when they collaborate with other teachers, 
especially when that learning centers on ill-defined problems evidenced by student work and 
artifacts from their classes (Ball and Cohen, 1999; Cochran-Smith and Lytle, 1999; Hargreaves and 


10 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Fullan, 2010). For teachers, some of the most powerful problem-based learning occurs in 
collaborative settings. 

Strong teacher collaborative groups share certain characteristics. Teachers establish a shared 
commitment to the goals of the work and common norms to govern how they interact (Grossman, 
Wineburg, and Woolworth, 2001). They focus their efforts on solving relevant and authentic 
problems of practice (Cochran Smith and Lytle, 1999; McLaughlin and Talbert, 2006; Horn and 
Little, 2009). In addition, strong collaborative groups develop routines to efficiently complete tasks 
(Hammerness et al, 2005) and create a group culture that increases instructional capacity of teachers 
(Cochran Smith and Lytle, 1999; Hargreaves and Fullan, 2010). Barron and Darling-Hammond 
(1998) offer guidance on how teachers can focus their collaboration to find creative ways to support 
struggling students in a PBL context. 

It should be noted, however, that the specific brand of PBL Sammamish teachers and school 
leaders established was anchored in the research literature but was specific to the students 
Sammamish serves. The Key Elements framework reflects a locally designed and developed PBL 
framework, informed by research and teachers’ classroom practice. 


11 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Chapter 2: Fidelity of Implementation 

This project focused a significant portion of its resources on redesigning the high school 
curriculum and an extant summer program using a Problem-Based Learning (PBL) model. To 
explore potential impacts of this effort the evaluation team engaged in 3 exploratory studies: 

1. An Intermpted Time Series (ITS) research design which used historical and recent data 
(school years 2002-03 through 2014-15) to investigate whether changes in AP Test pass rates 
coincided with the implementation of the intervention (i.e., Problem-Based Learning), 

2. A Pre/Post research design using matched treatment and comparison groups from school 
years 2005-06 through 2011-12 to explore relationships between exposure to PBL and 
student performance on AP Test scores, and 

3. A Pre/Post Quasi-experimental research design (QED) using matched treatment and 
comparison groups to investigate the impacts of a revised summer program on college 
readiness as measured by the Campus Ready instmment. 

In our evaluation report, we combine studies 1 and 2 into one exploratory study and present 
findings from study 3 separately. 

Before discussing the methods and findings of these studies it is important to have a clear 
picture of the specific strategies that the project implemented so that a connection can be made 
between the project implementation and the outcome findings. A major role of the evaluation was 
to help project leadership define and refine this picture in terms of the project’s critical components. 
Using PBL as their over-arching pedagogical vision, the school and evaluation team developed the 
Logic Model to provide the road map for what needed to happen and when and provides a larger 
context and rationale to understand the evaluation team’s Exploratory Studies. Each component is 
necessary for the project to have fidelity to the model. Ideally, these components describe the 
project in enough detail and with sufficient criteria so that the “same” project could be implemented 
by other schools and districts and thus, could lead to similar outcome results. This evaluation task 
resulted in a revised logic model specific to the research foci listed above as well as a ‘fidelity of 
implementation’ rubric. This rubric describes each of the critical project components and a system 
for measuring the extent to which implementation was carried out in terms of desired quality and 
intensity. 

Beginning with the original logic model and then analyzing qualitative and quantitative data 
collected by the evaluation team, seven components of the overall study emerged as critical for the 
fidelity of implementation model. These fell into two categories, Capacity-Building and PBL 
Implementation. Multiple indicators with criteria are used to evaluate the level of implementation 
fidelity for each component. The overall project fidelity of implementation components include: 

Capacity-Building 

Component 1: Increasing Teacher Pedagogical Expertise: Sammamish Institute for Learning and 

Teaching (SILT) 


12 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Component 2: Designing a Rigorous Curriculum: PBL Courses (redesign or new development) 
through PBL Design Teams 

Component 3: Monitoring and Supporting Career Readiness: EPIC Campus Ready Survey, 
Application Support 

Component 4: Development/Adaptation/Adoption of Research Based Framework for PBL 
Design, Implementation, and Evaluation 

Component 5: Developing Distributed Expertise to Support a Rigorous Curriculum: Leadership 
Team 

PBL Implementation 

Component 6: Increasing Rigor and Focusing on 21 st Century Skills in Curriculum: PBL Course 
Implementation 

Component 7: Focusing on 1 st Generation College Bound Students and Developing a PBL 
Laboratory: Starting Strong Summer PBL Implementation 


The Exploratory Studies Logic Model below provides graphical representation of each of 
these components and the logic chain between them and the outcomes measured in the exploratory 
studies. In general the exploratory research questions were: 

/. Does student participation in the redesigned PBL curriculum lead to increased AP Test pass rates especially 
by students with disabilities (SWD) and limited English proficiency (LAPPS) students? 

2. Is student participation in the redesigned PBL curriculum associated with higher AP Test scores and with an 
independent measure of college readiness (i.e., Campus Ready)? 

Each component described below had a least one measurable indicator. In the following 
table each of the seven components and its corresponding indicators is designated by a numeral 
followed by a letter. The numeral refers to the component while the lower case letter refers to its 
corresponding indicator. For example. Component la is component “Increasing Teacher 
Pedagogical Expertise” and its first indicator “Quality of design.” 


13 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Exploratory Studies Logic Model 
Impact of PBL Curriculum Implementation on College Readiness 


INPUTS 


Internal Expertise 

"Leadership 

"Faculty 

-Content Knowledge 

-Pedagogy 

-PD Facilitation 

-Technology 

-ELL/SWD 

"Support 

"Assessment Database 
Leadership Structure 
*Principal/PD 
"Project Leaders [2] 

"Project Manager 
"Implement. Team 
"Evaluation 
External Expertise 
*U Washington 
"Content 

"Corporate Mentors 

"Parents 

"Community 

■"Advisory Board 

Match Partners 

"GLEF 

*WA STEM 

"BSF 

"EPIC 

"College Board 
"Google 
"Microsoft 
"STEPS 

Culture 

"Respect/Credibility 
"Intellectual Risk-Taking 
"Data-Driven Decision-Making 

Tech. Infrastructure 
Learning Resources 


ACTIVITIES 


CAPACITY-BUILDING ACTIVITIES 

Component 1: Increasing Teacher 
Pedagogical Expertise: (SILT) 

Component 2: Designing a Rigorous 
Curriculum: PBL Courses through PBL 
Design Teams 

Component 3: Monitoring and Supporting 
Career Readiness: EPIC Campus Ready 
Survey and FAFSA Application Support 

Component 4: 

Development/Adaptation/Adoption of 
Research Based Framework for PBL 
Design, Implementation, and Evaluation 

Components: Developing Distributed 
Expertise to Support a Rigorous 
Curriculum: Leadership Team 

PBL IMPLEMENTATION ACTIVITIES 

Component 6: Increasing Rigor and 
Focusing on 21st Century Skills in 
Curriculum: PBL Course Implementation 

Component 7: Focusing on 1 st Generation 
College Bound Students and Developing a 
PBL Laboratory: Starting Strong Summer 
PBL Implementation 


SHORT TERM 
OUTCOMES 


Teachers: 

Increased Knowledge of Key 
Elements of PBL 
Increase Use of Key Elements in 
Practice 

N Students: 

Increased Engagement in 
Learning 

Increased Use of Student Voice 
Increased Effective Student 
Collaboration 
Increased Skills in Public 
Performance (Authentic 
Assessment) 



OUTCOMES 


Increased College Readiness : 

AP Exam Pass Rates 
AP Exam Scores 
Campus Ready Survey 

Increased Subgroup Impacts: 

SWD/LEPS AP Enrollment 
SWD/LEPS AP Exam Pass Rates 
SWD/LEPS AP Exam Scores 


14 










































Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Component 1: Increasing Teacher Pedagogical expertise: Sammamish Institute for Teaming and Teaching (SILT) 


Indicators and Operational Definitions 

Data Collection 

Fidelity Scale 

Criterion 

1A. Quality of design: Percent of SILT 
institutes over 5 years that are designed in 
accordance with the SHS professional 
learning framework: (Framing, Choice, 
Application, Reflection) 

Documentation 

1: 0% to 20% meet PL criteria 

2: 21% to 40% meet PL criteria 

3: 41% to 60% meet PL criteria 

4: 61% to 80% meet PL criteria 

5: 81% to 100% meet PL criteria 

Low=l,2 

Med=3 

High=4,5 

IB. Quality of delivery: Internal Faculty 
Expertise Utilized 

Percent of faculty that design and/or lead 
sessions 

SILT Design 

Documents 

1: 0 % to 10% of faculty design 
and/or present 

2: 11% to 20% of faculty design 
and/or present 

3: 21% to 30% of faculty design 
and/or present 

4: 31% to 40% of faculty design 
and/or present 

5: 41% or more of faculty design 
and/or present 

Low=l,2 

Med=3 

High=4,5 

1C. Participant engagement 1: Faculty 
Attendance - the % of faculty that attend 
each year 

(Sum of attendance each day) / (num 
days*total faculty) 

SILT Daily Rosters 

1: 0% to 10% of faculty attend 

2: 11% to 25% of faculty attend 

3: 26% to 50% of faculty attend 

4: 51% to 75% of faculty attend 

5: 76% to 100% attend 

Low=l,2 

Med=3 

High=4,5 

ID. Participant engagement 2: 

Participant ratings 

Average overall SILT Rating across all 
years 

SILT Participant survey 

1: 0% to 20% rate SILT as relevant 

2: 21% to 40% rate SILT as relevant 

3: 41% to 60% rate SILT as relevant 

4: 61% to 80% rate SILT as relevant 

5: 81% to 100% rate SILT as 
relevant 

Low=l,2 

Med=3 

High=4,5 


15 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Component 2: Designing a Rigorous Curriculum: PBL Courses through PBL Design Teams 


Indicators and Operational Definitions 

Data Collection 

Fidelity Scale 

Criterion 

2A. Exposure: Total number of design 
team meetings offered (i.e., release time 
provided) per course team 

PBL Meeting logs 

1: 10 to 35 times per team 

2: 36 to 70 times per team 

3: 71 to 105 times per team 

4: 106 to 140 times per team 

5: 141 or more times per team 

Low=l,2 

Med=3 

High=4,5 

2B. Participation: Percent of teachers that 
participate on 1 or more design teams 

Database 

1:0 to 20% participate on Design 

Team 

2:21 to 40% participate on Design 
Team 

3:41 to 60% participate on Design 
Team 

4:61 to 80% participate on Design 
Team 

5:81 to 100% participate on Design 
Team 

Low=l,2 

Med=3 

High=4,5 

2C. Participant engagement: Participant 
ratings — Percent team members rating PBL 
Design Process as valuable to practice 

PBL Team member 
interviews 

1:0 to 20% rate Design Process as 
valuable 

2:21 to 40% rate Design Process as 
valuable 

3:41 to 60% rate Design Process as 
valuable 

4:61 to 80% rate Design Process as 
valuable 

5:81 to 100% rate Design Process as 
valuable 

Low=l,2 

Med=3 

High=4,5 


16 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Component 3: Monitoring and Supporting Career Readiness: EPIC Campus Ready Survey, Application Support 


Indicators and Operational Definitions 

Data Collection 

Fidelity Scale 

Criterion 

3A. College Readiness Monitoring: 

Campus Ready Assessment administered 
to all students 

Survey Data 

1: 0% to 50% response rate 

2: 51% to 70% response rate 

3: 71% to 80% response rate 

4: 81% to 90% response rate 

5: 91% to 100% response rate 

Low=l,2 

Med=3 

High=4,5 

3B. FAFSA Completion: Percent of 
seniors that fill out FAFSA 

Documentation 

1: 0% to 50% response rate 

2: 51% to 70% response rate 

3: 71% to 80% response rate 

4: 81% to 90% response rate 

5: 91% to 100% response rate 

Low=l,2 

Med=3 

High=4,5 

3C. Early AP: Percent of students that 
take AP Human Geography during their 
freshman. 

Documentation 

1: 0% to 50% of freshman take AP 
Hum Geo 

2: 51% to 70% of freshman take AP 
Hum Geo 

3: 71% to 80% of freshman take AP 
Hum Geo 

4: 81% to 90% of freshman take AP 
Hum Geo 

5: 91% to 100% freshman take AP 
Hum Geo 

Low=l,2 

Med=3 

High=4,5 


Component 4: Development I Adaptation I Adoption of Research Based Framework forPBE Design, Implementation, and 
Evaluation 


Indicators and Operational Definitions 

Data Collection 

Fidelity Scale 

Criterion 

4A. Research-Based Framework: 

Adoption of a research-based model to guide 
project activities 

Documentation 

0: No framework is developed or 
adopted 

5: Research-based PBL framework is 
developed and/or adopted 

Low=l 

High=5 


17 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Component 5: Developing Distributed Expertise to Support a Rigorous Curriculum: Readership Team 


Indicators and Operational Definitions 

Data Collection 

Fidelity Scale 

Criterion 

5A. Leadership Team Membership: 

Percent of core departments represented 
on Leadership Team (8 core departments) 

- five year analysis 

Documentation 

1: 0% to 19% of departments 
represented 

2: 20% to 39% of departments 
represented 

3: 40% to 59% of departments 
represented 

4: 60% to 79% of departments 
represented 

5: 80% to 100% of departments 
represented 

Low=l,2 

Med=3 

High=4,5 

5B. Leadership Team Roles: Team 
member roles specifically defined in terms 
of duties, responsibilities, and authority. 

Documentation 

1: Roles not explicitly defined 

5: Roles explicitly defined 

Low=l 

High=5 

5C. Meeting Frequency: Frequency of 
Leadership Team Meetings, five year 
analysis 

Documentation 

1: None or Yearly 

2: Quarterly 

3: Monthly 

4: Weekly 

5: Several times per week 

Low=l,2 

Med=3 

High=4,5 

5D. Design Team Support. The average 
number of yearly visits to each design 
team to provide support. 

Interviews, 

Documentation 

1: 0 to 1 visit 

2: 2 visits 

3: 3 to 4 visits 

4: 5 to 6 visits 

5: 7 or more visits 

Low=l,2 

Med=3 

High=4,5 

5E. Advisory Board: Frequency of yearly 
Advisory Board meetings 

Documentation 

1: 0 to 1 meetings 

2: 2 meetings 

3: 3 meetings 

4: 4 meetings 

5: More than 4 meetings (as needed) 

Low=l,2 

Med=3 

High=4,5 


18 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Component 6: Increasing Rigor and Focusing on 21 st Century Skills in Curriculum: PBL Course Implementation 


Indicators and Operational Definitions 

Data Collection 

Fidelity Scale 

Criterion 

6A. Exposure to PBL: Percent of courses 
redesigned as PBL out of all courses 

SHS Course 

Catalog 

PBL Classroom 
Observations 

1: 0% to 20% of the slated courses 
implemented 

2: 21% to 40% of the slated courses 
implemented 

3: 41% to 60% of the slated courses 
implemented 

4: 61% to 80% of the slated courses 
implemented 

5: 81% to 100% of slated courses 
implemented 

Low=l,2 

Med=3 

High=4,5 

6B. Quality of PBL Delivery 1: Percent of 
courses that meet PBL Criteria (Sampled) 
(measured during project year 4] 

PBL Observation 
Protocol 

Teacher 

Interviews 

1: 0% to 20% of courses meet criteria 

2: 21% to 40% of courses meet criteria 

3: 41% to 60% of courses meet criteria 

4: 61% to 80% of courses meet criteria 

5: 81% to 100% of courses meet 
criteria 

Low=l,2 

Med=3 

High=4,5 

6C. Quality of PBL Delivery 2: Percent of 
courses that meet PBL Criteria as assessed 
by Leadership Team [measured during 
project year 5] 

Leadership Team 
Assessment 

1:0 to 20% of courses meet criteria 

2: 21 to 40% of courses meet criteria 

3: 41 to 60% of courses meet criteria 

4: 61 to 80% of courses meet criteria 

5: 81 to 100% of courses meet criteria 

Low=l,2 

Med=3 

High=4,5 

6D. Quality of PBL Delivery 3: Percent of 
courses that meet PBL Criteria as assessed 
by seniors [[measured during project years 4 
and 5] 

Senior Survey 
(years 4 and 5) 

1: 0% to 20% of courses meet criteria 

2: 21% to 40% of courses meet criteria 

3: 41% to 60% of courses meet criteria 

4: 61% to 80% of courses meet criteria 

5: 81% to 100% of courses meet 
criteria 

Low=l,2 

Med=3 

High=4,5 

6E. Teacher PBL Adoption of PBL 
Innovation: Percent of design team 
teachers whose adoption curves match 
desired CBAM trajectory - five year analysis 

Concerns Based 
Adoption Survey 

1: 0% to 20% of teachers desired 
adoption curve 

2: 21% to 40% of teachers desired 
adoption curve 

3: 41% to 60% of teachers desired 
adoption curve 

4: 61% to 80% of teachers desired 
adoption curve 

5: 81% to 100% teachers desired 
adoption curve 

Low=l,2 

Med=3 

High=4,5 


19 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Component 7: Focusing on 1 st Generation College Bound Students and Developing a PBL Laboratory: Starting Strong 
Sub-component 7A.: Design of PBL Challenges and Workshops 


Indicators and Operational 
Definitions 

Data Collection 

Fidelity Scale 

Criterion 

7A.1. Co-designed with External 
Partners: The design process 
transcends the boundaries of the 
school by engaging industry partners 
in the design of the PBL challenges. 
Definition: 

Each challenge involves at least 1 
active external partner in the design 
process. 

Interviews with director and 
analysis of design documents 

Frequent: the evaluator is present in 
some design meeting 

1 : 0% to 20% of challenges 
designed with external 
partner 

2: 21% to 40% of challenges 
designed with external 
partner 

3: 41% to 60% of challenges 
designed with external 
partner 

4: 61% to 80% of challenges 
designed with external 
partner 

5: 81% to 100% of 
challenges designed with 
external partner 

Low=l,2 

Med=3 

High=4,5 

7A.2 Authentic Challenges and 

Student Survey 

1: 0% to 20% of participants 

Low=l,2 

Workshop Content: The content of 

Pre, mid, post 

rate their challenge as 

Med=3 

Starting Strong is authentic to real 

authentic 

High=4,5 

contexts and related to college and 
career readiness 


2: 21% to 40% of 
participants rate their 
challenge as authentic 

3: 41% to 60% of 
participants rate their 
challenge as authentic 

4: 61% to 80% of 
participants rate their 
challenge as authentic 

5: 81% to 100% of 
participants rate their 
challenge as authentic 


20 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Sub-cojnponent 7B: Implementation of PBL Challenges and Workshops 


Indicators and Operational 
Definitions 

Data Collection 

Fidelity Scale 

Criterion 

7B.1. Co-Facilitated with External 
Partners: The implementation of the 
PBL challenges involves the external 
partner as a co-facilitator. 

Observation during Starting 
Strong 

Each PBL challenge is 
observed for at least 30 
minutes 3 times each during 
Starting Strong. 

Observers score each 
observation as a 0 (not 
facilitate) or 1 (co-facilitated 
with external partner) 

1 : 0% to 20% of challenges co¬ 
facilitated 

2: 21% to 40% of challenges co¬ 
facilitated 

3: 41% to 60% of challenges co¬ 
facilitated 

4: 61% to 80% of challenges co¬ 
facilitated 

5: 81% to 100% of challenges co¬ 
facilitated 

Low=l,2 

Med=3 

High=4,5 

7B.2. Attendance on Critical Days: 

Students must attend and participate 
for a minimum amount of time and 
days during the ‘critical days. Critical 
days are defined as days 2 through 6 
since that is where the bulk of the 

PBL intensive work occurs. 

Attendance logs 

Collected daily 

1: 0% to 20% of students attend 
days 2-6 

2: 21% to 40% of students attend 
days 2-6 

3: 41% to 60% of students attend 
days 2-6 

4: 61% to 80% of students attend 
days 2-6 

5: 81% to 100% of students attend 
days 2-6 

Low=l,2 

Med=3 

High=4,5 

7B.3. Active Student Participation 
& Collaboration: Percent of students 
that participate actively with their 
peers in collaborative groups. 

Student Surveys 

Pre, mid, post Starting Strong 

1: 0% to 20% collaborate 
effectively 

2: 21% to 40% collaborate 
effectively 

3: 41% to 60% collaborate 
effectively 

4: 61% to 80% collaborate 
effectively 

5: 81% to 100% collaborate 
effectively 

Low=l,2 

Med=3 

High=4,5 

7B.4. Authentic Assessment: 

Percent of students that demonstrate 
their knowledge, skills and 
understanding through public 
performance 

Attendance logs 

1: 0% to 20% of students present 
publically 

2: 21% to 40% of students present 
publically 

3: 41% to 60% of students present 
publically 

4: 61% to 80% of students present 
publically 

5: 81% to 100% students present 
publically 

Low=l,2 

Med=3 

High=4,5 


21 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Scoring 

Because this was a development project the criteria and thresholds for fidelity of implementation did not 
exist previously and thus were determined by the actual implementation data collected via interviews, 
surveys, observations, and document review. From these data the following thresholds were established 
for each component. Crossing these thresholds equates to fidelity of implementation. 


Component 

Threshold 

1 

Must have a score of high on at least three of the indicators AND no score of low on any 
indicator to meet fidelity at the School Level. 

2 

Must have a score of high on at least two of the indicators AND no score of low on any 
indicator to meet fidelity at the School Level. 

3 

Must have a score of high on at least two of the indicators AND no score of low on any 
indicator to meet fidelity at the School Level. 

4 

Must have a score of high to meet fidelity at the School Level. 

5 

Must have a score of high on at least three of the indicators AND no more than one score 
of low on any indicator to meet fidelity at the School Level. 

6 

Must have a score of high on at least three of the indicators AND no score of low on any 
indicator to meet fidelity at the School Level. 

7A&7B 

Must have a score of high on at least three of the indicators AND no score of low on any 
indicator to meet fidelity at the School Level. 


22 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Analysis 

On nearly every component evaluated, Sammamish High School implemented policies and tools to 
support PBL design and implementation to a high degree of fidelity. 



Component 

Project Years 

Threshold Criteria 

Fidelity Score 

Implemented with 
Fidelity? 

Notes 

1 

Increasing Teacher Pedagogical 
Expertise: Sammamish Institute for 
Learning and Teaching (SILT) 

1 - 5 

High fidelity at 
the school level 
= at least 3 of 4 
indicators have 
a score of high 
and no indicator 
has a score of 
low 

Indicators: 4 
Highs: 4 

Meds: 0 

Lows: 0 

Yes 

SILT and other professional 
development provide by 
project was a strength. 

2 

Designing a Rigorous Curriculum: 

PBL Courses through PBL Design 
Teams 

1-5 

High fidelity at 
the school level 
= at least 2 of 3 
indicators have 
a score of high 
and no indicator 
has a score of 
low 

Indicators: 3 
Highs: 3 

Meds: 0 

Lows: 0 

Yes 

Teachers reported that time to 
design with colleagues was 
essential. 

3 

Monitoring and Supporting Career 
Readiness: EPIC Campus Ready 
Survey, Application Support 

1 - 5 

High fidelity at 
the school level 
= at least 2 of 3 
indicators have 
a score of high 
and no indicator 
has a score of 
low 

Indicators: 3 
Highs: 2 

Meds: 1 

Lows: 0 

Yes 

Campus Ready administered 
consistently. Support for 
college application was not a 
visible part of project. 

4 

Development/Adaptation/Adoption 
of Research-Based Framework for 
PBL Design, Implementation and 
Evaluation 

1-5 

High fidelity at 
the school level 
= indicator 
must have a 
score of high 

Indicators: 1 
Highs: 1 

Meds: 0 

Lows: 0 

Yes 

Data suggests that the redesign 
of courses based on the 
conceptual framework was 
more important than 
implementing a specific 
instructional approach (PBL) 
that is consistent with the 
conceptual framework. 

5 

Developing Distributed Expertise to 
Support a Rigorous Curriculum: 
Leadership Team 

1 - 5 

High fidelity at 
the school level 
= at least 3 of 5 
indicators have 
a score of high 
and no indicator 
has a score of 
low 

Indicators: 5 
Highs: 3 

Meds: 2 

Lows: 0 

Yes 

The leadership team was 
essential. Data suggests that 
clarity around the roles and 
membership of the leadership 
team, and intense support 
during course implementation 
would lead to better 
implementation. 


23 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 



Component 

Project Years 

Threshold Criteria 

Fidelity Score 

Implemented with 
Fidelity? 

Notes 

6 

Increasing Rigor and Focusing on 

21 st Century Skills in Curriculum: 

PBL Course Implementation 

1-5 

Fligh fidelity at 
the school level 
= at least 3 of 5 
indicators have 
a score of high 
and no indicator 
has a score of 
low 

Indicators: 6 
Highs: 3 

Meds: 2 

Lows: 0 

Yes 

Not all redesigned courses 
were implemented. 

7 

Focusing on 21 st Generation College 
Bound Students and Developing a 
PBL Laboratory: Starting Strong 

1 

Fligh fidelity at 
the school level 
= at least 3 of 6 
indicators have 
a score of high 
and no indicator 
has a score of 
low 

Indicators: 6 
Highs: N/A 
Meds: N/A 
Lows: N/A 

N/A 

The revised Starting Strong 
summer program had not been 
‘PBL-revised’ by the end of 
project year one. 

7 

Focusing on 21 st Generation College 
Bound Students and Developing a 
PBL Laboratory: Starting Strong 

2 

Fligh fidelity at 
the school level 
= at least 3 of 6 
indicators have 
a score of high 
and no indicator 
has a score of 
low 

Indicators: 6 
Highs: 3 

Meds: 0 

Lows: 0 

Yes 

The redesigned summer 
program was initiated but full 
compatibility with PBL had not 
yet occurred. 

7 

Focusing on 21 st Generation College 
Bound Students and Developing a 
PBL Laboratory: Starting Strong 

3-5 

Fligh fidelity at 
the school level 
= at least 3 of 6 
indicators have 
a score of high 
and no indicator 
has a score of 
low 

Indicators: 6 
Highs: 0 

Meds: 0 

Lows: 0 

Yes 

The redesigned summer 
program became an exemplary 
instantiation of Problem Based 
Learning (using the Key 
Elements conceptual 
framework as criteria.) 


24 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Chapter 3: Qualitative Methods 

In our evaluation we use a mixed methods approach to triangulate the data to evaluate the impact 
the innovation (PBL) had on student learning outcomes. We use quantitative methods to compare student 
achievement outcomes for students from the 2004, 2005, 2006 cohorts with the students from the 2011 
cohort. A student cohort is named for the year in which they started as freshman at Sammamish High 
School. For students to be included in a cohort sample, they must have completed all four years of high 
school at Sammamish High School among other variables we discuss in length below. We also use the 
Concerns-Based Adoption Model (CBAM) survey to measure teachers’ concerns as they worked to adopt 
and implement the innovation (PBL) in their classes. Our qualitative measures include teacher interviews, 
focus groups, and classroom observations. Below we describe in detail how we collected, coded, analyzed, 
and validated the data to establish our findings. 

Qualitative Measures 

We use teacher interviews to evaluate how teachers were defining PBL over time and how they 
perceived its efficacy in their classroom practice. We also observed each redesigned course three times in 
the span of four months in 2013-2014 to determine the extent to which teachers were teaching PBL 
pedagogy in their day-to-day classroom practice. We describe our process for establishing protocols, 
collecting data, coding and analyzing the data, and validating our findings below. 

Teacher interviews 

Teachers, teacher leaders, and administrators were interviewed multiple times between 2010-2015. 
Teachers and teacher leaders were interviewed at various points from 2010-2014 to gain their perspective 
on the PBL curriculum design and implementation process was unfolding. Teacher leaders and the 
principle were also interviewed multiple times from December 2014-July and August 2015 to gain their 
perspective on their duties and responsibilities as school leaders throughout the life of the grant. In some 
cases there was overlap between the teachers who had been interviewed at various points since 2010 and 
the teachers who participated on the leadership and who were then interviewed 4 times between 
December 2014 and July/August 2015. 

Teachers, teacher leaders, and the project manager (school principal) were interviewed between 
November 2013-January 2014. The research team interviewed teachers who had been funded members of 
a curriculum redesign team at any point between 2010-2013. We did not interview teachers who had 
participated on a design team but who had left the school. This sample included 36 teachers in the Math, 
Science, Social Studies, English, Physical Education, and Career and Technical Education departments. 
Interviews lasted between 30 minutes to 90 minutes depending on the teacher and their responses to the 
questions we asked. 

A semi-structured interview protocol (cite here), adapted from the Levels of Use interview 
protocol (Hall, Dirksen, and George, 2006), was used for all interviews. The research team designed and 
field-tested the protocol with 2 teachers previous to use for formal data collection purposes. Three staff 
researchers conducted the interviews. Each interviewer was given the flexibility to adjust the interview as 
needed while making sure that each teacher was asked the same questions. We used a semi-structured 
protocol to provide the research team with specific questions designed to garner specific information from 


25 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

teachers but that would flexible enough to allow teachers the space to respond to issues or topics that we 
may not have asked. 

After the research team conducted all necessary interviews, each researcher transcribed the 
interviews they conducted. Interviews were transcribed using Microsoft Word and were uploaded to a 
Dropbox file. Two copies of interviews were saved. One anonymized (Excel file) and one with identifiable 
information attached (Word file). Another member of the research team, who did not conduct the 
interviews or have any knowledge of the school or project, transferred each interview into an Excel file 
and anonymized each interview by teacher but nested each file into department groups. 

Analysis of each interview was conducted in two stages. Anonymized interviews were redistributed 
to researchers who then conducted initial open coding of one interview. The research team then met to 
review each open coded interview to surface patterns and themes that emerged and to conduct a more 
structured coding process of each interview. More than 40 codes emerged from this process. Codes were 
developed around the research questions used to drive the interview process. Questions include: 

• What does PBL look like? 

• How are teachers differentiating, if at all, between the Key Elements and PBL? 

• Who is teaching PBL? To what extent? 

• What do teachers think about it? 

Based on the analysis performed by the research team, teachers were given numbers describing their level 
of understanding and use of PBL and the Key Elements according to the Level of Use (LOU) interview 
protocol. For example, a teacher who described their understanding and implementation of PBL in 
sophisticated and complex ways was given a 6 or 7 depending on the specifics of their interview. Teachers 
who described their understanding of PBL in simplistic ways and whose interview revealed little if any 
implementation of PBL pedagogy in their classroom received a 0 or 1 rating. Table 3, taken directly from 
Hall, Dirksen, and George (2006), illustrates the categories used to classify each teacher. 

Table 3 


Levels of Use (LOU) 

Scale Point 

Description 

Level 0: Nonuse 

State in which the user has little or no knowledge of the 
innovation, has no involvement with the innovation, and is 
doing nothing toward becoming involved. 

Level 1: Orientation 

State in which the user has acquired or is acquiring 
information about the innovation and/or has explored or is 
exploring its value orientation and its demands upon the user 
and the user system. 

Level 2: Preparation 

State in which the user is preparing for first use of the 
innovation. 

Level 3: Mechanical Use 

State in which the user focuses most effort on the short-term, 
day-to-day use of the innovation with little time for reflection. 
Changes in use are made more to meet user needs than client 
needs. The user is primarily engaged in a stepwise attempt to 
master the tasks required to use the innovation, often resulting 


26 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 



in disjointed and superficial use. 

Level 4: Routine 

Use of the innovation is stabilized. Few if any changes are 
being made in ongoing use. Little preparation or thought is 
being given to improving innovation use or its consequences. 

Level 5: Refinement 

State in which the user varies the use of the innovation to 
increase the impact on clients within immediate sphere of 
influence. Variations are based on knowledge of short and 
long-term consequences for clients. 

Level 6: Integration 

State in which the user is combining own efforts to use the 
innovation with the related activities of colleagues to achieve a 
collective impact on clients within their common sphere of 
influence. 

Level 7: Renewal 

State in which the user reevaluates the quality of use of the 
innovation, seeks major modifications or alternatives to the 
present innovation to achieve increased impacts on clients, 
examines new developments in the field, and explores new 
goals for self and the system. 


After the research team rated each teacher, based on their interview responses, we grouped 
teachers by department and an average score was given for each department. Scores for departments 
ranged from 2-6.5. This score was in no way comprehensive and final of where we thought each 
department stood at that point in time. If anything, interviews revealed variability between teachers within 
departments, many times with scores for teachers within departments ranging between every LOU Level. 
However, the score developed for each department informed how we interpreted other data we had and 
were collecting at the time. 

Classroom Observations 

We conducted classroom observations concurrently with teacher interviews between December 
2013-March 2014. We observed each teacher who was funded to participate on a design team, teaching a 
course they helped redesign. We conducted 3 observations of each teacher and negotiated with them to 
identify classes when they thought they would be teaching PBL curriculum. This process allowed us an 
opportunity to be efficient with our time and also triangulate our observations of what PBL pedagogy 
looks like in the classroom with teachers ideas about what PBL pedagogy looked like in practice. 

We designed the protocol to align with the school’s Key Elements of Problem-Based Learning. 
This document articulates the specific kind of PBL teachers designed into their courses and guided all of 
the course redesign work that occurred between 2010-2015. The protocol consisted of two parts. First, 
because PBL is largely a student-centered and focused pedagogy, we wanted to observe the amount of 
class time focused on direct instruction (teacher-centered) and the amount of time focused on student- 
centered independent and collaborative learning. In 3-minute increments, during classroom observations, 
researchers marked whether the instruction was teacher-centered or student-centered. Second, researchers 
observed how many of the Key Elements were present during the lesson and to what extent. 


27 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Before we conducted classroom observations, we field-tested the protocol with all the researchers 
who were responsible for conducting observations. During field-testing, we established a high level of 
inter-rater reliability, especially regarding the Key Elements section of the protocol. Each observer was 
asked to read and understand the Key Elements ahead of time. All three researchers spent an entire day 
conducting observations and debriefing observations after each observed class to establish a common 
vision of what each Key Element looks like in the classroom. We also established ways of observing the 
presence of such PBL components like an authentic problem and authentic assessment even if we had not 
observed them specifically in an observation. For example, students may be working toward solving an 
authentic problem but the task they may have been working on during one of our observations may not 
have been directly connected to the problem-solving task. As a team, we decided that even though solving 
the problem was not a specific task students were asked to work toward, because they were working within 
a context of a unit dedicated to an authentic problem, that our observations would count authentic 
problems as an observed component of that specific class. 

After we our classroom observations, we considered two questions before we started coding and 
analyzing the data from the observations. First, what was the threshold we were going to use to define 
whether or not PBL was being implemented in any particular classroom? The Key Elements document 
articulates 7 different components of PBL pedagogy. However, not every element is equal to establishing a 
high level of PBL in a lesson. Second, how were we going to describe PBL implementation in a way that 
takes into account all the different ways teachers were attempting to implement PBL throughout the 
school? 

We spent an entire day as a research team discussing what we had observed and describing what 
we thought “counted” as PBL teaching and learning. We concluded on the following hierarchy of elements 
to distinguish between teaching and learning that demonstrated PBL and teaching and learning that 
demonstrated PBL readiness but not full implementation. 

• PBL Implementation is characterized by the presence of the following PBL elements: 

o Authentic Problems 
o Authentic Assessment 
o Culturally Relevant Instmction 

• PBL Readiness is characterized by the presence of the following PBL elements but lacking in the 

presence of Authentic Problems, Authentic Assessment: 

o Culturally Responsive Instmction 
o Student Voice 
o Academic Discourse 
o Collaboration 
o Expertise 

Our observations revealed that a teacher may be leveraging high levels of student collaboration and 
culturally responsive instruction in his/her instruction, but if those elements are detached from an 
authentic problem students are working on or an authentic assessment students will complete at the end of 
a unit, then that teacher’s instmction demonstrates PBL readiness but not PBL implementation. Likewise, 
if a teacher anchors student learning in an authentic problem and will be assessing students’ learning in an 
authentic way, then they are also likely asking students to work collaboratively and they have made efforts 
to make learning culturally relevant to students, for example. Based on our observations, for student 


28 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


learning to count as problem-based learning, students must be engaged in authentic problems and must be 
assessed authentically, in ways that reflect how people are assessed when working within specific 
professions. 

Classroom observations served as a key measure of PBL intensity. Once all classroom observation 
data was collected, data was uploaded into a master Excel file. Data from each classroom observation were 
converted into a rough percentage of time spent on teacher-centered instruction and student-centered time. 
As PBL is a heavily student-centered pedagogical model, this percentage provided us with a starting point 
for describing the extent to which any particular lesson had strong PBL elements to it. For a specific 
lesson to be considered a potential PBL lesson, student-centered activity would need to meet or surpass 
75% of the observed class time. Table 4 below describes the threshold we used to differentiate between 
lessons that demonstrated PBL implementation and lessons that demonstrated PBL readiness. 


Table 4. 


Establishing the Threshold for PBL Implementation and PBL Readiness 


PBL Readiness 

PBL Implementation 

Teacher-centered instruction 

>25% 

<25% 

Student-centered activities 

<75% 

>75% 

Key Elements observed 

Culturally responsive instruction 
Expertise 

Collaboration 

Academic discourse 

Student voice 

Authentic problems 
Authentic assessment 
Culturally relevant instruction 
^Expertise 
^Collaboration 
* Academic discourse 
^Student voice 

(*) Designates key elements that may or may not have been present in PBL Implementation lessons. 

Classroom observations reveal than in many cases, lessons that demonstrated students explicitly working 
toward solving an authentic problem and an authentic assessment also demonstrated strong use of other key 
elements. 


Classroom observation data were analyzed using the above rubric to differentiate between lessons 
that exhibited PBL Readiness and lessons that exhibited PBL Implementation. In addition, we layered data 
from teacher interviews to identify courses that met the threshold of PBL Implementation. We grouped 
together teachers we observed who taught the same courses and combined the teacher interview and 
classroom observation data. Courses were deemed PBL Exemplars if 1) the lessons teachers taught met 
the criteria for PBL Implementation and 2) who scored highly (5-7) on our teacher interview ranking scale. 
At the time, those courses included AP Human Geography, AP United States Government, BioChem, 
Marketing, and AP Chemistry. 


29 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Chapter 4: Increasing Teacher Pedagogical Expertise 

In order to implement PBL across content-areas, school leaders deployed innovative approaches to 
building capacity in their teaching staff through a robust infrastructure of professional development. 
Believing that best curriculum is that which is designed by existing staff to serve the specific needs of their 
students, the school made large investments in professional learning designed to build capacity in teachers 
so that they could design engaging PBL coursework. At the time of the grant, a vast majority of the 
teachers had little knowledge, expertise, or experience with PBL. School leaders supported teachers' 
professional growth by 1) providing them the paid time and space to design PBL curriculum in design 
teams; 2) developing SILT, the teacher-led and designed summer professional learning experience; and 3) 
providing teachers further support to implement PBL in teacher-led and designed monthly staff meetings. 
Most importantly, each component of the professional learning infrastructure relied heavily on teachers' 
evolving PBL expertise to provide relevant and authentic professional development. 

Sammamish Institute of learning and Teaching (SILT). The collaborative culture developing within 
design teams bled into how teacher leaders approached how they designed SILT. Starting the Spring 
before, teacher leaders identified teachers whose expertise matched well with what the theme of the next 
summer's SILT would be. For example, if part of the next year’s SILT would be dedicated to authentic 
assessment, teacher leaders recruited teachers within the school who were experimenting with standards- 
based grading or assessing collaborative groups and asked them to design sessions for the upcoming SILT. 
Teacher leaders would set the agenda but teachers would design and lead the sessions. During one year of 
SILT, teacher leaders designed SILT like a conference, providing teachers with choices of sessions they 
could attend throughout the day. These two principles, teacher-designed and led sessions and teacher 
choice, established a culture of teacher professionalism and deepened teachers' pedagogical expertise. 
Teachers began to view their colleagues as their first and most important resource to improve their 
instructional practice. 

Monthly Staff Meetings. Although not new to the project, school leaders repurposed monthly staff 
meetings to support teachers’ ongoing PBL design and implementation process within the school. Instead 
of the principal standing in front of the staff issuing dictates, teachers led their colleagues in thoughtfully 
designed professional learning focused on some aspect of PBL design or implementation, usually aligned 
with the focus of that year's SILT workshop. Staff meetings afforded teachers the time and space to 
collaborate across content-areas on universal problems of practice, such as how to support ELLs as they 
struggled to work in collaborative groups within PBL focused units. Like the SILT workshops, staff 
meetings provided teachers with "just in time" professional learning opportunities where they could learn 
from their colleagues' successes and stmggles. 

PBL Curriculum Design Teams. To redesign traditional curriculum into PBL curriculum, school 
leaders established PBL curriculum design teams. Design teams, as they were referred to, consisted of a 
group of 3-5 teachers who had experience teaching the course they were redesigning. Membership was 
diverse and voluntary and was not constrained by seniority, years of teaching experience, or perceived 
expertise. Once on a design team, teachers were provided with a daily, common, planning period to meet 
to redesign an established course into a PBL course. 


30 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

School leaders designed and established a professional development infrastructure that supports 
professional growth and learning around problem based learning (PBL) pedagogy and principles. Each 
teacher may be on a different professional growth trajectory focused on various instructional practices, 
beliefs, and values, but the school expects that teachers immerse themselves in a process of collaborative 
instructional improvement and that they keep moving forward in that process throughout the year. The 
school developed three different, formal spaces where teachers could engage in professional development 
and growth: the Sammamish Institute for Teaching and Learning (SILT), monthly staff meetings, and PBL 
Design Teams. (Curricular design teams are described below as Component 2. These activities served 
both as a source of professional learning and as the process through which courses were redesigned.) We 
summarize each space briefly below and then describe each more in detail later in the chapter. 

Table 5 summarizes the way each professional learning context contributes to teachers’ 
professional growth. 


Table 5. 


Sammamish High School Professional Learning Infrastructure 


SILT 

Staff meeting 

Design teams 

Purpose 

Teachers build a common 

Refine, revisit, further 

Integration of PBL theory 


language about “the 

learning/knowledge 

and collective practice in 


work”/PBL principles and 

experienced in SILT. 

collaboration with other 


pedagogy using the Key 


teachers. 


Elements as a focus. 

Work with different groups 
throughout the year—design 

Examination of one’s 


To disrupt “traditional” ways 

teams, department groups, 

practice in a collaborative 


of thinking about students 

interdisciplinary groups. 

setting in the process of 


and student learning. 


innovation and 
experimentation. 




Sustained collaboration to 




engage in cyclical process of 
reflection and inquiry. 

Process 

Choice 

Choice when appropriate 

Shared task for a specific 
course they are teaching 


Interdisciplinary team plans it 

Highly structured and 



and represents a variety of 

organized around focal Key 

Given resources/tools 


perspectives in the building 

Element 

Freedom to structure their 


Leverages expertise of 

Teacher led with multiple 

own learning and 


partners, students, research 

teachers leading over the 

collaborative practice 


literature 

course of the year 

Collaborative with other 


Teacher led 

Planned by teacher leadership 

teachers who draw from 



team 

various expertise and 


Knowledge processed 
collaboratively 

Leverages collaborative and 

experiences in the classroom 



cooperative learning strategies 

Results from teachers 


Focus on problems and 


working through different 


practices that are authentic to 


opinions, values, beliefs 


teacher’s individual practice 


about pedagogy and practice 


31 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 





Application of principles 
reified in the Key Elements 
to teachers’ collective and 
individual practice 


Summer Institute of Learning and Teaching (SILT) 

Among other things, the funding received from the i3 grant facilitated an expansion and overhaul 
of how the school administered professional development and learning. Prior to receiving the grant, 
teachers participated in monthly staff meetings and two days of required professional development during 
the summer. In order to support teachers as they increased their knowledge and facility with PBL 
pedagogy, the school developed or adapted three components of professional learning: the summer 
institute of learning and teaching (SILT), PBL Design Teams, and monthly staff meetings. With few 
exceptions, teachers designed, developed, and led each component of their own professional learning. 

The school facilitated the SILT professional development days starting in August 2011 and 
continued through the 2014 school year. Designed by teachers and teacher leaders, SILT was a formal 
professional development experience. The school offered SILT during the last week of August, prior to 
the beginning of the school year. Although attendance was voluntary, the school paid teachers for their 
time if they attended. The topic for SILT varied but was largely aligned with further learning on specific 
Key Elements. 

The school tacked closely to three design principles as they worked to plan and implement SILT 
from year to year. First, teachers designed and led the sessions having to do with the essential learning 
each SILT was designed around. For example, in 2011, teachers designed and led all the sessions having to 
do with student collaboration and literacy. Second, in most cases, teachers could choose which sessions to 
attend for at least part of each SILT day. Third, at the end of most SILT days, teacher session leaders, 
teacher leaders, administrators, and partners assembled to examine teacher survey data from that day’s 
SILT sessions. These discussions informed the group’s decision making from day to day and year to year. 

Each day generally proceeded as follows. Teachers would arrive and teacher leaders would debrief 
the staff on the agenda for the day. Teachers would choose sessions to attend until lunch. After lunch, 
teachers would meet with either their design teams or an inter-disciplinary professional learning 
community. Before each day was brought to a close, teachers took a survey rating the effectiveness and 
relevance of each component of the day’s schedule. After teachers took the survey, a group of teachers, 
teacher leaders, administrators, and representatives from partner organizations would assemble and unpack 
the survey data to discuss teachers’ perceptions of that day’s learning. 

Methodology 

The data from SILT comes from online surveys administered through the Survey Monkey 
platform. At the end of every SILT professional development day teachers took an online survey asking 
them to rate the effectiveness and relevance of the day overall and to rate the specific sessions they 
attended throughout the day. Rating questions were designed using a Likert scale. Table 6 below is an 
example of the Likert Scale we used to receive feedback from teachers. 


32 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Table 6. 


Measuring Relevance at SILT 

Session 

How relevant were the following components of the morning session of today’s SILT? 

Defining challenge 

Not relevant 

Somewhat relevant 

Extremely relevant 

Did not attend 

Developing an action 
matrix on assessment 

Not relevant 

Somewhat relevant 

Extremely relevant 

Did not attend 


Questions asking teachers the extent to which they found SILT sessions relevant to their practice 
remained largely unchanged from year to year to ensure internal reliability for each question and the ability 
to compare teacher responses from year to year. In addition, teachers answered open-ended questions that 
asked them to extrapolate on their ratings of each session and the day overall. Each survey also contained 
questions that asked teachers to describe what they had learned from each session and the day overall. 

These surveys served two important functions. First, by looking at the number of responses to 
each survey, we can approximate the number of teachers who attended each SILT day. Although this is 
not an exact number, because for whatever reasons specific teachers may not have completed a survey, 
overall we are confident a vast majority of teachers who attended each SILT day also completed a survey 
for each day. Second, the surveys provide us a way to measure the extent to which teachers perceived 
SILT as a valuable learning tool that helped them improve their practice. 

Research questions 

In this section, we address the following questions regarding the professional development the 
school developed to support teachers’ learning. 

• To what extent did teachers participate in SILT? 

• To what extent did teachers find their professional development in SILT to be authentic and 
relevant to their practice? 

• To what extent did teachers learn from their professional development experiences in SILT? 

Findings summary 

• SILT was implemented to high degree of fidelity as defined by the logic model produced by the 
school and the evaluation team. Especially strong factors that contributed to a high degree of 
fidelity include: 

o Using teachers to design and lead sessions, 

o Designing each SILT day allowing for teachers to choose which sessions they attended at 
various times throughout the day, and 

o High participation rate amongst teachers from day-to-day and year-to-year. 

• A majority of teachers on staff at the beginning of each year participated in SILT. 

o Teacher exposure to PBL pedagogy and principles was high. 

• Teachers perceived the SILT professional development experience positively with the majority of 
respondents stating that the learning was both relevant and useful to their practice. 

• Specifically, teachers pointed to several features of the SILT experience as especially powerful in 
terms of their own learning including: 


33 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


o Learning focused on topics and themes teachers perceived as relevant to their practice, 
o Teacher-developed and led sessions, 
o The ability to choose particular sessions of interest, and 

o Adjusted agendas and learning objectives based on survey feedback from teachers. 


Findings 

In the following section, we present our findings in two ways. First, we present data to evidence 
the extent to which SILT rates highly according to our fidelity measures. These measures include an 
analysis of 1) teacher exposure to SILT, 2) SILT quality, 3) teacher engagement in SILT according to 
teacher attendance, and 4) teacher engagement in SILT according to how teachers rated the effectiveness 
of each SILT experience. Second, we use our data to more fully answer the research questions stated 
above having to do with relevant, authenticity, and practicality of SILT sessions. 

Fidelity measures 

Our data show SILT was implemented according to a high degree of fidelity in the first three years 
of the grant. As the school refined its approach to how it designed and implemented SILT from year to 
year, some measurements, such as exposure, were harder to measure. For example, in the third year of 
SILT, teachers were paid to meet for 8 hours over the summer to work on problems of practice of their 
choice. Survey data show many teachers participated in this component of SILT, but because their 
participation was self-reported, it is difficult to say with any certainty just how many actually did. However, 
those teachers that did participate rated highly that specific component of SILT. 


SILT exposure 

As illustrated in Table 7 below, our data show that participating teachers who attended every day 
of SILT during a given year had high exposure to learning focused on PBL pedagogy and principles. 


Table 7. 


Teacher Participation in SILT 

Year 

Exposure as measured in hours 

Teacher participation as measured by the 
average number of teachers who completed 
daily surveys for each SILT year 

2011 

48 hours (6 days, 8 hours a day) 

52 

2012 

40 hours (5 days, 8 hours a day) 

49 

2013 

24 hours (3 days, 8 hours a day) 

44 

2014 

24 hours (3 days, 8 hours a day) 

32 


Based on the data present in the table above, Table 8 below shows the overall fidelity rating SILT receives 
based on a measurement of exposure. 


Table 8. 


Fidelity Rating: SILT 

Component 

Operational definition 

Data collection 

Fidelity scale 

Criterion for 
adequate/high 
fidelity of 
implementation 

SILT exposure 

Exposure: Total Duration 
— the total number of 

SILT design documents 
SILT observations 

1: 0 to 10 hours per SILT 

2: 11 to 20 hours per SILT 

Low =1,2 

Med = 3 


34 





Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 



hours of training offered 


3: 21 to 30 hours per SILT 

4: 31 to 40 hours per SILT 

5: 41 to 50 hours per SILT 

High = 4, 5 


Question 1: To what extent did teachers participate in SILT? 

Teachers were surveyed at the end of each SILT experience. Some years, such as in 2011 and 2012, 
the school surveyed teachers at the end of each SILT day to gauge how relevant the specific learning for 
that day was for their daily practice and to inform tweaks and revisions the leadership team would make 
for the next day’s SILT focus. In other years, such as in 2013 and 2104, the school surveyed teachers 
periodically during SILT and then at the end of the professional development experience as a whole. 
Although not a perfect measure of teacher engagement in SILT, we use surveys as a measure of teacher 
participation. Table 9 shows the extent of teacher’s participation in SILT from 2011 to 2014. 


Table 9. 


SILT Data Collection Summary 

SILT Session 

Number of respondents 

Number of possible 
respondents 

SILT 2011, Day 1 

57 

70 

SILT 2011, Day 2 

56 

70 

SILT 2011, Day 3 

51 

70 

SILT 2011, Day 4 

49 

70 

SILT 2011, Day 5 

51 

70 

SILT 2011, Day 6 

50 

70 

SILT 11-16-11 Feedback 

66 

70 

SILT 2012, Day 1 

54 

81 

SILT 2012, Day 2 

50 

81 

SILT 2012, Day 3 

45 

81 

SILT 2012, Day 4 

44 

81 

SILT 2012, Day 5 

53 

81 

SILT Year 3 Survey 

50 

76 

SILT 2013, Day 1 

41 

76 

SILT 2013, Day 1, survey 2 

40 

76 

SILT Follow Up Faculty Survey 

34 

76 

SILT Spring 2014 

36 

75 

SILT Summer 2014 

27 

75 


Question 2: To what extent did teachers find their professional development in SILT to be 
effective and relevant to their practice? 

Teachers perceived SILT to be relevant to their classroom practice and an effective use of their 
time overall. When asked to say whether they disagreed, strongly disagreed, agreed, or strongly agreed, 
teachers overwhelming said they either agreed or strongly agreed “I came away with something specific 
today that I will be able to use in my teaching this year.” Table 10 below shows how teachers rated each 
day of SILT for which we have survey data. 


35 





Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Table 10. 


Overall SILT Effectiveness of Relevance 

Day 

Year 

Question 

Rating (Agree 
and Strongly 
Agree) 

Number of 
respondents 

1 

2011 

“I came away with something 
specific today that I will be able to 
use in my teaching this year.” 

79% 

57 

2 

2011 

“I came away with something 
specific today that I will be able to 
use in my teaching this year.” 

96% 

56 

3 

2011 

“I came away with something 
specific today that I will be able to 
use in my teaching this year.” 

100% 

51 

4 

2011 

“I came away with something 
specific today that I will be able to 
use in my teaching this year.” 

98% 

49 

5 

2011 

“I came away with something 
specific today that I will be able to 
use in my teaching this year.” 

100% 

51 

1 

2012 

“I came away with something 
specific today that I will be able 
toSILT use in my teaching this 
year.” 

88% 

50 

2 

2012 

“I came away with something 
specific today that I will be able to 
use in my teaching this year.” 

96% 

66 

3 

2012 

“I came away with something 
specific today that I will be able to 
use in my teaching this year.” 

93% 

54 

4 

2012 

“I came away with something 
specific today that I will be able to 
use in my teaching this year.” 

97% 

50 

5 

2012 

“I came away with something 

90% 

45 


36 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 




specific today that I will be able to 
use in my teaching this year.” 



1 

2013 

N.A 

N.A 

44 

3 

2013 

*Teachers rated the effectiveness of 
each component of SILT. 

June day = 68% 
Summer work = 
81% 

3 days in August 
= 96% 

53 

2014 

May 
23rd 
(Day 1) 

“The learning was relevant for me.” 

91% 

50 


Data suggest teachers overwhelmingly perceived SILT to be an effective use of their time and relevant to 
their daily practice. 

Question 3: To what extent did teachers learn from their professional development experiences in 
SILT? 

Teachers’ open-ended survey responses demonstrate teachers learned from their professional 
development experience in SILT. Components like built-in time to collaborate with various colleagues, the 
time to reflect on the day’s learning, and the use of various kinds of expertise (students, teachers, and 
community members) surfaced repeatedly when teachers discussed what made the SILT experience 
effective, relevant, and meaningful. Teachers responded to the question, “What were the most important 
things you learned today” in the following ways: 

• “Coming to a better shared understanding of the definition of PBL; why we should do it, how we 
should do it, and what we can do.” 

• “Thinking about the many ways (specifically authentic and relevant ways) to assess student learning, 
especially related to skills. Thinking about different ways of grading student learning, and what is 
really important, what we value.” 

• “Time to talk with both people in my discipline and from outside my discipline about how to use 
groups effectively in my classroom.” 

• “I had very fruitful conversations with colleagues - conversations that helped us process some 
ways we might collaborate in the future.” 

• “I feel like teachers’ knowledge was utilized in the process, so it didn’t feel top-down. I learned a 
lot from talking to other colleagues, learning about their instructional strategies, and further 
discussing PBL implementation with them was also beneficial. Also, I enjoyed attending teachers’ 
presentations, so I could learn from my colleagues. It felt highly collaborative, which was an 
excellent model for what we are asking our students to do.” 

• “The cultural awareness panel discussion is perhaps the most effective teacher training day (outside 
of time to work in my room) I have ever spent. It was honest, practical, and powerful.” 

• “I really enjoyed K’s presentation - it reminded me to never underestimate students or what they 
are capable of achieving.” 


37 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Teacher responses evidence general and specific things they took from their experience in SILT and apply 
to their teaching practice. One teacher stated that they “have new ideas about the needs of my minority 
students. I also have new ideas of how to connect to them.” Another stated that they “Learned of the 
many possibilities for incorporating PBL into all of my classes.” While these and other responses like them 
represent somewhat ambiguous or vague ideas for how teachers’ SILT learning could translate into 
different teaching strategies, others were more specific. One teacher stated they, 

Began to think about how expertise does not necessarily mean content 
knowledge, but can include identifying gaps in understanding and activating 
the resources to gain that knowledge. This more inclusive definition of an 
expert needs to be communicated to kids to really push the metacognitive 
processes of gaining expertise. 

Another teacher talked specifically about how they wanted to leverage students’ backgrounds as content 
for their classes saying, “Students can be used as cultural and social experts and could be a valuable part of 
the curriculum. Connecting their expertise with the content could make student engagement go up.” Both 
the more general and specific ways teachers talked about their learning from SILT evidence the experience 
was beneficial to their learning as teachers. 

PBL Curriculum Design Teams 

PBL Design Teams consisted of teachers who volunteered or agreed to participate in a year-long 
curriculum redesign process with other teachers within their department who had experience teaching the 
course to be redesigned. Design teams met during an extra, common, daily planning period. Meaning, if 
the Biology design team met during 4 th period, teachers on that team would meet every fourth period 
throughout the year. Design teams consisted of between 2 and 8 teachers, although most design teams 
consisted of 3-4 teachers. 

Design teams largely worked autonomously. Teachers used the Key Elements to guide their PBL 
design work but also worked within the constraints of the established district curriculum, the Common 
Core State Standards (CCSS), the Next Generation Science Standards (NGSS), and various AP course 
frameworks and expectations. Depending on the course and discipline, some design teams experienced 
more rigid constraints than others. For example, the Advanced Placement Human Geography team 
enjoyed the benefits of designing a brand new course that had not been taught at the school before. Their 
primary constraints included negotiating the pedagogical principles set forth in the Key Elements with the 
established AP framework and expectations for the course, all the while planning curriculum that every 
incoming freshman student would experience. Alternatively, the Geometry team did not have to contend 
with AP framework but they did have to contend with the district common math curriculum, the Math 
CCSS, and with preparing students for a state mandated end of course (EOC) exam. Each team's work 
was highly complex and complicated but in ways that were specific to the course they were designing and 
the content area in which they taught. 

Out of this process came profound teacher learning. While several teams struggled to navigate the 
web of competing standards, assessments, and expectations, virtually every design team teacher described 
the process as one the most fulfilling professional development experiences of their career. Like the PBL 
curriculum they worked to design for their students, teachers engaged in a daily process of creative and 
collaborative problem solving that deepened their content and pedagogical content knowledge and 


38 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

expertise. For many teachers, this process made teachers into dedicated collaborators who learned to seek 
out the expertise of their colleagues to further refine and adjust lessons and units and to solve problems of 
practice. 

School leaders assembled teacher PBL Design Teams to redesign established curriculum into 
problem based curriculum and to further build capacity and expertise within the teaching staff. Design 
teams were diverse according to teacher gender, years of experience, and expertise. Initially, the school 
chose teachers to serve on design teams if they had previous experience teaching the course to be 
redesigned. In years two and beyond, the school provided teachers with an opportunity to assemble their 
own teams and propose specific courses they wanted to redesign. What eventually transpired was a policy 
that balanced both the school’s need to redesign certain courses and teachers’ desire to prioritize 
curriculum redesign in some courses over others. 

The design teams the school established more resemble teacher learning communities, popularized 
by Wenger (1998), than professional learning communities, popularized by DuFour (2004, 2005). Unlike 
professional learning communities (PLCs) design teams are not guided or governed by facilitators, district 
or school leaders, or consultants, nor are teachers expected to follow specific protocols or processes to 
complete their work. Instead, in design teams, teachers established the routines and norms that govern 
their collaborative work and the eventual product(s) that emerged from that process. Design teams 
represent a key fidelity component to the school’s project implementation. 

Design teams were assembled in most departments in the school, including Fine Arts, Physical 
Education, Social Studies, English, Math, Science, Foreign Language, the Performing Arts, and Career and 
Technical Education. Courses specific to Special Education and English Language Learners (ELL) were 
not redesigned. 

For the most part, design teams operated autonomously throughout the school. It was highly 
uncommon for teacher leaders or school and district administrators to participate in design teams in which 
they were not already full members. Meaning, it was rare for district administrators, building administrators, 
or teacher leaders to drop in on design teams to observe the work they were doing or to offer support for 
how they stmctured their curriculum redesign work. Because teacher leaders from the Social Studies and 
Science departments commonly served on design teams as full members, it was uncommon for teacher 
leaders to check in on those design teams. An example of this was the Junior Level English Design Team 
who worked during the 2012-2013 school year. Field notes from observations of this design team evidence 
that out of 25 observations conducted throughout the year, a teacher leader or administrator visited the 
team twice to check in and provide them with support. The frequency by which teacher leaders or school 
leaders visited the Junior Level English team was a common occurrence among all the design teams 
studied. 

Design teams served two important purposes in regards to the work of the grant. First, design 
teams were responsible for designing PBL curriculum. The school did not articulate a specific amount of 
PBL Design Teams were expected to redesign, only that they had to redesign some amount of curricula 
they could implement the year after the design year. In the first year of curriculum redesign, the school 
intended for design teams to redesign full courses during their year release. Although this remained a tacit 
goal for all future design teams, after the first year of design the school shifted focus to supporting the 
quality of PBL emerging from design teams throughout the school. Second, the school hoped the design 


39 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

team experience would further build capacity and expertise within the teaching staff. Meaning, the school 
hoped design teams would evolve into rich contexts for teaching learning as teachers planned curriculum 
and shared problems of practice that emerged from the process of piloting redesigned curriculum. As 
teachers continued to engage in the curriculum design process, they would improve their practice and 
deepen their understanding of PBL pedagogy. 

The data show that some teams chose to establish a design/pilot stmcture to their curriculum 
redesign process and some did not. Among the teams studied, the AP Human Geography and the Junior 
English exhibited the most pronounced stmcture of design/pilot in their redesign process. The data also 
show that out of the 6 design teams we studied, the AP Human Geography and Junior English teams 
redesigned the most curricula by the end of their design year. 

In late fall of 2011, the school planned and facilitated a peer curriculum review process whereby 
teams would share some of the work they had completed up until that point. The purpose of this peer 
review process was to inject the curriculum redesign process with peer accountability. The school 
requested design teams create a presentation illustrating a partial or full unit of instruction they had 
planned and provide their colleagues with a document in which they would provide rationale and 
background for why they made specific curricular decisions. After design teams had time to read and 
digest the materials they received from their peer reviewers, they came together in a two-hour meeting to 
provide feedback to each other. Teacher leaders and school leaders facilitated these discussions. 

The school implemented one round of peer review in Winter of 2012. A Sammamish teacher 
leader and a university researcher co-designed the peer review process. The purpose was to provide teams 
the time and space to reflect on their design work and gain feedback on the curriculum they had designed 
up until that point. Teams were paired with teams in other departments. Paired teams were to prepare a 
PowerPoint presentation to provide important background to the redesigned unit, a redesigned unit with 
all the relevant course materials, and a rationale for their decision making in regards to the unit. Once 
teams were assembled it was clear that teams were at various stages of planning resulting in some teams 
sharing very little and some teams sharing too much. In addition, some departments wanted more specific 
feedback on the specific content within each unit and felt as though the inter-disciplinary design of the 
review process provided them with too little actionable feedback in regards to content. After receiving 
feedback from participating teams, the peer review process was abandoned. 

An additional notable feature of some design team’s process was the occasional presence of 
external experts and/or representatives from partner organizations in the meetings. In some cases, 
university researchers partnered with design teams to conduct observations on their design process 
throughout the year or university professors would make weekly visits with design teams to help them 
think through specific problems they had encountered in the planning process. In some cases, design 
teams partnered with educational organizations to pilot externally designed PBL curriculum. In still other 
cases, design teams would schedule visits with representatives from industry to either receive feedback on 
the authenticity of specific units or in an attempt to engage them in more focused planning of unit 
assessments. However, not every design team engaged partners or representatives from industry and in 
those teams that did, they engaged those persons to varying frequency and intensity throughout the year. 


40 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Re'search Questions 

In our analysis of how design teams functioned within the school and the extent to which they were 
effective in redesigning curriculum, we have asked the following questions of the data. 

1. What did design teams accomplish? Did their design efforts result in curriculum they could 
implement? 

2. How much curriculum did each team redesign? To what extent was the curriculum they redesigned 
PBL curriculum? 

3. How did design teams structure their collaborative work? What similarities, if any, existed between 
teams? 

4. How did participation in a design team affect the capacity of teachers to continue the work? 

Findings Summary 

1. The amount of curriculum each design team designed varied by team. 

2. The extent to which the PBL Design Teams redesigned was/is problem based learning varies from 
team to team. 

3. Teams that established a design/pilot stmcture to their curriculum redesign process designed more 
curriculum than those team that did not establish and design/pilot structure. 

4. Departmental culture influenced how design teams structured their collaborative work. 

5. Teachers stated learning more about teaching from their experiences working on a design team. 

Methodology 

We collected data on design teams during the 2011-2012, 2012-2013, and 2013-2014 school years. 
In the following table we describe the design teams we studied, when we studied each team, and the data 
we collected on each team. Our findings on teacher design teams are partially gleaned from a concurrent 
study conducted at Sammamish High School focused on the implementation of PBL throughout the 
school. The researcher who conducted this study is also a research associate for Knuth Research. This 
study followed a total of 6 design teams over the course of 3 years. Table 11 below describes what data 
was collected on which design teams and when that data was collected. 


Table 11. 


Teacher Design Team Data Summary 

Design team 

Year 

Data collected 

Frequency of data collection 

Curriculum successfully 
redesigned by end of 
design year 

AP Human 
Geography 

2011- 

2012 

Video recorded 

design team 

meetings 

Semi-structured 

teacher 

interviews 

Document 

review 

Weekly design team meeting 
observations (September- 
March) 

Interviews conducted in Fall 
and Winter 

Documents collected 
throughout the year 

A full year’s worth of 
curriculum 

Freshman 

English 

2011- 

2012 

Video recorded 
design team 
meetings 

Weekly design team meeting 
observations (September- 
March) 

3 complete units 


41 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 




Semi-structured 

teacher 

interviews 

Document 

review 

Interviews conducted in Fall 
and Winter 

Documents collected 
throughout the year 


Sophomore 

English 

2011- 

2012 

Video recorded 

design team 

meetings 

Semi-structured 

teacher 

interviews 

Document 

review 

Weekly design team meeting 
observations (September- 
March) 

Interviews conducted in Fall 
and Winter 

Documents collected 
throughout the year 

No completed units 

Geometry 

2012- 

2013 

Video recorded 
design team 
meetings 
Semi-structured 
and stimulated 
recall teacher 
interviews 
Document 
review 

Daily design team meeting 
observations 

• 1 week in September 

• Teachers planning a 

PBL unit in winter/late 
fall 

• 1 week in June 

Interviews conducted during 
each group of design team 
meetings (Fall, Winter, Spring) 
Documents collected 
throughout the year 

Mini-challenge cycles 
within larger units. 

1 complete unit. 

Junior English 

2012- 

2013 

Video recorded 
design team 
meetings 
Semi-structured 
and stimulated 
recall teacher 
interviews 
Document 

review 

Daily design team meeting 
observations 

• 1 week in September 

• Teachers planning a 

PBL unit in winter/late 
fall 

• 1 week in June 

Interviews conducted during 
each group of design team 
meetings (Fall, Winter, Spring) 
Documents collected 
throughout the year 

4 complete units 

*Senior 

English 

2013- 

2014 

Video recorded 
design team 
meetings 

Daily design team meeting 
observations in Fall 

Interviews conducted in Fall 

N.A. 

^American 

Studies 5 

2013- 

2014 

Video recorded 
design team 

Daily design team meeting 
observations 

N.A. 


5 Data from the AP American Studies and Senior English teams is incomplete and thus not used in this report. For various 
reasons, these teams chose to use their design year solely to redesign and not implement curriculum. These teams were studied 

42 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


(combo of AP 
U.S. History 
and AP 
Language) 


meetings 

• 1 week in Fall 

• Teachers planning a 

PBL unit in Winter 



In sum, our teacher design team data corpus includes 127 teacher design team meeting video 
recorded observations, 34 interviews, and extensive teaching artifacts collected during the data collection 
process. Interviews were transcribed by the researcher and member checked with each teacher to validate 
the transcription and the content of the interview. Data collection was not completed for either the Senior 
English or American Studies teams in the 2013-2014 because these teams did not implement and/or pilot 
curriculum concurrently with curriculum redesign. 

Findings 

In the following section, we present our findings in two ways. First, we present data to evidence 
the extent to which PBL Design Sessions rate highly according to our fidelity measures. These measures 
include an analysis of 1) teacher exposure to design teams and design sessions, 2) design session quality as 
defined by the extent of their support from external design mentors, and 3) design session quality as 
defined by the extent of their support from external content mentors. Second, we use our data to more 
fully answer the research questions regarding PBL Design Sessions stated above. 

Fidelity measures 

Our data show PBL Design Sessions were implemented with a moderate degree of fidelity. 

PBL Exposure 

Each PBL Design Teams had the opportunity to meet approximately 140 times throughout the 
school year. Each design team was provided a common, daily release to redesign curriculum. Each design 
team had the opportunity to meet at least 4 times during each full week of school. Three of the days they 
would have met during the week were during 50-minute periods. One of the day they would have met 
would have been during a block, 90-minute period. Because we did not conduct daily observations of each 
design team throughout each year, it is entirely possible that the extent to which each design team used 
their design period to meet as a team varied according to the day’s schedule, whether or not one of the 
members was out sick, or whether or not the team decided they did not have to meet on any particular day. 

Question 1: What did they accomplish ? 

The amount of PBL curriculum each design team designed varied by team. The amount of 
curriculum each team planned and the specific constraints and affordances each team faced when 
implementing the curriculum led to uneven levels of curriculum implementation the following year. For 
example, the AP Human Geography designed a vast majority of the PBL units they were going to teach in 
their implementation year during their design year. Conversely, the Algebra 2 team redesigned 
approximately half of the units they were going to implement the following year but the district 


by researchers from the University of Washington and were dropped from their research because they chose not to pilot newly 
designed curriculum during their design year. 


43 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

dramatically shifted the focus of the curriculum in the summer between design and implementation years, 
leaving them with little to implement. 

In addition, some teachers described feeling constrained by the standardized high stakes tests their 
students would take at the end of the year in their design process. For courses such as Geometry, 

BioChem, and AP Chemistry, teachers struggled to plan deeply engaging PBL units while teaching 
students enough content to ensure they were prepared to pass the associated end of course (EOC) exam, 
the Advanced Placement (AP) exam, or various district assessments. A teacher on the Geometry design 
team described how 

It's, it's limiting in creativity in my view. Like, be creative with PBL, take 
these kids where they haven't been before, make them think in ways they 
haven't before, but you have to do this [district curriculum/assessments]. 

And you have to do it in this time and you have to give this test by this date 
and, you know, it's just like, we have to give district assessments. You have to 
make sure you cover these, this, this topic and you have to make sure you 
have to hit these things but, do whatever you want. Do whatever you can, 
you know, have fun with it! Be creative! Go places! And you know, it's just 
putting a cap on how creative you can be. 

This tension between teaching the content demanded by various layers of testing and external assessments 
and the creativity demanded of teachers to design high level PBL units was difficult for teachers of these 
kinds of courses to balance. 

Question 2: How much curriculum did each team redesign? To what extent was the redesigned 
curriculum PBL? 

The quality of PBL curriculum each design team planned varies widely from team to team. Over the 
course of 4 years, approximately 35 design teams have redesigned 35 courses. Each team was provided one 
year to redesign existing curriculum. After the design year, teams continue to implement multiple versions of 
lessons and units as they get a better sense of how curriculum can be improved according to PBL pedagogy. 
Each team approached the design process differently and each team redesigned different amounts of the 
existing curriculum. 

Establishing the quantity and quality of PBL curriculum each team designed is problematic for 
several reasons. First, the standard of PBL the school has been aiming at has shifted from the beginning of 
the project. In part this was because the Key Elements document itself has dramatically shifted over time. 
The first version of the Key Elements document Year 1 and 2 design teams used was a mere two pages and 
articulated each key element broadly. Sammamish High School teachers and a university researcher rewrote 
the Key Elements document starting in Year 2 of the project. They were responding to feedback from 
teachers who said the key elements were too general to be of any use as they designed curriculum. As 
teachers gained more expertise through the design and implementation process, the Key Elements reflected 
the school community’s burgeoning expertise in PBL pedagogy. 

Not only did each teacher on each team approach the redesign process with varied levels of 
expertise of PBL pedagogy, each team dealt with slightly different constraints as they worked to redesign 
curriculum. These constraints included: 1) the extent to which the content of each course lended itself to 
PBL pedagogy, 2) the extent to which each course was accountable to a specific end of course high stakes 

44 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

test, and 3) the extent to which each course was beholden to existing district curriculum. For each course 
and design team, these constraints manifested themselves differently, forcing each design team to adjust the 
extent to which they felt lessons and units could be redesigned into PBL curriculum. 

Question 3: Did all design teams operate in the same manner? If not, why? 

Design teams did not operate in the same manner. Design teams approached the curriculum 
redesign task differently for several reasons. First, in many cases design teams adopted the norms and 
culture that exists within departments. In some cases, design teams evidenced strong and equitable patterns 
of collaboration while other teams evidenced hierarchical and at times confrontational patterns of 
collaboration. Second, some design teams leveraged cycles of design and implementation/pilot during their 
design team year while other teams chose to use the design year to design only, and not implement and/or 
pilot newly designed curriculum. Third, external constraints having to do with standardized testing such as 
end of course exams and AP exams factored into how much curriculum teams could redesign and what kind 
of PBL teams designed. Fourth, team chemistry factored into how teams structured their collaborative 
curriculum redesign. 

Design team collaborative routines. The data suggest variability in how teams structured their 
collaborative practice. For example, the Junior and Freshman English team generally approached 
curriculum redesign with enthusiasm and excitement. However, both teams stmggled from time to time 
with recurring debates having to do with the role of the novel or literary analysis essays in PBL English 
classes. Within these teams, how the department as a whole generally thought about these issues served as 
touchstone or starting point for discussions. 

The data from two design teams show a positive relationship between teams that routinely shared 
teaching artifacts from cycles of implementation and teams that produced large amounts of PBL 
curriculum. The AP Human Geography team’s design team data evidenced a strong routine of sharing 
teaching artifacts that reflected recently implemented or soon to be implemented curriculum. At the end of 
their design year, the AP Human Geography team redesigned nearly a whole year of PBL curriculum. 

Table 12 illustrates all the design team meetings we observed and the extent to which teachers shared 
teaching artifacts in those meetings. We observed the AP Human Geography team 11 times from mid- 
September to mid-March during the 2011-2012 school year. In the 11 observations we conducted, the AP 
Human Geography team shared a teaching artifact 12 times. 


45 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Table 12. 


Instances of Sharing in AP Human Geography Design Team Meetings 

Topics 

9/15 

9/22 

10/6 

10/27 

11/3 

11/17 

1/5 

1/12 

1/26 

2/9 

3/8 

PowerPoint 
lesson plan for 
the day 

X 



X 



X 


X 



Student survey 


X 










Developing a 
student 

classroom task 
or activity 



X 


X 


X 





Developing a 
student 

assessment 
and/or rubric 
for the 

assessment 




X 


X 


X 




Teachers discuss 
various subject 
matter content 










X 



In table 12, “X” stands for instances where teachers shared a problem of practice and that sharing led to a 
focused discussion amongst teachers. 

Throughout their design year, this team simultaneously planned and implemented newly designed 
PBL curriculum. In the design team meetings we observed, teachers routinely used shared artifacts to 
focus and ground their discussions around what was working and not working and why. By the end of the 
year, the AP Human Geography team had redesigned nearly a full year of curriculum, most of which had 
been piloted and revised. The team started their first year of implementation with curriculum that had 
undergone one cycle of design, implementation, and revision. 

The Junior level English team evidenced similar levels of sharing throughout the year in a similar 
number of design team meetings. We observed the Junior English team 25 times throughout the 2012- 
2013 school year. Table 13 below illustrates a subset of those observations when we observed the team 
redesigning the Satire Unit in November and December of 2012. Data from the Junior English team 
evidences that they routinely shared teaching artifacts in their design team collaboration. In this sample of 
10 design team meetings, this team shared 9 different kinds of teaching artifacts, 18 times. 


46 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Table 13. 


Instances of Sharing in the Junior English Design Team’s Satire Unit 

Topics 

1 

2 

3 

4 

5 

6 

7 

8 

9 

10 

Student surveys (design, 
data analysis, feedback) 

X 


X 

X 




X 

X 

X 

Scaffolding note-taking 
strategies 

X 










Class manifesto 

X 










Teacher developed 
model essay 

X 










Student peer 
evaluation/self 
evaluation handout 


X 









English content handout 
(various) 




X 



X 




Unit calendar 







X 




Various student 
work/artifacts 






X 



X 


Lesson plans 





X 


X 

X 




In table 13, “X” stands for instances where teachers shared a problem of practice and that sharing led to a 
focused discussion amongst teachers. 


As with the AP Human Geography team, the Junior English team routinely shared teaching 
artifacts in their design team meetings. The artifacts they shared typically represented newly designed 
teaching materials from PBL units. Similarly to the AP Human Geography team, by the end of their design 
year the English 3 team had redesigned nearly a full year of curriculum, most of which had been piloted 
and revised. The team started their first year of implementation with curriculum that had undergone one 
cycle of design, implementation, and revision. 

In contrast, Table 14 below evidences markedly less sharing in the Geometry design team. We 
tracked the extent to which teachers shared teaching artifacts, in a group of 9 design team meetings, as 
they planned a PBL unit in January 2013. 


47 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Table 14. 


Instances of Sharing in Geometry Design Team Meetings 

Topics 

1/14 

1/15 

1/16 

1/18 

1/23 

1/25 

1/29 

1/30 

2/8 

Looking at student 
grades for each Math 2 
course 





X 





Rubric for student 
project 









X 


In table 14, “X” stands for instances where teachers shared a problem of practice and that sharing led to a 
focused discussion amongst teachers. 


The Geometry design team shared far fewer teaching artifacts than both the AP Human 
Geography and Junior English design teams. They also had produced less curricula by the end of their 
design year. By the end of the school year, the Geometry design team redesigned “about a third” of a 
year’s curriculum into PBL curriculum. 

Our data suggest, as does the research literature (Cochran Smith and Lyde, 1999; McLaughlin and 
Talbert, 2006; Horn and Little, 2009), that sharing artifacts and problems of practice can impact the extent 
to which teachers learn from collaboration. When this specific routine was present in design team’s 
collaboration, other productive interpersonal features also seemed to be present. For example, sharing 
between all teachers within a design team, especially in those teams where novice teachers were members, 
imply that teachers also established clear norms to guide their collaboration and that they developed a 
shared commitment to the work of PBL redesign. Our data suggest that both features are indicative of 
productive teams. For example, both the Geometry team and the Sophomore English team eschewed 
norm setting as a legitimate component of collaborative work. In both cases, some teachers were hesitant 
to explicitly embrace PBL as a legitimate curriculum redesign goal. Conversely, in both the AP Human 
Geography and Junior English team, part of the norm setting process was to unearth philosophical 
differences between teachers regarding PBL pedagogy. In some teams, when teachers set clear norms and 
sustained a commitment to PBL throughout their redesign process, they established predictable routines 
within the team to allow for disagreement and dissent. In the teams where this did not happen, 
philosophical tensions and disagreements between teachers made the process of PBL redesign more 
complex, complicated, and potentially contentious as the year progressed. 

Question 4: How did participation in a design team affect the capacity of teachers to continue the ivork? 

When interviewed, teachers largely found the design team experience valuable regardless of their 
overall design team experience or the department in which their design team was situated. An experienced 
English teacher, who was resistant to PBL, commented that he was “trying lots of new things [throughout] 
the year.” Design team learning was especially powerful for the novice teachers who worked on a team. A 
first year Math teacher spoke about how 

As a first year teacher, it was a lot harder for me to see the big picture. There 
are little things I think I am picking up from them [the design team] on how I 


48 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

approach the little things and the big things.. .My belief system has evolved. 

I’ve learned from their mistakes too. 

An experienced Social Studies teacher commented that “compared to everything I have ever done in terms 
of, you know, professional development, working with other teachers in a design team has been the best 
professional development experience I have ever had.” When asked to describe further what it was about 
the design team experience that was so powerful for their learning, many teachers commented that the 
daily, scheduled time to collaborate with colleagues helped them was pivotal for their learning. 

Beyond the learning teachers experienced through the PBL Design Team experience, many 
teachers spoke about the respect they felt as a result of membership on a design team. An experienced 
English teacher described working on a design team as “the closest [I’ve] gotten to [feeling] like a 
professional and to feelfing] like the school tmsts and puts power in the teacher.” This same English 
teacher talked about the expectation of creative thinking and problem solving that came with working on a 
design team and how “there hasn't been an opportunity for me to be this creative in a really long time.” To 
many teachers having predictable, extended time to work with other teachers about curriculum and 
instruction was refreshing. An experienced teacher in the English department stated that 
Never before have I felt like I was encouraged to actually take a meaningful 
amount of time to talk about what I do in the classroom.. .The notion that we 
are actually, somebody is giving us the time and, and, taking note that that is an 
actual, in fact a vital part of instruction, of good instruction at the very least, 
um, is the ability to have time to do your planning and talk about 
practice.. .That feels good. 

The data is rife with testimonials from teachers that evidence a newfound respect they felt as design team 
teachers and a belief that they had learned more about teaching and learning from their collaboration with 
design team colleagues. 

Additionally, there is some evidence that some teachers have transferred the knowledge gained 
from redesigning curriculum into planning and informally redesigning other courses they teach. A teacher 
in the English department who participated on two design teams used strategies learned from those 
experiences to informally and incrementally redesign an Advanced Placement Language class he taught. 
One of the literacy specialists at the school used the knowledge she gained from planning the Sammamish 
Leads project to redesign her AVID classes. Examples like these suggest that teachers use their knowledge 
of the curriculum redesign process and of PBL to inform how they plan and teach other courses that have 
not been formally redesigned according to PBL pedagogy. 

Alternatively, teams that did not establish collaborative routines such as examining problems of 
practice, also seemed to stmggle to build the kind of capacity within the team that would help them sustain 
the iterative, curriculum redesign process as teachers continued to implement the curriculum. In the teams 
where this dynamic was present, there also existed a strong ethos of teaching privatism (Little, 1990; 
Hargreaves and Fullan, 2012). 

In meetings conducted with departments in the Spring of 2014, some teachers described a 
relationship between curriculum redesign release time and the status they perceived the school conferred 
upon design team teachers as a result of their design team experience. Meaning, some teachers equated 
design team release as a kind of currency or commodity only some teachers were offered. Teachers from 
the English department expressed resentment towards the amount of release time teacher leaders were 


49 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

provided for managing the grant work throughout the school. These teachers questioned the integrity of 
some teacher leaders who they saw as having less PBL teaching experience than they had. Teachers from 
the CTE department expressed resentment for what they perceived as the paltry amount of release time 
provided their department for curriculum redesign. This perception among some teachers was unexpected 
and surprising. 

The influence of departmental culture. In most cases, design teams stmctured their 
collaboration in ways similar to how collaboration functions in departments. A teacher on the Geometry 
team described how the team “Didn't really do that norming process thing at all.. .we put that off. It was 
just kind of natural, I think teacher A and I, having worked together the year before, it was sort of our 
routine already. So it just trickles down from the math culture that we have here.” A teacher on the AP 
Human Geography team described how the culture of the group was an extension of the Social Studies 
department. She described how the department “collaborates a lot” and how “at lunch we [they] talk about 
how we [they] can work together better as a department.” A teacher on the Sophomore English team 
described how the department tends to adjust whatever school-wide improvements the school implements 
to what they are already doing in their classes. When asked about how he and the department thinks about 
PBL redesign, he stated, “We're [the department] gonna put a project on the end of it [established 
curriculum]. People go, ‘Oh, that's PBL right? Project-based learning?’ Yeah. That's what it is. Um, and in 
the process we'll try and sneak in some of the ‘actual’ education, some of the ‘actual’ problem solving that 
we've always done.” The opinions of these specific teachers were not necessarily consistent with each 
teacher within each department. However, our data suggest they represent the overall opinions, beliefs, 
and dispositions present within each department. 

In each case, design teams transplanted the departmental culture into their design team stmcture 
but with different results. In the case of the AP Human Geography team, they transplanted a strong 
collaborative culture in which teachers would openly share their personal instructional practice and debate 
the best ways to address issues they encountered when implementing the curriculum. Interviews and 
informal conversations with various Social Studies teachers evidenced a group strongly supportive and 
enthusiastic about the potential of PBL to transform how they taught Social Studies content to students. 

In the case of the Sophomore English team and to a lesser extent the Geometry team, both teams 
transplanted a strong entrepreneurial bias and traditional culture in which teachers were expected to teach 
their classes in ways that best fit their teaching styles. Meaning, teachers in both teams took a largely 
content-centric perspective of teaching Math and English coursework that was biased toward how they 
were taught those classes as students. Teachers who held this perspective tended to resist collaborative 
problem solving and planning and preferred to teach in more isolated, private classrooms (Lortie, 1974). 
Reflecting the dominant attitude within the English department and to a lesser extent within the Math 
department, many of the teachers on these teams remained suspicious of how PBL redesign might change 
or shift the traditional content foci within each discipline. 


50 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chapter 5: School Leadership Structure 

This section examines the school’s leadership stmcture, including the teacher leaders, and how 
these leaders supported teachers in their transition to problem-based learning. This project pushed the 
traditional boundaries of what schools expect from principals, teacher leaders, and teachers. Much of the 
time teachers and school leaders were working with a vision but without a clear blueprint to get them from 
point A to point B. From time to time this tension caused inevitable conflict and second-guessing by 
everyone involved in the project. Beyond providing a description of how the principal and school leaders 
redesigned the school’s leadership structure, we use qualitative data to provide a candid view of what 
worked and did not work from the perspective of teachers, teacher leaders, and school leaders. 

Teachers face various obstacles when designing and implementing PBL curriculum. The most 
complicating factors include: 

• Lack of experience designing and implementing PBL curriculum (Albion & Gibson, 2000; Ward & 
Lee, 2002), 

• The time it takes implementing high quality PBL curriculum and, conversely, the time PBL 
curriculum takes away from teachers’ ability to directly teach to various high stakes tests (Ward & 
Lee, 2002; Brinkerhoff & Glazewski, 2004; Park, Ertmer, & Cramer, 2004; Simons, Klein, & Bmsh, 
2004), 

• PBL demands a different approach to assessment that can move teachers away from more traditional 
forms of assessment like multiple choice exams and essays (Benedict, Thomas, Kimerling, & Leko, 
2013), 

• Lack of strong models of high quality PBL curriculum in multiple content areas (Ertmer, Lehman, 
Park, Cramer, & Grove, 2003), and 

• Teaching and learning in a PBL context also demands that teachers and students embrace different 
roles (Gallagher, 1997; Bmsh & Saye, 2000; Land, 2000; Ertmer et al., 2003; Ertmer & Simons, 

2006; Grant & Hill, 2006). 

In addition, the roles teacher leaders take up can be challenging as they support teachers who work to 
design and implement PBL curriculum. The extent to which Sammamish High School teachers addressed 
these and other pressures in their work to support teachers is the focus of this section. 

Purpose of Leadership Team 

The purpose of the leadership team was to support the design and implementation of PBL across 
the school in an effort to increase college and career readiness. Rather than bring in consultants or an 
externally created PBL curriculum, the principal believed that developing teacher expertise and teacher 
capacity was a “big part” of the grant. 

Methodology 

Data collection and analysis for this section of the evaluation was conducted between 2014-2015. 

By 2014, all of the members of the evaluation team had started to form opinions on what was going on in 
the school. The evaluation team considered it both wise and necessary to ask someone not familiar with the 
Sammamish High School i3 Grant work to conduct research on the leadership stmcture established at the 

51 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

school. Two to three times a year, the researcher would travel to the Pacific Northwest to collect 
qualitative data. In between visits the researcher would transcribe and analyze audio recorded interviews 
and focus groups. While the larger research team had some knowledge of the protocols the researcher was 
using and the data he was collecting, much of the analysis was performed between visits. 

Data on the School Leadership Structure was collected during the 2014 and 2015 school years. In 
order to learn about the principal’s role in the grant and implementation process, he was interviewed five 
times over the course of the 2014-2015 school year. Each interview lasted between one and three hours. 
All interviews were audio recorded and transcribed for analysis, and all but one took place in a face-to-face 
setting. Findings were developed iteratively, over multiple rounds of analysis of the date, through a 
grounded theory approach (Erickson, 1986; Strauss & Corbin, 1998). As more themes surfaced from the 
process of data analysis, we adjusted our questions and interpretations. 

Each teacher leader was interviewed at least twice at various points throughout the 2014 and 2015 
school years, with the exception of one member of the leadership team who left the school at the end of 
the 2014 school year. Teachers were strategically identified to gather the full range of opinions and 
perspectives regarding the PBL work of the grant. Teachers were identified over several meetings with the 
research team for this evaluation. As with interviews conducted with the principal, all teacher and teacher 
leader interviews were audio recorded. All department focus groups were also audio recorded. All 
interview data was triangulated with school and grant documentation, department focus groups, principal 
interviews, and other interviews with teachers and teacher leaders. 

Leadership Team Membership 

The core school leadership team for the grant, originally called the Implementation Team for part 
of the first year, consisted of the following people: 

1. Principal (also served as the Project Director with no additional pay) 

2. Grant manager (external hire, non-teacher) 

3. i3 project leader 

4. i3 project leader 

5. i3 project leader 

6. Teacher leader/Instructional Curriculum Technology Leader (ICTL) 

7. Teacher leader/Literacy Coach 

8. Teacher leader/ELL Coach 

The full Implementation Team was formed in February/March 2011 and the Implementation 
Team and the Instmctional Leadership Team (ILT) were merged to into one leadership team in May 2011. 
According to the principal most leadership positions were open, meaning anyone in the school could have 
applied. Ultimately two i3 project leaders were teachers who had been significantly involved with the grant 
application process. One project leader had previously been involved in a Project-Based Learning 
curriculum redesign project, thus had first-hand experience transitioning from a more traditional 
curriculum to a more student-centered curriculum like PBL. 

The core leadership team grew from three members during the initial year to approximately eight 
members by the end of the grant. Although the grant manager position was not funded the last year of the 
grant, the same individuals served as the core leadership team during most of the five years. For an 
overview of all teacher leaders and their responsibilities throughout the grant period, please see Table 15. 

52 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Peripheral Members of Leadership Team 

At various times additional members were funded to complete specific leadership tasks. These 
members weren’t considered to be official members of the leadership team and were not required to 
attend leadership team meetings, but were funded to work on, as one teacher put it, “leadership activities”. 

• Teacher Leader [English Teacher] 

• Teacher Leader [English Teacher] 

• Studies Teacher Leader [Social Studies] 

These peripheral members were mostly active during the last two years of the grant and assisted with 
developing materials and information for external audiences (e.g., articles about SHS and PBL for 
websites). 

Leadership Team Roles 

The principal and the teacher leaders all had distinct roles and responsibilities. Some of these roles 
and responsibilities remained constant throughout the grant and some of these shifted to address needs of 
teachers, and to fulfill the obligations and expectations of the grant. This section provides an overview of 
the roles and responsibilities of the principal and the teacher leaders. 

Overview of the Role of the Principal 

In general terms, the principal described his role as “chief planner” and described key aspects of 
Inis role as Project Director and school leader as: 

• Person thinking the most about systematic change over time 

• Facilitating and managing the change process 

• Building teacher leadership capacity 

• Ensuring implementation has fidelity with the grant 

• Celebrating successes along the way 

• Problem-solving 

Specifically, the principal’s role as Project Director and school leader can be outlined in three phases: 
building the foundation, facilitating the process, and extending the reach, that roughly coincide with the 
year one, years two through four, and year five of the grant. 

Building the Foundation (Year 1) 

During year one, the principal’s main role was to build a foundation for the successful 
implementation of the grant over a five-year period. This primarily involved building relationships, 
establishing partnerships, and setting expectations for key stakeholders in the district office, school, and 
community. The principal understood and communicated to staff that innovation takes risk and the reality 
of an “implementation dip” as change begins. One key aspect of setting expectations for the project was 
communicating to stakeholders that implementing PBL and other aspects of the grant were going to be 
“substantial change, and that it was likely that when we started it, tilings, at times, [might] not go well.” 
Other primary responsibilities of the principal during year one included: 


53 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

• Writing job descriptions for grant leadership positions (grant manager and teacher leaders), in 
collaboration with a university researcher 

• Working with bargaining units and district human resource department to get leadership positions 
approved 

• Hiring a grant manager and teacher leaders 

• Organizing and facilitating the grant leadership team to begin implementing the transition to PBL 
curriculum 

• Identifying individuals for the grant Advisory Board 

• Developing initial version of Key Elements in partnership with teacher leaders, university 
researchers, and Knuth Research 

• Meeting with district administrators and curriculum personnel to develop common expectations 
for PBL curriculum development and professional learning 

Facilitating the Process (Years 2-4) 

The role of the principal in year one was focused on setting the stage for successful 
implementation of the grant and taking initial steps to begin the curricular and cultural shift at the school. 
Once work was underway, the second phase required a long-term focus on supporting teachers as they 
shifted their practice and developed professional expertise in PBL. During this time, shifts in 
implementation strategies took place (e.g., moving from a prescribed list of courses to be redesigned to an 
application process by interested teams of teachers) and teacher supports like the Key Elements were 
revised and refined. In general, the principal’s role shifted to one focused on facilitating the change 
process, maintaining focus on implementing the grant with fidelity by responding to feedback and making 
strategic and practical changes as necessary. The main responsibilities of the principal during this period 
included: 

• Maintaining long-term focus on implementing grant goals in the face of evolving district priorities, 
shifting academic standards, and changing school and district personnel 

• Continuing to set expectations and tone for the change process 

• Facilitating alignment of goals and expectations between PBL design teams, district supervisors, 
and district curriculum developers 

• Working with teacher leadership team to develop and implement professional learning and 
supports so middle and late adopters moved forward in the change process 

• Working with teacher leaders and a university researcher to refine Key Elements so they better 
support curricular and cultural shifts occurring at Sammamish 

• Strategically hiring new teachers who were aligned with school goals and had demonstrated 
professional collaboration in previous work 

• Building internal leadership capacity and problem-solving ability of teachers 

• Building relationships with direct supervisor(s) in an effort to support the ongoing work at the 
school 

• Ensuring work with external partners was aligned with school’s goals, and when necessary, 
changing or ending relationships 

• Gathering feedback from key stakeholder groups and making strategic adjustments as necessary 

54 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

• Maintaining communication between key stakeholders 

• Celebrating successes 

Extending the Reach (Year 5) 

After the initial year, the primary role of the principal was to maintain long-term focus on 
supporting teacher development and change as they shifted their instmctional practice and developed 
professional expertise in PBL. In the final year of the grant, the principal continued to work with the 
leadership team to support teacher growth and engagement with the project. He also worked to ensure 
that goals and obligations outlined in the grant were met. At the same time, there was an outward shift in 
focus. Curricular and cultural changes at Sammamish were communicated beyond the school through a 
variety of channels and the principal looked to build external partnerships that would help Sammamish 
continue the work after grant funds were gone. The main responsibilities of the principal during this 
period included: 

• Cultivating new partnerships with community stakeholders and organizations for the purpose of 
extending the impact of the grant at Sammamish, within the district, and across the larger 
education community 

• Deepening and redefining established partnerships to ensure Sammamish is continuing to receive 
value 

• Working with leadership team and teachers to revise the teacher leadership structure so more 
stakeholders feel involved and heard 

• Mediating new district initiatives that could take focus away from the change process and meeting 
goals outlined in the grant 

• Facilitating further refinement of the Key Elements and their integration into the new teacher 
evaluation system (based on the Danielson framework) 

• Collaborating on a book, and presentations for local and national education conferences 

• Facilitating outreach tours of Sammamish with groups of local, national, and international visitors 

• Maintaining focus on long-term change by planning for “after the grant” 

• Continuing to set expectations and tone for the change process so teachers would continue to 
develop and collaborate 

• Strategically hiring new teachers who were aligned with school goals and had demonstrated 
professional collaboration in previous work 

• Ensuring work with external partners was aligned with school’s goals, and when necessary, 
changing or ending relationships 

• Maintaining communication between key stakeholders 

• Communicating with teachers that PBL was going to be a continued focus of the school 
Role of Teacher Readers 

While individual teachers on the leadership team had specific roles and responsibilities (see Table 
15), as a group they were responsible for facilitating implementation of the grant. This teacher leadership 
team, along with the principal, was responsible for ensuring all aspects of the grant were being addressed 
and that the school was following through with the plans outlined in the grant application. The role of the 


55 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

leadership team as a group can be outlined in three phases similar to the role of the principal. A phrase 
repeated by the leadership team was “building the plane while flying,” meaning the team was 
simultaneously engaged in thinking long term about supporting professional learning and deepening the 
implementation of PBL throughout the school and “putting out fires” like planning the next school visit 
or mnning off to a meeting with other school leaders within the district. While each teacher leader fulfilled 
specific roles within the team, such as PBL Implementation Lead or Starting Strong/SHS Leads 
Coordinator, their roles were also somewhat fluid and every-changing depending on the very immediate 
and urgent needs of the project at that time. 

building the Foundation (Year 1) 

A large part of this first year was spent identifying and hiring teachers for the teacher leadership 
team. The Implementation Team, not fully formed until February/March initially only consisted of the 
Grant Manager and two Project Leaders. Near the end of the year (May 2011), the Implementation Team 
and the Instructional Leadership Team (ILT) were merged into one leadership team. During year one, 
primary responsibilities for the leadership team included: 

• Identifying individuals for the grant Advisory Board 

• Developing initial version of Key Elements in partnership with principal, university researchers, 
and Knuth Research 

• Organizing the mentoring program 

• Revising Starting Strong 

• Devising a communication plan (e.g., using OneNote, capturing gained knowledge, feedback 
mechanisms, collaborative process, school website) 

• Planning staff meetings and board presentations 

• Planning PBL PD on release days in May 

• Planning first Sammamish Institute of Learning and Teaching (SILT) 

• Reviewing budgets and staffing for PBL redesign teams 

Facilitating the Process (Years 2-4) 

During this stage of the project, teacher leaders spent much of their time supporting teachers as 
they worked to design PBL curriculum in design teams and implement PBL curriculum in their classroom. 
This support mostly included designing professional learning experiences based on feedback they were 
receiving from teachers. At the end of year three and throughout year four, teacher leaders were 
increasingly involved in managing and facilitating school visits as other schools, districts, and educational 
organizations became interested in what was happening at the school. The main responsibilities for the 
leadership team during this period included: 

• Maintaining long-term focus on implementing grant goals in the face of evolving district priorities, 
shifting academic standards, and changing school and district personnel 

• Shifting course redesign from being prescriptive to application-based where teacher teams apply to 
redesign a course (year X?) 

• Developing and implementing professional learning and supports so middle and late adopters 
moved forward in the change process 


56 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

• Expanding, revising, and refining Key Elements to better support curricular and cultural shift 

Extending the Reach (Year 5) 

At this point in the project, teacher leaders spent much of their time establishing an organizational 
structure for interacting with external partners and organizations that were interested in the work going on 
at the school. At the end of year four and throughout year five, several teacher leaders were intimately 
involved in revising and publishing the Key Elements document to an audience outside the school. The 
main responsibilities for the leadership team during this period included: 

• Cultivating new partnerships with community stakeholders and organizations for the purpose of 
extending the impact of the grant at Sammamish, within the district, and across the larger 
education community 

• Working with the principal and other teachers to revise the teacher leadership structure so more 
stakeholders feel involved and heard 

• Further refinement of the Key Elements and their integration into the new teacher evaluation 
system (based on the Danielson framework) 

• Collaborating on a book, and presentations for local and national education conferences 

• Facilitating outreach tours of Sammamish with groups of local, national, and international visitors 

• Maintaining focus on long-term change by planning for “after the grant” 

• Continuing to set expectations and tone for the change process so teachers would continue to 
develop and collaborate 


57 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Table 15. 


Overview of Teacher Leaders, Department Affiliation, Major Responsibilities, and FTE 


Position Title 

Department 

Responsibilities 

Grant-funded FTE 

Y1 

Y2 

Y3 

Y4 

Y5 

Grant Manager 

N/A-Nota 

teacher 

• Preparing annual reports 

• Developing, managing, 
maintaining budget 

• Managing contracts 

• Ensuring compliance 

• Providing logistical support for 
events 

• Communications 

1.0 

1.0 

1.0 

0.5 


i3 Project Leader 

Social Studies 

• Worked on i3 grant 

• PBL Professional development 

• PBL curriculum design team 
support 

• Revising Key Elements 

• Help support UW TCs during 
Sammamish Leads 

• Sharing the Sammamish story 
with external audience 

0.4 

0.4 

0.7 

(Maternity 
Leave Sept- 
Dec 2013) 

0.6 

0.4 

i3 Project Leader 

Science 

• Mentorship 

• Community outreach 

• Engaging outside experts 

• Support PBL professional 
development 

• School website development 

• Took the lead with creation of 
the Advisory Board, planning 
and running all four Advisory 
Board meetings per year, and 
maintaining on-going 
communication with Board 

0.4 

0.6 

0.6 

0.4 

0.4 

Teacher Leader 

Literacy Coach 

• Sammamish Leads 


0.2 

0.2 

0.2 

0.4 


58 








Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) 



Grant Dev07 



















Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Leadership Team Meetings 

The school leadership team met weekly throughout the duration of the grant, and this opportunity 
to meet weekly was funded through grant monies. The researcher was provided access to the leadership 
team meeting agendas recorded in Microsoft OneNote. OneNote was used to record meeting agendas and 
at times meeting notes and other relevant files and documents. This OneNote notebook was available for 
anyone working at the school to view. In the school leadership team meetings OneNote file, 151 
leadership team meeting agendas were recorded, starting on April 12, 2011 and ending on June 17, 2015. 
These meetings generally occurred once a week starting at the beginning of each school year (except year 
1) and concluding at the end of the school year. Meetings also did not take place during times that school 
was not in session (e.g., holidays, winter break, spring break). 

The purpose of these weekly leadership team meetings was to discuss issues related to grant 
implementation, instruction, and leadership. Meeting topics were related to discussing and planning: 

• Grant and grant implementation 

• District and state assessments and student achievement data 

• Roles, responsibilities, and goals of leadership team members 

• Starting Strong/Sammamish Leads 

• Communication about grant 

• Staff meetings 

• PD for staff (e.g., SILT, other PD days) 

• Advisory Board meetings 

• Presentations (e.g.. School Board, external groups) 

• Visits by outside groups (e.g., other schools and districts, Microsoft executives) 

• Sources and methods of obtaining additional funding 

Advisory Board Meetings 

The leadership team was also responsible for forming and meeting with a grant Advisory Board. 
According to the i3 grant application, the Advisor Board’s purpose was to provide: 

• Program guidance for STEM college readiness and career preparedness 

• Assistance with securing resources and 

• Assistance with scaling by leveraging professional connections 

• Assistance with seeking funding to support project implementation in new settings 

For most of the first year, the leadership team was identifying individuals who could serve on the Advisory 
Board. Membership on the Advisory Board shifted several times throughout the duration of the grant. In 
general, Advisory Board members included representatives from the partner university, from local business 
and industry, and from within the administrative ranks within the school district. Typically meetings lasted 
from 2-3 hours and provided school leaders and the Advisory Board with a chance to discuss progress the 
school was making and to discuss various problems that surfaced. During year two the Advisory Board 
met approximately three times. During years three and four the Board met 4 times each year. Generally, 
meetings occurred twice in the fall and twice in the spring. Meeting dates recorded in leadership team 
meeting notes can be viewed in Table 16 below. 


60 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Table 16. 


Advisory Board Meetings 

Meetings 

Year 2 (11-12) 

Year 3 (12-13) 

Year 4 (13-14) 

Year 5 (14-15) 

Fall 

10/2011 

9/2012 

11/2012 

10/2013 

12/2013 

10/2014 

12/2014 

Spring 

1/2012 

5/2012 

2/2013 

5/2013 

2/2014 

5/2014 

3/2015 

5/2015 


Findings 

While teachers cherished the time to collaborate with colleagues to redesign curriculum and refine 
their practice, they experienced varied levels of support from teacher leaders in that process. 

• Teacher leaders successfully used the Key Elements to design thoughtful professional learning for 
and with teachers. Many teachers viewed the Key Elements as a crucial form of support that 
guided their PBL curriculum redesign process. 

• While teacher leaders brought expertise and experience specific to PBL to their work with teachers, 
they stmggled at times to support their colleagues’ content-specific needs, both as a group and 
individually, as they worked to design or implement PBL curriculum. 

• Support for incorporating outside experts into the curriculum redesign process was problematic. 
Teachers expressed a need for more help connecting with experts but were provided little specific 
structure or support in that area. 

• Working without clearly delineated roles and responsibilities both empowered and compromised 
teacher leaders. 

The Key dements as a Crucial Support for Teachers’ PBL Work 

One of the central roles of teacher leaders throughout the grant was to further define and support 
the implementation of the Key Elements of Problem Based Learning as a description of PBL pedagogy 
and practice. The Key Elements anchored their work with teachers who were designing and implementing 
PBL curriculum. Teacher leaders used the Key Elements to ground all professional learning experiences. 

As such, the Key Elements emerged as a central tool teacher leaders used to support teachers’ work in and 
out of the classroom. 

While a majority of teachers described the Key Elements as a valuable tool for their ongoing 
professional growth, some dismissed them as either too simplistic or too general in scope. A World 
Language teacher described how they provide teachers with a “common language.” A Social Studies 
teacher expressed a similar opinion stating, “It gives a framework and it gives a way to talk to each other.” 
A Performing Arts teacher described how the Key Elements “made me reflect on my teaching. I kind of 
turned it [teaching] back on myself with each of the Key Elements.” Those teachers that were dismissive 
of the Key Elements described them either as more superficial than insightful, as “things [that] are like 
basic good teaching” or as lacking some of the “content specific ways of thinking about things.” These 
teachers’ dissenting views represented a minority of teachers we interviewed. 

Teachers voiced appreciation for how teachers and teacher leaders used the Key Elements to 
design relevant and thoughtful professional learning experiences. A teacher leader described them as a 

61 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

“teaching improvement system” that guided teacher leaders’ “thinking about teachers and developing them 
[the Key Elements] to create a learning environment that’s conducive to all students. I mean what a great 
place to start.” A new teacher to Sammamish recalled their first experience attending the Sammamish 
Institute for Teaching and Learning (SILT) and how different it was than other professional development 
they had experienced. They said, “I left just like jumping up and down because I had never sat in a PD 
where people were asking questions about things. About students and about learning, and sharing ideas 
and being excited about anything. I mean it was just totally different. I was so excited to get started.” 

Supporting Content-Specific Curriculum Design 

Teacher leaders struggled to provide both more general, big picture support for PBL design and 
content-specific support teachers needed to make PBL work within the constraints of their discipline. 
Teachers were provided the Key Elements document to guide their thinking about PBL pedagogy but 
were provided few content-specific examples to draw from. This was by design. Teacher leaders and the 
principal wanted teachers to engage the curriculum design process creatively and expansively and to design 
curriculum that worked best for the specific students they served. While some teachers saw this as an 
exciting challenge, others found it frustrating and counter-productive. 

Teacher leaders worked to support teachers’ PBL curriculum design regardless of content-area 
constraints. This proved problematic for the teacher leaders and teachers. As one Math teacher stated, “I 
think the thing that kind of frustrating me the most is people who are mnning this whole thing trying to 
tell us how to do it when they’ve never taught a single math lesson in their life.” Although shared by other 
teachers, this concern does not seem to be universally shared throughout the staff. Many teachers saw the 
lack of content-specific support as an authentic problem in an of itself that demanded they creatively and 
collaboratively problem solve in design teams. As a Science teacher stated, “Having a lack of PBL 
curriculum and examples I think in some ways is a challenge, I mean just like with teaching, having a 
model to look at, is helpful, um but I also think if you are looking for a PBL curriculum you are looking 
for a technical solution to an adaptive problem.” A teacher leader corroborated this perspective saying, “If 
we are just keep giving them all these sample units, we’re not actually getting the change that we want. So 
from that lens like I don’t think that [not having PBL curriculum] was a problem.” While teacher leaders 
worked to support all teachers as much as possible, they simply did not have the content knowledge to 
provide the support some teachers needed. This tension remained both a practical and philosophical 
conflict throughout the duration of the grant. 

Using Outside Expertise for PBL Curriculum Design 

Contacting, partnering, and leveraging outside experts proved problematic throughout the duration 
of the grant. While both teachers and teacher leaders acknowledged this as a problem early on in the PBL 
work, a good solution remained illusive. Teachers were concerned about the lack of experts available to 
them to contact and how to best use their expertise when they brought them into the curriculum design 
process. While a few individual teachers fostered productive relationships with outside experts, the process 
proved too overwhelming and ambiguous for many teachers. Teacher leaders also expressed frustration 
with their lack of expertise and experience in this area and thus with their limited ability to support 
teachers. 


62 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Many teachers were blunt in expressing their disappointment with the lack of support regarding 
experts in the classroom. A Math teacher stated, “Outside expertise has been impossible. We’ve just been 
making up what we could on our own.” An English teacher shared this sentiment saying. 

I’ve been really, really, really disappointed by the focus that was given to us 
on how we should be involving outside experts. We were on our own, 
essentially, to locate those people, provide them, schedule them, bring them 
in. And then at the tail end of that, we were also expected to share those 
people’s names with a person for collection and entry into a database. Um, 
and I think that a lot of us felt like we had traded on personal relationships 
with people. 

Other teachers offered more nuanced assessments of what went wrong with the process saying. 

They [leadership team] had a teacher leader looking for experts for other 
disciplines, but she didn’t have any contacts in our world, and it was, and 
then in certain languages, certain subjects it’s harder to find the experts. So, 
so we didn’t have the background research or the background experts to 
come in. And we didn’t have the expertise to help with our PBL. So, you 
know that was kind of where we were lacking. 

Although this was clearly a point of weakness in the support provided teachers, this task constituted new 
work for everyone involved. While some teachers likely had experience working with outside experts in 
their own classrooms or extra-curricular activities, very few models exist for how to locate and 
thoughtfully integrate experts into the curriculum planning and implementation process. Although teachers 
have great facility with the content they teach, teachers may not have much contact with professionals and 
outside experts who work in the fields and professions who use the content in their work. For example, a 
Biology teacher may or may not have Biologists friends or acquaintances to tap as external experts. The 
assumption that teachers would have vast networks of professionals who do work in the same content 
area in which teachers teach, further complicated efforts to connect teachers with outside experts. 

Teacher Leaders as Empowered and Compromised Agents of Change 

The Sammamish High School principal greatly empowered the teacher leaders to design and lead 
all professional learning experiences and support the PBL design and implementation process. He afforded 
the teacher leaders with latitude to define their roles and responsibilities and support teachers’ work in 
ways that best matched their skills, talents, and strengths. Throughout the five years of the grant funded 
PBL work, the membership of the leadership team remained largely consistent. As teacher leaders 
continued to deepen and broaden their leadership and pedagogical expertise, they reinvested that expertise 
back into their work with teachers. Over time, the leadership team, consisting of the teacher leaders and 
the principal, developed a common vision for they should continue to support teachers and efficient and 
effective ways of working together. Much of the success of the PBL project can be attributed to the care 
and support the teacher leaders provided to teachers as they worked to design and refine their PBL 
courses. 

Over time, however, there emerged resentment amongst some teachers that decision-making and 
power consolidated within the teacher leader ranks. One teacher stated, “I think there has been a growing 
feeling about just a core few group of people is making decisions. Um, and I think there was some 
resentment building about that as though it’s like sort of this inner circle.” Teacher leaders were funded 

63 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

through the grant to be released from their typical teaching responsibilities. While this made sense from a 
school leadership perspective, some teachers increasingly found this policy to be problematic. Some 
teachers pointed out that they had more experience and expertise planning and teaching PBL, at least that 
which is articulated in the Key Elements, than the teacher leaders who were responsible for supporting 
and guiding their work. Lastly, some teachers questioned how the teacher leaders were held accountable 
and what their specific roles and responsibilities were. One teacher argued, “I don’t see the leadership roles 
as effective at all. I think they are extremely ill-defined. I think that there, well, there isn’t a job description, 
thusly there is no set of accountability measures to be taken that, one, these people are doing their job, or 
[two] that their job is necessary, or [three] that it is effective.” 

To be fair, these sentiments, although deepening amongst some teachers, were not universal. Some 
teachers expressed gratitude towards the teacher leaders and the work they had to help move the school 
towards PBL. While noting that greater transparency about specific teacher leader roles and responsibilities 
and how they were held accountable would have been welcomed, a teacher stated, “All that being said, I 
think they have done a hell of a job, I think that by and large the school respects them tremendously and is 
incredibly grateful to them, I know that I am. I have respect for every single one of those people and I, 
and I know how hard they have worked.” Many teachers we spoke to described both a candid assessment 
of the drawbacks of a teacher leadership role and a deep respect for the work teacher leaders had 
accomplished. 

For their part, many of the teacher leaders were not unaware of growing discontent towards them. 
One of the first teacher leaders to be promoted spoke in thoughtful ways about the role and how difficult 
it is to balance leadership responsibilities with existing relationships with colleagues. She said. 

Yeah I think, I mean I would add one thing. I think the, um, the innovation 
of doing, of changing instmction significantly within a traditional 
comprehensive high school, um, knowing how to do that means knowing 
how to make change amidst turnover, and amidst personalities of 65, 70 
different staff. And amidst the, you know, power differentials within 
departments. And amidst people being people and leaders having their own 
specific styles that work with some people and don’t work as well with some 
people, and knowing that all those constraints. I think our challenge, and the 
win of this grant, if it comes to fruition in that way, is knowing how do you 
make significant change? And you are gonna have people come and go, and 
you are gonna have people who work better or worse with other people. And 
you are gonna have, even whoever you put in leadership positions is gonna 
have, even if they are teacher leaders, they are gonna have specific styles. And 
they are gonna have limitations and they are gonna have personal lives. And 
they are gonna have, you know, um. And so it’s about making change within 
that stmcture which, um, has not happened perfectly here. But I think has 
happened in some powerful ways, but, and I would say too that in some 
instances people who sound negative in an interview around PBL have 
actually shifted the practice in the classroom in ways that don’t come across 
in an interview. 

Speaking in the final year of the grant, this teacher leader reflected on the importance of the interpersonal 
dynamic of school leadership and how making change is a deeply personal and intimate endeavor. The 

64 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

process of changing how people teach, how they see themselves as teachers and teacher learners, and how 
they interact with students is a long, difficult, yet not impossible process. 

Discussion 

While most of our analysis has been about the role of the teacher leader, we want to highlight 
several key components to success and stmggles the leadership team faced throughout the duration of the 
grant. First, the distributed leadership approach taken by the principal was an integral and central 
underlying philosophy behind his decision-making. Eschewing a top-down approach to guiding the 
process of PBL design and implementation, he established the teacher leader core and engaged them as 
collaborators and equal decision makers. He empowered them to redesign the way professional learning 
experiences were developed and made those teachers the school’s primary source of support and expertise. 
Second, although we describe the formal role of teacher leader within the school, many informal teacher 
leaders emerged since the beginning of the PBL project. Teachers have presented curriculum at national 
conferences, have served as guest speakers and teachers in university teacher education programs, and 
have taken up leadership positions within the district. Sammamish teachers have taken advantage of the 
flexibility they have been afforded to express leadership in myriad ways within and without the school. 
Third, many of the problems that have surfaced as a result of teacher leaders’ work have more to do with 
the social, political, emotional inner-workings of public schools than any specific policy failure. Teacher 
leadership policies are problematic not just logistically but socially and interpersonally. Teaching is a 
profession built upon a strong egalitarian ethos. Promoting teachers within a school to a position of power 
and influence over other teachers creates interpersonal dilemmas both for the teachers who do not 
become teacher leaders and those that do. Lastly, not only were school leaders and teacher leaders flying 
blind, or as they would put it, “flying the plane while building it,” but much of the work they were doing 
was unprecedented in public schools. Few public schools have attempted to transform themselves from a 
traditional school to a PBL school by drawing from existing teachers’ expertise. 


65 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Chapter 6: Exploratory Studies 1 and 2 

Assessing the Impact of Problem Based Learning (PBL) on Students’ Career and College 

Readiness 

In this section we share findings from two Exploratory Studies intended to identify possible 
components that impacted student growth and learning. In the first study we explore if any differences 
that exist in student performance on AP tests. We compare a group of students who took AP tests prior to 
the school’s PBL adoption with a group of students who took AP tests while the school was implementing 
PBL across content areas. In the second study we explore the impact of the school’s summer Sammamish 
Leads (formerly Starting Strong) program meant to introduce incoming students to PBL and to enrich 
current students’ PBL skills by engaging them in relevant and authentic challenge cycles with industry 
experts. 

We organize our findings using the following structure. First, we provide background to both 
studies by revisiting the school’s stated goals in their i3 grant proposal and the demographic background 
about the students who have attended SHS over the past ten years. Second, we provide methodological 
background to how we collected, organized, and analyzed the quantitative data. Third, we share findings 
from Exploratory Study # 1 in which we analyze longitudinal data comparing student performance on AP 
tests prior to and during the school’s adoption of PBL and differences in AP pass rates over time. 
Specifically, we share findings from data collected at the school, course, and department level. This data 
includes a comparison of all students selected for inclusion in matched groups and subsets of those 
students who receive free and reduced lunch (FRL) services, students who receive Special Education 
accommodations or students with disabilities (SWD), and students who speak a first language other than 
English at home (EngNotFirst). Fourth, we share findings from Exploratory Study #2 in which we assess 
the extent to which student participation in Starting Strong/Sammamish Leads impacted participating 
students’ future performance on the Campus Ready instrument. Finally, we discuss what these findings 
mean in the context of the school as we have observed it in the duration of our study. 

As an i3 development project, the major evaluation goal was to provide data that project leaders 
could utilize to define, refine, and improve the project. This resulted in an updated logic model and fidelity 
of implementation criteria in which we articulate the essential components of the project. This, then, 
enabled the evaluation team to design and answer 3 research questions. Each of these questions was aimed 
to identify potential impacts that could be investigated using rigorous research techniques as a validation 
project in the future. The findings from these investigations illuminate relationships between project 
components and outcomes related to student college and career readiness. 

The research questions we address in Exploratory Study #1 are as follows: 

Question 1: AP Test Performance. Is there a relationship between student participation in courses that 
were targeted for PBL redesign and college and career readiness as defined by student performance on AP 
Tests (mean AP scores)? 

Question 2: AP Pass Rates. Is there a relationship between student participation in courses that were 
targeted for PBL redesign and college and career readiness as defined by student performance on AP Tests 
(pass rates)? 


66 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
The research question we address in Exploratory Study #2 is as follows: 

Question 3: Starting Strong. Is there a relationship between student participation in Starting 
Strong/Sammamish Leads and college readiness as measured by student performance on the Campus 
Ready instmment? 

Stated Goals and Student Populations 

The stated outcome goals for the Sammamish High School i3 Development grant are as follows: 

• 20% increase in AP pass rates, especially in STEM content areas (Biology, Chemistry, Statistics, 
Calculus AB/BC, Physics, Environmental Science), 

• 20% increase in students with disabilities (SWD) and limited English proficient students (LEPS) 
enrolling in AP STEM classes, 

• 75% of all students, 50% of SWDs, and 60% of LEPS successfully completing pre-calculus with a 
B or better, 

• 100% of all students reaching standard on the state math test, 

• 10% annual improvement on the state science test for all students, 

• 15% annual improvement for SWDs and LEPs, 

• 90% on-time graduation rate for SWDs and 75% on-time graduation for LEPS. 

Although the school’s grant proposal explicitly references students’ scores on state tests, we have 
chosen to compare students’ performance on AP exams for several reasons. Educational research 
demonstrates a strong correlation between a student’s grade point average (GPA) (Geiser, Santelices, 2007; 
Sawyer, 2013) and their ability to succeed in college, noting that GPA remains one of the best predictors 
of a student’s success in college. Scholars have also found similar strong correlations between a student’s 
engagement and achieved success in AP coursework and their ability to succeed in college (Dougherty, 
Mellor, Jian, 2005; Reid, Moore, 2008). 

Conversely, educational researchers have yet to identify a strong correlation between a student’s 
score on various state high stakes tests and their college readiness. This conundrum is especially 
pronounced in Washington State where since 2002, the state high stakes test has changed three times from 
the Washington Assessment of Student Learning (WASL) to the High School Proficiency Exam (HSPE) 
to the current Smarter Balanced Assessment (SBA), making longitudinal comparisons, using those 
measures, between groups of students nearly impossible. 

These desired outcomes listed by the school were closely linked to the school’s desire to improve 
students’ college and career readiness and further open access to rigorous science, technology, engineering, 
and mathematics (STEM) coursework to more students through the use of problem based learning in all 
core content coursework. To address the research questions stated above, we assembled a large database 
of student variables. 

PBL Redesigned Courses, Student Cohorts, and Student Populations 

One of the major goals of the i3 project was to redesign a significant portion of the courses 
offered into PBL courses. The following is a list of the courses that were targeted for redesign (i.e., had a 
teacher design team work on the redesign) and their corresponding year of implementation. Note that 


67 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


these courses are labeled as ‘targets.’ Whether or not PBL was implemented is not reflected in this list. Of 
these courses, 13 had an Advanced Placement designation. 

Table 17 shows the courses that were targeted for PB redesign and the year in which they were 
redesigned. 


Table 17. 


Courses Targeted for PBL Redesign 

Targeted Courses 

School Year of 
Implementation 

AP BIOLOGY 

12 

ALGEBRA 2 

12 

AMER LIT/COMP 

13 

AP AM STUDIES 

14 

AP AM/COMP GOV 

14 

AP AMER GOVT 

10 

AP CHEMISTRY 

13 

AP CHINESE LANG 

12 

AP ENV SCI 

10 

AP HUMAN GEOG 

11 

AP PHYSICS 1 

14 

AP SPANISH LANG 

13 

AP US HIST/LNG/CMP 

14 

AP US HIST 

13 

AP WORLD HIST 

12 

BIO/CHEM 1 

12 

BIO/CHEM 2 

12 

BIO/CHEM I & II 

11 

CHINESE 3 

12 

CHORALE CHOIR 

12 

CONCERT BAND 

12 

CONCERT CHOIR 

12 

CORE PHYS ED 1 

12 

CORE PHYS ED 2 

12 

DIG VID /AUDIO 1 

11 

DIG VID/AUDIO 2 

11 

DIG VIDEO/AUDIO 

11 

DRAMA 

12 

ELL AMER LIT 

12 

FRENCH 3 

13 

GEOMETRY 

13 

HEALTH 

12 

HLTH SCI CAREER 

12 

HON FROSH COMP 

12 

HON SOPH COMP 

12 

INTRO COMP PROG 

11 

ORCHESTRA 

12 

PHYSICS 

13 


68 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


PRE-CALCULUS 

14 

SPANISH 4 

13 

US HISTORY 

13 

WORLD HISTORY 

12 


Rather than graduation year, in this report we refer to each group of students as a Cohort defined 
by the school year in which students were or would have been freshmen. Table 18 below provides a key 
for Cohort and Graduation years, as well as the total number of students in each cohort. 


Table 18. Cohort Names, Graduation Year, and Number of Students 


Cohort 

Graduation Year 

Number of Students 

C 2002 

2006 

361 

C 2003 

2007 

419 

C 2004 

2008 

391 

C 2005 

2009 

417 

C 2006 

2010 

364 

C 2007 

2011 

347 

C 2008 

2012 

360 

C 2009 

2013 

347 

C 2010 

2014 

361 

C 2011 

2015 

371 

C 2012 

2016 

324 

C 2013 

2017 

282 

C 2014 

2018 

207 


Table 19 below presents each cohort and its related dosage with respect to the years the project 
was being developed and implemented. Because of the way redesigned PBL courses were phased in over 
time it is important to clearly understand which students received which courses and how many courses 
each student took. 


Table 19. 


Comparison of Mean Number PBL and All Courses Taken by Cohort 

Cohort 

Mean Number of PBL 
Targeted Courses 

Mean Number of all courses 
taken 

Percent 

Cohort 2 

0 

22 

0% 

Cohort 3 

0 

21 

0% 

Cohort 4 

0 

24 

0% 

Cohort 5 

0 

26 

0% 

Cohort 6 

0 

28 

0% 

Cohort 7 

1 

30 

3% 

Cohort 8 

2 

31 

5% 

Cohort 9 

2 

30 

8% 

Cohort 

10 

5 

30 

16% 

Cohort 

11 

20 

30 

66% 


69 





Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Some students in Cohort 2007, who began their time at SHS 3 years before the project was 
initiated, had access to the first PBL courses designed as a sort of pilot to this project. Conversely, many 
students that began as Freshmen during the project did not have opportunities to take the upper level 
redesigned courses that came online after they had graduated. Two cohorts in particular had four full years 
in the project but their doses were considerably different. Cohort 2010 was essentially one year ahead of 
much of course implementation and as such is more similar to a partial dose cohort. Cohort 2011 is the 
first cohort that received a ‘near’ full dose of the redesigned curriculum although the courses were in the 
first year of implementation, in many cases. 

It is helpful to visualize the points in time relative to project years that each cohort was receiving 
the SHS curricular dose. Table 20 below presents each cohort crossed with each school year. There are 3 
basic levels of dosage of the i3-sponsored curriculum that students receive: No dose, partial dose, and full 
dose. Because of the staggered approach to course implementation the partial doses contain a range of 
intensities. Cohort 2011-12 was the first cohort to receive a near full dose and for which 4 years of data is 
available. Cohort 2012-13 was the first cohort to receive the full dose access to all redesigned courses but 
only data through their junior year (2015) is available. 


70 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Table 20. 



Cohort 2002-03 

^3- 

o 

CO 

o 

o 

rsj 

O 

_c 

o 

u 

Cohort 2004-05 

Cohort 2005-06 

Cohort 2006-07 

Cohort 2007-08 

Cohort 2008-09 

Cohort 2009-10 

vH 

T—1 

6 

T—1 

o 

rsj 

t 

O 

_c 

o 

u 

Cohort 2011-12 

Cohort 2012-13 

^3- 

CO 

T—1 

o 

CM 

t 

o 

_c 

o 

u 

Cohort2014-15 

SY 2002-03 

FR 












SY 2003-04 

SO 

FR 











SY 2004-05 

JR 

SO 

FR 










SY 2005-06 

SR 

JR 

SO 

FR 









SY 2006-07 


SR 

JR 

SO 

FR 








SY 2007-08 



SR 

JR 

SO 

FR 







SY 2008-09 




SR 

JR 

SO 

FR 






SY 2009-10 





SR 

JR 

SO FR 






SY 2010-11 






SR 

JR SO 

FR 





SY 2001-12 







SR JR 

SO 

FR 




SY 2012-13 







SR 

JR 

SO 

FR 



SY 2013-14 








SR 

JR 

SO 

FR 


SY2014-15 









SR 

JR 

SO 

FR 


No Dose 

Partial Dose 

Partial Dose -Pr 

Course Redesig 

Full Dose -Pre 
Course Redesig 

Full Dose Fr-Jr 

Full Dose Fr-So 

Full Dose Fr 


=J ro =J 


How We Describe Groups of Students 

The evaluation team grouped subsets of students according to whether or not they receive free and 
reduced lunch (FRL) services, whether or not they have disabilities (SWD), and whether or not they speak 
a first language other than English at home. We provide rationale below for why we grouped students 
according to those categories in our analysis. 

Free and reduced lunch (FRLJ. One of the ways we disaggregate student performance data is by 
whether or not students receive free and reduced lunch services. Although typically under-representative 
of the number of students living in poverty, a school’s FRL statistic is commonly used by the Department 
of Education and other state government entities as the poverty index for a school. For example, a school 
does not qualify for Title I funds until 40% of their students receive free and reduced lunch support. Many 
educational researchers point to poverty as the single most entrenched problem facing public schools in 
the U.S (Darling-Hammond, 2010; Berliner, 2013). Their research indicates that a student’s socio¬ 
economic status can be a reliable predictor of a student’s performance on standardized test scores and 
overall performance in school, in many cases regardless of the race of the student (Sirin, 2005). 

We do not disaggregate the performance data by race for two reasons. First, the way the federal, state, and 
district entities collect data on students’ racial demographics is lacking. Students from the Middle East, for 
example, would have to identify themselves as African American, Hispanic, Native American, Multi-racial, 
White, or Other. These narrow categorizations would be confusing for parents and families and may lead 
to imprecise data based solely on racial classifications. Second, although such factors as parental 

71 






Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

educational background, race, ESL, and gender plays a role, a student’s SES status most significantly 
impacts the extent to which students can overcome those variables to access rigorous coursework that 
makes them more college ready (Cabrera, La Nassa, 2000; Berliner, 2013). In the cases where students 
have low SES status, the variables of race, ESL, at times gender, can further complicate their ability to 
access rigorous coursework and succeed in school. In the cases where students have middle to high SES 
status, those variables can be easier for students to overcome (Cabrera, La Nassa, 2000). 

Students who speak a language other than English at home (EngFirst). Another way we disaggregate the 
data is to track students’ AP scores who speak a language other than English at home. Although this is not 
a perfect measure of a student’s proficiency with English, it does provide some indication of whether or 
not the English language remains a barrier to their learning in school. We do not use the standard 
classification of English Language Learner (ELL) or Limited English Proficiency (LEP) students to 
categorize students for the following reasons. First, we have found that categorizations of courses as ELL 
or LEP are not consistent from year to year. Second, the state exam used to classify students as ELL 
students has shifted several times since 2002, making an apples-to-apples comparison of ELL students’ AP 
scores between years problematic. Instead, we use the more stable and reliable self-reported statistic of 
students who report to speak a language other than English at home. A good example of this can be found 
in Table 3 below that illustrates the striking difference between students who received ELL services and 
those who self-identified as speaking a language other than English at home. These data suggest that the 
range of language proficiency at Sammamish may be far wider than represented solely by the number of 
students who receive ELL services. 

Students with Disabilities (SWT)). We use this phrase to describe students who receive Special 
Education accommodations as documented in a 504 Plan or Individualized Education Plan (IEP). These 
students may or may not be in a pull out Special Education class. 

Table 21 illustrates the demographic changes Sammamish High School experienced from 2002 to 
2011. Of note is the increase in students with disabilities (SWD) and increase, since 2006, in students who 
speak a language other than English at home and students who receive free and reduced lunch (FRL) 
services. 


Table 21. 



SWD 

ELL 

FRL 

English Not Language at Home 

Cohort 2 

10% 

5% 

30% 

30%, 

Cohort 3 

14% 

4% 

26% 

26% 

Cohort 4 

11% 

5% 

28% 

28% 

Cohort 5 

12% 

3% 

26% 

26% 

Cohort 6 

18% 

3% 

25% 

25% 

Cohort 7 

16% 

6% 

34% 

34% 

Cohort 8 

21% 

5% 

35% 

35% 

Cohort 9 

23% 

9% 

38% 

38% 

Cohort 10 

25% 

8% 

36% 

36% 

Cohort 11 

23% 

9% 

34% 

34% 


72 





















































Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Methodology: Exploratory Study #1 

In this study, we compared AP Test Scores of two groups: those students that had participated in 
at least one targeted PBL redesigned course and those that had not. The former group (Treatment) 
received varying dosages of the redesigned curriculum depending on when they were enrolled at SHS. The 
comparison group received no dosage. Students were drawn from school data as early as 2001 and as late 
as the 2014-15 school year. 

Matching 

A matched comparison group, pre-post design was used to answer questions about the impact of 
project implementation on AP Test performance. Two groups, treatment and comparison, were 
established based on student exposure to courses that were targeted for redesign by the i3 project. 
Treatment students had taken at least one targeted PBL course while comparison students had not taken a 
targeted PBL course. For the analysis a variable called “PBLPool” was created and coded with a 0 for the 
comparison group and 1 for the treatment group. 

A decision was made to include students from cohorts 5 through 12 in the pool because (1) the 
number of years prior to the innovation was about the same as the project period, (2) full Free/Reduced 
Lunch data (a key covariate in our data analysis) was not available for earlier cohorts, and (3) these cohorts 
were temporally adjacent to each other which helped control for environmental and cultural differences 
experienced by these two groups. Next, all AP Test taker data from this time period was compiled into a 
single database in which the unique record identifier consisted of 2 variables: student ID and AP Test. 

Only the first attempt at each AP Test was utilized in the database to reduce re-testing effects. (A small 
percentage of students, less than 5%, typically retake AP exams.) 

Students were then matched on AP Tests taken and the number of years they spent at SHS. 
Demographic covariates (SWD, English First, FRL, and gender) along with overall high school GPA were 
used as covariates in the statistical analysis. We then selected only those students that had taken AP tests in 
English, Math, Science, and Social Studies. While we don’t discount the importance of AP tests in the Arts 
and World Languages, data from these areas was spotty and inconsistent over time making the longitudinal 
analysis difficult. 

After matching was completed we looked for systematic differences between matched and 
unmatched AP test takers, as well as between AP Test takers and non-AP Test takers. The analysis 
revealed the following: The proportion of students with category labels SWD, ELL, FRL, and 
EngNotFirst is much greater in the non-AP Test takers group than in the AP Test takers group. 

Chart 1 below illustrates the differences in percentages between students who took no AP courses, 
students who were not matched by took an AP course, and students who were matched and took an AP 
course. We compared the differences between these three groups according to the demographic subsets of 
students: ELL, SWD, EngNotFirst, and FRL. Chart 1 shows that students from those subsets 
disproportionally make up the percentage of non-AP students. However, since 2005, enrollment of SWDs 
and EngNotFirst students in AP courses has increased. Chart 2 below illustrates the raw number of SWD 
and EngNotFirst students who took an AP Test for each of the cohorts. 

To explore whether project implantation had an impact on the non-AP group we discuss the 
results of a follow up analysis below. 


73 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 1. 


100 % 

90% 

80% 

70% 

60% 

50% 

40% 

30% 

20 % 

10 % 

0 % 


Percent of SWL, FRL, and EngNotFirst by Matched, Non-Matched, 
and Non-AP Test Takers 



No AP Unmatched AP Matched AP 

■ Ell ■ SWD ■ EngNotFirst ■ FRL 


74 












Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 2. 


100 

90 

80 

70 

60 

50 

40 

30 

20 

10 


Number of SWD and EngNotFirst Students taking AP Tests 



Cohort Cohort Cohort Cohort Cohort Cohort Cohort 

2005 2006 2007 2008 2009 2010 2011 

— Yes SWD — EngNotFirst 


Table 22 presents the number of students in each cohort that were matched by the number of SHS 
years and AP Test. The total number of AP Tests taken by the Treatment Group was 3,505. 


Table 22. 


Number of Students in Each Cohort Matched by Number of SHS Years and AP Test 


English 

Math 

Science 

Social Stuc 

ies 


Comparison 

Treatment 

Comparison 

Treatment 

Comparison 

Treatment 

Comparison 

Treatment 

Cohort 2005 

224 

0 

162 

0 

220 

0 

302 

0 

Cohort 2006 

218 

0 

164 

0 

194 

0 

340 

0 

Cohort 2007 

92 

111 

76 

88 

87 

100 

99 

147 

Cohort 2008 

50 

177 

38 

132 

52 

162 

68 

259 

Cohort 2009 

29 

164 

23 

128 

48 

224 

51 

207 

Cohort 2010 

14 

113 

13 

104 

18 

195 

26 

187 

Cohort 2011 

1 

153 

0 

100 

1 

200 

1 

249 

Cohort 2012 

1 

76 

0 

16 

0 

73 

0 

140 

Totals 

629 

794 

476 

568 

620 

954 

887 

1189 


75 
























Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

In Table 23 below, we illustrate the number of AP Tests used in the analysis by each course. 
Several test versions were available for calculus and physics. For the analysis we pooled the numbers of 
students and their scores into single combined test groups taking their best score if they had more than 
one test in the pool. Overall there were 6117 AP Tests taken by 1462 students (treatment n = 810; 
comparison n = 652) for an average of 4.2 AP Tests taken per AP Test taker. 

Table 23. 


AP Tests Taken by Course 


Comparison 

Treatment 

Total 

Biology 

218 

355 

573 

Calculus (Combined) 

268 

360 

628 

Chemistry 

118 

198 

316 

English Language 

388 

497 

885 

English Literature 

241 

297 

538 

Environmental Science 

225 

342 

567 

U.S. Government 

196 

281 

477 

Physics (Combined) 

59 

59 

118 

Psychology 

98 

127 

225 

Statistics 

208 

208 

416 

U.S. History 

263 

293 

556 

World History 

330 

488 

818 


2612 

3505 

6117 


Assignment into either the treatment or comparison groups was determined by sorting students 
into 2 piles: those what had taken no courses that had been redesigned as part of project activities, and 
those that had taken at least one targeted PBL course. Students from Cohorts 2007 through 2012 were 
eligible for treatment group membership, while students form Cohorts 2005 and later were eligible for the 
comparison group. 

The comparison group consisted of students from all cohorts but primarily from cohorts 2005 to 
2010. Before matching, comparison students were sorted in descending order by cohort year while 
treatment students were sorted in ascending order. This allowed the matching algorithm to select matches 
(in case of ties) that were closest in time to project implementation. This helped to control for contextual 
variables such as teacher, curriculum, and content standards and also allowed for students to be matched 
to students in the same cohort. Even though there were more comparisons students in the pool, students 
in the treatment group had taken more AP courses than comparison students. Thus we allowed 
comparison students to be matched more than one student/AP test in the treatment group. We then 
assigned a valued representing the number of matches they were in and used that to weight the case in 
statistical calculations. 


76 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 3. 

Percent of Each Cohort as Members of PBL Exposure Groups 


40 % 

30 % 

20 % 

10 % 

0 % 



— — Comparison (Weighted) — • Comparison (Unweighted) —Treatment 


77 





Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Findings: Exploratoiy Study #1: Comparison of Student Performance in AP Coursework 

In this section we share findings from Exploratory Study # 1 in which we compared student 
performance on Advanced Placement (AP) tests using an intermpted time series analysis. We compared 
the mean AP test scores of students who took AP courses prior to the school’s adoption of PBL with 
students who took AP courses during the school’s ongoing implementation of PBL. 

Overall, students in the treatment group outperformed their matched peers in the comparison 
group on multiple Advanced Placement (AP) tests. In some cases, student gains were statistically 
significant, even when data was disaggregated according to students who receive free and reduced lunch 
(FRL) and Special Education services (SWD) and students who speak a first language other than English 
at home (EngNotFirst). 

Additionally, students in the treatment group overall have a higher percentage of AP tests passed 
than the comparison group despite more students taking AP courses and the associated AP tests over time. 
We also found a strong correlation between the number of PBL courses a student took and their mean AP 
score. As students took more PBL courses, their AP mean score increased. Within the treatment group, we 
also found a positive correlation between an increased number of PBL courses a student took and the 
percentage of AP tests they passed throughout their four years at Sammamish High School. 

When aggregated to department-level performance on AP exams, students in the treatment group 
experienced gains in AP scores in the Math, English, Science, and Social Studies departments. In the Math, 
Science, and Social Studies departments, student gains in mean AP scores were statistically significant. The 
departments that experienced the highest gains in student AP scores were the departments that more fully 
adopted PBL. Upon closer analysis, the data suggest that departments interacted with the PBL initiative 
and the Key Elements differently. While not every department adopted PBL in the same way, those that 
used the Key Elements to guide ongoing curriculum design and redesign experienced the largest student 
gains in AP scores throughout the department. 

In each of the following figures, we use a double asterisk (**) to denote statistically significant 
differences in mean AP scores. Also, we insert “error bars” to show the range beyond which gains were 
statistically significant. Error bars illustrate the extent of the variability within which gains fall within the 
standard error. The range illustrated by error bars can change dramatically depending on whether we are 
examining scores by department or course and they can also be different depending on course. 

Rough Comparison of AP Score Means by Group 

Chart 4 below illustrates the difference in AP score mean between the comparison and treatment 
cohorts. When aggregating all AP scores received by every student in the comparison and treatment group, 
students in the treatment group experienced a statistically significant increase (2.29 to 2.49) in mean scores 
on all the AP tests they took. 


78 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 4. 



In addition to an overall increase in AP score means, students also passed AP tests at a statistically 
significant higher rate across the school. Chart 5 illustrates the difference in AP test pass rates by students 
in the comparison and treatment groups. 


79 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 5. 

Percent of AP Test Takers Passing with a Score of 3 or Higher - Whole School 

100 % 

90% 

80% 

70% 

60% 

50% 

40% 

30% 

20 % 

10 % 

0 % 



Comparison of Mean AP Scores by AP Course 

Overall, students in the treatment group experienced gains on AP scores across multiple AP 
courses including: AP Biology, AP Calculus (combined BC and BCAB), AP Chemistry, AP English 
Language, AP English Literature, AP Environmental Science, AP United States Government, AP Physics 
(combined all levels of Physics), AP Psychology, AP Statistics, and AP World History. Student gains on 
AP scores were statistically significant in AP Biology, AP Calculus (combined BC and BCAB), AP 
Chemistry, AP United States Government, AP Psychology, AP United States History, and AP World 
History. 

Table 24 below illustrates the mean AP score differences by course, between all students in the 
comparison and treatment groups. 

Table 24. 


Statistically Significant Gains in Mean AP Scores by Course 

Course 

Mean AP Score Difference 

AP Biology 

1.62 to 1.97 

AP Calculus (combined) 

2.64 to 2.95 

AP Chemistry 

1.80 to 2.02 

AP United States Government 

2.38 to 2.66 

AP Psychology 

2.53 to 2.90 

AP Unites States History 

2.10 to 2.63 

AP World History 

2.49 to 2.63 


80 













Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Chart 6 below illustrates the difference between the comparison and treatment groups in mean AP 
score by course. Amongst the courses listed below, only AP Biology, AP Chemistry, AP Physics, AP 
United States History, and AP World History were redesigned by funds from the i3 grant. However, all 
courses listed below were taught by teachers who had design team experience at some point during the i3 
project. 


Chart 6. 



In addition to an increase in AP score means, students also passed AP tests at a higher rate across 
courses. Chart 7 illustrates the difference in AP test pass rates by course by students in the comparison and 
treatment groups. In AP Biology (19% to 33%), AP Calculus (combined) (50% to 58%), AP United States 
Government (38% to 51%), AP Psychology (48% to 63%), and AP United States History (34% to 47%), 
students in the treatment group passed at a statistically significant higher rate than students in the 
comparison group. 


81 














Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 7. 



Comparison of Mean AP Scores by Students Who Speak a First Language Other Than English at Home (EngFirst) 
Chart 8 below illustrates the difference in mean AP score between students who speak a first 
language other than English at home in the comparison and treatment cohorts. When aggregating all AP 
scores received by those students in the comparison and treatment group, students in the treatment group 
experienced a statistically significant increase (2.38 to 2.55) in mean scores on all the AP tests they took. 


82 


















Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 8 

School Level AP Test Performance - EngNotFirst (Estimated Marginal Means) 

5 

4.5 

4 

3.5 

3 

2.5 

2 

1.5 

1 

0.5 

0 



Students in the treatment group who speak a first language other than English at home 
experienced gains in AP scores in the following AP courses: AP Biology, AP Chemistry, AP English 
Language, AP English Literature, AP Environmental Science, AP United States Government, AP Physics 
(combined all levels of Physics), AP Psychology, AP Statistics, and AP World History. Those students 
experienced statistically significant gains in AP Biology, AP United States Government, AP Psychology, 
and AP United States History. 

Table 25 below illustrates the mean AP score differences by course, between students who speak a 
first language other than English at home in the comparison and treatment groups. 

Table 25. 


Statistically Significant Gains by Students Who Speak a First Language Other 
Than English at Home (EngNotFirst) in Mean AP Scores by Course 

Course 

Mean AP Score Difference 

AP Biology 

1.71 to 1.96 

AP United States Government 

2.47 to 2.79 

AP Psychology 

2.68 to 3.02 

AP Unites States History 

2.12 to 2.72 


Chart 9 below illustrates the difference in AP Score means between students who reported to 
speak a first language other than English at home in the comparison and treatment groups. 


83 










Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 9. 



■ Comparison ■Treatment 


In addition to an increase in AP score means, students who speak a first language other than 
English at home also passed AP tests at a higher rate across courses. Chart 10 illustrates the difference in 
AP test pass rates by course by students in the comparison and treatment groups. In AP Biology (23% to 
33%), AP United States Government (41% to 54%), AP Psychology (55% to 70%), AP United States 
History (35% to 49%), the percentage of EngNotFirst students in the treatment group who passed the AP 
test was statistically significant higher than the percentage of EngNotFirst students who passed the AP test 
in the comparison group. Of note are AP Calculus (combined), AP Statistics, and AP World History in 
which a lower percentage of EngNotFirst students in the treatment passed the AP test than their 
comparison group peers. 


84 

















Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 10. 



Comparison of Mean AP Scores by Students Who Receive Free and Reduced Funch (FRL) Services 

Chart 11 below illustrates the difference in mean AP scores between students who receive free and 
reduced lunch (FRL) services in the comparison and treatment cohorts. When aggregating all AP scores 
received by those students in the comparison and treatment group, students in the treatment group 
experienced a statistically significant increase (1.82 to 2.06) in mean scores on all the AP tests they took. 


85 






















Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 11. 



Students in the treatment group who qualify for free and reduced lunch (FRL) services experienced 
gains in AP scores in the following AP courses: AP Biology, AP Chemistry, AP Calculus (combined), AP 
English Language, AP English Literature, AP Environmental Science, AP United States Government, AP 
Psychology, AP Statistics, and AP World History. Those students experienced statistically significant gains 
in AP Biology, AP Environmental Science, and AP World History. 

Table 26 below illustrates the mean AP score differences by course, between students who 
received free and reduced lunch (FRL) services in the comparison and treatment groups. 

Table 26. 


Statistically Significant Gains by Students Who Receive Free and Reduced 
Lunch (FRL) Services in Mean AP Scores by Course 

Course 

Mean AP Score Difference 

AP Biology 

1.18 to 1.57 

AP Environmental Science 

1.36 to 1.76 

AP World History 

2.14 to 2.37 


Chart 12 below illustrates the difference in AP Score means between students who received free 
and reduced lunch services in the comparison and treatment groups. 


86 









Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 12. 


Course Level AP Test Results - FRL (Estimated Marginal Means) 
5 

4.5 

4 



■ Comparison ■Treatment 


In addition to an increase in AP score means, students who received free and reduced lunch (FRL) 
services also passed AP tests at a higher rate across courses. Chart 13 illustrates the difference in AP test 
pass rates by course by students in the comparison and treatment groups. In AP Biology (7% to 15%), AP 
English Language (22% to 32%), and AP Environmental Science (8% to 21%), the percentage of FRL 
students in the treatment group who passed the AP test was statistically significantly higher than the 
percentage of FRL students who passed the AP test in the comparison group. 


87 








Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 13. 



Comparison of Mean AP Scores by Students With Disabilities (SWT)) 

Chart 14 below illustrates the difference in AP score means between students with disabilities 
(SWD) in the comparison and treatment cohorts. When aggregating all AP scores received by SWDs in the 
comparison and treatment group, students in the treatment group (2.02) outperformed students in the 
comparison group (1.85). However, the gains experienced by these students in the treatment were not 
statistically significant. 


88 











Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 14. 



Students with disabilities (SWD) in the treatment group experienced gains in AP score means in 
the following AP courses: AP Biology, AP Calculus (combined BC and BCAB), AP English Literature, AP 
Psychology, AP Statistics, and AP United States History, and AP World History. Those students 
experience statistically significant gains on AP scores in AP Calculus (combined BC and BCAB), AP 
Statistics, AP Physics (combined all levels of Physics), and AP United States Government. 

Table 27 below illustrates the mean AP score differences by course, between students with 
disabilities (SWD) in the comparison and treatment groups. Of note is performance of students with 
disabilities (SWD) in AP Environmental Science. In that course, students in the comparison group 
outperformed their peers in the treatment group by a statistically significant margin (1.68 to 1.65). 

Table 27. 


Statistically Significant Gains by Students with Disabilities (SWD) in Mean AP 

Scores by Course 

Course 

Mean AP Score Difference 

AP Biology 

1.50 to 1.67 

AP World History 

2.04 to 2.39 


Chart 15 below illustrates the difference in AP Score means between students with disabilities 
(SWD) in the comparison and treatment groups. 


89 








Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 15. 



In addition to an increase in AP score means, students receiving special education (SWD) services 
also passed AP tests at a higher rate across courses. Chart 16 illustrates the difference in AP test pass rates 
by course by students in the comparison and treatment groups. The data in this chart represents an 
anomaly in the AP pass rate data overall. Not only do the error bars suggest significant variability within 
the data but in many cases the data suggest the percentage of students with disabilities that passed AP tests 
has decreased when compared to their comparison group peers. While several courses such as AP 
Chemistry, AP Calculus (combined), and AP United States History seem to show significant differences 
between groups, the wide range of variability suggest that those differences may not be statistically 
significant. 


90 








Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 16. 


Percent of AP Test Takers Passing with a Score of 3 or Higher - SWD 


100% 


80% 


60% 


40% 



AP Mean Score and Pass Rate by Intensity of PBL Dosage 

Within the data that show gains by students in the treatment (PBL) group, we found that the 
amount of PBL students were exposed to correlates with an increase in AP score means and an increase in 
the number of AP tests students pass. 

Chart 17 illustrates the positive correlation between the amount of PBL students in the treatment 
group experienced and an increase in their mean AP scores. This positive correlation is especially striking 
as it suggests the difference between not passing (overall mean AP score of 2.22) and passing (overall 
mean AP score of 3.02) associated with more intense exposure to PBL coursework. 


91 





























Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 17. 


4.5 


2.5 


1.5 


AP Test Mean by Targeted PBL Course Exposure 


0.5 


0 to 1 PBL Courses 


2 to 6 PBL Courses 


7 or more PBL Courses 


Chart 18 illustrates the positive correlation between the amount of PBL students in the treatment 
group experienced and an increase in the percentage of AP tests they pass. This trend shows an increase 
from 36% of tests passed by students receiving a dosage of 0-1 PBL courses to 67% of tests passed by 
students receiving a high dosage of 7 or more PBL courses. 


92 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 18. 



Taken together, these data suggest a positive correlation with PBL and increased student success 
on AP tests regardless of content-area and regardless of student sub-group. Although not universal, 
student gains in AP scores are wide spread and on the upward trajectory across courses within content 
areas. 

Assessing the Impacts of AP Human Geography as a Common AP Experience for Incoming Freshmen 

Designed and implemented as a PBL course in 2011, AP Human Geography was one of the first 
courses to be implemented and was required for all freshmen attending Sammamish High School. 
Research suggests that students benefit from the experience of taking AP coursework early in their high 
school careers and that such an experience can have positive impacts on their performance in future 
coursework (Rodriguez, McKilip, Niu, 2013). School leaders wagered that requiring AP Human 
Geography for all incoming freshmen would provide them with a strong foundation in the skills necessary 
to be successful in future PBL and AP coursework. Because students may have taken AP Human 
Geography but not the test, we examined how both groups (course takers only and course takers and test 
takers) of students fared in future AP coursework. It is important to note that these data reflect teachers’ 
first effort to implement this course. AP Human Geography teachers have told us in interviews that the 
course has been gready improved since 2011. 

Chart 19 illustrates the difference in mean AP scores by AP Human Geography course takers who 
did not take the test in the treatment group and students in the comparison group. AP Human Geography 
course takers who did not take the test (2.655 mean score) outperformed students in the comparison 
group (2.26 mean score). These differences were statistically significant. 


93 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 19. 


School Level AP Test Performance - Took AP Human Geography Course (Estimated Marginal 

Means) 

5 

4.5 

4 

3.5 

3 

2.5 

2 

1.5 

1 

0.5 

0 



For students who took AP Human Geography and who took the test, the differences were more 
pronounced, resulting in a difference of 2.29 mean score (comparison group) and 2.74 mean score (AP 
Human Geography test takers). Chart 20 illustrates those differences. 


94 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 20. 


School Level AP Test Performance - Took AP Human Geography Test (Estimated Marginal 

Means) 

5 

4.5 

4 

3.5 

3 

2.5 

2 

1.5 

1 

0.5 

0 



Chart 21 illustrates the differences in mean AP scores between students in the treatment group 
who took AP Human Geography, but did not take the test, and students in the comparison group. 
Students in the treatment group, who took AP Human Geography but not the test, outperformed students 
in the comparison group across the board. In AP Biology, AP Calculus (combined), AP English Literature, 
AP Psychology, AP Statistics, and AP United States History, student gains were statistically significant. 


95 





Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 21. 


Course Level AP Test Results - Took AP Human Geography Course as Freshman (Estimated 

Marginal Means) 

5 

4.5 

4 



■Comparison ■Treatment 


Again, for students who took AP Human Geography test, the differences were more pronounced 
across courses. In AP Biology, AP Calculus (combined), AP English Language, AP English Literature, AP 
Psychology, AP Statistics, AP United States History the differences in AP score means were statistically 
significant. Chart 22 illustrates differences in mean AP scores between the comparison group and the 
treatment groups of students who took the AP Human Geography test. 


96 






Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 22. 


Course Level AP Test Results - Took AP Human Geography Test as Freshman (Estimated 

Marginal Means) 

5 

4.5 



■Comparison ■Treatment 


Table 28 below compares the mean AP scores of all students included in Exploratory Study #1 
(mean AP scores). This includes mean AP scores from students in the comparison group, all students in 
the treatment group, students within the treatment group who took AP Human Geography but did not 
take the test, and students who took the AP Human Geography course and the associated test. 

Table 28. 


Comparison of Mean AP Scores Between All Students in Exploratory Study #1: Mean AP 

Scores 

AP Course 

Comparison 
Group Students 

All Students in 
the Treatment 
Group 

AP Human 
Geography 
course takers, 
not test takers 

AP Human 
Geography test 
takers 

AP Biology 

1.624 

1.973 

2.562 

2.597 

*AP Calculus 
(combined) 

2.643 

2.955 

3.283 

3.321 

AP Chemistry 

1.809 

2.029 

2.114 

2.165 

*AP English 
Language 

2.515 

2.545 

2.663 

2.765 

*AP English 
Literature 

2.44 

2.523 

2.766 

2.848 


97 





















Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


**AP 

Environmental 

Science 

2.07 

2.155 

2.194 

2.288 

**AP United States 
Government 

2.387 

2.666 

2.505 

2.607 

*AP Physics 
(combined) 

3.297 

3.466 

NA 

NA 

*AP Psychology 

2.531 

2.902 

2.935 

3.163 

*AP Statistics 

2.079 

2.224 

2.966 

2.889 

AP United States 
History 

2.102 

2.615 

2.734 

2.882 

AP World History 

2.497 

2.632 

2.687 

2.74 


(*) Designates courses not redesigned as part of the i3 grant. 

(**) Designates courses redesigned into project based learning previous to the i3 grant. [Insert footnote 
here] 

Bold numbers signify mean passing AP score 

Two things are noteworthy in Table 28. First, treatment group students’ mean AP scores improve 
as they take AP Human Geography and the associated test. Second, the increases in mean AP scores are 
not limited to only the courses that were redesigned into PBL courses. Students also experienced increases 
in their mean AP scores in the courses not redesigned according to PBL pedagogy as part of the i3 grant. 


Exploratory Study #la 

The motivation to explore AP Test performance and pass rates stemmed from the completion of an 
exploratory Interrupted Time Series study. This study compared pass rates, as defined by the number 
of AP Test passers in each department by the number of students they served in a given year. The pre¬ 
data points were the number of passers in courses not yet redesigned while the post data were from the 
number of passers who had taken redesigned PBL courses. Because courses came online in a 
staggered fashion it was necessary to code the data in such a way to account for varying 
implementation times. A regression analysis was conducted using the following model: 


Ns—1 

Passers per Dept. st = ft + ^ y s (Indicator for Subject s) + ft (Indicator that Course is using PBL in year t)+£ s 


Table 28a below provides an overview of which courses were analyzed and the pre or post designation 
for each. By analyzing the data at the department and whole school level we found that taking PBL 
redesigned courses was associated with increased pass rates for Social Studies and Science. These 
findings were confirmed in the follow-up exploratory study #1 described above. 

(For more information about this preliminary exploratory study please contact the authors.) 


98 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Table 28a. 


Department 
(E6: All 

Departments) 

^\School Year 

Course 

2002 

2003 

2004 

2005 

2006 

2007 

2008 

2009 

2010 

2011 

2012 

2013 

2014 

El. Social 

Studies 

US 

Government 

0 

0 

0 

0 

0 

0 

0 

0 

X 

X 

X 

X 

X 

Comparative 

Government 

0 

0 

0 

0 

0 

0 

N 

N 

N 

N 

N 

N 

X 

Human 

Geography 

N 

N 

N 

N 

N 

N 

N 

N 

N 

X 

X 

X 

X 

US History 

0 

0 

0 

0 

0 

0 

0 

0 

0 

0 

0 

X 

X 

World History 

0 

0 

0 

0 

0 

0 

0 

0 

0 

0 

X 

X 

E2. Science 

Biology 

0 

0 

0 

0 

0 

0 

0 

0 

0 

0 

X 

X 

Chemistry 

0 

0 

0 

0 

0 

0 

0 

N 

0 

0 

X 

X 

Environmental 

Science 

N 

0 

0 

0 

0 

0 

0 

0 

X 

X 

X 

X 

Physics 

0 

0 

0 

0 

0 

0 

0 

0 

0 

0 

0 

X 

X 

E3. English 

English 

Language 

0 

0 

0 

0 

0 

0 

0 

0 

0 

0 

X 

X 

English 

Literature 

0 

0 

0 

0 

0 

0 

0 

0 

0 

0 

X 

X 

E4. Math 

Calculus 

0 

0 

0 

0 

0 

0 

0 

0 

0 

0 

X 

X 

E5. World 
Language 

Spanish 

Language 

0 

0 

0 

0 

0 

0 

0 

0 

0 

0 

0 

X 

X 

Chinese 

Language 

N 

N 

N 

N 

0 

0 

0 

0 

0 

0 

X 

X 


Impact on Special Populations: English Language Learners 

English learners (ELs), students whose first language is not English, are not only the fastest 
growing school-age population but also a tremendously diverse group representing an abundance of 
cultures, languages, nationalities, and socioeconomic backgrounds. The majority of students who comprise 
the EL population are US citizens. Nationally, more than 75% of ELs in grades K-5 are second- or third- 
generation Americans, and 57% of middle and high school ELs were born in the United States 
(Grantmakers for Education 2013). The belief that English learners are students who have recently moved 
to the US is inaccurate. More and more students with parents from diverse backgrounds are growing up in 
the US who need ELL accommodations. 

Washington State Legislature defines English learner (Chapter 28A.180 RCW) as “any enrollee of 
the school district whose primary language is (one) other than English and whose English language skills 
are sufficiently deficient or absent to impair learning.” The definition indicates a lack of English 
proficiency and that English is not the first or primary language of the student. What the definitions fail to 
indicate is the range of academic experiences these students bring into the classroom. Some ELs at 
Sammamish High School are coming in with having successfully completed rigorous academic courses in 
their native countries, while others, because of their journeys, have had interrupted educational experiences. 


99 










































Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

The absence of this information creates a limited view of the students. 

At SHS, ELs make up 15% of the student population. 38% of SHS students speak one of the 45 
different reported languages other than English at home. While the most common language is Spanish in 
the district, the others include Chinese, Korean, Russian, Japanese, Vietnamese, and Telugu to name a few. 
And within these language groups are varied academic, linguistic, and life experiences. The term English 
learner does not display the multitude of layers within the label. 

Focm Group Demographics 

Fifty students made up the 13 focus groups, which were conducted during the English Language 
Development (ELD) classes at the end of the 2014-2014 school year. The group’s configurations were 
based on advice from the ELD instructors with regards to peer dynamics since the focus groups 
conversations took place during their class. Each group’s demographics are detailed in Table 29 below. 
The aggregate demographics were: 

• 23 female and 27 male students; 

• 8 ninth-grade students, 9 tenth-grade students, and 33 eleventh- and twelfth-grade 
students; 

• 13 beginning level students, 17 intermediate level students, 20 advanced level students; and 

• 18 Spanish, 10 Mandarin, 9 Vietnamese, 4 Korean, 2 Hindi, 2 Cantonese, 1 Persian, 1 
Russian, 1 Punjabi, 1 Japanese, 1 Tagalog, and 1 Taiwanese speakers. 

• The range of stay in the US was from 2 to 60 months. 


100 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Table 29. 


Focus 

Number of 

First Languages 

Grade 

Gender? 

WELPA Language 

Length of 

Group 

ELs (n) 

(n, language) 

Level 

(n, female / 

Level (Beginning — 

time in the 





male) 

Advanced) 

U.S. 

A 

4 

1 — Russian 

9 th 

4 - Female 

2 — Beginning 

7-30 months 



1 - Spanish 



1 - Intermediate 




2 - Vietnamese 



1 - Advanced 


B 

3 

1 — Korean 

9 th 

3 — Male 

2 — Beginning 

2-48 months 



2 — Spanish 



1 - Intermediate 


C 

5 

2 — Korean 

11 * 

3 - Female 

3 - Intermediate 

24-48 months 



1 - Mandarin and 
Cantonese 

2 - Vietnamese 

12 * 

2 - Male 

2 - Advanced 


D 

5 

2 — Spanish 

11 th 

5 - Male 

2 — Beginning 

12-24 months 



3 - Vietnamese 

12 * 


1 - Intermediate 

2 - Advanced 


E 

2 

2 - Vietnamese 

11 * 

2 - Male 

2 - Advanced 

24 months 




12 * 




F 

4 

1 - Cantonese 

10 * 

3 - Female 

2 - Intermediate 

5-24 months 



3 - Mandarin 


1 - Male 

2 - Advanced 


G 

4 

3 - Spanish 

10 th 

2 - Female 

2 — Beginning 

6-24 months 



1 - Tagalog 


2 - Male 

1 - Intermediate 

1 - Advanced 


H 

5 

1 - Japanese 

11 * 

5 - Female 

1 - Intermediate 

18-36 months 



1 — Korean 

1 — Mandarin 

1 - Mandarin and 
Taiwanese 

1 - Punjabi 

12 * 


4 - Advanced 


I 

3 

1 - Cantonese 

11 * 

3 — Male 

2 - Intermediate 

18-36 months 



1 — Mandarin 

1 - Punjabi and Hindi 

12 * 


1 - Advanced 


J 

2 

1 — Persian 

9th 

2 - Female 

1 — Beginning 

5 to 9 months 



1 - Spanish 

10 th 


1 - Intermediate 


K 

2 

2 — Mandarin 

11 * 

2 - Male 

2 - Advanced 

42 months 

L 

5 

1 — Mandarin 

11 * 

2 - Female 

2 — Beginning 

5 to 36 



4 - Spanish 

12 * 

3 - Male 

3 - Intermediate 

month 

M 

6 

6 - Spanish 

11 * 

1 - Female 

2 — Beginning 

3 to 24 




12 * 

5 - Male 

1 - Intermediate 

3 - Advanced 

months 


101 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Hoiv do ELs Describe Their Experience with PBL in Mainstream Math, Science, and Social Studies Classroom? 

Twenty-eight of 50, 56% of ELs said they did not connect with PBL as a pedagogical model. 
Reasons cited included the dislike of group work associated with PBL and that PBL isn’t connected to the 
real world. Another explanation is the difference in learning styles to the ELs’ prior academic experiences. 
FGB4, a student from Korea who had been at SHS for two months at the time of data collection 
explained, “(I) do not feel connected to the projects. (It’s a) waste of time. All subjects in home school 
were text books and tests. Trying to learn on your own is better.” For some ELs, PBL is drastically 
dissimilar to the classrooms they are familiar with. Even for ELs who have had experience with projects, 
PBL can be challenging because of the complex and massive language demands in some content areas. 
FGB4, a student from Mexico who had been in the US for three months at the time of data collection said, 
“I worked on a lot of projects in home country, but I no understand here.” Not having the appropriate 
language scaffolds prevents ELs from understanding the content presented. And without connections to 
prior knowledge, the potential resources ELs could implement may go untapped. 

Of the three content areas, ELs connected the most to science and when there was choice in the 
PBL topic. FGG4 explained, “(I like) project where you can pick any topic you want.” Several students 
brought up student choice projects such as the Independent Project in science and the History Day project 
in social studies. The following sections discuss each specific content area. 

Math. ELs expressed that there was limited PBL in math with 15 of 50 (30%) saying that there was 
no PBL in math. The students who talked about PBL in math expressed a dislike for the projects because 
they did not see the connection between the project and the content. One student explained FGF3, “I 
don’t understand why we did the robot project in (pre-calculus) class.” The robots were used as part of the 
PBL unit the pre-calculus classes were utilizing. Students were given a choice of any topic in the textbook 
they were interested in building a unit around with lesson plans, assessments, and the use of a robot to 
teach the concept. Without contextualizing the PBL, students will miss important connections and 
relevance between content and PBL. 

Science. Comparing the number of ELs who said they liked PBL in content areas, science came out 
on top with 18 of 50 (36%) ELs expressing their enjoyment of PBL in science. FGA4 exclaimed, “Science 
has the best project because of experiments.” Seven of the 18 said it was because they were able to choose 
their topic. 6 of 50 communicated concern with science PBL because of the difficulty and layers involved 
in completing the culminating project/aspect of the unit. FGD3.2 explained, “Science is the most difficult 
class. I never saw (the content) before, so the teacher said you need to talk about it with your partner, but I 
don’t even know what to talk about.” Understanding the content is a prerequisite for participating in 
collaborative and PBL activities. And because instruction and materials are given in a language ELs are still 
acquiring, additional resources such as summaries of research and scientific articles and native language 
tools would be useful in equipping ELs with content knowledge. Even though science involves dense 
academic vocabulary and concept, 14 ELs described a favorite project/unit in science. FGK3.2 explained, 
“My fav project was the GMO project because I just watched news from China about the topic and my 
relatives and I were so exciting to discussing it over the phone.” PBL that has authentic connections to the 
ELs’ lives are engaging and motivate students to become more involved. 


102 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Social Studies. With social studies, 11 of 50 (22%) expressed liking PBL with the same number 
expressed not liking PBL in social studies. Sixteen ELs (32%) said social studies had the most PBL 
activities. The PBL in social studies was consistent with each unit culminating in a product and 
presentation of some form. FGH3.3 explained, “After we finish a project, we start another project.” And 
FGG3.3 explained, “We have to do so many projects in history!” The mixed reactions to PBL in social 
studies stemmed from interest in the topic, relevance, and the projects. FGF3.3 explained, “Yes, there are 
a lot of projects. My favorite is to make the film where we picked a topic and make a video about it. Mine 
was about Queen Victoria.” Having choice in PBL was a motivating factor for many ELs. FGH3.3 
explained, “I don’t like government project. It’s all about political things. I don’t connect with the content. 
They are really hard topics that you don’t even know about.” Many ELs described favorite projects in 
social studies including the children’s book and history day projects in World History. FGJ3.3, “Favorite 
project was the children’s book because this is very good. It’s for helping kids.” The ELs that revealed 
challenges with social studies PBL described difficulty with the understanding and connecting to content, 
presentation, and group work. FGI3.3 explained, “There are a lot of projects in history. Projects are too 
hard. I don’t know what to research about.” And FGL3.3 explained, “I like the class (social studies). But I 
don’t like when we do projects because when we work in groups sometimes it is only one or two people 
doing the work.” Similar to reactions with science, ELs are also frustrated with group work in social 
studies. 

What Specific Expectations Do They Find Challenging Within the Environment? 

Issues brought up can be grouped into two areas: language and PBL. Language expectations refer 
to the plethora of language demands in the mainstream classrooms, and the PBL expectations pertain to 
the activities and curriculum related to problem-based learning utilized in the classroom. 

Eanguage Expectations. Because instmction, interactions, and assignments are all conducted in a 
language ELs are still mastering, productive engagement with content and activities is difficult and require 
extra time and resources for ELs. FGJ2 and FGL2 both expressed engaging with content “when (they) 
understand what is going on in class.” Comprehension of what is happening in the classroom is 
foundational to engagement with the material and activities in the classroom. For some, understanding the 
instructor is a challenge. FGB3.3 explained, “It’s hard to understand the teacher and the material, I prefer 
to read textbook and taking a test.” Other ELs who enjoy learning in a particular content area find school 
challenging because of the language barrier. FGJ3 explained that “I don’t like projects in there, the words 
are really hard and I can’t understand the words, I love(d) geography in my country.” In this case, the 
student enjoyed social studies in her home country but is unable to feel the same way here because of 
language. 

Many students expressed anxiety with presentations in both social studies and science classrooms. 
FGI4 explained, “I learned a lot from every project, but it’s hard to do the presentation part.” Speaking in 
front of an audience of peers can be a nerve-wracking experience for many. For students who are still 
learning English, the experience can be paralyzing. However, there are moments of relief as FGH3 
conveyed, “I liked the broadcast project because when we presented, we could hide behind a poster board 
because it was a radio broadcast.” Because language is the key to comprehension and participation in the 
classroom, equipping ELs with academic language and providing opportunities for comprehensible input 

103 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

and output will increase their engagement and progress. 

PBL Expectations. PBL at SHS consists of units in each course that are designed and implemented 
by teachers in the content area’s department. Each of the content areas, math, science, and social studies, 
have several design teams that create PBL units aligned to state standards and district content requirements. 
Units are designed with the seven key elements (listed in the Methods section) in mind to provide an 
authentic and relevant real world problem for students to tackle. Many of the units require group work to 
create a product and/or culminating solution to be presented to various audiences. For many ELs, 
problem-based learning is a new type of learning environment with activities, content and academic skills 
which are unfamiliar. 

Many ELs did not see the value of projects because they are used to the banking model of 
education where teachers are positioned as knowledgeable and where teachers present the knowledge to 
the students and then assess the students on the understanding of the materials. FGA3.3 explained, “I 
don’t like working on PBL because I prefer to be writing things, and the teacher to be explaining.” 
Providing an explanation of PBL and its benefits would provide ELs with a better understanding of the 
structure and potential rewards of engaging with a more complex type of instruction and learning 
environment. 

One major component of PBL is collaborative activities where students are asked to work together 
to complete a project or to provide a solution to a problem. ELs feel a lot of pressure because of grades 
and anxiety towards bringing the group’s grade down because they feel overwhelmed by the project. 

FGK5 explained, “Some students they help you. They want you to understand. Some other type of 
students, they don’t want you to waste their time and their grade, so in that time, they are just so hard.” 
Fostering effective collaborative techniques and behavior are pivotal in ensuring students are working 
efficiently and productively and feeling supported by the team for ELs and non-ELs. 

Group member selection was also an issue of concern for the ELs. FGF5 explained, “Teachers 
should set up groups because it is hard to find other team members.” FGI5 provided an example; “More 
teammates. Like everyone chose their group first. Nobody knows me there, so I’m the only person left and 
the other two students in my group never shows up to class.” 

Teacher Support 

When asked about the types of supports the ELs wanted in the classroom, 39 of 50 (78%) students 
talked about the need for more assistance from teachers. The teacher support ELs asked for ranged from 
explanations of content to participation in collaborative activities. Seven ELs asked for teachers to provide 
slower and clearer explanations. FG5F explained, “Slow down a bit when explaining important concepts.” 
Before engaging with PBL, an understanding of the foundational content is pivotal. Another aspect of 
support relates to participation. FG5A explained, “Teacher provide more support. She should go around 
and say you have to participate.” For students who are unfamiliar with the norms and rules of PBL 
activities, clear instmctions for and encouragement of participation from the instmctor would ease ELs 
anxiety with unfamiliarity. 

ELs found a number of teachers to be accommodating and extra helpful to them. FG5C explained, 
“Teachers are nice and they help me a lot this year, especially Duke Slater (pseudonym for ELD teacher). 
We need to have a lot of writing, and I always bother him to help me, and he always so patient and explain 
to me a lot and give me so many helps.” Many of the ELs turned to their ELD instructor because they felt 
comfortable asking them for help, as the instmctor fully understood the challenges they faced in the 

104 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

classroom and believed in their potential. Another FGL5 explained, “Teachers can be nice to everyone, 
not all the time, sometimes you need do the job for students to understand, need to make sure the 
students understand what is going on, check homework all the time.” Unfortunately, the ELs did not find 
all teachers to be friendly and helpful. FG5D explained, “Some teachers are not helpful. They know about 
you (EL status), but if you ask them questions, they don’t care.“ Another student in the same group 
expanded, “Teachers, some are so mean. They don’t want to help you. You ask them questions and they 
don’t answer.” FG5E saw promise in teachers with the positive core belief, “It’s the most important that 
they WANT to change to help ELs.” The ELs were able to sense that some of their teachers did not want 
to support their learning if it meant having to provide additional supports beyond their required teaching 
load. 

Other requests by ELs included the use of their native languages in the classroom to understand 
the material and connect with the content. Another one relates to group work. A majority of PBL activities 
is completed in groups and because ELs are new and often the minority in the classroom, it can be 
challenging to find group mates. FGH5 explained, “Grouping us with friends makes us want to do the 
project more.” Being able to work with peers that they already have a relationship with may ease tensions 
around group dynamics. Teachers could also implement activities to build a classroom community, in 
which collaborative peer relationships could grow. Teacher support and the implementation of purposeful 
scaffolds and explicit connections are desperately needed for ELs to productive engage with PBL in 
mainstream content area classrooms. 

Summary of Findings 

The data show that the language demands associated with PBL represent a significant obstacle to 
English Learners. English Learners are forced to learn two languages in a PBL classroom: the discourse 
associated with the content and the social language needed to negotiate and navigate roles and 
expectations with peers. We recommend teachers provide English Learners with support materials and 
linguistic tools in order to comprehend foundational content before engaging with PBL tasks. Teachers 
should allow students to use their native language in the classroom to connect to prior knowledge and 
access resources. Both findings support a more robust approach to helping English Learners become more 
linguistically proficient, both academically and socially, in the classroom. 

In addition, teachers should take explicit steps to help English Learners socialize into a PBL 
classroom. PBL components need to be explicitly explained for ELs who may not be familiar with the type 
of learning environment. Teachers need to build cultural sensitivity activities into students’ day-to-day 
learning. Teachers need to build community within the classroom to create a safe environment to take 
risks. Many students expressed sadness and frustration that their native English-speaking peers treated 
them negatively in small group settings when they would stmggle to understand social cues or to keep up 
with the fast-paced used of social English in those groups. Teachers should take explicit steps to position 
ELs as knowledgeable and powerful contributors to the problem-solving task. Lasdy, teachers should 
group students intentionally and purposefully depending on the goals of the task. At times this may mean 
grouping ELs together. At other times this may mean placing students in heterogeneous groups. Either 
way, teachers should implement norms to group work to ensure more egalitarian participation patterns and 
a more egalitarian division of labor in big projects. 

Like many other students, ELs told us they are more motivated to learn and more engaged in 
learning activities when they feel that learning is both relevant to them personally and authentic to work 

105 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

performed by professionals and industry experts in a given field. When appropriate, teachers should 
provide more explanation to make connections between the work they do in school and the work 
professionals do in the work place clear. Teachers should also keep in mind that relevance extends beyond 
what students are interested in and encompasses the skills students have that can transfer to any specific 
task. For example, students who have experience building computers with peers or adult mentors have 
knowledge of design principles and processes teachers can leverage when considering the tasks they ask 
students to complete. Teachers should design lessons and units with flexibility in mind, designing multiple 
ways students can succeed on tasks, assignments, and assessments. Lastly, student choice is critical. 
Whether students are designing a new dmg to fight cancer, or negotiating a treaty to govern resource 
exploration in the Arctic, or writing an essay, teachers should design various ways students can participate 
in those tasks to evidence their mastery of the focal skills and content. 


106 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
PBL Adoption and Comparative AP Scores by Department 

Although core content area design teams and departments received generally the same amount of 
resources and support from school leaders, departments experienced policies associated with the PBL 
curriculum project differently. 

Social Studies. The Social Studies department approached the school’s PBL initiative with 
enthusiasm. A member of the department was hired as a teacher leader tasked with developing the 
professional learning experiences that supported teachers’ PBL design and implementation process. Along 
with another teacher in the department, this teacher was also part of the Knowledge in Action (KIA) 
research study, led by researchers at the University of Washington and funded by the George Lucas 
Educational Foundation (GLEF). This study provided teachers with support to redesign the AP United 
States Government course according to project-based learning pedagogy. These teachers’ success with the 
AP Us Government course gradually influenced the work other Social Studies teachers were doing in their 
courses. In part because of the work these teachers experienced, initially there was enthusiasm and 
excitement for PBL amongst Social Studies teachers and emerging project based learning expertise within 
the department when the SHS PBL curriculum work started. 

Since the beginning of the grant, the Social Studies department lost only one teacher. From time to 
time other teachers went on maternity leave, but the core of the department has remained stable 
throughout the life of the PBL project. Since the beginning of the project, a vast majority of the required 
courses offered within the department have been redesigned into PBL courses, this includes AP Human 
Geography (9 th grade course), World History and AP World History (10 th grade course), AP United States 
History and AP American Studies (11 th grade course), AP United States Government and AP Comparative 
Government (12 th grade course). Teachers of those courses continue to refine, revise, and improve the 
PBL pedagogical model they use to teach in those courses. 

English. The English department approached the school’s PBL initiative with a mix of suspicion 
and enthusiasm. In the 2011-2012 school year, two design teams worked to redesign the Freshman English 
and Sophomore English courses. The Freshman English team successfully redesigned most of the existing 
units in the Freshmen English curriculum. The Sophomore English team designed two units for the new 
Sophomore English course. In the 2012-2013 school year, another team successfully redesigned much of 
the existing Junior English course into PBL curriculum. In the 2013-2014 school year a Senior English 
course worked to redesign the non-AP Senior level course and another team worked with two teachers 
from the Social Studies department to create a new, inter-disciplinary AP American Studies course. 

At the time of writing, an English teacher and Social Studies teacher continue to teach the AP 
American Studies course, but little PBL curriculum designed by previous design teams continues to be 
taught within the department. The AP Language and AP Literature courses have not been targeted for 
redesign by the school. From 2010-2015, the English department lost almost half of the teachers within 
the department who were there when the PBL curriculum project started. Of the teachers that remain 
from the beginning of the grant, the data demonstrate they have largely become indifferent to PBL 
pedagogy and have largely abandoned PBL as a guiding pedagogical model in their practice. 


107 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Math. The Math department approached the school’s PBL initiative with a mix of suspicion and 
enthusiasm. In 2010, several teachers led sessions at the Sammamish Institute of Learning and Teaching 
(SILT). The Algebra II, Geometry, and Pre-Calculus courses were redesigned with varied levels of success. 
The Algebra II team redesigned a majority of the units for the next year’s Algebra II curriculum. However, 
the year after this course was redesigned, the district shifted to a new Math curriculum, leaving the fate of 
this course in limbo. Today, little of the originally redesigned Algebra II curriculum remains. The 
Geometry team redesigned half to two-thirds of the curriculum. Both the Algebra II and Geometry teams 
were caught between the district curriculum and associated assessments, the Key Elements of Problem 
Based Learning, the Math Common Core State Standards, and state end of course exams that served as 
requirements for graduation. Theirs was truly the tension between depth and breadth. As a result, both 
teams worked to design shorter challenge cycles to frame mathematical concepts. Whereas Social Studies 
courses immersed students in units that spanned several weeks, both Math teams designed challenge cycles 
that spanned single or several days. 

Another tension that emerged within the department was a resistance to disturb a highly defined 
and articulated vertical alignment between courses to build students’ math skills to prepare them to be 
successful in the AP Calculus AB/BC course many students took their senior year. There was and is a 
strong belief within the department that this alignment supported students’ success in upper level AP 
coursework. Their progression from Algebra II to AP Calculus AB/BC was the result of myriad hours of 
thoughtful collaboration within the department to identify specific math skills and content students would 
need for AP Calculus AB/BC. Instead of abandoning PBL completely, the Math department has worked 
to strategically integrate the Key Elements into their practice. While there are no “full dose” PBL courses 
offered in the Math department, much of the teaching and learning students experience in the Math 
department has been informed and influenced by the Key Elements. 

In the first years of the grant, the department also lost 2 veteran teachers to retirement. 

Science. The Science department approached the PBL initiative with enthusiasm. One of the Science 
department teachers was an original member of the teacher leadership team and remains highly involved in 
components of the ongoing PBL implementation work. Two other Science teachers have also held long¬ 
standing positions of leadership within the leadership team. One of those teachers was also heavily 
involved in the i3 grant writing process. 

The Science department recruited 8 teachers to redesign the BioChem course starting in 2010 and 
continuing through 2011. This course has undergone at least one major revision since it was originally 
designed and implemented in 2011. In 2012, a design team was assembled to redesign the AP Chemistry 
course and in 2013 a design team was assembled to redesign the AP Biology course. Both these courses 
have undergone changes since they were first implemented but they remain focused around PBL pedagogy. 
A Science department design team also redesigned the core Physics class in 2013. This department’s 
current positive collaborative culture was forged through many tense and candid conversations in the early 
years of the grant. Since 2011, the department has lost two core members of the department. One of them 
retired and one transferred to another school in the district. Several Science department teachers have 
emerged as leaders and experts within the school around the use of technology in the classroom and the 
integration of external experts into unit design. 


108 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

World Language. The World Language department approached the PBL curriculum with cautious 
optimism. However, the department has encountered multiple barriers to designing and implementing 
PBL coursework throughout the life of the grant. First, it is not uncommon for some world language 
teachers to teach several (3 or more) language classes at a time, sometimes within the same period. In 
some cases, teachers within the World Language department are the only teacher at the school who 
teaches their language. For example, there is and has been only one French teacher at the school for years. 
It is not uncommon for this teacher to have anywhere from 3 to 5 different classes to prepare for 
throughout the year, including some AP level courses. Teaching this kind of schedule can be difficult to 
sustain, especially when teachers are redesigning some but not all courses according to PBL pedagogy. 

This teaching situation generally holds tme for the Chinese teacher and Spanish teachers. Not only do the 
World Language teachers act as micro departments in and of themselves, but the number of courses 
offered during any given year can far exceed those offered in some of the other core content area 
departments. Second, the way courses were redesigned also looked different. For example, the first design 
team established in the World Language department consisted of the teachers teaching three different 
languages. While PBL principles can be universal from classroom to classroom, it was nearly impossible 
for teachers to implement commonly designed curriculum, making the process of professional learning 
that other design teachers experienced problematic. 

For all of these reasons, World Language teachers have found it exceedingly difficult to design and 
implement PBL coursework with fidelity. However, multiple discussions with teachers within the 
department suggest that at least some of the World Language teachers use the Key Elements to inform 
their teaching practice. Given the class, it is not uncommon for students to engage in projects such as 
simulations using Chinese, French, or Spanish. While World Language teachers use PBL in places, PBL 
has not yet saturated the way World Languages are taught within the department. 

These differences in how specific departments experienced the PBL curriculum design and 
implementation process may help explain why students experienced different AP score outcomes within 
each department. 

Adoption of PBL Within Departments 

To measure the extent to which teachers were adopting PBL, we surveyed teachers using the 
Concerns-Based Adoption Model (CBAM). We surveyed teachers at the end of the school year, starting in 
2011 and continuing until 2015. The CBAM survey was developed in the 1970s at the University of 
Texas’s Research and Development Center for Teacher Education. Organizations and Institutions have 
repeatedly used the survey to measure the extent to which people adopt an organizational intervention or 
policy. The survey was developed to “understand what happens to teachers and university faculty when 
presented with a change” (Hall, Dirksen, and George, 2006, p. 1). The purpose of the survey is to provide 
a “framework designed to help change facilitators identify the special needs of individuals involved in the 
change process and address those needs appropriately based on the information gathered through the 
model’s diagnostic dimensions” (Hall, Dirksen, and George, 2006, p. 1). 

We used the survey to measure the concerns Sammamish High School teachers and departments 
felt as they implemented PBL throughout the life of the i3 grant. When paired with modified Levels of 
Use (LOU) semi-structured interviews with individual teachers directly impacted by PBL course design 
and implementation, we can understand how teachers were making sense of PBL and the variety of 
perspectives on PBL that emerged within departments. For this specific evaluation, the CBAM survey 

109 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

does not provide information on the relative value of PBL or whether or not it was the right intervention 
for this school at this time. We used it primarily to measure the extent to which departments were 
adopting and implementing PBL pedagogy in their courses. 

Methodology 

We administered the survey to teachers during a staff meeting in May or June each year from 2011 
to 2015. Before teachers took the survey, we reminded them of the purpose of the survey, what we hoped 
to learn from their input, and that their responses would be completely anonymous. Table 30 below shows 
how many teachers took the survey each year. 

Table 30. 


CBAM Survey Respondents by Year 

Year 

Survey Responses 

Total Number of Staff 

Response Rate 

2011 

51 

83 

61% 

2012 

68 

70 

97% 

2013 

52 

81 

64% 

2014 

55 

76 

72% 

2015 

55 

75 

73% 


The numbers of teachers who responded to the survey remained relatively consistent throughout 
the years of implementation. After teachers took the survey, responses were collected and analyzed by 
researchers at the Southwest Educational Development Laboratory (SEDL). During the summer we 
debriefed the results of the survey with the SEDL researchers to identify patterns and trends in the data. 
Starting in 2014, we facilitated meetings with each department at Sammamish High School to share the 
findings of their specific survey responses and probe them for insights on the data. 

Chart 23 illustrates the kind of long-term adoption trends expected from an organization that is 
adopting an intervention. The red line, illustrating a high level of concern amongst respondents in the 
“unconcerned,” “information,” and “personal” categories is typical for people who are generally 
concerned about what the intervention is and what it will mean for them. The red line is consistent with 
responses from people who are experiencing a new intervention or policy. The blue line, illustrating a high 
level of concern amongst respondents in the “personal,” “management,” and “consequence” categories is 
typical for people who are generally concerned about how the intervention is impacting them, how it is 
being managed on the organizational level, and what consequences will be for people (in this case 
students) directly impacted by the intervention. The blue line is consistent with responses from people 
who are in mid adoption and are working to implement the intervention. The green line, illustrating a high 
level of concern amongst respondents in the “consequence,” “collaboration,” and “refocusing” categories 
is typical for people who have adopted the intervention and are most concerned with improving it. The 
green line is consistent with responses from people who have successfully adopted the intervention. 


110 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 23. 



School-level Findings 

School-level survey results show Sammamish High School is trending in a positive direction. 
Starting in 2011, the data show a school most concerned with PBL and hungry for more information 
about what it is and how to implement it. In 2012, the school becomes less concerned with knowing what 
PBL and how to implement it and becomes more concerned with how to manage the implementation 
(ranging from 43-60%) and time to collaborate with colleagues to improve their implementation of PBL 
(ranging from 36-48%). These data, illustrated in Chart 24, suggest a majority of teachers at Sammamish 
High School have adopted PBL, are wondering how they can manage the demands of the PBL 
instructional model moving forward, and are craving more time to collaborate with colleagues to improve 
how they enact PBL in their classroom. 


Ill 





Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 24. 



Department-level Findings 

Department-level findings portray a more complicated and nuanced picture of adoption trends 
from 2011 to 2015, indicating 1) the Social Studies and Science departments had fully adopted PBL, 2) the 
Math department remained hesitant to fully embrace PBL in their teaching, and 3) the English department 
had abandoned it altogether. Below we share survey response trends from each department from 2011 to 
2015 and discuss how their trends demonstrate the extent to which they are adopting PBL pedagogy. 

The Social Studies Department. Chart 25 illustrates CBAM response data from the Social Studies 
department from 2011 to 2015. 


112 










Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 25. 



These data suggest teachers in the Social Studies department have successfully adopted PBL. The 
five year trend suggests they 1) have a good understanding of what PBL is (low in the “unconcerned” and 
“information” categories), 2) are most concerned with managing how they are implementing PBL moving 
forward (high in the “management”), and 3) looking for ways to collaborate with each other to improve 
how they are implementing PBL (high in the “colalboration” category). Qualitative responses from Social 
Studies teachers support these data. A teacher described working “to implement PBL in ways that are 
more authentic and more engaging because I see it as critical for student learning.” Although there is some 
variability of perspectives within the department, these data suggest this teacher’s perspective is consistent 
with that of the Social Studies department overall. 

These data seem to align with our findings in Exploratory Study #1. Students in the treatment 
group experienced gains in Social Studies department AP coursework exams. Mean differences are 
statistically significant when accounting for matched students using GPA, SWD, EngNotFirst, FRL, and 
Gender as covariates. For this analysis 887 comparison students were matched to 1189 treatment students. 
Students in the comparison group were allowed to be matched to more than one treatment student making 
the weighted comparison and unweighted treatment groups to have 1189 students each. AP Test Scores 
across all Social Studies department tests (U.S. Government, Psychology, U.S. History, and World History) 
for students in the treatment group (having taken at least one redesigned PBL course in any department) 
scored higher than the comparison group. As expected, departmental GPA (DepGPA) accounted for the 

113 









Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


vast majority of the variance in AP Test scores, but PBL Exposure (BigMatch) accounted for more 
variance than the SWD, ELL, FRL, or Gender. 


Table 31. 


Dependent Variable: Advanced Placement Test Score (Social Studies) 


Source 

Type III Sum 
of Squares 

df 

Mean Square 

F 

Sig. 

Corrected Model 

1035.904 b 

6 

172.651 

157.351 

.000 

Intercept 

1.958 

1 

1.958 

1.784 

.182 

DepGPA 

742.672 

1 

742.672 

676.857 

.000 

SWD 

3.961 

1 

3.961 

3.610 

.058 

ELL 

21.304 

1 

21.304 

19.416 

.000 

FRL 

10.648 

1 

10.648 

9.705 

.002 

Gender 

36.845 

1 

36.845 

33.580 

.000 

BigMatch 

63.668 

1 

63.668 

58.025 

.000 

Error 

2601.545 

2371 

1.097 



Total 

18751.000 

2378 




Corrected Total 

3637.449 

2377 





a. Department — Social Studies 

b. R Squared = .285 (Adjusted R Squared = .283) 


114 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


The Science Department. Chart 26 illustrates CBAM response data from the Science department from 
2011 to 2015. 



These data suggest teachers in the Science department have successfully adopted PBL. Like data 
from the Social Studies department, the five year trend in the Science department suggests they 1) have a 
good understanding of what PBL is (low in the “unconcerned” and “information” categories), 2) are most 
concerned with managing how they are implementing PBL moving forward (high in the “management”), 
and 3) looking for ways to collaborate with each other to improve how they are implementing PBL (high 
in the “colalboration” category). Qualitative responses from Science teachers support these data. A teacher 
described PBL “(when well implemented) to be very engaging and liberating for students. It [PBL] 
provides access points for students with a variety of needs and ability levels.” Although there is some 
variability of perspectives within the department, these data suggest this teacher’s perspective is consistent 
with that of the Science department overall. 

Again, these data generally align with findings in the Exploratory Study #1. Students in the 
treatment group experienced gains in Science department AP coursework exams. Mean differences are 
statistically significant when accounting for matched students using GPA, SWD, and Gender as covariates. 
The Science department’s CBAM data suggest department-wide adoption of PBL. For this analysis 620 
comparison students were matched to 954 treatment students. Students in the comparison group were 
allowed to be matched to more than one treatment student making the weighted comparison and 
unweighted treatment groups to have 954 students each. AP Test Scores across all Science department 


115 









Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

tests (Biology, Chemistry, Environmental Science, Physics-combined) for students in the treatment group 
(having taken at least one redesigned PBL course in any department) scored higher than the comparison 
group. As expected, departmental GPA (DepGPA) accounted for the vast majority of the variance in AP 
Test scores. While significant, PBL Exposure accounted for only a small portion of the overall 
departmental level variance especially as compared to Gender. One particular course (Biology) was 
responsible for the majority of the effects of PBL exposure in this department. 

Table 32. 


Dependent Variable: Advanced Placement Test Score (Science) 


Source 

Type III Sum 

of Squares 

df 

Mean Square 

F 

Sig. 

Corrected Model 


6 

131.151 

122.222 

.000 

Intercept 

■■■■ 

1 

20.558 

19.158 

.000 

DepGPA 


1 

486.690 

453.555 

.000 

SWD 

2.544 

1 

2.544 

2.371 

.124 

ELL 


1 

9.004 

8.391 

.004 

FRL 

38.716 

1 

38.716 

36.080 

.000 

Gender 

94.563 

1 

94.563 

88.125 

.000 

BigMatch 

13.663 

1 

13.663 

12.733 

.000 

Error 


1901 

1.073 



Total 


1908 




Corrected Total 

2826.786 

1907 





a. Department — Science 

b. R Squared = .278 (Adjusted R Squared = .276) 


116 












Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


The Math Department. Chart 27 illustrates CBAM response data from the Math department from 
2011 to 2015. 



These data suggest teachers in the Math department remain hesitant to adopt PBL. The five year 
trend in their data suggest they 1) are concerned about PBL in general (resurgence in concerns in the 
“unconcerned” category), are concerned with how to manage it in their classrooms (high in the 
“management” category), and that 3) they are looking for other pedagogical models to implement (low in 
“collaboration” and tailing up in the “refocusing” category). The combination of their highest concerns 
reveal a “W” pattern in their data, especially in their 2015 responses. According to SEDL researchers, a 
“W” pattern suggests a group of people who have become disenchanted with the intervention and who are 
actively looking for alternatives they think would work better. According to one Math teacher, “Due to the 
amunt of new vocabulary and content needed in each course, longer in-depth, student led PBL problems 
have caused some problems in the past. The real world problems that we have tried to apply are either too 
oversimplified (so they aren’t really real world any longer) or the realness (messiness of the data) 
obfsucates the purpose.” Although teacher-level responses reveal some variation in teacher’s perceptions 
about PBL, these data suggest this teacher’s perspective is consistent with that of the Math department 
overall. 

Agreement is clear between the CBAM data and our findings from Exploratory Study #1. Students 
in the treatment group experienced gains in Math department AP coursework exams. Mean differences are 
statistically significant when accounting for matched students using GPA as a covariate. The Math 
department CBAM data suggest the Math department has not fully adopted PBL. However, focus groups 


117 









Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


conducted with the Math department suggest Math teachers, notably the AP Calculus teacher, has used the 
Key Elements of Problem Based Learning to inform how she continues to shape how students learn 
Calculus in her classroom. While the qualitative data suggest the Math department continues to approach 
what PBL looks like in their classrooms differently, they continue to use the Key Elements as guiding 
principles for how they further refine their course offerings. 

For this analysis 476 comparison students were matched to 568 treatment students. Students in the 
comparison group were allowed to be matched to more than one treatment student making the weighted 
comparison and unweighted treatment groups to have 568 students each. AP Test Scores across all Math 
department tests (Calculus-combined, Statistics) for students in the treatment group (having taken at least 
one redesigned PBL course in any department) scored higher than the comparison group. As expected, 
departmental GPA (DepGPA) accounted for the vast majority of the variance in AP Test scores. While 
significant, PBL Exposure accounted for only a small portion of the overall departmental level variance 
especially as compared to Gender and FRL. However, when controlling for these covariates PBL 
Exposure was significantly associated with the outcome variable. 


Table 33. 


Dependent Variable: Advanced Placement Test Score (Math) 


Source 


df 


F 

Sig. 

Corrected Model 

1122.736 b 

6 

187.123 

134.309 

.000 

Intercept 

98.922 

1 

98.922 

71.002 

.000 

DepGPA 

931.677 

1 

931.677 

668.720 

.000 

SWD 

.188 

1 

.188 

.135 

.714 

ELL 

1.341 

1 

1.341 

.963 

.327 

FRL 

9.271 

1 

9.271 

6.654 

.010 

Gender 

19.670 

1 

19.670 

14.118 

.000 

BigMatch 

9.304 

1 

9.304 

6.678 

.010 

Error 

1572.951 

1129 

1.393 



Total 

10150.000 

1136 




Corrected Total 

2695.687 

1135 





a. Department — Math 

b. R Squared = .416 (Adjusted R Squared = .413) 


118 












Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


The English Department. Chart 28 illustrates CBAM response data from the English department from 
2011 to 2015. 

Chart 28. 



These data suggest teachers in the English department have become indifferent towards PBL and 
have abandoned it as a guding pedagogical model. The five year trend in the English department sugests 
they 1) remain concerned with PBL in general (resurgence in the “unconcerned” category), 2) are relatively 
unconcerned about either the management of PBL in their classrooms and uninterested in collaborating 
with colleagues to improve their implementation of PBL (decreases in the “management” and 
“collaboration categories), and 3) they are looking for other pedagogical models to use in their daily 
instructinal practice (tailing up in the “refocusing” category). Qualitative responses from English tecahers 
support these data. As one teacher stated, “The important skills that stuents need to learn get overlooked 
when only working on problems and activities. Many fundametal skills need practice in other methods.” 
Although individual teacher’s repsonses revealed some diversity within the department, these data suggest 
this teacher’s perspective is consistent with that of the English department overall. 

CBAM data showing a resistant department overall is consistent with our findings from 
Exploratory Study # 1. Students in the treatment group did not experience gains on the AP English 
Language or AP English Literature exams. Mean differences are not statistically significant when 
accounting for GPA, SWD, and FRL as covariates. The English department CBAM data suggest the 
English department has abandoned PBL as a viable pedagogy for their content-area. Qualitative data 
suggest low PBL curriculum implementation fidelity. 


119 








Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

For this analysis 629 comparison students were matched to 794 treatment students. Students in the 
comparison group were matched to more than one treatment student making the weighted comparison 
and un-weighted treatment groups to have 629 students each. AP Test Scores across all English 
department tests (English Language, English Literature) for students in the treatment group (having taken 
at least one redesigned PBL course in any department) did not score higher than the comparison group. 

As expected, departmental GPA (DepGPA) accounted for the vast majority of the variance in AP Test 
scores. PBL exposure accounted for nearly no variance in the outcome variable. 


Table 34. 


Dependent Variable: Advanced Placement Test Score (English) 


Source 

Type III Sum 
of Squares 

df 

Mean Square 

F 

Sig. 

Corrected Model 

496.196 b 

6 

82.699 

87.481 

.000 

Intercept 

39.028 

1 

39.028 

41.284 

.000 

DepGPA 

289.185 

1 

289.185 

305.907 

.000 

SWD 

10.012 

1 

10.012 

10.591 

.001 

ELL 

12.474 

1 

12.474 

13.196 

.000 

FRL 

30.016 

1 

30.016 

31.752 

.000 

GEnder 

1.668 

1 

1.668 

1.765 

.184 

BigMatch 

1.570 

1 

1.570 

1.661 

.198 

Error 

1494.576 

1581 

.945 



Total 

12011.000 

1588 




Corrected Total 

1990.773 

1587 





a. Department — 


English 


b. R Squared — .249 (Adjusted R Squared 


.246) 


The World Language Department. Even though we do not share course level mean AP score data from 
the World Language department for reasons having to do primarily with small sample sizes, our findings 
show that students in the treatment group experienced gains in World Language department AP 
coursework exams. Mean differences are statistically significant when accounting for matched students 
using GPA, SWD, and EngNotFirst as covariates. The Foreign Language department’s CBAM data 
suggest they have not fully adopted PBL. However, individual interviews conducted with World Language 
teachers suggest they actively look for ways to incorporate the Key Elements of Problem Based Learning 
into their teaching. While the qualitative data suggest the World Language department continues to 
approach what PBL looks like in their classrooms differently, they continue to use the Key Elements as 
guiding principles for how they further refine their course offerings. 

Comparison of 2015 Data by Years of Experience and Department 

Two prominent findings have emerged from the CBAM data. First, the CBAM data strongly 
suggests that departments and departmental membership influenced the extent to which teachers adopted 
PBL pedagogy. Second, the CBAM data also suggest that after five years the staff is somewhat divided 
around the central issue of PBL effectiveness. 


120 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Data shared above illustrate both the diversity by which each department has adopted, or not, the 
PBL model of teaching and learning. The Math and English departments remain somewhat resistant, if not 
indifferent, to adopting PBL in their courses while the Science and Social Studies departments have 
successfully adopted PBL as the primary pedagogy in their courses. In Chart 29 below illustrating adoption 
curves by years of experience, it is clear there is no statistical difference between novice, experienced, and 
veteran teachers on the staff. 

Chart 29. 



This finding is surprising given conventional wisdom that holds that more veteran teachers would 
be more resistant to adopt a new pedagogical model. These data suggest there is little difference in 
concerns or adoption rates between less and more experienced teachers. Compare the Chart 29 with Chart 
30 below in which we illustrate how departments compare in their adoption curves. 


121 










Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 30. 



When we compare PBL adoption rates by departments, the differences are more distinct. The 
most prominent difference between departments is the extent to which they perceive collaboration to be a 
primary concern at this point in the PBL project. The departments that are more concerned with 
collaboration, Science and Social Studies, are the departments that have fully adopted PBL pedagogy. 
Overall, departments who are adopting the intervention (PBL) see collaboration as an important tool to 
further refine their PBL design, thus making their ability to collaborate a major concern. Whereas the 
departments that are least concerned with collaboration, Math and English, are also the departments 
whose data “tail up” in the refocusing category, suggesting they are looking for other pedagogical models 
that might work better than PBL in their classrooms. Data illustrated in Figure 19 suggests that the core 
content area departments seem to be headed in different directions, potentially endangering the extent to 
which the school can sustain PBL as a guiding pedagogy across the school. 

Comparison of Mean AP Scores by All Students Disaggregated by Department 

Aggregated to the department level, the AP scores data shows gains in students mean AP scores 
between comparison and treatment groups in all departments. Students experienced statistically significant 
student gains in mean AP scores in the Math, Science, and Social Studies departments. Chart 31 compares 
the comparison and treatment group mean AP Scores by department. 


122 









Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 31. 


Department Level APTest Performance (Estimated Marginal Means) 


o.s 






English 


Math ” 


Science * 


Social Studies ** 


■ Comparison ■Treatment 


Table 35 below illustrates the statistically significant mean AP score differences between the 
comparison and treatment groups in departments. 

Table 35. 


Statistically Significant Gains in Mean AP Scores by Department 

Department 

Mean AP Score Difference 

Math 

2.43 to 2.68 

Science 

1.93 to 2.13 

Social Studies 

2.37 to 2.66 


Chart 31 below illustrates a similar positive trend in the percentage of students in each department 
who have passed AP tests. In the English (43% to 48%), Math (45% to 51%), Science (30% to 36%), and 
Social Studies (42% to 52%) departments, the difference in the percentage of students who passed AP 
tests was statistically significant. 


123 









Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 31. 



When we compare the mean AP scores in each department of students who receive free and 
reduced lunch (FRL) services, similar trends in the data emerge. While FRL students in the treatment 
group made gains in each department when compared to FRL students in the comparison group, the 
students in the treatment group made statistically significant gains in the Science and Social Studies 
departments. Chart 32 illustrates the gains made by FRL students in each department. 


124 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 32. 


Department Level AP Test Performance - FRL (Estimated Marginal Means) 
5 

4.5 

4 

3.5 

3 

2.5 

2 

1.5 

1 

0.5 

0 



English Math Science** Social Studies ** 

■Comparison ■Treatment 


Table 36 below illustrates the statistically significant mean AP score differences between FRL 
students in the comparison and treatment groups in departments. 

Table 36. 


Statistically Significant Gains in Mean AP Scores by FRL Students by 

Department 

Department 

Mean AP Score Difference 

Science 

1.36 to 1.72 

Social Studies 

2.02 to 2.27 


Chart 33 below illustrates a similar positive trend in the percentage of FRL students in each 
department who passed AP tests. In the English (21% to 30%), Science (12% to 21%), and Social Studies 
(28% to 36%) departments, the difference in the percentage of students who passed AP tests was 
statistically significant. 


125 












Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 33. 



When we compare the mean AP scores in each department of students who speak a first language 
other than English at home (EngNotFirst), once again similar trends in the data emerge. While 
EngNotFirst students in the treatment group made gains in each department when compared to FRL 
students in the comparison group, the students in the treatment group made statistically significant gains in 
the Science and Social Studies departments. Chart 34 illustrates the gains made by EngNotFirst students in 
each department. 


126 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 34. 



Table 37 below illustrates the statistically significant mean AP score differences between 
EngNotFirst students in the comparison and treatment groups in departments. 

Table 37. 


Statistically Significant Gains in Mean AP Scores by EngNotFirst Students by 

Department 

Department 

Mean AP Score Difference 

Science 

2.05 to 2.24 

Social Studies 

2.49 to 2.77 


Chart 35 below illustrates the percentage of EngNotFirst students in each department who passed 
AP tests. While the percentage of students who passed AP tests increased, with the exception of the Math 
department (50% to 47%), in no department were those increases statistically significant. 


127 








Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 35. 


Percent of AP Test Takers Passing with a Score of 3 or Higher - EngNotFirst 


100 % 

90% 

80% 


70% 

60% 

50% 

40% 

30% 

20 % 

10 % 

0 % 

English Math Science Social Studies 



When we compare the mean AP scores in each department of students with disabilities (SWD), 
students in the treatment group made gains in the Math, Science, and Social Studies departments when 
compared to SWD students in the comparison group. SWDs in the treatment group made statistically 
significant gains in Social Studies departments. Chart 36 illustrates the gains made by SWDs in each 
department. 


128 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 36. 


Department Level AP Test Performance - SWD (Estimated Marginal Means) 
5 

4.5 

4 



English Math Science Social Studies ** 

■ Comparison ■ Treatment 


Table 38 below illustrates the statistically significant mean AP score differences between SWDs in 
the comparison and treatment groups in the Social Studies department. 

Table 38. 


Statistically Significant Gains in Mean AP Scores by Students with Disabilities 

(SWD) by Department 

Department 

Mean AP Score Difference 

Social Studies 

1.87 to 2.21 


Chart 37 below illustrates the percentage of SWDs in each department who passed AP tests. While 
the percentage of students in the treatment group who passed AP tests increased in the Science and Social 
Studies departments, those students passed AP tests at an equal rate in the Math department (34%) and 
passed AP tests at a decreased rate (34% to 19%) in the English department. In no department were those 
increases or decreases statistically significant. 


129 









Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Chart 37. 

Percent of AP Test Takers Passing with a Score of 3 or Higher - SWD 

100 % 

90% 

80% 

70% 

60% 

50% 

40% 

30% 

20 % 

10 % 

0 % 

English Math Science Social Studies 



College and Career Readiness Outcomes for Students Not Participating in AP Coursework 

So far we focus specifically on outcomes for students who enrolled in AP courses. Chart 38 
includes outcomes for students not taking AP coursework on the Campus Ready assessment. The Campus 
Ready assessment is an externally validated assessment meant to measure students’ career and college 
readiness as defined in the research (Conley, 2012). For the past four years, every Sammamish student took 
the Campus Ready assessment in the Spring of each academic school year. 

Chart 38 compares mean Campus Ready assessment scores for students who had not taken AP 
tests with those who had taken AP tests. Within those two categories of students, we compare students 
who had no exposure to PBL coursework with those who had some exposure to PBL coursework. These 
data suggest that student outcomes on the Campus Ready assessment also improve for students who had 
not taken an AP test but who were exposed to PBL coursework. 


130 











Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Chart 38. 



131 






Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Findings: Exploratory Study #2: Impact of Starting Strong/Sammamish Leads on Students’ 
Career and College Readiness 

The Starting Strong exploratory outcome study focused on the impact of participation in a PBL- 
oriented summer program on college and career readiness. The context for studying this impact is Starting 
Strong, an intense seven to nine-day summer program that uses a problem-based learning instructional 
model embedded with college and career readiness workshops. Starting Strong mirrors the curricular 
approach of the whole-school PBL curriculum redesign effort and thus provided a context to study the 
extent to which PBL as a broad instructional strategy might lead to changes in students’ college and career 
readiness. The theory of change for Starting Strong is consistent with that of the larger i3 curriculum 
redesign effort. It is hypothesized that students who engage in authentic, collaborative problem-solving 
through partnerships with community and business organizations will score higher on measures of college 
and career readiness than students who do not experience this sort of curriculum. 

The Dev07 project design consists 7 critical components (see fidelity section of this report) with 
Starting Strong as the 7 th component: Focusing on 1 st Generation College Found Students and Developing a PBL 
Laboratory: Starting Strong. The rational for studying this component as a sub-study of the larger exploratory 
research is that: 

1. PBL is the driving framework of Starting Strong 

2. Expertise through partnerships provides the context and authentic problems for students to solve 
through collaboration 

3. Starting Strong provides an exemplary picture of what full-blown PBL looks like 

4. Starting Strong is a discrete 7 to 9-day intervention that lends itself to rigorous study 

5. The faculty who teach in the Starting Strong program are also the teachers who are involved in 
PBL design and implementation of the high school curriculum 

6. Most importantly, the central theme of Starting Strong is College and Career Readiness 

For these reasons Starting Strong presented itself as an ideal proxy for studying the impact of the 
i3 project. In this study the evaluation sought to answer the following question: 

• Do students who participate in Starting Strong achieve higher levels of college and career readiness as compared to 
matched students who do not participate in Starting Strong!? 

Methodology: Exploratory Study #2 

This two-year study used a quasi-experimental design (QED) to study the impact of participation 
in the Starting Strong program on college and career readiness as measured by an instmment called 
Campus Ready. The design involved treatment students (those that participate in Starting Strong in one 
summer program) and a comparison group matched on a pre-administration of the Campus Ready 
instrument. Both treatment and comparison students in the study were drawn from the SHS population 
and were sophomores, juniors or seniors (Campus Ready pre-test data was not available for incoming 
freshmen). The study began with the pre-measure. (We use the concept of school year to maintain about 
when events occurred and use a two-year notation to signify the school year, for example, ‘March, 2012- 
13’). Table 39 describes our data collection process for this study. 


132 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Table 39. 


Date Event 

March, 2012-13 Campus Ready Administration (pre) for pool of students eligible for 

selection as treatment or comparison group member for the Starting Strong 
summer program offered in the summer of school year 2012-13. 


Summer, 2012-13 Starting Strong Participation (first cadre) and evaluation of fidelity. 


March, 2013-14 Campus Ready Administration (pre) for pool of students eligible for 

selection as treatment or comparison group member for the Starting Strong 
summer program offered in the summer of school year 2013-14. 


March, 2013-14 Campus Ready Administration (post) for Starting Strong participants from 

the previous school year (2012-13). Those students who had both pre and 
post Campus Ready data at this time and who participated in the 2012-13 
Starting Strong program defined cadre 1 of the treatment group. The 
remaining students having both pre and post-test Campus Ready data and 
no participating in any Starting Strong summer program comprised the 
matching pool set 1 of comparison students. 


Summer, 2013-14 Starting Strong Participation (second cadre) and evaluation of fidelity. 


March, 2014-15 Campus Ready Administration (post) for Starting Strong participants from 

the previous school year (2013-14). Those students who had both pre and 
post Campus Ready data at this time and who participated in the 2013-14 
Starting Strong program and who did not participate in the 2012-13 Starting 
Strong summer program defined cadre 2 of the treatment group. The 
remaining students having both pre and post-test Campus Ready data and 
no participating in any Starting Strong summer program comprised the 
matching pool set 2 of comparison students. 


Summer, 2015 Campus Ready data acquired for all students 

Treatment group finalized (cadres 1 and 2 combined that meet treatment 
group inclusion criteria) 

Comparison group finalized from comparison pool (sets 1 and 2 combined) 
that meet comparison group inclusion criteria. 

Data Analysis 


Using Shaddish, Cook and Campbell notation, the design of the study is: 

NR Oi X 0 2 


NR O l 

133 


o 2 





Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Oi= Pretest (Campus Ready) used for matching and to check baseline equivalency at study 
conclusion 

X = Treatment (Participation in 2 consecutive Starting Strong programs) 

02= Posttest (Campus Ready) used to determine performance levels 


Subjects 

Students eligible for the study included those that participated in Starting Strong during the 
summers following school years 2012-13 and 2013-14. For those attending the 2012-13 program their 
Campus Ready scores during that same school year (5 months before attending Starting Strong) served as 
their pretest and their 2013-14 Campus Ready scores (7 months after attending Starting Strong) served as 
their post-test. The group of students who attended the next year summer program followed the same 
pattern but a year later for pre and post tests. All pre-test values were pooled and defined the pool from 
which treatment students were selected. The treatment group, then, had students in cohorts 2010, 2011, 
and 2012. Only students with both pre and post Campus Ready data were included in the final treatment 
group. 107 students were eligible for the treatment group before checking for double doses of Starting 
Strong. After removing students that had attended both summers to the total number of students in the 
treatment group was 95. 

Comparison students were drawn from the same cohorts as the treatment group students and had 
to have pre and post Campus Ready data available but not attended Starting Strong at any time (N = 237). 

Matching 

A simple one-to-one matching strategy was utilized. Students in the treatment group were matched 
to students in the comparison pool using Cohort (i.e., the school year of their freshman enrollment at 
SHS) and their scored on the Campus Ready pre-test. This matching strategy controlled for years of 
education and SHS and exposure to PBL courses since each cohort should have the same opportunity to 
take courses at SHS. 

All students with pre and post-test data were divided by cohort and then within each of these 
cohort groups, were separated by treatment or comparison participants. Campus Ready scores were then 
rounded to 0.25 standard deviations of the pooled group pre-test mean and treatment students in each 
cohort group were sorted by pre-test score: high to low. Similarly, comparison students in each cohort 
group were randomized and then sorted on pre-test from high to low. Comparison students were then 
matched to treatment students using the rounded pre-test score. The first comparison student to match 
exactly was kept, the remaining matches discarded. Once all matches were made in each cohort group the 
cohort treatment groups were combined and the cohort comparison students were combined. A dummy 
variable called GroupF was created and each comparison members was coded as zero while each 
treatment student was coded as one. Students in the comparison pool were matched to all 95 treatment 
students with 5 comparison students being matched to 2 treatment students. 

Outcojne Variable 

The pre-test (Campus Ready developed and validated by EPIC) was used to match comparison 
students to Starting Strong students, and the post-test was used to examine differences on the outcome 
variable (college readiness) between these groups. Campus Ready was administered to SHS students each 


134 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

March of the five-year project period. Campus Ready measures the following constructs related to college 
and career readiness: 

• Key Cognitive Strategies - patterns of intellectual behavior that lead to higher order thinking 

• Key Content Knowledge - knowledge and attitudes toward the core subjects needed for 
success 

• Academic Behaviors - attitudes and behaviors for success in college and workplace 

• Contextual Skills and Awareness (College Knowledge) - knowledge and skills necessary to 
apply and enroll in college/navigate higher education 

A total score was computed by summing weighted scores from these dimensional sub-scores. 

Data were available for students at the sub-scale level (means and standard deviations have been provided). 
These data are treated as continuous variables. Table 40 below lists the dimensions, aspects, and 
components of this instmment. 

Table 40. 


Dimension (Key) 

Aspect 

Component 

Key Cognitive Strategies 
(weight = 5) 

Problem Formulation 

Hypothesize 

Strategize 

Research 

Identify 

Collect 

Interpretation 

Analyze 

Evaluate 

Communication 

Organize 

Constmct 

Precision/Accuracy 

Monitor 

Confirm 

Key Learning Skills & Techniques 
(weight = 2) 

Self-Monitoring 

Goal-Setting Strategies 

Persistence Strategies 

Self-Awareness Strategies 

Learning Strategies 

Test-Taking Strategies 

Note-Taking Strategies 

Information Retention Strategies 

Collaborative Learning Strategies 

Time Management Strategies 

Strategic Reading Strategies 

General Study Strategies 

Key Transition Knowledge & Skills 
(weight = 4) 

Academic Awareness 

College and Career Preparation 

College and Career Expectations 

College Admissions Process 

College Selection 

College Application 

College and Career Culture 

College Awareness 

Career Awareness 

Tuition and Financial Aid 

Financial Aid Awareness 

Tuition Awareness 


135 





Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Key Content Knowledge 
(weight = 5) 


Academic Attribution 

ELA Attribution 

Math Attribution 

Science Attribution 

Social Sciences Attribution 

Technology Attribution 

Academic Value 

ELA Value 

Math Value 

Science Value 

Social Sciences Value 

Technology Value 

Student Effort 

ELA Student Effort 

Math Student Effort 

Science Student Effort 

Social Sciences Student Effort 

Technology Student Effort 

Challenge Level 

ELA Challenge Level 

Math Challenge Level 

Science Challenge Level 

Social Sciences Challenge Level 

Technology Challenge Level 

General Key Content 

Knowledge 

Stmcture of Knowledge 

Experience with Technology 


The developers of the Campus Ready instmment used a four-part model to establish validity and 
reliability: factor analysis, discriminant analysis, item reliability, and think-alouds. The factor analysis 
yielded a 5-factor solution consistent with the four dimensions of the instmment (All dimensions except 
the Learning Strategies discriminated at an acceptable level between high and low performing schools. 
Reliability coefficients were found to be at 0.88 or higher suggesting a high degree of internal consistency. 
The think-alouds are in progress and data is not yet available. (More information on the Campus Ready 
instrument is available from the developer.) 

Statistical analysis of outcomes for students 

After data from the final Campus Ready assessment are available we will check the remaining 
treatment and comparison groups for baseline equivalency. If pooled average standard deviation on the 
Campus Ready pre-assessment is less than or equal to .25 then the matches will remain intact. If, however, 
this value is greater than .25 SD then comparison students will be rematched to treatment students. 

Analysis 

An analysis of covariance using the GLM function in SPSS was used to analyze data. GLM 
Univariate allows one to model the value of a dependent scale variable based on its relationship to 
dicotymous categorical and scale predictors. The categorical factor for prediction was the independent 
variable: participation/no participation in Starting The dependent variable was individual student scores on 
EPIC Campus Ready instrument. Covariates included the pre-score on Campus Ready, FRL, and ELL, 
and Parent Education. No missing Campus Ready data were imputed. Statistical model: 


136 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
PostCR Scorej = fa + fa (Condition) + /? 2 (PreCR) + ^3 (ELL) + /? 4 (ParentEd) + fa(SES) +e t 

Results 

The pre-test scores for the comparison and treatment group are nearly equal as are standard 
deviations. The differences between the estimated marginal post-test means is .1.64 or about third of a 
standard deviation (p=.013). 


Group 

Pre-Test Means 

N 

Std. Deviation 

Comparison 

3.5888 

95 

.55321 

Treatment 

3.5929 

95 

.55288 

Total 

3.5908 

190 

.55158 


Group 

Estimated Post-Test Means 

N 

Std. Error 

Comparison 

3.563 

95 

.046 

Treatment 

3.727 

95 

.046 


In the model the covariate accounts for the bulk of variance in the outcome variable is the pre-test 
score and a small amount by the dependent variable. Chart 39 below illustrates gains by treatment (Starting 
Strong students) on the Campus Ready instrument. However, their gains are not statistically significant. 

Chart 39. 



Tests of Between-Subjects Effects 
Dependent Variable: Post Campus Ready Score 

Source Type III Sum of Squares df Mean Square F Sig. Partial 

Eta 

Squared 


137 














Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Corrected Model 

28.152 

5 

5.63 

27.832 

0 

0.431 

Intercept 

5.79 

1 

5.79 

28.623 

0 

0.135 

Pre-Test 

26.459 

1 

26.459 

130.791 

0 

0.415 

ELL 

0.031 

1 

0.031 

0.151 

0.698 

0.001 

Parent Education 

0.006 

1 

0.006 

0.03 

0.863 

0 

LRL 

0.055 

1 

0.055 

0.271 

0.603 

0.001 

Treatment 

1.265 

1 

1.265 

6.254 

0.013 

0.033 

Error 

37.223 

184 

0.202 




Total 

2589.399 

190 





Corrected Total 

65.375 

189 






a. R Squared — .431 (Adjusted R Squared — .415) 

b. Computed using alpha = .05 


Discussion 

Quantitative analysis of matched groups reveal gains in student performance on the Campus Ready 
instrument. Observations of multiple Starting Strong/Sammamish Leads modules evidence strong student 
engagement, collaboration, and active problem solving. A member of the research team observed a student 
focus group held on the last day of the Bill and Melinda Gates Foundation World Health challenge. In this 
module, students were challenged with identifying a specific problem within the realm of global health, 
such as maternal care and availability of vaccines in the Third World. Students described their experience 
as both interesting and empowering. Specifically, they focused on the opportunity to work with a Global 
Health specialist from the Bill and Melinda Gates Foundation throughout the module as very positive for 
their motivation to engage the problem and find a solution with peers. When the evaluation team 
interviewed the teacher, she highlighted student choice and problem authenticity as major contributors to 
students’ overall engagement and interest. 

Although the quantitative data suggest little impact of Starting Strong/Sammamish Leads on 
students’ performance on the Campus Ready instrument, those data do not fully describe the value 
students seem to see in the Starting Strong/Sammamish Leads experience. Our observations revealed high 
levels of student creative and collaborative problem solving. Students work alongside industry experts to 
think deeply about chronic problems and possible solutions to them. In most cases, students present their 
solutions to a panel of industry experts and answer tough questions from them about the practicality and 
viability of their solutions. These unique experiences should not be overlooked, especially for students 
who will be the first in their family to go to college. Our observations suggest that what students learn 
from their experience in this program augments and accentuates the PBL experience they receive in their 
coursework throughout the year. 

Lastly, Starting Strong/SHS Leads served as an important PBL laboratory for the school, especially 
in how the program deepened teachers’ expertise in PBL pedagogy and practice. Over time the program 
proved useful not just for deepening students facilities with key 21 st Century skills but in providing 
teachers with experience in teaching in a PBL context with no constraints. Teachers gained valuable 
experience working with experts to plan highly engaging and relevant PBL experiences and coaching 
students on collaborative and problem-solving strategies without the constraints of grading, high stakes 
testing, or district curriculum or common assessments to worry about. This proved liberating for teachers 


138 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

and students alike and seemed to free teachers to take greater risks in how they planned their challenges 
and freed students to fail while trying something new. 


139 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Chapter 7: Discussion and Conclusions 

Sammamish High School implemented the intervention (PBL) with high fidelity in almost every 
facet of the project, despite the inevitable shift in priorities that occur over time. Several critical 
components emerged from this project that provide the background for student gains on AP tests. The 
most important of these is the development of the Key Elements of Problem Based Learning framework 
that guided all curriculum redesign efforts and all professional learning experiences. The Key Elements 
framework also provided Sammamish High School teachers, school leaders, and students a common 
language to describe and define ambitious teaching and learning practices. Interviews and focus groups 
with teachers, school leaders, and students repeatedly revealed a working knowledge and understanding of 
the Key Elements as a whole and with what enactment of specific Key Elements looks like in the 
classroom. This emerging common language around PBL can be directly attributed to the robust 
infrastmcture of professional learning developed by teacher leaders and the principal working on the 
Leadership Team. The Key Elements framework provided a foundation for virtually every i3 project 
associated policy implemented in the past five years, profoundly shaping a shift in how teachers taught and 
how teachers and students learned. 

The redesigned school Leadership Stmcture had some impact on the overall success of the project. 
They partnered with existing teachers to design professional learning experiences a vast majority of 
teachers described as valuable overall and relevant to their specific classroom practice. Teacher led- and 
designed SILT professional learning experiences complimented the work teachers were doing in design 
teams. Design teams proved especially powerful contexts for teacher learning. While not every design team 
was equally successful in their efforts to design PBL curriculum, nearly every teacher felt the design team 
experience was one of the most valuable in their careers. In many cases, teachers’ creative and 
collaborative problem solving, focused on student learning and problems of practice, made teachers feel 
like valued professionals within the school. 

Overall, students who experienced PBL curriculum in their AP coursework improved their scores 
on AP tests when compared to students who took the same or similar AP coursework before the school 
implemented PBL across content areas. Even though it would be premature to suggest a causal 
relationship between PBL AP coursework and improved student scores on AP tests, the data suggest that 
PBL curriculum does not hurt student performance on AP exams. Rather, data from Sammamish High 
School suggests that the school’s PBL implementation positively impacted students’ college and career 
readiness outcomes overall, specifically their performance in AP coursework and their performance on the 
Campus Ready assessment. While students who speak a first language other than English at home 
(EngNotFirst) experienced gains in some mean AP scores, qualitative data suggest that ELLs continue to 
struggle with the language and pedagogical demands inherent in a PBL classroom. 

While findings indicate Sammamish High School students achieved statistically significant, wide 
spread gains in AP scores across many AP tests in the English, Math, Science, and Social Studies 
departments, they also obscure variability in how departments interacted with the PBL initiative 
throughout the duration of the grant. Quantitative and qualitative findings suggest that the academic 
departments at Sammamish High School played an important, possibly central role, in the extent to which 
individual teachers adopted PBL. 

Results from the Concerns Based Adoption Model (CBAM) surveys show varied levels of 
adoption of PBL throughout the school. In those departments where a vast majority of teachers adopted 

140 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

PBL as a guiding pedagogical model, PBL seems to have augmented practices and routines already in place. 
This finding is especially pronounced in the Social Studies department where Levels of Use (LOU) 
interview data suggest several teachers had experience developing PBL curriculum and where the 
department already had a strong collaborative culture in place that aligned with the collaborative work 
teachers would do in design teams. In those departments where PBL has not gained widespread traction, 
such as the English department and to a lesser extent in the Math department, PBL seemed to present a 
significant pedagogical disruption to teaching “business as usual.” Although department level CBAM data 
suggest the English department has all but abandoned PBL as a guiding pedagogical model, both the 
teacher level CBAM data and interview data suggest that outlier English teachers remain open to the 
potential PBL provides to engage more students. While PBL has made few in roads into how the Math 
department designs and implements new curriculum, qualitative data suggest they use the Key Elements to 
inform how they continue to adjust and rework the existing curriculum. In almost every case, the extent to 
which departments have adopted PBL closely aligns with the extent to which department leadership have 
come to accept and promote PBL as a legitimate pedagogy and one that can have potentially positive 
impacts on student learning outcomes. 

Course level findings illustrated above suggest design team teachers may be transferring knowledge 
gained in their design teams to courses that were not redesigned into PBL courses. Statistically significant 
gains in AP Calculus and AP Psychology are two good examples. Both the AP Calculus and AP 
Psychology courses were not formally redesigned through funds from the i3 grant. However, in the past 4- 
5 years, teachers who have design team experience have been teaching the AP Psychology course. 
Additionally, AP Calculus teachers are renowned for using highly student-centered practices, many of 
which are described in the Key Elements. Even though the AP Calculus course has not been redesigned, a 
long-standing AP Calculus teacher described to us how she uses the Key Elements to inform how she 
plans and adjusts her curriculum throughout the year. While not statistically significant, AP English 
Literature, AP English Language, and AP Statistics are three courses that have not been redesigned but 
where students have also experienced gains in mean AP scores. Teachers also benefitted from exposure to 
high quality professional learning a vast majority of which focused on student-centered, PBL pedagogy and 
strategies. Over time teacher participation in SILT and design teams seemed to deepen their pedagogical 
and pedagogical content knowledge expertise. 

Starting Strong/Sammamish Leads, although not as impactful on standardized measures, made 
qualitative impacts on how Sammamish students approached learning. Students attend Starting 
Strong/Sammamish Leads during the summer and get to choose what module they attend. Students work 
collaboratively, many times with an industry expert, to solve ill-defined, complex problems. The experience 
they gain hones their ability to think creatively and divergently, work collaboratively with peers, and gain 
valuable experience working on an authentic task in a profession, field, or discipline. This experience is 
especially valuable for students who would be the first in their family to attend college as it expands their 
career options and may provide them with purpose as they navigate the college landscape. 


141 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


References 

Albion, P., & Gibson, I. (2000). Problem-based learning as a multimedia design framework in teacher 
education .Journal of Technology and Teacher Education, 8( 4), 315-326. 

Ball, D. L. & Cohen, D. K. (1999). Developing practice, developing practitioners: Toward a practice-based 
theory of professional education. In L. Darling-Hammond & G. Sykes (Eds.) Teaching as the learning 
profession (pp. 3-32). San Francisco, CA: Jossey Bass. 

Barron, B., Scwartz, D., Vye, N., Moore, A., Petrosino, A., Zech, L., Bransford, J., The Cognition Group 
and Technology Group at Vanderbilt University. (1998). Doing with understanding: lessons from 
research on problem- and project-based learning. The Journal of Teaming Sciences, 7(314), 271-311. 

Barron, B., & Darling-Hammond, L. (2008). Teaching for Meaningful Learning: A Review of Research on 
Inquiry-Based and Cooperative Learning. In J. L. T. G. N. Cervetti, L. Darling-Hammond, B. 
Barron, D. Pearson, A. H. Schoenfeld, E. K. Stage, T. D. Zimmerman (Ed.), PowerfulTeaming: 

What We Know About Teaching for Understanding. San Francisco: Jossey-Bass. 

Belland, B. R., Glazewski, K. D., & Ertmer, P. A. (2009). Inclusion and Problem-Based Learning: Roles of 
Students in a Mixed-Ability Group. RMTE Online: Research in Middle Tevel Education, 32(9), 1-19. 

Benedict, A. E., Thomas, R. A., Kimerling, J., & Leko, C. (2013). Trends in teacher evaluation: What every 
special education teacher should know. TeachingExceptionalChildren, 45(5), 60-68. 

Berliner, D., C. (2013). Effects of inequality and poverty vs. teachers and schooling on America’s youth. 
Teachers College Record, 115(12), 1-19. 

Blumenfeld, P. C., Soloway, E., & Marx, R. W. (1991). Motivating project-based learning: sustaining the 
doing, supporting the learning. Educ Psychol Educational Psychologist, 26, 369-398. 

Boaler, J. O., & Staples, M. (2008). Creating Mathematical Futures through an Equitable Teaching 
Approach: The Case of Railside School. Teachers College Record, 110(3), 608-645. 

Brinkerhoff, J., & Glazewski, K. (2004). Support of expert and novice teachers within a technology 

enhanced problem-based learning unit: A case study. InternationalJournal of Teaming Technology, 1(2), 
219-230. 

Brush, T., & Saye, J. (2000). Implementation and evaluation of a student-centered learning unit: A case 
study. Educational Technology Research and Development, 48(3), 79-100. Retrieved from 
http://dx.doi.org/10.1007/BF0231985 

Cochran Smith, M., Lyde, S. (1999). Reladonships of knowledge and practice: Teacher learning in 
communities. Chapter 8 in Review of Research in Education, 24, 249-305. 

Conley (2010) College and Career Ready: Helping all students succeed beyond high school. San Fransisco, CA: Jossey- 
Bass. 

Dewey, J. (1938). Experience and education. New York, NY: Touchstone Books. 

DuFour, R. (2004). What is a professional learning community? Educational Teadership, 61, 1-6. 

DuFour, R., Eaker, R., and DuFour, R. (2005). Recurring themes of professional learning communities 

and the assumptions they challenge. In R. DuFour, R. Eaker, R. DuFour (Eds.) On common ground: 
The power of professional learning communities, (pp. 7-29). Bloomington, IN: National Education Service. 

Erickson, F. (1986). Qualitative methods in research, pp. 145-161. In M. Wittrock (Ed.), Handbook of 
research on teaching. 3 rd edition. New York: MacMillan. 


142 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Ertmer, P. A., Lehman, J. D., Park, S. H., Cramer, J., & Grove, K. (2003). Barriers to teachers' adoption 
and use of technology-supported learner-centered pedagogies. In Proceedings of society for information 
technology and teacher education international conference 2003 (pp. 1761-1766). 

Ertmer, P. A., & Simons, K. D. (2006). Jumping the PBL implementation hurdle: Supporting the efforts of 
K-12 teachers. Interdisciplinary Journal of Problem-based Learning, 1(1), 40-54. 

Gallagher, S. A. (1997). Problem-based learning: Where did it come from, what does it do, and where is it 
going? Journalfor the Education of the Gifted, 20(4), 332-362. 

Grant, M. M., & Hill, J. R. (2006). Weighing the risks with the rewards: Implementing student-centered 

pedagogy within high-stakes testing. In R. Lambert & C. McCarthy (Eds.), Understanding teacher stress 
in an age of accountability (pp. 19-42). Greenwich, CT: Information Age Press. 

Grossman, P., Wineburg, S., Woolworth, S. (2001). Toward a theory of teacher community. Teachers College 
Record, 103(6), 942-1012. 

Hall, G. E., Dirksen, D. J., George, A. A. (2006). Measuring implementation in schools: Levels of use. University of 
Texas at Austin: SEDL. 

Halvorsen, A., Duke, N., K, Brugar, K., Block, M., Strachan, S., Berka, M., Brown, J. (2014). Narrowing 
the achievement gap in second-grade social studies and content area literacy: The promise of a 
problem-based learning approach. The Education Policy Center. Michigan State University. 

Hammerness, K., Darling-Hammond, L., Bransford, J., Berliner, D., Cochran-Smith, M., McDonald, M., 
Zeichner, K. (2005). How teachers learn and develop. In L. Darling-Hammond and J. Bransford 
(Eds.), Preparing Teachers for a Changing World, Washington D.C.: The National Academy of 
Education. 

Hargreaves, A., Fullan, M. (2012). Professional capital: Transforming teaching in every school. New York, NY: 
Teachers College Press. 

Horn, I., Little, J. (2009). Attending to problems of practice: Routines and resources for professional 

learning in teachers’ workplace interactions. American Journal of Educational Research, 47(1), 181-217. 

Jencks, C., Smith, M., Acland, H., Bane, M. J., Cohen, D., Gintis, H., et al. (1972). Inequality: A reassessment 
of the effect offamily and schooling in America. New York: Basic Books. 

Land, S. M. (2000). Cognitive requirements for learning with open-ended learning environments. 

Educational Technology Research and Development, 48(3), 61-78. 

Little, J. W. (1990). The persistence of privacy: Autonomy and initiative in teachers’ professional relations. 
Teachers College Record, 91 (4), 509-536. 

Marjoribanks, K. (1979). Eamilies and their learning environments: An empirical analysis. London: Routledge and 
Kegan Paul. 

McLaughlin, M., W., Talbert, J., E. (2006). Building school-based teacher learning communities. New York City, 
NY: Teachers College Press. 

National Research Council. (2000). How people learn: Brain, mind experience, and school. Washington DC: 
National Academies Press. 

Park, S. H., Ertmer, P., & Cramer, J. (2004). Implementation of a technology-enhanced problem-based 
learning curriculum: A year-long study of three teachers. In Proceedings of the 2004 conference of the 
association for educational communications and technolog) (pp. 19-23). 

Parker, W., Mosborg, S., Bransford, J., Vye, N., Wilkerson, J., Abbott, R. (2011). Rethinking advanced high 
school coursework: Tackling the depth/breadth tension in the AP US Government and Politics 
course. Journal of Curriculum Studies, 43(4), 533-559. 

Parker, W. C., Lo, J. Yeo, A. J., Valencia, S. W., Nguyen, D., Abbott, R. D., Nolen, S. B., Bransford, J. D., 
Vye, N. J. (2013). Beyond breadth-speed-test: Toward deeper knowing and engagement in an 
advanced placement course. American Educational Research Journal, 50(6), 1424-1459. 


143 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 

Ravitz, J. (2009). Introduction: Summarizing findings and looking ahead to a new generation of PBL 
research. Interdisciplinary Journal of Problem-Based Learning, 3(1). 

Reid, M., J., Moore HI, J., L. (2008). College readiness and academic preparation for postsecondary 

education: Oral histories of first-generation urban college students. Urban Education, 43, 240-261. 

Rodriguez, A., McKilip, M., Niu, S. (2013). The earlier the better? Taking the AP in 10 th grade. The College 
Board. 

Sawyer, R. (2013). Beyond correlations: Usefulness of high school GPA and test scores in making college 
admissions decisions. Applied Measurement in'Education, 26, 89-112. 

Simons, K. D., Klein, J., & Brush, T. (2004). Instructional strategies utilized during the implementation of 
a hypermedia, problem-based learning environment: A case study. Journal of Interactive Learning 
Research, 15(f), 213-233. 

Strauss, A. L., & Corbin, J. M. (1998). Basics of qualitative research: Techniques and procedures for developing grounded 
theory. Thousand Oaks: Sage Publications. 

Ward, J. D., & Lee, C. L. (2002). A review of problem-based learning. Journal of Family and Consumer Sciences 
Education , 20(1), 16-26. 

Wenger, E. (1998). Communities of practice: Teaming meaning and identity. Cambridge, United Kingdom: 

Cambridge University Press. 


144 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Appendix A. Implementing PBL Classroom Observation Protocol 


Date: 

Start time: End time: 

Period/Class: 

Observer: 

Teacher: 

What is the goal or objective of this lesson? 

How is this lesson contextualized within a larger unit? 

Briefly describe (3-5 sentences) what happened during this lesson. 



Teacher 

Individual student 

Groups of students 

Page 

Time 

Instructions 

Lecture 

Modeling 

Answering 

Coaching 

Other 

Seatwork 

Computer 

Presenting 

Mentoring 

Lab work 

Other 

Partner work 

Group work 

Computer 

Presenting 

Critiquing 

Lab work 

Other 

Notes 
































































































































145 



































Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


Appendix B: Key Element Classroom Observation Protocol 


Date: 

Start time: End time: 

Period/Class: 

Observer: 

Teacher: 

NOTE: As you complete this table, please refer back to the Key Elements SHS document for clarification on the Key Elements. 


KEY 

ELEMENT 

FOCUS QUESTIONS 

OBSERVATION EVIDENCE 

CONTINUUM LEVEL OBSERVED 

Authentic 

problems 

*What is the 
problem/project? 

*Is the problem/project 
relevant to the 
professionals in the 
community? 

*Do students have input 
on the crafting of the 
problem/project? 

*How structured is the 
problem? 

*Are there more than one 
possible solutions and 
strategies for addressing 
the problem/project? 


N.O. 

INC 

INT 

TRAN 

EMP 

N.O.C. 

Developing 

expertise 

*Is knowledge or expertise 
opening shared across the 
project team (including 
students, teacher, and 
industry professionals? 

*Do facilitators (teachers 
or industry professionals) 
elicit student expertise? In 
what context? How 
frequently? 

*What types of expertise 
show up in the setting (e.g. 
school-related or hobby 


N.O. 

INC 

INT 

TRAN 

EMP 

N.O.C. 


148 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 



related)? 

*How is feedback given in 
the setting? 








Culturally 

responsive 

instruction 

*Does it appear that the 
teacher has taken time to 
learn about students and 
their interests outside of 
school? 

*Is the lesson connected to 
local issues and/or issues 
relevant to students’ lives? 
*Are examples and 
experiences from a variety 
of cultures and life 
experiences used to 
illustrate the power of the 
discipline and/or project? 
*Does the lesson capitalize 
on students’ native 
language and values? 

*Does the teacher inform 
students of the ways in 
which their native language 
and values may differ from 
the language and values of 
the workplace and higher 
education? 


N.O. 

INC 

INT 

TRAN 

EMP 

N.O.C. 

Student voice 

*Is student feedback 
elicited? If so, how? 

*Do all students give 
feedback or do some 
students give feedback 
more often? 

*Do students have a say in 
how the lesson progresses, 
how the work happens and 
how it is managed? 


N.O. 

INC 

INT 

TRAN 

EMP 

N.O.C. 

Collaborative 

groups 

*How is group 
collaboration set up at the 
beginning of the lesson? 
*Does this dynamic 


N.O. 

INC 

INT 

TRAN 

EMP 

N.O.C. 


149 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 



continue throughout the 
lesson? 

*Are students collaborating 
within groups? How? 

*How do facilitators 
support collaboration? 

*How are roles defined 
and established in the 
group? 

*Does it appear that 
students are aware of the 
importance of 
collaboration for college 
and career readiness? 








Academic 

discourse 

*How are academic and 
discipline-specific 
vocabulary taught or 
addressed in the project 
work? 

*Do students uptake and 
use vocabulary specific to 
the discipline? How? 

*What happens when 
students are confused or 
stuck on discipline-specific 
vocabulary? 


N.O. 

INC 

INT 

TRAN 

EMP 

N.O.C. 

Authentic 

assessment 

*How is assessment 
present in this lesson? 

*Are students assessed in a 
way that is representative 
of a profession aligned 
with the discipline of the 
class? 


N.O. 

INC 

INT 

TRAN 

EMP 

N.O.C. 


Key: 

N.O. = Not observed 
INC = Inclusion 
INT = Integration 
TRAN = Transformation 
EMP = Empowerment 
N.O.C. = Not on continuum 


150 




Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 
Appendix C: Levels of Use (LOU) Teacher Interview Protocol 

Working knowledge of innovation : 

1. Describe how you think about the relationship between the Key Elements and PBL? Are they one and 
the same or do you think of them as distinct pedagogies and/or strategies? 

2. Describe your current familiarity with the Key Elements specifically and with PBL generally? 


Using the innovation : 

3. In what ways have you incorporated the Key Elements into your teaching practice? 

• When you incorporate the Key Elements into your practice, would you say you depend heavily on 
the description of specific elements in the Key Element document or do you make changes based 
on your own understanding of what those pedagogical strategies are? 

4. What challenges have you encountered as you have worked to incorporate the Key Elements into your 
teaching practice? 

• Are there ways you think the Key Elements and/or PBL don’t work for your classroom or content 
area? How? 

5. If you were to imagine a spectrum of your own teaching over time, on the left is how you taught before 
the grant work started and on the right is full blown implementation of the Key Elements and PBL, where 
would you put your practice on that spectmm today? 

• Why? 

• Do you have a specific example regarding assessment, student collaboration, to further illustrate 
your response? 


Coordinated use of innovation : 

6. In what ways have you worked with colleagues to further refine how you implement the Key Elements 
in your classroom? 

7. Has your thinking about teaching changed at all as a result of your collaboration with other teachers? 


Adaptation of innovation : 

8. As you continue to work with the Key Elements, have you started adapting/changing what PBL looks 
like for your specific classroom and students? 

9. Can you foresee a time in the future when you would revise, change, or abandon the Key Elements or 
PBL for something better or different? 


151 



Knuth, et al.: An Evaluation Report: Investing in Innovation (i3) Development Grant Dev07 


152