The secret to success in data deployment is the data team
The role of the principal is always evolving, and nowhere is that more evident than in how principals must analyze and disseminate achievement data, and more specifically, how it is being used to lead instructional practices at their schools.
School leaders today are expected to effectively plan for improved student achievement based on an array of assessments being administered to students. It’s all about learning to be data-driven instructional leaders and exercise data-based decision making.
Research and literature have shown that using data at the school level can be a difficult task. With the increased pressure at both the national and state levels to improve student achievement in the educational setting, the creation of data teams in schools across the United States is on the rise. Our school created a data team, and we’ve used it to close the achievement gap.
Creating a Data Team
In the summer of 2013, I had just accepted a position as assistant principal at a middle school in central Ohio. Having already completed some surface-level research on the building data for the interview process, I was aware of the school’s report card and the areas that needed attention. During my first week on the job, I met with my principal and proposed the idea of creating a data team within our school building. I envisioned the data team consisting of as many stakeholders as possible. Eventually it would consist of school administrators, teachers, intervention specialists, gifted coordinators, and instructional coaches. The data team’s goal was to gather and organize data. The objective was to provide as much useful data to the staff as possible in order to inform targeted interventions with all students, to fill in achievement gaps, and to inform instructional practices across the entire school building within all three grade levels (6–8). In August, I sent an email to all staff members asking if anyone was interested in serving on our data team. In the end, we had 13 committed staff members on our team to start the school year.
Analyzing the Data
Shortly after the beginning of the school year, we held our first meeting. After we agreed on our goals and vision for the team, the first step was to carefully analyze our most recent state report card. For the 2012–13 school year, our middle school had received all As and Bs in every component of the state report card except for the Gap Closing/Annual Measurable Objectives (AMO) component (this piece measures the academic performance of specific groups of students, often according to demographics). In Gap Closing/AMO, we received a D (66.7 percent). This component immediately became an area of focus because in Ohio, the Gap Closing/AMO grade indicates how well students are doing in math and reading. It answers the question, “Is every student succeeding, regardless of income, race, culture, or disability?”
Additionally, we decided to focus our efforts on our overall Achievement (Performance Index and Indicators Met) component. In the Achievementcomponent we had received a B, but we felt this measure remained important because this grade combines two results for students who took the tests. The first result—Performance Index—answers the question, “How many students passed the state test?” and the second result—Indicators Met-answers the question, “How well did students do on the state test?” We also decided to focus on the Progress (Value-Added) component. More specifically, we focused on the lowest 20 percent in the Achievement measure, as we had received a B in this area. This component measures “the progress for students identified as the lowest 20 percent statewide in both reading and math achievement.” Because the Progress component measures how much students grow each school year, we felt it was an important element of focus.
In addition to the focus areas on our state report card, we made it a goal to promote and support the regular analysis of more recent student data, such as various benchmark reading assessments and teacher-developed formative assessments—basically any assessment data we could get our hands on. Finally, we decided that we would share as much data with staff as possible, provide detailed methods on how to use the data, and conduct professional development with both the data team and staff, when possible. The next step was to collect and disseminate the data. We used the Education Value-Added Assessment System (EVAAS) to collect student data from the state assessments.
Gathering and Sharing the Data
Using EVAAS, we came up with a variety of data sets (see Figure 1). Each data set was shared with staff throughout our school building along with a detailed description of what each data set represented, how to use the data to guide targeted interventions, and how to use the data to drive instructional practices within the classroom and grade levels. The data sets were shared using Excel in order to keep it organized, accessible, and easily shared.
In addition to the four data sets in Figure 1, we also collected and analyzed data from benchmark assessments via STAR (assessments that offer expanded skills-based testing), BAS (a system that links assessment to instruction along The Continuum of Literacy Learning), and teacher-developed formative assessments throughout the academic school year. We also provided several professional development opportunities outside of the school day and within staff meetings for both data team members and staff.
For example, we shared how to analyze and use STAR data to set up interventions and goals for students, how to interpret the scores, and examples of student growth versus nongrowth. We also presented how to use the student diagnostic report and instructional reading level (IRL) effectively with classroom instruction, shared how to interpret the growth proficiency chart, and showed how to use the built-in interventions within the STAR software. Finally, we emphasized the importance of discussing STAR data with students and outlined how to explain their scores to them in a way that allowed for them to clearly understand their own data, how we used it, and why it was so important for them to do their best on each assessment.
Figure 1: Data Sets Derived from EVAAS
|Data Set 1||This data set included students who were Not Likely to Be Proficient (NLTBP) in math and reading (grades 6–8) and science (grade 8) on the upcoming state assessments, based on their previous performances on state assessments.|
|Data Set 2||This data set included students who were identified to be in three or more subgroups-i.e., Individualized Education Plan [IEP], socioeconomic status, race, gifted, limited English proficiency, free and reduced-price lunch, etc.-in grades 6-8. In lay terms, these students could really hurt or help our school report card data based on their performance on the upcoming state assessments. Essentially, because these students fall under several different subgroups, they can be in multiple relevant subgroups for measures like Value-Added. (For example, a student who is identified as gifted in math and on an IEP would be in the Gifted-Value-Added measure and that same student would also be included in the Students with Disabilities Value-Added grade. Essentially, their performance on the state assessments carries “more weight” with regard to how well they perform.)|
|Data Set 3||This data set included students who were both NLTBP and in three or more subgroups (i.e., Individualized Education Plan [IEP], socioeconomic status, race, gifted, limited English proficiency, free and reduced-price lunch, etc.) in math and reading (grades 6–8) and science (grade 8).|
|Data Set 4||This data set included students who were categorized as Not Growing from year to year in math and reading in grades 6–8. Within this data set, we also included subgroups these students belonged to.|
Putting the Data to Use
Perhaps the most important element of implementing a data team is ensuring that the data collected, analyzed, and shared is put to use by your staff. The data from the state assessments (in Figure 1) were used to create initial intervention groups. This data was also useful in creating a snapshot of our school building and each grade level as it pertains to state achievement. Our staff also found this data helpful because they were able to reference the data sets (as a list of names) to ensure students in their respective classrooms were growing. If not, they could provide necessary intervention/enrichment as needed.
The STAR and BAS scores were used in several different formats as well. This data helped to determine instructional and independent reading levels for each student. Because the STAR and BAS assessments are administered throughout the school year, they allow the data team and staff to identify trends in student growth.
English-Language Arts teachers used the data to form “zone intervention” groups to close students’ reading gaps. Content teachers used this data to determine the reading levels and abilities of their students, which allowed them to differentiate articles and assignments thanks to the data. This data also allowed us to determine which seventh-grade students would benefit from taking eighth-grade Academic Connections.
Our science teachers used the data to track articles being used within their instructional units—all students receive articles covering the same content; however, articles can be “leveled” so that the content is more accessible to students. Our social studies teachers used the data to group students into ability groups for instructional units, especially for the “We the People” unit. Each group of students is given a textbook that most closely reflects their personal reading level. Our health teachers used the data to determine appropriate leveled articles about events and issues affecting middle school students across the globe.
Overall, by using the data to be aware of students’ reading and performance levels, our teachers are able to locate articles and books that make content more accessible to our students.
Evaluating the Results
In the end, we raised our Gap Closing/AMO grade from a D (67.7 percent) to a B (84.5 percent), the highest in our school district (of 11 sites). Additionally, we raised our Performance Index Score from 84.6 percent to 85.4 percent, the highest in the history of our school building. Finally, we raised our lowest 20 percent in Achievement score from a B to an A. Plus, we had a student in our school building score the highest in the state of Ohio on the eighth-grade math state assessment. These outcomes were certainly an area of satisfaction for all stakeholders involved, including students, staff, parents, and community members. One of the most important results was immeasurable—that is, I felt our data team and staff became better at analyzing and using data to drive instructional practices in our school building across all three grade levels (6–8).
The addition of the data team has proved to be extremely beneficial to our school building. The following school year there was even more interest from our staff to serve on the team; we added several more members, and we were even granted several professional development days by the district office to allow our data team to meet during the school day (versus before or after school). For example, our district purchased new performance-tracking software called PerformancePlus. We spent professional development time teaching the data team how to run reports (create data sets) and share them with their colleagues. We also challenged the data team members to be the “data leaders” in our school building.
There were perks to serving on the data team. For example, the district office granted all data team members special access to student data districtwide (versus only the students in their respective classrooms). In addition, the success we experienced with our data team was contagious, as several other schools within our district and surrounding school districts created data teams the following school year and modeled their teams after ours.
Sidebar: Making it Work
Use these 5 “Cs” to build your world-class data team:
Collaboratively create a clear vision and set goals for your data team
Collectively analyze the data, discuss it, and decide what to do with it
Check to ensure teachers are using the shared data as well as collecting their own data to effectively drive their instructional practices in the classroom and with individual students
Constantly reanalyze the most current data collected to ensure you are working toward your goals, and adjust goals as needed
Create a school climate/culture that values data
Denver J. Fowler, EdD, is an assistant professor of educational leadership in the Department of Leadership and Counselor Education within the School of Education at the University of Mississippi and is the 2015 Ohio Assistant Principal of the Year.