In K–12 education, our approach to data analysis is often like a teenager’s approach to food. A hungry teen will open the refrigerator and stare. Seeing nothing appealing, he’ll open the pantry. When he doesn’t find anything there, he’ll return to the fridge and stare some more. 

In many schools and districts, we do the same thing with data. We stare at results from academic measures trying to figure out why a student or group is struggling. When we don’t find answers, we look at data from other academic measures. But still, the answers don’t appear. 

One way to find those answers is to change our approach to data analysis. To identify students who need additional support to improve their academic performance or stay engaged in school, we have to look at different types of data in different ways. 

In Florida’s Indian River County School District, we had a wealth of data from general academic outcome measures—state tests, national tests, and district benchmark assessments—but it wasn’t always enough to give us insight into the cause of students’ struggles. It wasn’t until we began using early warning indicators (EWIs) and pairing academic data with student engagement data—such as attendance and behavior—that real answers began to emerge. With this information, we could see why some students were struggling and, more importantly, how we could help them. 

A Systematic Approach

First, we formed a committee to select our EWIs. It included school and district administrators, teachers, student support specialists, school psychologists, social workers, attendance officers, and guidance counselors, among others. Including all these different viewpoints helped to ensure that our EWIs would be broad enough to be used districtwide, but deep enough to be meaningful to each stakeholder.

In early 2014, our committee met four times to identify what indicators would be most useful. By our final meeting we had decided on the following EWIs: attendance, discipline, mobility, retentions (students who repeat grade levels), course failures, and academic measures. 

We quickly realized, however, that to identify students who exhibited these EWIs, we’d have to go to four different data systems. After looking at our systems and at how other Florida districts were managing EWIs, we chose to implement the Early Warning System by Performance Matters, the makers of our assessment and data management system. Using this customizable reporting and filtering module, we were able to set our own values and rules for our attendance, discipline, and academic measures. Now we can track all of our EWIs in one place and compare academic data side by side with student engagement data. 

Conduct Beta Testing 

In spring 2014, we beta-tested our EWIs and the Early Warning System. We asked our committee members to pull data on attendance, discipline, mobility, retentions, and course failures, and to pair it with data from academic measures, such as student performance on state tests. Once we started using the EWI data, we quickly discovered what questions and issues we’d have to address. 

For example, at first we thought that one retention in a student’s academic history would be a good EWI. When we paired it with other EWIs, however, we found that two or more retentions were a more accurate indicator. Through beta testing, we were able to correct these issues before going districtwide. 

Connect EWIs to Improvement Planning

To launch our initiative in the summer of 2014, we introduced our EWIs to the Indian River Fellowship for Instructional Leaders (IRFIL), a districtwide professional development initiative that includes five to 15 representatives from every school who meet every other month to look at systemic issues.

We used the Early Warning System to provide each school with its own set of data, including FCAT 2.0 reading and math data (Florida’s former statewide assessment), attendance numbers, discipline data, and figures on course failures. We then gave each school team a protocol for reviewing their data. It was an eye-opening experience for them to see their students’ academic data alongside engagement data. For example, when these teams looked at students who scored at level 1 or 2 on the FCAT, many saw that a significant percentage of those students had chronic attendance issues. Thus, it became very clear that changes to curriculum would not impact student performance, but rather, we had to focus on getting students to school, then we could work on improving their academic performance. 

Once we demonstrated what the Early Warning System was capable of, we asked each team to return to their school and engage in an eight-step problem-solving process to develop a school improvement goal related to a student engagement EWI, such as attendance or discipline. 

Next, we launched training sessions at each school to show staff members how to use the system in their day-to-day routines. At the outset of each session, we didn’t try to “teach” them anything. Instead we showed them data from each EWI, and then compared different EWIs side by side. As soon as they saw the relationships between the academic data and student engagement data, they grasped the value of our initiative and were ready to learn how to use the system. 

Give Schools Flexibility

As a result of our data analysis, our district launched a major initiative to improve student attendance. To make sure everyone was looking at a static set of EWI data and using the same vocabulary, we separated our attendance data into three levels: chronic (present less than 90 percent of the time), severe (present 90–95 percent of the time), and adequate (present more than 95 percent of the time). We also gave each school the flexibility to use this data to create programs to boost attendance based on its unique needs. 

Since then, we’ve seen increases in attendance across the district. Twelve of our 20 schools saw improvements in the percentage of students exhibiting “adequate” attendance. In those schools where attendance did not increase, we were able to further engage in the eight-step problem-solving process to identify and address other issues that were keeping students out of school (e.g., identifying alternative discipline measures to reduce out-of-school suspensions).

As a result of our early warning initiative, each and every user of this data is now taking more responsibility for his or her students and their results. Because we can easily access and compare academic and student engagement data, we can ask questions we never thought to ask before. We can see patterns emerge and answers appear. Instead of staring at academic measures in isolation, we can make important connections among attendance, discipline, and academic measures—and connect students to the support they need to improve their performance and stay in school.

Sidebar: Making It Work

Helpful Ways to Delve Into Data

Look at your student data in new and different ways. Pair academic data (like assessment test results) side by side with student engagement data (like attendance numbers) to see how issues such as attendance or behavior may be impacting academic performance. 

Maintain early warning indicator (EWI) data in one central, secure location, rather than separate systems. A centralized system makes  it much easier to collect, access, analyze, and act on the EWI data in a timely way.

Use EWIs with a multi-tiered system of supports to ensure students receive the support they need at the levels they need it. With EWI data, you can quickly find at-risk students, begin interventions earlier, more easily monitor their progress, and make adjustments as needed.


Brian McMahon is a performance data analyst for the Indian River County school district in Vero Beach, FL.