This is the time of year when the Bureau of Assessment & Accountability (Michigan Department of Education) sends the Student Analysis Extract Files (S.A.F.E.) data to every district. These data reports provide opportunity to do some deep digging into subgroup performance on the state assessments.

In conversations with intermediate and local district leadership, I have heard some say that the initial foray into state assessment data seems “overwhelming.” Others claim, “No state assessment results should be a surprise.” And some who have used these data to implement effective improvement strategies expect to see improvement trends appear gradually, yet steadily over two or more years.

Given the density of state assessment data, where does subgroup analyses begin? A starting point, of course, is a look at the aggregate subgroup data; digging deeper is the more complex process. In many cases, and particularly for Focus schools, the IEP subgroup is performing significantly below others. So, one beginning strategy is unpacking these data by grade and content area.

For purposes of this discussion, the example will be students with IEPs and the third-grade MEAP reading assessment.

Look at historical trends: What stands out?

Looking at subgroup data in the aggregate provides an initial glimpse of performance; looking at these data over time adds more perspective. For example, specific to the subgroup under discussion, the following questions are just some that might be asked:

  • How has this subgroup performed in reading over the last 3-5 years?
  • Have there been significant changes over this period, and if so, what factors might account for the changes?
  • Has the total percentage of students with IEPs changed significantly over this period and, if so, why?
  • Has the ELA curriculum changed?
  • Has instruction changed?

This type of questioning can lead to better understanding of how school practices and processes are impacting performance for this subgroup. Data drive questions and help us see “what is.”

Look at the faces behind the data: Who is in this subgroup?

“…you need data, but you need to generate and use it in a way that makes the child come alive in the minds and actions of teachers.” Sharratt, L. and Fullan, M., Putting FACES on the Data. Corwin, 2012.

Going deeper means knowing which students are in this subgroup. For example, who are the third grade students in the IEP (MEAP) group?
Seeing the faces, knowing individual performance, helps to better assess existing challenges; are they whole group issues, or individual?

When looking at these students’ performance on the MEAP in reading, are there uniform areas where this subgroup of third-grade students with IEPs has demonstrated proficiency or deficiency? Or is the proficiency/deficiency scattered among the students and more individualized? Answering these preliminary questions will lead to more detailed analyses of grade-level content and instruction as well as individualized and special education instruction issues.


Unpack the data: More variables to consider.

“For the most part, the data will fall into the four categories of demographics, student learning, perceptions, and school processes.”
Bernhardt, Victoria L., Data, Data Everywhere; Bringing all the data together for continuous school improvement. Eye on Education, 2009.

Once the students are identified, another data element to review is the amount of time spent in general education classroom instruction (educational setting). Special Education annual reporting includes three categories of time spent in general education settings: =/>80%; 40-79%; and <40%. These data are available in the aggregate (at the district level) on (under Special Education Data Portraits). These data will need to be disaggregated to the building, grade and student level.

Education setting is an important variable when analyzing subgroup and individual performance; it can be linked to instructional strategy, instructional time, and curricular content and alignment. This may be of particular interest if general education and special education are not well coordinated, or if special education is not integrally incorporated into the general curricular content and standards.

Thus, time in the general education classroom (education setting) is a variable that warrants attention. Who is spending what amount of time in the general education classroom?

  • Are there patterns of proficiency that are tied to time in general education? In special education classrooms?
  • Are students being pulled from core reading instruction in the general education classroom for related services (Speech, OT, PT, etc.)? What is the impact of this practice?

Diving deeper into student level data

Continuing to dive into student-level data relative to third-grade MEAP reading, a look at IEP goals is warranted. For those students with IEPs who are not achieving grade-level proficiency:

  • Which students have goals linked to reading?
  • Are the goals aligned with appropriate standards?
  • Are instructional strategies that have been used considered proven and effective strategies?
  • How often has student progress been monitored (and strategies adjusted as necessary)?
  • Are problem-solving processes used when addressing achievement issues?

Yet another variable to review is assessment accommodations on the IEP:

  • Have the accommodations been provided consistently in the classroom, as well as for state assessments?
  • Are those that have been used appropriate, based on individual student needs?
  • Do the accommodations used effectively support the student to demonstrate learning?

These are some of the key variables that should be reviewed as part of more comprehensive and multi-faceted data analyses. Meaningful improvement strategies are best selected when a true picture of data, faces, and IEPs has been revealed.

Disaggregating performance data to the student level puts faces on the data; this ”reminds us that the numbers represent real children and young people striving to make the most of themselves as they prepare for an uncertain future.” Sir Michael Barber in the Forward to Sharratt and Fullan’s Putting FACES on the Data. Corwin, 2012.

Data, Faces & IEPs: What some districts are doing now

In recent interviews with district leaders, the following comments were made in response to inquiries about the use of state assessment and other student performance data:

Inquiry: How have you approached subgroup analyses of state assessment data?

  • “We are really just starting to dig into subgroup issues relative to state assessments.”
  • “We have learned to look at historical data (trends) first, and to consider systemic issues. But we also know that we need to dig more deeply into student level (IEP) data to assess if we are aligned across our targets, strategies and essential content standards.”
  • “At the systems level, we have found we need to constantly improve and enhance our data systems so we can access the discrete data we need.”

Inquiry: What overall data analyses have you implemented?

  • “We have spent the last three years ramping up access to data and using progress monitoring data to tell us if what we are doing is actually working.”
  • “Across our ISD, we now hold formal Data Days for district leadership to support deep analyses and to train others how to lead teams in digging deeper into student performance data.”
  • “We are targeting not only state assessment data, but common assessments, benchmarking and universal screening data to inform our instruction and improvement strategies.”

Inquiry: What are the outcomes of good subgroup data analyses?

  • “Good data analyses helps us determine actionable targets for improvement. Without the data analyses, we were just guessing.”
  • “Looking at subgroup performance helped us to identify some systemic issues; we totally revamped our math instruction for special education center programs.”
  • “We discovered that our progress monitoring reports did not provide disaggregation of students with IEPs, so we are adjusting the data input so we can take a look at this subgroup – progress monitoring is essential for students with IEPs.”
  • “We use assessment data to plan ‘rate of gain’ that needs to be targeted and to point us toward specific learning that needs to be addressed.”
  • “We have added specific talent to our team in order to do some statistical analysis and correlational studies. We are seeing that our AIMSweb progress monitoring data are predictive of MEAP results.”

State assessment data constitute only one data set, but an important one within the accountability framework under which FOCUS schools operate. Disaggregating performance data to the subgroup, grade level, content and student level is a necessary function for continuous improvement. Examining the variables linked to school practices, instruction and ultimately the performance of students with IEPs is a big task, but one that is critical to student outcomes as well as professional practice.

“Decades of research has shown us that the school factor that has the greatest impact on student achievement is classroom instruction. What happens between teachers and students in our nation’s classrooms has the greatest impact on how those students learn. This applies almost equally to students with and without disabilities. In fact, it is even more important for our students with disabilities.”
John O’Connor. Students with disabilities can meet accountability standards: A roadmap for school leaders. AASA, 2010.



eip tool checklist thumbnailFor those who are interested, a DRAFT guide for getting started with IEP subgroup analysis (beginning with MEAP results) is available for review, comment and input.




Go to top