Please ensure Javascript is enabled for purposes of website accessibility A Six-Step Student Data Protocol for Analyzing Assessment Data
top of page
Search
Writer's pictureAMISA

A Six-Step Student Data Protocol for Analyzing Assessment Data

This article was orinally published in BlogLogic.

A Six-Step Student Data Protocol for Analyzing Assessment Data

Materials prepared by Dr. Kristy L. Beam, Faculty at University of North Georgia, Former Principal at American School of Valencia, Administrator and Independent Consultant

Assessment data provides a quantitative measure of our students’ performance. Digging deeper, however, assessment data also provides a wealth of information that becomes a road map for leading school improvement.

Dr. Beam, faculty at University of North Georgia as well as a former principal and assistant principal, shared with us her process for using data effectively in schools to initiate change, create a shared vision among faculty, and guide an improvement project, specifically focusing on mathematics.

Why Assessment Data?

Schools and districts often have limited resources, whether it be time, money, personnel, etc., and therefore must utilize resources effectively. Student data plays an integral role in finding those opportunities for meaningful, large-scale change, while also making the process manageable by providing focus.

For using data effectively in schools, schools need to have a process in place for analyzing data. As we know, interpreting assessment data is cumbersome with multiple assessments, subgroups, grades, subjects, and schools.

How do we translate data into a meaningful visual? How do we pick a point of focus? How do we make inferences from the numbers? How do we apply these findings towards an initiative? How do we measure the efficacy of the initiative?

The questions can go on and on. As Dr. Beam explains: “As school leaders, we have to create that vision for what we’re going to achieve, and we have to create that plan for how we’re going to achieve it and relay that vision, so everyone is on board…We have to have the evidence to support it.” In order to use data as evidence and communicate findings with staff, Dr. Beam recommends using student data protocols.

Analyzing Student Data with Student Data Protocols

Student data protocols are a series of steps for analyzing student data. They help us talk about data with teachers, which is important because data has the propensity to increase anxiety.

“The goal is to get to where you have open, transparent conversations…You want data to become a collective ‘This is our data. This represents us.’”

Dr. Beam joined the staff as Assistant Principal at a low performing urban public school with the goal of improving school performance. She relied on assessment data, as well as surveys and observations, to drive her improvement project.

Standardized testing is a waste and time of money if you are not using the data.

As we review the protocol shared by Dr. Beam, we also need to consider how we can use student data protocols to not only engage with data but also as a means to foster school- or district-wide conversations with data as a comfortable, self-reflective topic.

Dr. Beam’s process for analyzing student data and using data effectively in schools is next:

1. Choose a point of focus.

In coming to the school, Dr. Beam kept hearing about the fifth-grade class, which was significantly under-performing in math. As a result, she chose to focus on fifth grade because their performance is indicative of other factors, both internal and external. Supporting them through interventions will in turn impact the experience of other students outside that grade.

2. Pull relevant data and state observations.

There are two keywords here: relevant and observations. Assessment data solely about fifth grade math performance in a standardized test doesn’t paint a whole picture. Relevant student data includes a much larger data set: a history of the assessment data, data from curriculum such as assessment methods used, DOK, learning activities, etc., as well as teacher surveys. Second, observations refer to statements like: “I observe…”. Before jumping to conclusions about data, we must state the ‘what’. What are we seeing? Dr. Beam did both of these.

  1. Assessment data dating back to the first grade showed that, since then, 15% of students had not met grade-level standards. And, in fourth grade the number of students who didn’t meet grade level standards doubled.

  2. Teacher surveys questioned why teachers thought performance was so low. The responses largely concluded that lack of parent support, lack of prior preparation, and lack of student discipline were to blame. The focus was mostly on external factors

  3. Discipline data revealed that boys were three times more likely to have a discipline referral and three times more likely to have interactions with teachers in the classroom, but they also had lower GPAs.

  4. Curriculum data exposed that assessment was largely summative, pencil and paper assessments.

These data pieces all contributed to the low performance of the fifth-grade class.

3. Interpret the data.

After pulling the data and stating observations, Dr. Beam then moved to analyzing student data, asking “What does this data suggest?”.

  1. Assessment Data: Poor performance is not just isolated to the prior end-of-year test for the current fifth grade because such a high percentage of students are not meeting those grade level expectations starting in first grade.

  2. Teacher surveys: Teachers were mostly focused on external factors, and while their answers may be true, it’s important to shift focus on what they, as a school, can control.

  3. Discipline data: There is a lack of equity in the school, both in the sense that boys are receiving more referrals and girls are not called on as often in class.

  4. Curriculum data: There were no formative check-ins with students to gauge their comprehension. Students were also not engaging with mathematical concepts in ways that include problem-solving and critical thinking.

4. Determine the implications.

After interpreting the data, the next step is identifying the implications. “How will this impact our instructional strategies?”

Dr. Beam and her team identified that to best use their resources, they would implement guided math to help students have hands-on interactions in a student-centered learning environment. They were also going to work with teachers to diversify their classroom assessments, so students could engage with mathematical concepts in rigorous ways, as well improve equity in the classrooms.

5. Create a framework to initiate this change.

As a school, they recognized the need to train math teachers in guided math. They also recognized that they couldn’t just ask teachers to develop entirely new curriculum and lessons, so they provided teachers with these resources like articulated lesson plans and supplies to implement guided math.

During implementation, Dr. Beam’s team did classroom observations and tallied what visible learning they were seeing in addition to increased rigor with DOK. They also focused on equity and studied how teachers were interacting with students. They also had teachers report on the assessments they gave students.

In addition, Dr. Beam’s team created data notebooks for students. In them, students set goals and reviewed the Big 20, which are the top 20 math skills. The notebooks personalized student learning and growth. As a corollary, the school also initiated student-led conferences, where students reviewed the data notebook with parents and shared with them their goals. In doing so, all stakeholders were involved in the process.

Student and school performance gains achieved through regular reviews of results (achievement data and student work) followed by targeted adjustments to curriculum and instruction. Teachers become most effective when they seek feedback from students and their peers and us that feedback to adjust approaches to design and teaching.

6. Track growth.

This step is often not included in traditional student data protocols, but we’d be remiss to ignore it. As change was implemented, Dr. Beam tracked the data. Teachers became more aware of how they interacted with students. There was an increase in the use of tools in the math classrooms and reasoning shifted to problem-solving and hands-on learning. The diversity of assessments also increased as teachers implemented more oral questions, exit tickets, and projects.

Dr. Beam then brought these observation data to PLC’s and used it as fodder for conversations with grade- and subject-level teams as well as individual teachers. These conversations focused on what’s happening in the classroom and acted as an opportunity to review progress, so they could work to meet their goals collectively as a school. Data became about transparency and collaboration.

These feats were the building blocks of much larger change towards student-centered learning, collaborative teaching, and transparent school initiatives – all of which are examples of using data effectively in schools with student data protocols.

EXAMPLES OF STUDENT DATA PROTOCOLS:

Bio: Kristy Beam has a diverse background in education as a teacher and administrator. After 15 years in a large public-school system in the US, she spent the last four as principal of a private, international school in Europe. Currently, she is working in curriculum development and has taken on a greater role with the University of North Georgia, where she has taught in the graduate education program for the last four years.

Join Our Mailing List

Sign up to receive the latest AMISA news and events!

Thanks for submitting!

AMISA-logo-transparent.png

3105 NW 107th Avenue
Suite 400-S5
Doral, FL 33172

 

Email: info[at]amisa.us

  • Twitter
  • LinkedIn
  • Instagram
  • Facebook
  • Youtube
bottom of page