Florence 1 Schools is a mixed suburban and rural district. We have a 70% poverty rate, and all of our schools receive Title I funding. We face many of the challenges that high-poverty districts around the country face, and some of those challenges are compounded by the fact that schools in South Carolina are poorly funded and supported, with language in our state constitution affording students a “minimally adequate” education.
Like many schools, we believe in using data to inform instruction. Recently, we began using a new kind of data—cognitive data—that has helped change our approach to data-informed instruction more generally. Here’s how it’s working.
Moving from ‘data-rich” to ‘data literate’
Like many school districts across the country, even when our district was data-rich, we did not feel we were using a synthesis of all data sources to make the best decisions about instruction, curriculum and professional development. We also recognized that there were additional sources of data about the learner that were not traditionally considered in the decision-making process.
In the past, most of our professional development had focused on student achievement data, in large part because there is so much pressure on educators to raise student achievement. We would use benchmarks, unit assessments or any other formative assessment tools we had, along with analyses of state assessments. We would look at trends by grade, class, teacher or a variety of other demographics. We would try to individualize training for specific curricular needs or instructional strategies if it was warranted by the trends we saw, but it was generally driven by achievement data and little else.
Recently, we partnered with Mindprint Learning to collect cognitive data on each student so we could create learner profiles for each student at two of our middle schools. Generated by an hour-long assessment, that data focuses on how students learn rather than what they have learned. Together with achievement data and social and behavioral data that focus on things such as self-awareness, self-management, engagement, and motivation, this new data source allows us to make better-informed decisions to improve student outcomes, including achievement.
This data gave us an individualized look at the learner and how they learn. Learner profiles include an analysis of the learners’ cognitive skills such as visual motor speed, visual memory, attention, verbal reasoning, abstract reasoning, verbal memory, flexible thinking, working memory, and spatial perception. At the expense of one hour in one school day to administer the assessment, the comprehensive cognitive data that we collected and analyzed yielded very insightful information about our learners. Where we were data-rich, we’re now data-literate.
We have not mastered it yet, but this shift is driving new practices in our professional learning communities. Instead of working to personalize learning based on achievement, we are working to personalize it based on learners. When we talk about differentiation strategies, it isn’t just a buzzword, because we can truly differentiate based on how students cognitively approach learning. Sometimes our PLCs will break into smaller groups of two to four teachers who all have groups of students struggling with a particular cognitive skill.
Cognitive data in practice
When we began looking at cognitive data, one of the first things we noticed was that, across the classes involved, a high percentage of students needed support in the areas of verbal memory and visual memory. It was glaring, but it also suddenly made sense why so many teachers (and even parents) expressed frustration that they spent so much time explaining an activity to students, but students blurted out, “What are we supposed to do?” when it came time to get to work.
Explaining an activity may be a common practice, but that doesn’t mean that it’s going to be effective if 60% of students struggle with verbal memory. Rather than avoiding any classroom practices that require students to use their verbal and visual memory skills, our teachers were able to learn how to help students develop those skills.
We also found that, as a whole, the students we assessed had strong abstract reasoning skills, but they struggled with retention. In response, we built some universal classroom routines to better support retention, including a daily spiral review. Previously, teachers might have moved from module one to module two to module three. This meant that at the end of the year, they would give a summative assessment where 30% of what students were tested on was from the first module, which they hadn’t thought about in months.
Without knowing that our students struggled with retention, we were setting them up to lose their math knowledge almost as soon as they had it. Once we were aware of the challenge, a spiral review was an easy enough solution to help students maintain and build upon their math knowledge day by day and throughout the year.
Another learning skill that we found many of our students struggled with was executive function. To help them develop these skills, teachers introduced paper planners for students to use and began devoting part of the day to working on skills like note-taking, planning, prioritizing and keeping track of things on a calendar.
Results and next steps
Feedback from building leadership has been encouraging. Our principals are telling us that they feel like they suddenly have access to data that is truly informative and actionable. Now they understand why a student is frustrated and what kind of accomplishments they will want to celebrate.
A few teachers reached out directly to district leadership to thank us for providing these new data tools. As a teacher, it can be so discouraging to be driven by student outcomes with no information about how to actually improve those outcomes. Once they had some insight into what makes a student successful and how they could better support a student’s personal approach to learning, they became much more energized.
To truly measure success, we’ll be looking at a range of data. We’ll look at the easiest measures, such as unit assessments, module assessments, formative assessments, benchmarks, and the end-of-year state assessments. We also plan to look at other measures such as student engagement. Are there any students who were previously checked out but are now having light bulb moments in class? Are classroom disruptions and discipline going down? During classroom observations, do we see teachers acting more as facilitators? Are they working to address students’ skills and help them approach the content in ways that account for their individual strengths and challenges? Is teacher satisfaction improving?
These are all big indications of school culture that we anticipate improving if cognitive data is used faithfully and appropriately.