Listening In

Listening In

Experts discuss the world of Data-Driven Decision Making.

In a recent Web seminar about Data-Driven Decision Making, three experts in the field - Douglas Reeves, CEO and founder of the Center for Performance Assessment; Stephen C. Jones, superintendent of Norfolk (Va.) Public Schools; and Howard Woodard, the chief information officer for the Georgia Department of Education - answered your questions on this topic. Here's a sample of what was covered. (To view the entire presentation, visit www.districtadministration.com/webinars.).

What sort of data does Norfolk gather on a monthly basis and what are the sources of this data?

Stephen Jones: We've aligned our curriculum in such a way that it reflects the standards associated with the assessment component here in Virginia. So with that alignment, those quarterly assessments are designed to replicate what those state assessments are discerning on the part of students when they take those assessments.

For example, we try to have those monthly assessments resemble the standardized tests and format as well as complexity of the questions. We try to build those distracters that are associated with the tests so that they're close to the answers because we know that those test-taking strategies have to be incorporated into the assessments.

The monthly assessments are most effective when teachers are able to get together as a team to analyze the assessments and the results of the assessments to determine the strengths and the weaknesses of the students.

There is debate about whether teachers should be able to run their own reports or if that task better falls to a data manager. Is there a black-and-white answer?

Stephen Jones: I think with the type of training that we've given our principals and then the training that our teachers have received, and the fact that the technology allows for an immediate turnaround, it should not be a laborious process for teachers. It's a matter of getting the data and then being able to sit and analyze the data in a disaggregated fashion.

Doug Reeves: I really believe that Dr. Jones is on to something important here. Creating all this capacity but then only putting it in the hands of the datameister is never going get ownership at the teacher level.

How have you found the time and training for teachers to learn about data analysis?

Stephen Jones: Well, that's a relatively easy question. I mean time is the one variable that for some reason we try to hold it constant when we talk about student performance but for the adults in terms of staff development and professional development, we try to be as flexible as we can. And we obviously have not been able to spend all of the money that we would like in professional development.

But we've recognized, and our Board of Education has recognized, the importance of this and that's one of the reasons we're utilizing the services of Dr. Reeves and his colleagues. We're very, very liberal in how we allow our teachers and administrators to attend professional conferences that are tied to these kinds of pedagogical questions that we're talking about. I was just reminded that we have early release days that are built into our calendar for professional and staff development. So we find that this is a real investment that pays tremendous dividends.

What should educators be telling their boards in public about data-driven decision making?

Doug Reeves: Norfolk is a good example of leadership in that area. Here's the ethical principle involved. No child you serve is going to be more accountable than the adults. No teacher is going to be more accountable than the administrators. And no administrator is going to be more accountable than the board is.

So, when you look at accountability indicators, in this district, for example, it starts with the Board of Education having established accountability indicators for themselves and the accountability system doesn't just include a list of test scores.

Howard Woodard: We also do this at the state board level. One of the things we are beginning to implement now is with the superintendents' dashboard for strategic planning, which we as managers use on a weekly basis in cabinet. We basically have made that available to our state Board of Education members and they look at certain key ratios and then periodically every 60 days or 90 days we have sessions where we all sort of go out to dinner and talk about certain pieces of information and certain key aspects of what we're trying to do policy-wise.

Are these reports that districts create available and given to student and their parents?

Stephen Jones: It's very important that we market and put into a proper context the work that we're doing in schools given that here in Norfolk and most urban districts across the country are dependent districts in terms of funding. It's critically important for us to constantly make our case to show that education is a wonderful investment and not just an expenditure.

So we explain the data-driven decision making to our public and we shy away from all of the terminology. We explain what the achievement gap is and why it's so important for us to close those gaps and we've got the numbers that show we're doing it.

We try to stay on message by including our goals, our desires, our work in all speeches, newsletters, PTA meetings.

How can administrators collect the classroom data in a way that is not threatening to teachers but informative about powerful instructional strategies happening or not happening in a school.

Doug Reeves: You have got to start with acknowledging that data doesn't have an emotional valiance to it. It is not positive or negative. It is simply data. It's kind of like gravity. It's just out there. And so the way that you take the emotionality out of it is to be utterly objective.

Now, the problem is that a lot of observations of classroom practice are not objective. And if you don't think that I'm right about this, I would challenge you at your next faculty meeting-or next administrator meeting-to simply ask the question what's good reading instruction and have people write down characteristics of good reading instruction on a piece of paper anonymously and collect them and see if they're consistent or wildly different.

If the leadership in your district isn't consistent about something as basic and essential as what's good reading instruction, don't be surprised if teachers feel that they're getting mixed messages.

Conversely, we can be very objective in saying you know what? Writing and social studies work. So I'm not going to sucker punch you. There's not going to be any gotchas. When I walk into a social studies class, one of the things I'm going to do is look in the assessment file and ask what percentage of assessments included student writing. It's utterly objective. Ten people can ask the same question. All 10 get the same answer. That's the kind of thing that we need to do.

I can't say that it won't be threatening since frankly asking anybody to change professional practice does imply some risk taking. But what I can tell you is that if you're on a treasure hunt, not a witch hunt, then you're going to be out there finding great practice and your mission as leaders is to catch teachers doing something right. And then document and replicate best practices.


Advertisement