Smarter tech spending for schools

Measuring technology's ROI requires vision, strategic planning, and even some software

School district administrators hate it when they stumble across closets full of unused computing devices or visit classrooms where whiteboards are being used as very expensive chalkboards.

Technology leaders at Dysart USD in the Phoenix suburbs had real-world experience with this phenomenon and were determined to make changes, says Diana Hawari, chief information officer. “Buying 1,000 laptops and dividing them between so many schools might not be the best idea if we don’t have a great idea how to use them” Hawari says.

In the 2013-14 school year, Hawari and Michelle Benham, director of innovation in teaching and learning, crafted a technology innovation questionnaire tied to each school’s continuous improvement plan.

Teachers and administrators must describe how a purchase will enhance the larger strategic goals of personalized learning, digital safety, and communication and empowerment for students and staff.

Every technology purchase goes through this process and must be approved by Hawari and Benham. “Principals and teachers now have to take a step back and say how this purchase aligns to the larger goals that we set for ourselves each year” Benham says.

Initially the process was not popular because principals felt it slowed them down, she admits. “They do recognize, however, that they are better able to implement things, and are guided toward better purchasing decisions.”

Edtech purchasing tool

Dysart USD’s Technology Innovation Plan asks teachers and principals some open-ended questions to get them thinking about their technology purchase requests, including:

How do you plan to use technology in support of the strategic plan and your continuous improvement plan?

Please describe the professional development required to implement this plan.

How will technology resources be secured, inventoried and managed?

How will technology resources be distributed to ensure equity and access for learning and assessment?

How will you evaluate the implementation of your technology plan and design next steps?

So, who’s using the technology?

Administrators now strive to align strong technology plans with district strategic goals. Avoca School District 37 just north of Chicago wants to support high-performing professional learning communities.

As a first step, one school is using a Google+ Community to share tech implementation ideas teachers are trying, Superintendent Kevin Jauch says. Meanwhile, the district is implementing a technology innovation plan with several teams to monitor investments and deployments.

For example, the district has been assigning Macbook Airs to students in grades 6 through 8. Now, a Research and Device team assesses the refresh strategy.

“We are evaluating if we need such a robust machine or whether we can continue to excel with a less-expensive device such as a Chromebook or a tablet” Jauch says. “If we can realize savings by refreshing with a less-expensive device, can we reinvest some of that savings into devices at our lower grades or into peripherals?”

A newly created Technology Innovation Leadership Team will develop a system for identifying innovative initiatives. The team will conduct pilot programs to evaluate whether new ideas and products can be scaled across schools.

Members of the technology teams have been heavily involved in recent curriculum adoptions, including identifying appropriate online resources for mathematics and differentiation.

“The curriculum development team identifies the need and perhaps the resource, and the technology team checks to see how the proposed resource fits within our technology environment” Jauch says.

In 2017, West Morris Regional High School District in New Jersey turned to a technology solution to measure ROI. A software product called Paperbasket tracks active usage of applications on student devices and gives administrators dashboard views of the results.

The software revealed, for instance, that students were not regularly using the plethora of research databases the district had made available.

“We can go back and look at whether we should be subscribed to eight different databases, or narrow that down” says Sean Beavers, district technology coordinator. “We have implemented a lot of technology tools, and while we have a clear picture about usage for some of those tools, we don’t for others.”

The district can also set usage parameters when, for example, teachers are offered three tools for formative assessment and three more for media creation. “We can say we are paying $27,000 for this application, we have this many licenses, and here is how much usage we would like to see on a weekly, monthly or yearly basis” he says.

Getting more specific data

The Toms River Regional School District, also in New Jersey, has sought grant programs that have a data analysis component, Assistant Superintendent Marc Natanagara says.

“We don’t often have an incentive to create and analyze data we are not already collecting” Natanagara says. “The data collected tends to be the high-stakes tests and grades, but never about a particular learning practice, piece of technology or initiative. That is what we are trying to do more of.”

In one grant project, furniture vendor Steelcase sponsored a literacy makerspace, paying to equip a room with furniture, technology and lighting. It is a flexible space that students can rearrange.

“For the next few years, we have two teachers committed to gathering data on how student practices—both qualitative and quantitative—are changing based on the arrangement and use of this room” Natanagara says.

Toms River also won a $100,000 state grant to reduce summer learning loss in its Title 1 middle schools. It developed a maker camp, with a different theme each week and a lot of integrated technology.

Its stated goal was to identify a baseline of student understanding of concepts at the end of one school year, and retain that level in testing at the next school year’s beginning.

“In most cases we actually beat that” he says. “Not only were we able to stem learning loss, but in some cases the students made other advances.”

Skills that are hard to measure

For some administrators, measuring impact remains a tough sell, says Keith Krueger, CEO of CoSN, the Consortium for School Networking.

“I am not sure people always see the upside of being measured” Krueger says. “Some superintendents on our advisory board tell us that their community has already bought in to the idea of investing in technology, so they don’t feel a strong need to demonstrate value.”

CoSN has developed SmartIT tools administrators can use to measure cost of ownership and impact. A workbook leads administrators through the process of assigning relative importance to projects, stating anticipated benefits in measurable terms, and deriving a total qualitative benefit score.

The effort can be challenging because of the intangibility of some learning benefits. For instance, many technology projects are framed around student engagement, but how do you measure that? Krueger suggests that administrators could look at student attendance or do a satisfaction survey of parents.

“It is easy to measure test skills or graduation rates, but some of the things we are trying to do involve new skills, such as helping students become more collaborative or critical thinkers” he says.

Still, the strategic tech planning appears to be worth the effort. Benham, the director of innovation in teaching and learning, says Dysart USD’s Technology Improvement Plan has produced noticeable improvements, including a focus on learning goals over technology.

Most Popular