AI in schools: Learning to love this transformative technology

"For many in the education world, artificial intelligence is a demon unleashed ... for others, it’s a panacea for education’s myriad challenges," FutureEd think tank writes.

The hype, the hand-wringing and the excitement over AI in schools is overwhelming many district leaders and their teams as they try to figure out how the technology fits into education. Among the latest group of experts to offer some actionable guidance is the FutureEd think tank at Georgetown University.

“For many in the education world, artificial intelligence is a demon unleashed, one that will allow students to cheat with impunity and potentially replace the jobs of educators,” writes FutureEd editor Alina Tugend in the report, “Navigating the Artificial Intelligence Revolution in Schools.” “For others, it’s a panacea for education’s myriad challenges.”

One thing is clear: AI’s impact on schools and learning is only going to grow as learning platforms become more adaptive, teachers work to further personalize instruction and leaders seek more administrative efficiencies in running their districts.

Tugend offers this snapshot of how fast the technology is advancing: the AIs of the past have consistently failed the SAT, Advanced Placement exams and other standardized tests. But ChapGPT has steadily improved its performance on college GREs and LSATs, according to researchers at The Wharton School of the University of Pennsylvania.

AI in schools: Guidance for leaders

Most educators are now focusing on further integrating ChatPGT and other bots and AI systems as last year’s flood of K12 bans recedes, Tugend notes. She suggests that district and school leaders ask the following questions:

  • Should AI be a required skill for students?
  • How do we choose what companies or programs to invest in?
  • What can schools do about students cheating with AI?
  • What are the privacy risks?

To find the answers—and to help school communities get the most out of AI—Tugend encourages superintendents and their teams to tackle three big concepts:

1. Try out the technology

It sounds obvious—most educators will need to test out generative AI before they can fully embrace it. Administrators should give teachers and other staff time and space to play with ChatGPT and other tools without fear of making mistakes, Tugend advises. She recommends test-driving a tool such as Stable Diffusion, which can create images based on students’ prompts.

More from DA: Education in 2024: Partners break down 8 big learning trends

CIOs and other district technology leaders can also make themselves available to help teachers experiment with AI, Tugend adds.

2. Does our district need a cheating policy?

Cheating is, of course, one of the big negatives around AI in schools. Tugend urges teachers to change their perspectives on cheating. One district tech leader says teachers are now asking students who use ChatGPT to share their whole conversation with the bot to shed light on the student’s thought process.

Teachers elsewhere are now doing work in class that used to be assigned as homework, which prevents students from using AI.

To formalize how teachers students and teachers use AI—and to address privacy concerns—district leaders should hire a chief technology officer to (or have their current chief technology officer) coordinate any new policies and procedures.

3. Ensuring equity, eliminating bias

“Some hope that AI will help bridge the equity gap in K12 education; others fear it will widen it,” Tugend notes. Free tools, such as the original ChatGPT, make artificial intelligence more accessible.

But the powerful ChatGPT costs $20 per month and more affluent districts will have more resources to help teachers and students use AI productively, she adds. Experts see a new digital divide opening between students who are taught to use AI and those who aren’t.

This “readiness checklist” created by the Council of the Great City Schools and The Consortium for School Networking will help districts develop policies that prevent AI from perpetuating biases. The guide also counsels schools to require that AI vendors have discrimination protections in place.

Matt Zalaznick
Matt Zalaznick
Matt Zalaznick is a life-long journalist. Prior to writing for District Administration he worked in daily news all over the country, from the NYC suburbs to the Rocky Mountains, Silicon Valley and the U.S. Virgin Islands. He's also in a band.

Most Popular