How to embrace AI without gambling on student outcomes

Date:

Share post:

It’s clear the education system is struggling to implement AI in a meaningful way. Recent data from Gallup shows low satisfaction in public education and the 2024 Nation’s Report Card shows increasing concerns about student performance.

Eager for solutions, teachers increasingly use generative AI in the classroom and additional challenges will surely follow this strategy. Districts nationwide can no longer wait for federal direction on responsible AI use.

School leaders must now determine how to embrace AI. Districts must ensure AI use in schools is safe and fosters equitable outcomes. For AI guidance, school leaders can reference the National Education Association, the Education Department’s recommendations, and guidelines or policies released by organizations and several states.


More from DA: How to be an idealist and a realist in an era of polarization 


The question remains: On a practical level, how can schools smartly integrate AI into classrooms without sacrificing student outcomes?

Schools need a quicker response

As a licensed attorney, I’ve spent 30 years working at the intersection of legal accuracy, predictive modeling and product development. Since 2019, I’ve applied this knowledge in education technology by considering policy and practical applications for safe, wise AI implementation that helps, not harms, educators and students.

The stakes are high. School leaders and educators already see the impact of generative AI in ChatGPT-written essays, presenting new challenges in assessing student knowledge. Given the rapid rise of AI, some argue that schools should pause all AI applications until a formal policy exists. While I can understand their concerns, formal policy creation takes time and schools need a quicker response.

Innovations will continue, with or without the active involvement of schools. Tech companies will keep expanding the use of AI in their new products with little oversight and few regulations. Because of this, they may prioritize profit margins over education standards and student outcomes.

Instead of pausing AI implementation in the classroom, schools can take cues from New York City Public Schools and directly engage with technology creation. By being directly involved, school districts can guide AI software innovations to meet their needs and desired student outcomes, all while protecting student privacy.

There is proof that a well-crafted school policy influences edtech providers. While some industries have conflicting motivations that prioritize sales or stocks over customer interests, the edtech industry’s motivations drive toward positive student outcomes.

This is because edtech providers must demonstrate high Every Student Succeeds Act tier standards to appeal to the schools they serve. As a result, edtech companies should be motivated to create excellent products that align with globally accepted best practices and lead to high student learning outcomes.

In short, edtech companies stay competitive by striving to achieve high standards. By developing sound, responsible AI policy, schools can influence edtech innovation to directly benefit classroom learning.

4 benchmarks for AI in education

Several reputable resources are available to guide schools in drafting AI policy. In 2020, a global coalition of education stakeholders, myself included, formed the EDSAFE AI Alliance and created the SAFE Benchmarks Framework.

There are four essential benchmarks within this framework to achieve equitable outcomes for students:

  • The first policy consideration is safety for students and teachers. This addresses the importance of protecting user data and privacy by managing cybersecurity risk while allowing responsible innovation and learning.
  • Accountability protects student-teacher interactions by creating and updating standards to keep edtech companies accountable, including as edtech products evolve.
  • AI implementations must be fair and tools must be accessible and transparent. The edtech provider must transparently communicate how they evaluate fairness and have a system to scrutinize the data’s fairness and the accessibility of the tools.
  • AI policy should also require companies to demonstrate efficacy so schools have concrete evidence that the AI tools they purchased meet their intended outcome. It further considers whether the result is good, fair to all user demographics and equitably communicated. Notably, data tracking has limited ability to track protected characteristics. It is necessary to demonstrate efficacy by showing whether products are fair and effective to all demographics, partially because algorithmic bias can be hard to detect without rigorous testing.

In addition to these considerations, an effective policy will be supported by one or more certifying bodies that evaluate acceptable tools with transparent criteria. These bodies ensure edtech solutions comply with laws, regulations and best practices so schools can more easily make purchasing decisions.

Steering AI development

Beyond policy, school districts can foster positive outcomes for students by ensuring product validation. Schools should request vendors provide evidence that products function as claimed.

Overpromising and underdelivering product capabilities is a significant concern in the education industry. When schools rely on a student resource that falls short, the consequences directly impact learners. Thankfully, schools can mitigate these risks through AI policy and vendor certification lists.

AI tools are and will continue to be developed. As stakeholders, schools should influence how edtech tools take shape and impact our nation’s children.

Given the motivations of edtech providers, school districts can steer AI use within edtech software development by creating and adopting smart usage policies. School districts can further influence AI development by ensuring that any edtech products used in the district are proven effective in benefiting students and teachers.

By following these recommendations, school districts can leverage AI in education to accelerate learning in the classroom without gambling on student outcomes.

Jon Medin
Jon Medin
Jon Medin is a licensed attorney and he leads the research, psychometrics education sciences, platform and innovation team at Renaissance, a global provider of edtech learning solutions and resources.

Related Articles