How should we teach with AI? The feds have 7 fresh edtech ideas

"We must harness AI’s ability to sense and build upon learner strengths," Department of Education asserts.

Keeping humans at the center of edtech is the top suggestion in the federal government’s first stab at helping schools determine how they should teach with AI. With technology like ChatGPT advancing with lightning speed, the Department of Education is sharing ideas on the opportunities and risks for AI in teaching, learning, research, and assessment.

Enabling new forms of interaction between educators and students, and more effectively personalizing learning are among the potential benefits of AI, the agency says in its new report, “Artificial Intelligence and the Future of Teaching and Learning: Insights and Recommendations.” But the risks include a range of safety and privacy concerns and algorithmic bias.

Educators and policymakers should collaborate on the following principles:

  1. Emphasize humans-in-the-loop: Educators and students can remain firmly at the center of AI if users treat edtech like an electric bike rather than a robot vacuum. On an electric bike, humans are fully aware and fully in control, and their efforts are multiplied by technological enhancement. Robot vacuums complete their tasks with little human involvement or oversight beyond activating the device.
  2. Align AI models to a shared vision for education: The educational needs of students should be at the forefront of AI policies. “We especially call upon leaders to avoid romancing the magic of AI or only focusing on promising applications or outcomes, but instead to interrogate with a critical eye how AI-enabled systems and tools function in the educational environment,” the Department of Education says.
  3. Design AI using modern learning principles: The first wave of adaptive edtech incorporated important principles such as sequencing instruction and giving students feedback. However, these systems were often deficit-based, focusing on the student’s weakest areas. “We must harness AI’s ability to sense and build upon learner strengths,” the Department of Education asserts.
  4. Prioritize strengthening trust: There are concerns that AI will replace—rather than assist—teachers. Educators, students and their families need to be supported as they build trust in edtech. Otherwise, lingering distrust of AI could distract from innovation in tech-enabled teaching and learning.
  5. Inform and involve educators: Another concern is that AI will lead to a loss of respect for educators and their skills just as the nation is experiencing teacher shortages and declining interest in the profession. To convince teachers they are valued, they must be involved in designing, developing, testing, improving, adopting, and managing AI-enabled edtech.
  6. Focus R&D on addressing context and enhancing trust and safety: Edtech developers should focus design efforts on “the long tail of learning variability” to ensure large populations of students will benefit from AI’s ability to customize learning.
  7. Develop education-specific guidelines and guardrails: Data privacy laws such as the Family Educational Rights & Privacy Act (FERPA), the Children’s Internet Privacy Act (CIPA), and the Children’s Online Privacy Protection Act (COPPA) should be reviewed and updated in the context of advancing educational technology. The Individuals with Disabilities Education Act (IDEA) could also be reevaluated as new accessibility technologies emerge.

More from DA: Why your fellow superintendents are facing more no-confidence votes


Matt Zalaznick
Matt Zalaznick
Matt Zalaznick is a life-long journalist. Prior to writing for District Administration he worked in daily news all over the country, from the NYC suburbs to the Rocky Mountains, Silicon Valley and the U.S. Virgin Islands. He's also in a band.

Most Popular