7 ways edtech providers can make AI in education safer and more efficient for students

New guidance specific to edtech vendors identifies exactly how they should incorporate AI technology into their products in a way that's ethical, efficient, and, most importantly, safe for students.

Artificial intelligence, a popular buzzword in the K12 space, has continuously faced public scrutiny since it first arrived most notably through the power of ChatGPT nearly one year ago. Now, companies leveraging such technology, including the edtech sector, are being asked to vet their own products for the safety of their users.

Just yesterday, the Biden Administration issued an executive order outlining new rules and safeguards for companies working with AI. For similar reasons, organizations are calling on edtech providers to ensure their products are created with the well-being of students and district communities in mind.

Last week, the Software and Information Industry Association released its seven “Principles for AI in Education,” a guide for edtech vendors on how they should incorporate AI technology into their products in a way that’s ethical and efficient.

“From tutoring and test preparation to assessing learner performance to relatively simple tasks like checking the spelling and grammar of a document, AI technologies are and can have great impact on teaching and learning,” the guide reads. “Because of this and in order to realize AI’s promise, stakeholders must address and mitigate risks attendant to these technologies.”


More from DA: Overcoming misogyny in the workplace: It goes both ways in the superintendency


The seven principles SIIA advises edtech companies to adhere to read as follows:

  • AI in education should address learners, educators and families’ needs.
  • AI technology must consider educational equity, inclusion and civil rights as important components of successful learning environments.
  • The technology must protect student privacy and data.
  • Products should aim for transparency to support schools and communities to effectively understand and engage with these tools.
  • Edtech companies should work with educational institutions and key stakeholders and outline the opportunities and risks of new AI technologies.
  • Developers should adhere to best practices for accountability, assurance and ethics in order to mitigate risks and achieve the goals outlined in these principles.
  • The edtech sector should engage with the greater education community to identify ways to support and communicate AI literacy for those using it.

“With AI being used by many teachers and educational institutions, we determined it was critical to work with the education technology industry to develop a set of principles to guide the future development and deployment of these innovative technologies,” SIIA President Chris Mohr said in a statement.

Micah Ward
Micah Wardhttps://districtadministration.com
Micah Ward is a District Administration staff writer. He recently earned his master’s degree in Journalism at the University of Alabama. He spent his time during graduate school working on his master’s thesis. He’s also a self-taught guitarist who loves playing folk-style music.

Most Popular