District leaders are turning to artificial intelligence to fill gaps in mental health care and other student-support services. But educators should know not all AI is built for those tasks, says one provider.
In August, District Administration reported on research that ChatGPT is likely to assist users in causing self-harm. That’s because large language models like ChatGPT don’t serve one specific purpose. Instead, they aim to answer any questions a user may have, within certain limits.
Sean Wheelock, co-founder and CEO of Inner Peak AI, says his tool is purpose-built to help students develop well-rounded, social-emotional skills.
“We have fun gamifications, exercises and activities related to specific goals that students set for themselves.”
Addressing gaps
Young people have taken their lives after communicating emotional distress with what Wheelock calls “non-purpose-built AI.” K12 leaders should be warned that these generic chatbots may not address the problems districts are trying to solve.
“We have created a safe space where we don’t actually allow the AI to give any advice at all,” he says, reflecting on the tragic cases of AI engagement. “The AI is just there as a validating, supportive, active listener that asks questions.”
If a student shows suicidal tendencies when interacting with Inner Peak AI, a notification is sent to the school and the student’s emergency contact. There are also restrictions on what topics can be discussed, including violence, Wheelock says.
K12 students most commonly ask the AI for relationship advice, both romantic and non-romantic. Kids are struggling to understand how to have difficult conversations with people, he notes.
“We see students coming to the app, saying, ‘I’m in a fight with a friend,’ or ‘I’m unhappy with a romantic partner,’ or ‘I’m in an argument with my parents,'” he explains.
Kids are also showing signs of general stress, including school-related issues like test anxiety.
Inner Peak AI provides insights about the general health and wellness of students but guards data privacy. The only information schools receive is aggregated and anonymized data about their student population, which helps leaders understand the “pulse” of their student body, he adds.
“We never sell or share any student data with third parties,” he says. “We also don’t share individual-level information, even with the school, unless it is a crisis situation.”
AI is not a replacement for human connection
Since the introduction of ChatGPT in November 2022, experts have repeatedly assured educators that AI will never replace the human connection that students need during instruction. That same interaction is critical for mental health counseling, Wheelock says.
“First of all, if a student is in a crisis situation, they should not be talking to an AI,” he explains. “More broadly, we see that technology, and AI in particular, has an exciting role to play in making health support more personalized, accessible and affordable to people.”
Meanwhile, some school districts haven’t enacted AI policies, according to the latest research. Wheelock says he hopes educators will trust purpose-built tools that serve leaders’ missions to develop the “whole child.”
“There are groups that are very fearful and groups that are very optimistic,” he says. “I think the right place to be is somewhere in the middle. I hope that the people who are afraid are still open enough to the possibilities of how AI can be used safely, because there are a lot of easy things that you can implement to make AI a whole lot safer.”
More from DA: The federal push for charter schools just got more expensive



