ChatGPT has heightened concerns about cheating in K12 and now, the question of whether the AI tool has harmed its own customers appears to be at the center of a federal investigation launched just this week.
OpenAI, the creator of ChatGPT, is being investigated by the Federal Trade Commission over its data collection practices and “publication of false information about individuals,” The New York Times reported. The Washington Post called the probe “the most potent regulatory threat” to the use of the chatbot and the company’s operation in the U.S.
As most administrators know, the emergence and rapid advancement of ChapGPT have educators taking ever more aggressive steps to determine if students are using the AI to complete essays and other assignments. And there’s little question about its popularity: nearly 60% of students aged 12-18 have used ChatGPT, according to a poll conducted at the end of the school year. About two-thirds of parents want their school district to invest in AI to improve learning, that survey also found.
Some four in 10 teachers surveyed said they use ChatGPT at least once a week, and yet another survey, nine in 10 K12 and college students surveyed recently said they’d rather use ChatGPT than a tutor when they need help studying.
Districts are increasingly banning ChatGPT and other AI platforms while some school systems have responded with less certainty. For instance, New York City schools this spring allowed educators to use ChatGPT just a few months after district leaders had initially banned the software.
In a letter to the company obtained by The Washington Post, the FTC has asked OpenAI to detail its strategies for preventing risk and any complaints it has received about its AI tools generating “false, misleading, disparaging or harmful” statements about people.
Meanwhile, a growing number of edtech companies are leaning into the technology. Educational support service provider Chegg just released CheggMate, a study aide powered by GPT-4, the most up-to-date version of OpenAI’s software.