It’s early days for AI. Here’s what we’ve learned

Teachers and students need time to dive into AI tools and start trying them out, with some guidance to encourage persistence.
Kristen DiCerbo
Kristen DiCerbo
Kristen DiCerbo is the chief learning officer at Khan Academy, a nonprofit educational organization, where she sets the teaching and learning strategy for the company and ensures a consistent research-based pedagogical approach across its offerings. Kristen leads the content, product management, design, and customer support teams, and most recently the launch of Khanmigo, an AI-based tutor and teacher assistant pilot. Kristen was included in Time Magazine's AI 100 2024 list of people shaping the future of AI and one of 10 innovative executive women in Chief’s New Era of Leadership awards.

Do you remember the first time you heard the phrase “ChatGPT”? November marks two years since it launched and the conversation about AI in education started ramping up.

At Khan Academy, we’ve been piloting Khanmigo, an AI tutor and teaching assistant. Our work includes classroom observations, interviews with thousands of administrators, teachers and students, and early data analysis. We are optimistic about our early findings and committed to the important work ahead as we explore this new technology. Here are some of our learnings.

If bringing AI into the classroom is a marathon, we’re 250 yards into this

Soon after the launch of ChatGPT, we saw many districts, that needed time to evaluate the new technology, ban its use. This was largely due to concerns around its use for cheating. Most of those districts have rescinded the bans and are working to define AI policies. As the possibility of using AI tools opens up, the most common questions are:

  • What is AI?
  • What can AI tools do for me?
  • Will AI take over the jobs of teachers?
  • How do we address student cheating?

The questions that educators ask are broad and correctly indicate we are in the very early days of the AI era. If a district or its teachers haven’t started using AI yet, they should not feel behind. A helpful resource for district leaders is the AI Guidance for Schools Toolkit from the nonprofit TeachAI.org. The toolkit contains resources to educate staff and stakeholders on AI, suggestions for how to revise existing policies, and letters for parents, staff and students.

Interacting with AI doesn’t just “come naturally”

Often, the first time someone asks an AI tool to do something, the response is not what the person wants. Understandably, their first reaction may be to say, “well, AI can’t help.” But we think AI can help. When it is carefully adapted to a classroom environment, we think AI has enormous potential.

Both teachers and students need time to dive into AI tools and start trying them out, with some guidance to make suggestions and encourage persistence to see the benefits.


Superintendents on the move: A well-known leader steps down


Transcripts of student chats reveal some terrific tutoring interactions. But there are also many cases where students give one- and two-word responses or just type “idk,” which is short for “I don’t know”. They are not interacting with the AI in a meaningful way yet. There are two potential explanations for this: 1.) students are not good at formulating questions or articulating what they don’t understand or 2.) students are taking the easy way out and need more motivation to engage.

In talking to teachers about this, they suggest that both explanations are probably true. As a result, we launched a way of suggesting responses to students to model a good response. We find that some students love this and some do not. We need a more personal approach to support students in having better interactions, depending on their skills and motivation.

Understand how to get students unstuck

The biggest criticism of ChatGPT when it was released was that it was a cheating tool. We knew creating an effective AI tool for the classroom meant it couldn’t just give students the answer—the tool had to help students get unstuck. To be clear, being stuck isn’t a bad thing—students struggle when they’re working on material that’s at the edge of what they know. AI tutoring tools must provide just enough support to help them be successful.

After each response our AI tool gives, users can give a thumbs up or thumbs down to the response and also leave written feedback. We expected we would mostly get negative feedback from this mechanism as people would be more likely to complain when something went wrong. Instead, more than two-thirds of the feedback is positive. In addition, our analysis of chat transcripts reveals that overall AI can make tutoring moves, such as setting goal statements and making course corrections, that previous research tells us good tutors make.

We also sought to find the right balance between asking students questions and giving them hints and support. Some students were frustrated that our AI tool kept asking questions they didn’t know. If AI is to meet the promise of personalization, the technology needs to be aware of what the student currently knows and what they are struggling with to adjust the amount and type of support it provides.

A surprising number of people value AI’s language capabilities

Many students in historically under-resourced communities come from families where English is not the first language. In interviews, students and their teachers have expressed how helpful it is to have a resource they can ask questions to in Spanish, for example, to get clarification on something the teacher said or a problem they don’t understand.

This wasn’t one of the primary uses we had considered at the outset of our pilot but it has been great to have it shown to us by students. In response, we now offer support in Spanish and Portuguese and plan to add more languages soon.

Safety and security need continual attention

Our AI tool launched with a number of safety and security features. Student conversations are sent through a moderation system that gives the interaction a score for violence, hate, self-harm, inappropriate content and other concerning issues. If the score exceeds a threshold, the teacher is notified and can review the conversation.

As with any automated system, this one needs tuning to identify egregious issues but doesn’t needlessly flag innocuous conversations. There are challenges when students wish to discuss, for example, the ending of Romeo and Juliet or debate the death penalty. These topics get flagged in the system but teachers view them as acceptable uses of the system.

One tip for district administrators

District leaders have jam-packed days. So it makes perfect sense that some leaders have shared with us, “My teachers and students are ahead of me in trying AI tools. I’ve tried ChatGPT a little, but I don’t know what else is out there.” This is understandable, and you are not alone.

To understand the power of AI and envision what it might or might not do, we recommend establishing a working group. The group can play with AI tools— especially those designed by educators for educators—and make recommendations.

Most Popular