FETC Voices in Technology: Monitoring for signs of harm

Charlie Jimmerson is director of technology at Marshall County Public Schools in Alabama

 

 

A student’s suicide last year caused the leaders of Marshall County Public Schools in Alabama to question what they could do differently to help identify and prevent future tragedies.

“As an educator and as a dad, this hit me,” says Charlie Jimmerson, the district’s director of technology. “Every life is precious, and this one was taken away from us.”

Jimmerson researched technology that could monitor students’ emails and internet searches for warning signs.

“We thought, ‘There’s got to be a way that we, as a school system, can use technology to our advantage to get the right people in front of students who are reaching out and who may be researching information about self-harm or bullying,’” he says.

The district now filters online traffic and monitors student communication with a combination of artificial intelligence and human monitors.

Jimmerson is a Future of Education Technology Conference presenter.

 


 

How does this technology work?

We have one product that uses artificial intelligence to pick up key phrases in students’ emails and searches. Another product with real humans monitors 24/7. If there’s a substantial alert and they can’t get hold of me or the local principal, they will contact the sheriff to do a welfare check. We also have a parent portal that gives parents a snapshot of online searches their kids are doing at school.

What did you need to successfully implement this technology?

The technology is wonderful, but you have to have the human interaction to be successful with it. I knew if we were going to identify threats, we needed to have in place what to do next. So I reached out to our student services department and our director, Janna Bonds. Now, our principals and counselors get these alerts, and they take the time to investigate and meet with the kids to find out what’s going on.

Did you have any legal or privacy concerns?

I know there are other district leaders who looked at this and chose not to do it because of the liability. If we know something is wrong now and we don’t act on it, are we liable? That’s why the buy-in piece from our principals and counselors is so important. If we save one kid, it’s worth it, and I feel it has already saved a life.

Did you need to make changes to your tech policies?

Our acceptable use and technology policies already cover this. This is school-owned material, so students are aware that anything they do in our software and our email is public information that we hold. They know we’re not reading their emails. We’re just hitting on those key phrases that raise red flags.

Is school safety a growing part of your role in IT?

IT is really getting involved in the safety side of things. It makes a lot of sense. If a student is considering self-harm, they usually look up how to do it first. We didn’t have to do this, but I think we chose to because our kids matter and their safety matters. A month after we started this, we had counselors at two different places get involved, and it made a difference in the students’ lives. We immediately saw the impact of what we’re trying to do.


Jennifer Herseim is an editor for LRP Media Group and program chair for Inclusion and Special Education at DA’s Future of Education Technology Conference.

Most Popular