4 principles school leaders can follow when balancing student safety and privacy
Schools must balance two complex issues—safety and privacy—as they look to technology to help them identify students who may be experiencing a mental health crisis.
While the pandemic upended much of what we recognize as traditional schooling, it also reaffirmed the critical role that schools play in supporting students’ mental health needs. Under normal school operating conditions, 70-80% of students who receive mental health services receive those services through their schools, according to The George Washington University’s Center for Health Care and Health Care in Schools
But we are not in normal operating conditions. As COVID-19 rates fluctuate in response to vaccinations and variants, and schools continue to grapple with how to open responsibly, students will yet again face challenges adjusting to the new health protocols, instructional approaches, and school environments.
And, unfortunately, we already see signs of the pandemic’s impact on students’ mental health: the American Academy of Pediatrics found suicidal ideation was 1.60 and 1.45 times higher in March and July 2020 compared to the same period one year earlier, and that suicide attempts were 2.34 and 1.77 times higher during the same period.
When it comes to students, we know they spend a lot of time working, communicating, and exploring online. Among youth, internet searches are correlated with self-injury and suicide, according to data published in the Journal of Affective Disorders.
So it makes sense that schools are using technology to help them identify students who may be at risk. But, youth suicide is not, however, an isolated concern.
Protecting students’ privacy is also an issue that parents and caregivers, schools, and policymakers all care deeply about. In the last eight years, states across the country have passed more than 100 laws regulating how school systems, state education agencies, and technology providers handle student data.
In an effort to show their commitment to protecting students’ privacy rights, more than 300 companies have signed the legally binding Student Privacy Pledge.
In deploying technology, then, school leaders face a difficult question: How can schools balance their interest in protecting students’ safety with their obligation to protect students’ data privacy?
As a former educator and long-time student privacy advocate, I believe that achieving this balance takes T.A.C.T:
Transparency builds trust between a school and its community of students and families. In the context of deploying a self-harm alerting technology, transparency means that the school system publicly shares:
- who the vendor is;
- what types of data privacy protections both the school system and the vendor have in place;
- when the technology will be active (e.g., time of day, nights, weekends, and vacations);
- where the technology will be active (e.g., school-managed devices that students can take home, school-based computers, and school-managed student accounts);
- how the technology generates alerts;
- why the school believes self-harm alerting technology is a critical part of its student safety and support program.
By the nature of the technology, self-harm alerting tools will most likely contain sensitive student information—especially if the technology identifies students who may be actively planning to harm themselves or die by suicide. Because of this sensitive information, schools must be especially thoughtful in deciding who has access to the dashboards and notifications.
School system leadership should work closely with their technology team to ensure that the only people who can access the alerts are qualified school staff and authorized school partners (such as a local mobile crisis team) who are trained in handling sensitive information. In addition, if a school partner will be part of the notification process, then the school system should have an agreement in place with the partner that identifies both parties’ responsibilities for handling sensitive student information in accordance with FERPA.
Ongoing communication between the school system and its community of students and parents is an essential part of any successful edtech implementation, but it’s especially critical when implementing self-harm alerting technology.
Before deploying the technology, schools should provide information on what parents can expect if their student’s activity generates an alert. Schools should also consider providing resources for parents to learn more about warning signs for potential suicide or self-harm risk and how they can access help for their child.
Additionally, schools should take into consideration cultural differences regarding mental health and work with their communities to foster ongoing dialogue with parents via multiple methods of communication (email, newsletters, town halls, webinars, etc.) in as many languages as relevant to the school community.
While technology can play a powerful role in helping schools identify students who may be experiencing a mental health crisis, it’s important that schools integrate the technology into a broader student support program. Ideally, schools should have two things in place as they begin to deploy the self-harm alerting technology: (1) clearly articulated protocols for how to handle a notification that a student may be actively planning an act of self-harm or suicide (ex: AFSP and Trevor Project’s Model School Policy), and (2) a team of mental health and counseling professionals trained both on the software and how to respond to alerts. I
If schools do not have these in place already, then the onboarding and implementation of this technology is a valuable opportunity for schools to create a support team and develop their response protocols.
Striking the balance requires work
Navigating the complexities of student safety and student privacy is no longer optional — the increase in student mental health struggles and the proliferation of digital data demand that both be prioritized. Schools are uniquely positioned to support student mental health and connect families with resources, but doing so requires a system-wide dedication to protecting student privacy in the process.
Finding the right balance takes intentional work, but it’s work that is worthwhile—and potentially life-saving.
Teddy Hartman is head of privacy for GoGuardian and was previously an English teacher and director of strategy and data privacy for a large school system in Maryland.
Interested in edtech? Keep up with DA's Future of Education Technology Conference®.