Ken Yeh is the director of technology at Ontario Christian Schools, a private K-12 school near Los Angeles with about 100 children per grade. Three years ago, the school began buying Google Chromebook laptops for every student in middle and high school.
The students would be allowed to take them home. Yeh says parents "were concerned" about what they might be used for, especially outside of school.
Google software, like that of other companies, comes with virus protection and the ability to filter search results and block certain Web sites, but Ontario Christian Schools turned to a third party to provide an additional layer of security: a startup called GoGuardian.
GoGuardian helped Yeh and school leaders create a list of off-limits websites: porn, hacking-related sites and "timewasters" like online games, TV and movie streaming. The software also has another feature: It tracks students' browsing and searches whenever they are using the computer, at home or at school. And that's how Yeh was alerted that a student appeared to be in severe emotional distress.
"When I came to work," he recalls, he received an indicator that a student had searched for suicide and several related terms. "I then went in to view the student's browsing history around this time period."
The more he saw, the more Yeh was convinced that this wasn't an idle or isolated query.
Other searches by the student seemed to relate to specific methods of self-harm and "terms that strongly suggested that the student was struggling with certain issues," he says. "When I looked at that history I alerted our principals and guidance counselor to see if they could follow up and see what was going on."
The student was brought in to speak with a guidance counselor — a conversation, Yeh says, that led to positive interventions. "It was a little unexpected. We weren't thinking about this as a usage for GoGuardian."
Yet in the three years that GoGuardian has been in use at this school, this type of incident has happened three separate times, he says. And GoGuardian says that across the 2,000 districts where its software is in use, it has heard similar anecdotes dozens of times.
Rodney Griffin, the Chromebook coordinator for the Neosho School District in southwest Missouri, says it happens there an average of once a semester.
"Any time, day and night, I alert a school counselor or administrators," he says. "I've had it happen when they contacted home at like 10 p.m. and said, 'I think you need to check on your child.' "
Suicide is the third leading cause of death among youth aged 10 to 24.
And yet identifying young people at risk remains a complicated challenge. Last month, the U.S. Preventive Services Task Force released a recommendation, aimed at doctors recommending that adolescents be screened for depression. But they have also stated that, "Unfortunately ... it is not clear how primary care clinicians can effectively identify and help people who are not already known to be at increased risk for suicide."
In the cases mentioned, software programs do seem to be able to assist in that identification process. But they do so by effectively thrusting school IT directors, such as Yeh, into the role of eavesdroppers.
And that can be problematic, says Elana Zeide, a research fellow at NYU's Information Law Institute and an expert on student privacy and data. "This is a growing trend where schools are monitoring students more and more for safety reasons," she says. "I think student safety and saving lives is obviously important, and I don't want to discount that. But I also think there's a real possibility that this well-meaning attempt to protect students from themselves will result in overreach."
A student who types "suicide" into a search box could be researching Sylvia Plath, Socrates or terrorist movements, Zeide points out. And there could be legitimate personal or educational reasons for students to search other flagged terms, from sexual anatomy to sexually transmitted diseases or drugs, without "sending immediate alerts to the powers that be."
She points out that low-income students may be disproportionately subject to surveillance, as school-owned devices are more likely to provide their only access to the Internet. And she worries about the broader message: "Are we conditioning children to accept constant monitoring as the normal state of affairs in everyday life?"
This type of dilemma is almost certainly going to become more common, as school-owned devices and laptops proliferate. In 2015 alone, according to a report released this month, U.S. K-12 districts bought 10.5 million devices like laptops and tablets, a 17.5 percent increase over the year before.
Carolyn Stone, ethics chair of the American School Counselor Association, says that she was "taken aback" to hear that student Web searches done at home were triggering interventions by school staff. "It's so intrusive," she says.
On the one hand, she says, the issue of students thinking about suicide needs to be taken very seriously and treated differently from other types of disclosures. When school guidance counselors hear anything about potential self-harm, even secondhand, she says, "We're on it. We're calling home. Privacy and confidentiality go out the window."
On the other hand, she says, she worries about school staffers without mental health training having access to what are, essentially, students' private thoughts.
"On the surface, it sounds like a very good idea to err on the side of caution when it comes to student suicide," Stone says. "But this is something that sounds like it could spin out of control. ... It's a slippery slope."
Cody Rice, the technical product manager at GoGuardian, says that schools are given control over what search terms are flagged and what to do about them, and no client to date has raised privacy concerns.
"Schools and parents are the primary protectors of the students, and GoGuardian provides another tool to help them in their endeavors, but does not make decisions on which types of online activity may lead to alerts to the administration for the benefit of the student."
Yeh says that the parents at his school have never complained about privacy violations. He adds that they've raised complaints only when the filtering has malfunctioned, allowing students temporary access to off-limits sites.
As for being asked, with no mental health training, to serve as a de facto mental health early warning system for the school community, he seems to accept it as a new part of his job.
"It is a way for us to proactively intervene when they are looking for help. And so we feel a good sense of responsibility in trying to look out for the welfare of our students."
Copyright 2021 NPR. To see more, visit https://www.npr.org.