Here you’ll find some insights we gained about Internet helplines in general and, below them, specific takeaways from our US-based school helpline “lab”….
A referral service too. Importantly, a social media helpline is an information clearinghouse as well as call-in service. Internet helplines are the new middle layer of help for social media users – the layer between the platforms in the cloud and the users (and traditional help services) on the ground. So helpline workers need access to a database of specialized help to which they can refer a caller (including your equivalent of the US’s 9-1-1, when the content represents immanent harm), as well as to platform content moderation decision makers. Helplines also need to know about specialized law enforcement services, such as the Internet Crimes Against Children Task Forces in every US state and the National Center for Missing & Exploited Children and its protocols for reporting criminal content.
Highly contextual content: Part of the “middle layer” work is the context Internet helplines can provide the platforms about what’s happening on the ground – because the helplines talk directly with the people who can provide it. Youth content is highly contextual. Most of the abuse reports the platforms get are “false positives” – not actionable, because the moderators don’t have context and often can’t tell if, for example, a comment is true hate speech or harassment, parody, an inside joke, etc. Helplines help the platforms take action and – as the middle layer gets built out around the world – ultimately will reduce the number of non-actionable reports platforms get, screening out a lot of content that doesn’t violate Terms of Service. A number of social media companies have long said that algorithms will “soon” be able to detect all content that violates their Terms of Service proactively, as it’s posted. That may be true for more predictable content violations (speech and behavior that has longstanding patterns that can be “fed” to machine-learning algorithms), but it will be a long time before algorithms catch up to the ever-new expression and sociality of youth.
Internet helplines should have industry funding. Because social media helplines help the platforms, as mentioned just above, as well as their users, the companies should together fund the helplines as an industry, at least partially (one per country, or at least per region of the world), as well as help raise awareness of their services. This too would be helpful to the platforms, and not just in the form of public goodwill: In doing so, the industry could help establish best practices for helplines worldwide, creating a uniformity that helps content moderation teams. As for other funding, we found in talking with philanthropists, social impact investors and venture capitalist was a reluctance to be associated with social media at a time when public opinion about the industry and social media had turned negative. At the very least, the platforms should fund an international professional association of Internet helplines with the aim that it would ensure best practices worldwide, train helpline workers and conduct research on their work and independent (off-platform) user care.
The operational model. The social media helpline we piloted served US schools, as does its model helpline in the UK: Professionals Online Safety Helpline. Another model would be a helpline that serves either everyone or youth, as in some European countries, Australia and New Zealand. We chose the UK model partly for reasons of scale – no country the size of the United States has an Internet helpline for everyone – and partly for their expertise. It would be optimal for the US to have helpline service that’s free and available to everyone, at least weekdays, but that would require consistent funding sustained over the long term – worth exploring.
The youth peer-to-peer model. One model helplines might consider is the peer-to-peer one, with teens as either employees or volunteers receiving community service credit for their time at the call desk. This could be part of a youth-focused nonprofit organization that already exists, such as Oregon’s peer-to-peer YouthLine or California-based Community Helpline and LGBT National Youth Talkline.
Partner with specialists. An Internet or social media helpline could have several types of partners: law enforcement, specialty hotlines (mental healthcare, dating abuse, LGBTQ support, etc.) and school safety tiplines. That last group generally sends Internet-based harassment and bullying cases back to the schools where the tips came from – going full circle – so schools aren’t generally getting help with online harm from school safety tiplines in the states that even have them (see this page for more on tiplines). When Internet expertise is needed, tiplines could outsource to an Internet helpline for help with online content.
Marketing always needed. An Internet helpline, like any such service, can only help to the degree people know it’s there for them. So local and national governments, Internet companies and NGOs are all needed to help spread awareness of the helpline.
Privacy law compliance. This is a given but must be mentioned here in any case: As a digital service provider, an Internet helpline must comply with the data privacy laws concerning minors in all jurisdictions in which it operates.
Takeaways specific to serving US schools
Helplines & student leadership. A helpline for schools supports student leadership, positive school climates and restorative above punitive approaches to student discipline. The school personnel who called us most were assistant principals, whose job included student discipline. They taught us that they had the best outcomes when they saw students as part of the solution, not just the problem, and enlisted students’ help – from students leaders and sometimes simply working with peers of a perpetrator or target – in resolving relational problems that developed at school. Students also helped them report abusive content, which always needs to be the first step in attempting to get harmful content removed. We encouraged administrators who called us to work with students to report the content and gather evidence. In the process, they saw the value of students and administrators blending skill sets and working as a team. This approach promotes student voice and leadership.
Where the problem really lies. What adults, including school personnel, don’t always understand is that the real context of problems online is school life – what’s going on with students, relationally, in and out of school. The problem isn’t the location (e.g., a social media platform); research has shown that in very many cases, it shows up in multiple locations, online and offline. If problematic content violates Terms of Service, it can be deleted. The problems themselves are rarely “deleted” when the content is, but some can be defused when the visible evidence is deleted. There is often more work to be done in the school context to address the underlying problem, as was the case in the analog days of telephones.
Obtaining evidence challenging. Obtaining the evidence of a problem (screenshots or links to the page, comment, image, video, tweet, etc.) is a necessary best practice for helplines because, as an intermediary service, a helpline needs to determine if the problem appears to violate the Terms of Service of the platform in question. It’s also the hardest part of each case. There are a couple of reasons why it’s a challenge. For one, school personnel are busy, and it’s not always easy for them to reach back out to students to arrange a meeting for obtaining the evidence. Another is that adults often aren’t experienced social media users themselves. Justifiably, callers would just like the problem to “go away.” [Caveat: If a case involves nude photos of minors, evidence must never be emailed; only URLs/links should be requested from callers.]
Disbelief that this type of service exists. School personnel seemed conditioned to feel they were on their own with social media-related problems. They also seemed to think in terms of programs rather than services – programs that have to be adopted by each school and that require training. Our helpline frequently needed to reassure school personnel that the service was a great deal simpler: nothing to adopt, just pick up the phone and call.
The legal barrier to total customer satisfaction. The most common desired outcome that can’t be provided to school staff is the identity of the student causing the problem they’re calling about. It’s a common ask because schools often want to discipline the student, sometimes because of pressure from the target’s parents. Since social media companies are barred by federal privacy law from revealing the identity of a user without a subpoena, a US helpline can’t obtain or provide the offending student’s identity. Another desired outcome, of course, is content deletion. Though helplines help screen complaints for Terms violations, obviously they can’t ultimately be the final decision-maker on whether problematic content violates Terms and comes down.
Help with intervention a new concept. School administrators were far more accustomed to hearing about solutions on the prevention side (digital safety and citizenship education), rather than the intervention side of school discipline and safety. There are plenty of assemblies, talks, curricula, trainings and programs being offered to schools for prevention education. The helpline is designed to support restorative intervention – to be part of schools’ incident response matrix. Help on demand, when incidents happen, seemed unheard of on the school scene.
Work with NCMEC & ICACs. For handling a case involving “sexting,” or nude photos from minors, it was important that we were registered as an Electronic Service Provider with the National Center for Missing & Exploited Children. This registration protects minors, the helpline and the schools it works with. With cases involving nude photos of students, school personnel are instructed not to email photos/screenshots but rather send a URL as evidence. Helpline staff made sure the school was working with law enforcement, asked if the caller knew about the ICAC nearest the school to see if it would be advisable for them to involve the ICAC (Internet Crimes Against Children Task Force, law enforcement trained in dealing with such cases).
Clarity of mission. A helpline for schools is designed to address online harm to minors. Schools sometimes experience online pranks or attacks against school personnel or school pages in social media and would call the helpline to get help with getting that kind of content deleted. Our remit, however, was to help reduce online harm to students, not “brand protection,” and content concerning minors was the only kind with which we received help from the social media platforms.
Helping social media platforms help schools. The helpline provides help in two directions. It provides schools with research-grounded perspective and help for informed incident response, on-demand. In the other direction, it provides social media companies with the actionable context they need to help users more effectively. A helpline’s (or “trusted reporter’s”) content escalations are based on the evidence provided them, so the helpline’s on-the-ground understanding of both the problem and social media terms of service reduces the high number of “false positives” (inaccurate) abuse reports that flow through social media reporting systems and increases the systems’ accuracy. A helpline increases companies’ ability to help end users, in our case schools.
“Free” may not be ideal. That the service was free seemed hard for school personnel to comprehend (though when we changed the model and began charging, the opposite was true). Many school personnel were surprised during presentations and meetings with administrators, counselors, superintendents and school board members that such a service could be free. One administrator asked us if there was a catch or a “hidden cost.” In talking with for-profit school-serving companies and board members in more than 20 states at the National School Boards Association’s 2016 convention, we heard that schools are more likely to use paid services, and the vast majority of school board members we spoke with said they’d expect this to be a paid service.
Challenges posed by school phone systems. Email is a surprisingly important tool for school-focused helplines, because it’s difficult to get through institutional phone systems when we call people back. If a caller does leave the school phone number, sometimes it’s without an extension, increasing the time involved in finding and getting through to them.
Safety is an ecosystem. Helplines are just one of the stakeholder groups that need to be in the discussion about the online part of youth mental health and wellbeing. Another is the state-level school safety tiplines run by law enforcement (though Utah’s is run by mental healthcare specialists). There are also independent commercial and nonprofit apps and tiplines that support schools (e.g., Sprigeo and Sandy Hook Promise’s Say Something reporting system, respectively). Another essential stakeholder group is researchers. In Europe, there are researchers who have specifically looked at the work of Internet helplines, their operations and best practices (helplines can promote best practices in schools, such as media literacy education, social-emotional learning and restorative approaches to school discipline). A fourth stakeholder group is practitioners of social emotional learning, restorative practices, bullying prevention and sexual health educators. A fifth is school resource officers who work closely with students and school counselors and social workers, as well as administrators. A sixth is the social media platforms themselves. Societies need all these stakeholders in the discussion about how best to support students online and offline, on both the preventions and intervention sides of the safety equation.
To see how the helpline we piloted fit into the bigger picture of social media user care, see this article by our founder at Medium.com.