So now looking ahead. We’ve learned a great deal from serving schools and talking and working with all stakeholder groups over the past 8 years: the people we’ve helped, the experts at other helplines, school leaders, students, safety professionals at multiple social media platforms, fellow youth-serving NGOs, program evaluators, funders and fellow nonprofits as well as tech companies serving schools. Based on all that, here are our recommendations….
- Teens helping teens. Train and employ teens who use social media, including videogames, to be the helpline workers. If they’re in a state where they can receive public-service credit for supporting their peers, ensure that they do. Examples include Oregon-based YouthLine and California-based Teen Line. Peers can provide a special kind of understanding and empathy that adults can’t, both by definition and because adults use tech and media differently from youth. This approach also models what is ideally a key principle of any youth service: to foster youth participation in matters concerning youth in according with the UN Convention on the Rights of the Child.
- All social media. The helpline doesn’t only work with the well-known platforms, but gaming, game chat, vlogging and livestreaming services as well. They are social too. Wherever media is social, young media users – or the adults who parent or work with them – deserve access to helpline support.
- Mental healthcare support: Because we know from the research that 1) depression, suicidal ideation and other mental health conditions are on the rise, 2) online risk and harm are largely psychosocial in nature and 3) youth “seem particularly affected by the pandemic,” operate the helpline and train teen and adult staff with ongoing guidance from mental healthcare professionals. Some Internet helplines in other countries are called “online safety” helplines, but the US’s would be more current and aligned with the research if it were associated more with mental healthcare, because the Internet is just one of many factors in mental health and wellbeing.
- Cross-industry support, both financial and operational. Ideally, the Internet industry as a whole provides both financial and operational support. It should create an industry-wide fund similar to that of the WePROTECT Global Alliance but aimed at youth peer-to-peer support for non-criminal, or psychosocial, online harm. If such a fund didn’t support the US’s or other individual helplines, it should at least support an international association of helplines, which…
- Establishes quality standards
- Requires uniform best practices
- Provides a forum for helpline workers to compare notes and support one another
- Disseminates research on the helpline work
- Convenes annual gatherings of helpline workers
- Serves as a single point of contact for the industry.
- Government support: Local, state and federal governments, including law enforcement and its tiplines, should know about, refer to and work with Internet helplines, which can support their work where online content is concerned. In cases where school safety tiplines receive reports of online harassment, bullying or other online harm that’s not technically illegal, they could refer these cases to the Internet helpline’s expertise rather than sending the tips back to schools to address – since students and school staff are where such tips typically originate. An Internet helpline must work with and support law enforcement, including the states’ Internet Crimes Against Children (ICAC) Task Forces, in cases where illegal (child sexual exploitation) content is involved, deferring to their expertise and protocols, and should be registered with National Center for Missing & Exploited Children as an online service provider.
- Support and refer to traditional hotlines, including those serving adults, such as the Cyber Civil Rights Initiative, as well as specialized hotlines such as LoveIsRespect.org for dating abuse and TheTrevorProject.org for LGBTQ youth.
- Independent and apolitical. The helpline is ideally an independent nonprofit or public benefit corporation – or, perhaps ideally, the Internet part of an existing youth-serving organization. If independent, consider setting it up as a non-charitable purpose trust like the independent entity Facebook set up to administer the work of the Oversight Board for content moderation appeals but to work with all platforms (that the Oversight Board itself may eventually serve many platforms is under discussion). An international association of helplines should receive the same level of support as an international “Oversight Board” set up for after-the-fact appeals of content decisions. In the US, it also might be worth exploring setting up a helpline as part of SAMHSA, one of the Federal Partners in Bullying [and cyberbullying] Prevention or StopBullying.gov, but neither provides help service for in-person bullying, so it’s unlikely they would do so for the online kind.
- Fill in the missing piece, the Internet piece. All vulnerable groups deserve help with online harm, of course; youth were just the first, historically. Children and young people were our focus and that of all of the world’s earliest Internet helplines – for the most part they still are, though some now help people of all ages (here‘s one example). There are excellent crisis hotlines and school safety tiplines in the US, but in most cases Internet-based harm and the ability to address it are not included in the help they provide. An Internet helpline is the “missing piece” and can help these services with the online part of cases and refer callers to these specialized services when their expertise is needed. “Deletion centers,” such as the one in Germany described here, are needed because they’re for people of all ages, but they’re about content deletion only, while helplines provide help at the human level too.
- Grow the middle layer of support: the layer between users on the ground and the platforms in the cloud. Youth in all countries deserve timely trained help with harmful online content, yet only a limited number of Internet helplines exist in the world so far. If Internet companies say anything about this, it’s typically that their own internal systems – algorithms and human content moderators (in house or outsourced) – handle the problem. But algorithms don’t do well with the highly contextual, fast-changing content of youth, and distant human moderators rarely have enough context for abuse reports to take appropriate action and content moderation systems are inconsistent in response times and decisions. The Oversight Board Facebook created is in a whole different category of response to users’ complaints – long after-the fact. It has a decision making process that will be way too slow to be considered timely intervention, much like that of an appeals court, it will address only a tiny fraction of the requests Facebook receives and – for now – is only for content on Facebook and its other platforms. Users need to have a way just to “pick up the phone” (email, text, chat, etc.) and call someone who can help, refer and – where appropriate – escalate harmful content in timely fashion, which the platforms simply can’t do.
For more on the big picture of social media user care, see this mid-2020 post on the many moving parts and this pre-pandemic post on young people’s interests