Almost 25 years ago, the World Health Organization initiated a global suicide prevention effort, offering recommendations and guidelines for the responsible coverage of suicide by traditional media. However, with the rise of social media, individuals now have a powerful platform where they are not obligated to adhere to reporting guidelines even as they exert significant influence on their peers and even public opinion.

In a commentary published in The Lancet Psychiatry, Johns Hopkins researchers both call for the establishment of guidelines to address this gap and recommend how modern social media platforms should share pro-social, life-saving education and remove harmful content. Authors of the commentary are Katherine Hoops, an assistant professor of anesthesiology and critical care medicine; Paul S. Nestadt, an associate professor of psychiatry and behavioral sciences and the clinical director of the Johns Hopkins Anxiety Disorders Program; and Mark Dredze, the John C. Malone Professor of Computer Science.

“For months during the pandemic, I admitted far more young people to the pediatric intensive care unit after serious suicide attempts than ever before,” Hoops says. “And my experience wasn’t unique. Clinicians caring for young people across the globe have all seen firsthand the increases in suicidality among youth. We also see at an individual level the role that social media plays.”

Inspired by the effectiveness of the WHO’s guidelines for responsible reporting on suicide, Hoops partnered with colleagues Nestadt and Dredze in their capacity as experts on suicide prevention and social media, respectively, to devise specific recommendations to extend these guidelines to modern social media platforms.

The authors recommend that social media sites take the following actions:

  • Periodically share suicide prevention content and promote access to resources and helplines in an effort to normalize discussions about mental health;
  • Aided by artificial intelligence, identify and moderate content that increases the likelihood of suicide—such as how-to guides, access to lethal means, and suicide pacts;
  • Actively identify instances of cyberbullying and impose commensurate consequences upon the offenders; and
  • Require hosted “influencers” to follow evidence-based standards for suicide reporting and to promote discussions on safety and intervention.

“We hope that this commentary will contribute to the ongoing conversations about some of the potential dangers of social media,” says Hoops. “We hope that clinicians and researchers will be aware of the potential issues and not only proactively counsel patients and families about online safety, but also contribute to advocacy efforts in this space, as well.”