Facebook's high-stakes dilemma over suicide videos

  

Amanda Hebert felt powerless as she watched a Facebook video of her 32-year-old friend taking her own life.

In the 12-minute video, a police officer begged Hebert's friend to think of her two daughters and to let him help. Hebert called her friend, who streamed the suicide online, only to see her phone calls ignored in the video.

For Hebert, the pain from her friend's death didn't end there. Despite reports to Menlo Park-based Facebook by friends and the Anne Arundel County police in Maryland, she said the tech firm took at least six hours to pull down the video. It made its way to another website, where it now has hundreds of thousands of views.

"It's literally still out there haunting her friends and family," Hebert said.

Social media companies such as Facebook and Twitter's Periscope have made videos simpler for people to share online, but now these companies are in a race against time to respond quickly to posts depicting self-harm - before they go viral.

Balancing the risks of suicide contagion with free speech, newsworthiness and other factors, these companies' complex decisions to leave a video up or pull it down can mean the difference between life and death for people attempting suicide.

Sometimes, leaving a video up can allow family and friends to reach out to the person or call law enforcement for help.

"It's a hard place for these companies to be, to make decisions about what they're going to allow and what they're not going to allow, because it becomes a slippery slope quickly," said Daniel Reidenberg, executive director of Suicide Awareness Voices of Education.

Suicide is the 10th leading cause of death nationwide, according to the Centers for Disease Control and Prevention, and as more people share their lives - and in some cases, their deaths - online, tech firms are playing a larger role in efforts to prevent self-harm.

Facebook, the world's largest social network with nearly 2 billion users, rolled out suicide-prevention tools to provide help-line information and resources to those in distress.

Occasionally, Facebook and its users have successfully intervened.

In May, a Georgia teenager who attempted suicide on Facebook Live survived after a friend and Facebook itself reported the video to law enforcement. Sheriff's deputies scrambled to find the right address for the teen and to confirm that the video, which could only be viewed by the teenager's Facebook friends, wasn't a prank.


"The paramedics were able to render aid to her, and she was transported to the hospital," said Bibb County sheriff's Sgt. Linda Howard. "It happened so fast, but it took us 42 minutes to find out where she was and get to her location."

Yet when someone dies in a suicide streamed on Facebook, some say the company needs to pull down those videos faster.

After her friend's suicide, Hebert said internet trolls posted hurtful comments on the dead woman's Facebook page, which is no longer online. And at least two users posted the video on another website, where it now has more than 202,100 views and has been shared more than 650 times.

Helen Alexander, an acquaintance of the victim, said she reported the video to Facebook, but it already had 10,000 views on the site by the time the tech firm removed the footage. Facebook said it removed a video of the woman's suicide posted by another user from another source, but has no record of the woman livestreaming her own death.

"You can't stop people from doing whatever they're going to do with livestreaming, but as a platform, you can govern what happens to that stream or video once it's been reported to you," Alexander said.

She and Hebert shared an online petition that called on the White House to make it illegal to share live videos of suicides.

While Facebook may leave up videos if they provide a lifeline to the person in distress, the company often takes them down afterward amid concerns about the impact on survivors and copycat suicides.

Facebook has online rules against promoting or encouraging suicide or self-injury, but the tech firm also started allowing more content that people find newsworthy even if it violates the company's standards. For example, the social media giant said it left up a video of an Egyptian man who set himself on fire to protest rising government prices because it believed his act was newsworthy.

"It's hard to judge the intent behind one post, or the risk implied in another," wrote Monika Bickert, Facebook's head of global policy management, in a recent blog post. "Someone posts a graphic video of a terrorist attack. Will it inspire people to emulate the violence, or speak out against it? Someone posts a joke about suicide. Are they just being themselves, or is it a cry for help?"

But suicide-prevention experts say there are best practices social networks should follow to minimize harm.

"There are things that you don't want to have happen, and those include glorifying suicide, graphically showing how people take their lives and providing people with a blueprint with how they might take their life," said Vic Ojakian, the National Alliance on Mental Illness Santa Clara County board president, whose son died by suicide.

Facebook and other tech firms haven't said how many suicides or attempts have been broadcast through their live-video tools, but experts believe it's a small fraction of suicides worldwide.

Nearly 800,000 people worldwide take their own lives every year, and suicide was the second leading cause of death among 15- to 29-year-olds in 2015, according to the World Health Organization.

Facebook recently announced it was hiring 3,000 more workers to help review posts that are flagged for violating its online rules, including videos that promote suicide and violence.

In this increasingly digital world, experts say that anyone - not just mental health officials and crisis hotlines - can help save someone's life.

"Ultimately in my mind, suicide happens when pain outweighs hope," said Stan Collins, a suicide-prevention specialist for Know the Signs, a suicide-prevention marketing campaign by the California Mental Health Services Authority. "So the solution is how can we continue to keep hope and help convince people the reasons for living?"

Explore further: Facebook beefs up suicide prevention focused on live video

More information: Suicide Prevention Resources

If you or someone you know is in immediate danger, call 911.

National Suicide Prevention Lifeline: Call 800-273-8255 or you can chat via Facebook Messenger by visiting www.facebook.com/800273talk/.

The Trevor Project (for LGBT youth): 866-488-7386

Know the Signs: www.suicideispreventable.org

Facebook: www.facebook.com/safety/suicideprevention/

Twitter: support.twitter.com/articles/20170313