Facebook and mental health - your questions answered

Teens on tech

There's no denying the power of social media, especially for children and teens. With worries surrounding its effects on mental health and wellbeing, Facebook is tackling issues head-on to help young people navigate their way through and create positive experiences.

Facebook asked Mumsnet users to share their questions surrounding the platform and its effect on young peoples’ mental health and wellbeing. Read on to learn all about the policies and procedures they have put in place to keep children as safe as possible online.

Skip to what you want to know

Supporting mental health

Which mental health organisations are working alongside Facebook to offer help to those who need it?

When it comes to tackling topics as important as mental health and wellbeing, collaboration is vital. We partner with over 50 expert organisations around the world to create initiatives and resources that help anyone affected.

We partner with over 50 expert organisations around the world to create initiatives and resources that help anyone affected. For example, in the UK we’re working with Childnet International and The Diana Award to help train thousands of young people across the UK as Digital Safety Ambassadors, helping their peers to ensure they have a safe and positive experience online.

Other examples are our partnerships with Papyrus and Samaritans, which have helped us develop our Suicide Prevention Hub, which connects people in distress with those who can support them. We also partnered with the Yale Center for Emotional Intelligence alongside other organisations, to create our Bullying Prevention Hub, which offers help and resources to anyone facing bullying, as well as advice to those who may be worried about someone else.

How are you being proactive about supporting young people's mental health in terms of their social media use? How are you addressing concerns about the amount of time young people spend on social media?

We employ social psychologists, social scientists and sociologists, and collaborate with experts to better understand wellbeing. This allows us to improve Facebook in ways that help it make a positive impact. Here are a few examples:

  • News Feed: We made several changes to enable more meaningful interactions and reduce passive scrolling of low-quality content. We also redesigned the comments feature to encourage better conversations.
  • Snooze: People often tell us they want more say over what they see in their News Feed. Snooze gives people the option to hide a person, page or group for 30 days, without having to permanently unfollow or unfriend them.
  • Take a Break: Research on breakups suggests that offline and online contact with an ex can make emotional recovery more difficult. Take a Break lets people see less content from anyone they choose, limits what someone else can see about them, and lets them decide who can see their past posts.
  • Your time on Facebook: We created a dashboard that lets people see and manage their time on Facebook.
  • Daily reminders: People can now set reminders to limit the time they spend on Facebook.

Is there a way to detect trolls within support groups on Facebook for mental health issues?

If you come across content you believe is intended to troll others, you can report it anonymously – we’ll always prioritise cases where there is a threat to safety.

We have a safety and security team that’s 30,000 people strong and collectively works 24 hours a day, seven days a week. Roughly half of them monitor, identify and act on any content that violates our content policies, particularly content intended to harm or harass. This team is supported by artificial intelligence that is always searching for content that could be harmful. Of course, context is key when it comes to understanding posts, and that’s why we use human reviewers as well.

Our most effective way of detecting troll content is still the Facebook community itself.

Has Facebook worked with any professional organisations to do research and come up with action plans?

We do independent research and look carefully at third-party research that explores how social media affects wellbeing. This provides us with useful insights that help us improve Facebook. For example, our insight on the benefits of active participation versus passive scrolling helped us make changes to our News Feed algorithm, prioritising content from friends and family over content that people passively scroll through.

We work with Internet Matters who create tools to guide parents through the many issues teens can experience online. We also partnered with The Diana Award and Childnet International to create the Bullying Prevention Hub, with the help of the Yale Center for Emotional Intelligence.

How is Facebook working to promote positive mental health for teens and young people?

We want Facebook to be a place of positivity, especially for teens and young people. And we’ve spent years researching the link between mental health and social media, learning a lot in the process. For example, engaging with close friends and groups tends to have a positive effect on wellbeing while passively consuming content can have a more negative one. We use these learnings, and many more like them, to improve Facebook in ways that help promote those positive emotions and minimise negative ones.

In collaboration with mental health experts we’ve also introduced new tools, such as an activity dashboard that lets people manage their time on the platform, and a “do not disturb” setting on Facebook Messenger. We've also pledged 1 million dollars to fund further research into social media’s impact on wellbeing.

Bullying and harassment on Facebook

What help is provided to those facing bullying or harassment on Facebook? Do you provide the victim with links to support services should they wish to access them?

People can also use our anonymous reporting tools to alert us when they see content that makes them concerned for the safety of a friend.

We want to ensure no one who encounters bullying or harassment on Facebook is without help and resources. That’s why we created the Bullying Prevention Hub in partnership with the Yale Center for Emotional Intelligence. It’s a resource for teens, parents and educators that offers step-by-step plans, including guidance on how to start important conversations with someone being bullied. People can also use our anonymous reporting tools to alert us when they see content that makes them concerned for the safety of a friend.

At what point do you make a referral to, say, the police or other bodies? Or do you just remove and block posts and leave it at that?

We consider the language, context and details in order to distinguish casual statements from content that presents a real threat to public or personal safety.

We alert law enforcement when we believe there is a genuine risk of physical harm or direct threats to public safety. Of course, this isn’t always easy. We understand that people often express themselves in hostile but facetious and non-serious ways.

That's why we try to consider the language, context and details in order to distinguish casual statements from content that presents a real threat to public or personal safety.

If you'd like to know more about our community standards, visit the Facebook Community Standards page.

Visit your.fb.com to find out more about social media and mental health, safe use of the internet, content governance, and privacy.