YouTube is shoring up its mental health resources for teen viewers through a series of updates and partnerships with youth, parenting, and mental health experts. The changes aim to provide “guardrails” for YouTube’s younger viewers without stifling teens’ desire to explore.
“It’s healthy for teens to choose what they watch because they are exploring their interests and seeing the world from different perspectives. This helps teens develop the capacity to take initiative and lead change for themselves and their communities,” Yalda T. Uhls, founding director of the Center for Scholars and Storytellers and member of YouTube’s advisory committee, said in YouTube’s blog post.
A majority of teens today have only known a life a digital life with smartphones and social media. Unlike past generations, today’s parents and guardians are faced with the mammoth task of educating kids about digital literacy while protecting them from online dangers. It’s a burden made heavier by the ever-evolving tech landscape.
The Google-owned video sharing app is expanding its crisis resource panels to a full-page view. These resources pop up when a user searches sensitive and potentially harmful topics related to suicide, self-harm, and eating disorders. Instead of the current display about crisis hotlines and chats, and a carousel of videos from health sources, YouTube aims to “help viewers slow down in moments of acute distress” and redirect them to helpful resources. The crisis hotline contact information will still appear in the panel.
YouTube said it’s also revamping its Take a Break and Bedtime reminders, which first launched in 2018. Now the reminders will appear more frequently and visibly whether you’re watching Shorts or full-length videos. When I set up the feature on my phone, the video paused and asked if I wanted to take a break. Reminders can be adjusted in settings, so anyone can use them, but the feature is automatically enabled for viewers under 18 years of age and pops up every 60 minutes by default.
With the help of its Youth and Families Advisory Committee, YouTube said it’s identified content categories that could be problematic for teens to watch repeatedly. The app cites content that “compares physical features and idealizes some types over others, idealizes specific fitness levels or body weights, or displays social aggression in the form of non-contact fights and intimidation.”
“A higher frequency of content that idealizes unhealthy standards or behaviors can emphasize potentially problematic messages—and those messages can impact how some teens see themselves. Guardrails can help teens maintain healthy patterns as they naturally compare themselves to others and size up how they want to show up in the world,” Allison Briscoe-Smith, a clinician and researcher of the Youth and Families Advisory Committee, said in YouTube’s.
YouTube said it’s limiting repeated recommendations related to these topics for teens in the U.S. and plans to add more countries over the next year.
Along with YouTube, other social media apps have put mental health safeguard in place over the years. Instagram, Facebook, X (formerly Twitter), Tumblr, Pinterest, and TikTok all have crisis resources that pop up when you search certain terms related to harmful behaviors.
Image credit: YouTube