They present a nearly intractable problem for social media companies under pressure to do something about material on their services that many people believe is causing harm, particularly to teenagers.
Those concerns came into sharp focus in recent weeks in a pair of Senate subcommittee hearings, the first featuring a Facebook executive defending her company and the second featuring a former Facebook employee turned whistleblower who bluntly argued that her former employer’s products drove some young people toward eating disorders.
The hearings were prompted in part by a Wall Street Journal article that detailed how internal Facebook research showed Instagram, which is owned by Facebook, can make body image issues worse for some young people.
On Tuesday, executives from YouTube, TikTok and Snapchat are scheduled to testify before a Senate subcommittee about the effects of their products on children. They are expected to face questions about how they moderate content that might encourage disordered eating and how their algorithms might promote such content.
“Social media in general does not cause an eating disorder. However, it can contribute to an eating disorder,” said Chelsea Kronengold, a spokesperson for the National Eating Disorders Association. “There are certain posts and certain content that may trigger one person and not another person. From the social media platform’s perspective, how do you moderate that gray area content?”
The association advises social media companies to remove content that explicitly promotes eating disorders and to offer help to users who seek it out.
But young people have formed online communities where they discuss eating disorders and swap tips for the best ways to lose weight and look skinny. Using creative hashtags and abbreviations to get around filters, they share threads of emaciated models on Twitter as inspiration, create YouTube videos compiling low-calorie diets and form group chats on Discord and Snapchat to share how much they weigh and encourage others to fast.
Influencers in fashion, beauty and fitness have been accused of promoting eating disorders. Experts say that fitness influencers in particular can often serve as a funnel to draw young people into extreme online eating disorder communities.
YouTube, Snapchat, TikTok and Twitter have policies prohibiting content that encourages eating disorders. The companies should improve their algorithms that can surface such content, Kronengold said.
“It becomes an issue, especially when people are coming across this content who can be harmed by it or don’t want to see it,” she said.
Like many other popular YouTube creators, Eugenia Cooney, 27, makes videos that share her favorite fashion and makeup items with her more than 2 million followers. But for years, her viewers have not focused on the topics of Cooney’s videos. Instead, they flood her comments with concerns about her health.
Although she spoke in 2019 about her struggles with an eating disorder in interviews with other YouTubers, Cooney rarely addresses her audience’s concerns. While some viewers flock to her social media profiles on YouTube, Twitter, Instagram and the streaming service Twitch to beg her to seek treatment, others have accused her of using her platform to promote eating disorders to young people.
They say her videos are examples of “body checking,” a habitual behavior of reviewing the appearance of one’s body that is often associated with eating disorders. More than 53,000 people signed a petition in January asking social media companies to remove her content.
“I just kind of feel like everybody has the right to make videos and to post a photo of themselves,” Cooney said in an August video. “With me, people will always be trying to turn that into such a bad thing.” She did not respond to requests for comment from The New York Times. YouTube said Cooney’s content did not violate its rules.
“We work hard to strike a balance between removing harmful videos about eating disorders and allowing space for creators and viewers to talk about personal experiences, seek help, and raise awareness,” said Elena Hernandez, a YouTube spokesperson. “We reduce the spread of borderline content about eating disorders that come close to violating our policies but don’t quite cross the line.”
YouTube does not prevent users from searching for eating disorder content, although it does include an eating disorder help line at the top of its search results for some common terms related to the topic.
Mishel Levina, a 19-year-old college student in Israel with 21,000 followers on TikTok, encourages her viewers to “block if you’re sensitive.” Her videos show off her waist and stomach, feature song snippets about being skinny and include text about losing weight.
Levina acknowledged that some of her behavior was unhealthy but said she was just sharing her life and was not urging other people to starve themselves.
“I’m being called out for promoting bad eating habits — I’m not promoting them,” she said in an interview. “I’m just making a joke out of them — it’s all a joke. It’s social media. I’m not pushing this on you. I’m sharing information. It’s your decision to take it and use it or to leave it aside and just skip it.”
Last year, TikTok began cracking down on content that explicitly encourages eating disorders and blocking some hashtags that promote disordered eating. But it has allowed creators to continue to share videos that discuss recovery or crack subtle jokes about eating disorders.
“We aim to foster a supportive environment for people who share their recovery journey on TikTok while also safeguarding our community by removing content that normalizes or glorifies eating disorders,” Tara Wadhwa, TikTok’s director of U.S. policy, said in a statement.
But many popular TikTok trends that do not explicitly promote eating disorders still highlight thin bodies, implicitly advocating thinness as the ideal.
“My first trend, ironically, is something that makes me feel awful,” said McKenzie Ellis, 26, whose music was featured in a recent “hip walking” trend where creators filmed close-ups of their waist while walking.
On Twitter, creators routinely share advice for crash diets and encourage disordered eating, and some amass tens of thousands of followers in the process. Twitter’s algorithms automatically suggest related accounts and topics for users to follow, based on the accounts they view. When a Twitter user views accounts that promote eating disorders, Twitter recommends topics like “fashion models,” “fitness apps & trackers,” “mindful eating” and “workout videos.”
Twitter said that its policies prohibit content that promotes eating disorders or provides instructions or strategies for maintaining them, and that the company primarily relies on users to report violative content. A spokesperson for the company said that its topic recommendations differed by account.
“While we remove content that violates our policies on suicide and self harm, we also allow people to share their struggles or seek help,” the spokesperson said.
On Snapchat, users often form group chats dedicated to privately encouraging one another to pursue eating disorders. Some of the chats are focused on providing negative feedback, essentially bullying the participants about not fulfilling their diet goals. Others provide positive feedback.
After an inquiry from The New York Times, Snapchat said it would ban terms related to the group chats from being used in users’ display names, group chat names and search. The company previously blocked a number of common terms associated with eating disorders and provides suggestions for resources, a spokeswoman said.
If you or a loved one is struggling with an eating disorder, contact the National Eating Disorders Association help line for support, resources, and treatment options at 800-931-2237. You can find additional resources at NationalEatingDisorders.org.