Takeaways from Frances Haugen's Testimony and the Path to Improving Social Media

Takeaways from Frances Haugen's Testimony and the Path to Improving Social Media

While many of the issues around Facebook’s role in the spread of misinformation and teen mental health were speculated over the past several years, Frances Haugen’s testimony to congress this past week provided hard evidence and made them a national conversation. It was particularly powerful because of her deep understanding and clear explanation of Facebook’s activity based on the 1,000s of pages of internal research she gathered while working there.

Key takeaways from her testimony:
  • Facebook knew the harm that apps such as Instagram can cause for teens
  • Facebook’s algorithm amplifies problematic content including misinformation
  • Facebook’s algorithm is commonly exploited by foreign entities 
  • Facebook consistently chooses profits at the expense of societal issues
Haugen’s testimony clearly identified two primary issues underlying these takeaways:
  • Facebook algorithms optimize for engagement and as a result amplify hate, misinformation, and political unrest. Unfortunately, research shows that highly controversial content (such as misinformation, negative, or politically charged) tends to be much more engaging. Put simply, people just like to click on things that stroke their most powerful emotions. In Haugen’s words “Facebook makes money when you consume more content.” As a result they don’t have a strong incentive to remove the hate and misinformation.
  • Instagram harms teenagers' mental health, especially young women. An internal Facebook study of ~1,300 participants shows that 1 in 5 teens say Instagram makes them feel worse about themselves. Another study found that 32% of teen girls said that when they felt bad about their bodies Instagram made them feel worse. According to Haugen, Instagram creates a “feedback cycle where kids are using Instagram to self-soothe but then are exposed to more and more content that makes them hate themselves.”
Given these issues, what are some tangible ways to improve social media?
1. Halt the use of algorithms optimized for engagement 
  • Proposal: Given that both issues arise from the use of algorithms optimized for engagement (also called engagement based ranking systems), it seems clear that one way to improve social media is to create algorithms that don’t optimize solely for engagement. As we wrote in Why We’re Building Privee, platforms could try to optimize for something other than engagement - say happy, fun, or inspirational content. Haugen suggested one improvement could be that Facebook / Instagram show content only in chronological order. Instagram worked this way until 2017 and was still wildly popular.
  • Rationale: Optimizing for something other than engagement would likely lead to less amplification of hate and misinformation on the platform since those types of content are the most engaging.
  • Potential weak points: Facebook doesn’t want to make less money. Unfortunately, maximizing purely for engagement is profit maximizing for Facebook and any shift away from that would almost certainly make less money in the short run. Additionally, Facebook executives could be concerned that moving away from a purely engagement maximizing approach would lead to users shifting attention to competitors (such as Tik Tok) that still employ them. 
2. Increase the age limit for using social media
  • Proposal: Increase the age limit for social media from 13 to 16 or 18.
  • Rationale: This would decrease the amount of young people in their formative years who are exposed to the potentially harmful effects of social media. Facebook’s own research showed that Instagram use can significantly harm the mental health of teens (e.g. teens who struggle with mental health say it makes them worse).
  • Potential weak points: While it could be effective, this method doesn’t mitigate the systemic issues of how social media platforms work. Additionally, social media is already ingrained into the lives of 13-18 year olds. It’ll be difficult to simply remove it.
3. Enforce account verification
  • Proposal: Enforce account verification for all users.
  • Rationale: Remove the veil of anonymity that makes social media the perfect platform for online bullying and the spread of misinformation. An August 2020 report by advocacy group Avaaz found that the top 10 producers of “health misinformation” got 4 times as many views on Facebook as the top 10 sources of authoritative information. Another study found that out of nearly 150,000 posters in Facebook Groups disabled for Covid misinformation, just 5% were producing 50% of all posts. More verified accounts could also increase user trust on the platform since people will know that larger accounts are who they say they are. 
  • Note - the proliferation of bots was not well covered in Haugen’s testimony. They create major issues on social media with regards to the spread of misinformation. A 2021 study found that 33% of the top sharers of content from low-credibility sources were likely to be bots. The same study found that in the lead up to the 2016 presidential election, 20% of all political tweets originated from accounts likely to be bots. Greater efforts by platforms to detect and remove bot accounts would likely reduce misinformation. Account verification is one avenue to reduce them. 
  • Potential weak points: This solution only works if Facebook doesn’t give passes and whitelist popular accounts like it was reported in the Wall Street Journal. Some users could also be weary of trusting Facebook with their identity. Lastly, in some places there is value for journalists in dangerous situations to remain anonymous to keep them out of danger. 
4. Provide greater transparency into how algorithms work
  • Proposal: Social media platforms outline how their engagement based ranking systems work and give academic researchers the ability to investigate.
  • Rationale: An increased understanding of the inner workings of these systems would enable lawmakers to create more sound regulations and keep up with the innovations of technology companies. It would also allow researchers to better understand the societal effects of various algorithms.
  • Potential weak points: Companies will likely be resistant to providing such transparency given their track record.
5. Government regulations
  • Proposal: Carving out a targeted exemption in Section 230 for algorithmic ranking that would make platforms who use engagement based ranking systems liable for the content that spreads on their platforms. We will cover this more next week with a deep dive into Section 230. 
  • Rationale: Make platforms liable for the negative externalities their algorithms can create. Per Haugen, “If we had appropriate oversight, or if we reformed [Section] 230 to make Facebook responsible for the consequences of their intentional ranking decisions, I think they would get rid of engagement-based ranking,” Haugen said. “Because it is causing teenagers to be exposed to more anorexia content, it is pulling families apart, and in places like Ethiopia, it’s literally fanning ethnic violence.”
  • Potential weak points: If regulation isn’t well designed, it would only further entrench Facebook and other leading social media platforms as industry leaders.

It seems that Facebook just had it’s “big tobacco moment”. Over the next several months we expect leaders to make a push towards improving social media through regulatory reform. 

What do you think is the best way to improve social media? As always feel free to drop us a line to chat.

Sources:

Join waitlist to become an early Privee member

Thank you! You've been added to the waitlist.
Oops! Something went wrong while submitting the form.