Amid a new storm of controversy following the publication of internal research documents which underlined the negative impacts that its apps can have on users, Facebook says that it will introduce new measures to help address some key areas of concern, including harmful impacts on teens and the amplification of divisive content.
- Reduce the presence of politics on people’s feeds, in line with user feedback that they want “more friends, less politics”
- Add a new prompt on Instagram which will detect when teen users are repeatedly engaging with content that may not be conducive to their well-being, then ‘nudge’ them to look at other content instead
- Add a new ‘take a break’ feature on Instagram, which will prompt teens to take time away from the app
Facebook will be hoping that these new measures, which don’t have a launch timeframe as yet, will help to lessen the growing pushback against the app, after former Facebook product manager Frances Haugen went public with a range of internal research reports which appear to suggest that Facebook is aware of various harms that it’s causing users, and society more broadly.
Haugen’s view is that Facebook has willfully ignored at least some of these findings, as addressing them would likely have impacts on Facebook usage rates.
Facebook has strongly denied these claims, with CEO Mark Zuckerberg noting that it conducts such research with the expressed intent of understanding how it can improve.
As per Zuckerberg:
“If we wanted to ignore research, why would we create an industry-leading research program to understand these important issues in the first place? If we didn’t care about fighting harmful content, then why would we employ so many more people dedicated to this than any other company in our space – even ones larger than us? If we wanted to hide our results, why would we have established an industry-leading standard for transparency and reporting on what we’re doing?”
Zuckerberg also specifically addressed the idea that the company prioritizes profit over safety, noting that:
“We make money from ads, and advertisers consistently tell us they don’t want their ads next to harmful or angry content. And I don’t know any tech company that sets out to build products that make people angry or depressed. The moral, business and product incentives all point in the opposite direction.”
The complexity of the various elements at play make it hard to understand what, exactly, is Facebook’s true motivation in this respect, but Haugen’s contention is that Facebook’s News Feed algorithm, in particular, is a key source of the platform’s issues, with the system showing users more divisive, more angst-inducing content in the app in order to spur more engagement, and therefore drive more Facebook usage.
Haugen’s proposed solution is to force Facebook to remove engagement-based algorithms entirely, via reforms to Section 230 laws in the US, a move that Facebook will likely oppose, given the benefits that it does derive from algorithm-defined engagement.
Instead, Facebook has sought to provide alternative solutions, such as reducing algorithmic influence in the amplification of certain content. But the real question Facebook needs to answer is whether its algorithms do, in fact, amplify divisive, misleading and harmful content. If that’s the case, then it does seem that some level of reform is necessary, with the specifics of such potentially built into law, as per Haugen’s suggestion.
On this front, Clegg says that Facebook would be open to the idea of letting regulators have access to Facebook’s algorithms that are used to amplify content. That could be a big step – and in a broader sense, Facebook does appear to be open to legal reforms that would apply to all social platforms, and would take some of these decisions out of its hands.
From a PR perspective, the look here is not great, with Facebook being portrayed as a source of underlying evil that’s willfully amplifying harmful movements for the greater good of its own company. But in actuality, it does seem as though Facebook is on board with calls for increased movement on this front, even if the revelations have not been brought about as it would have preferred.
Facebook hasn’t helped itself with its combative, defensive approach to the latest revelations, but Haugen’s efforts may indeed be a critical step in improving broader understanding of the impacts of social networks, and the ways in which we can improve such systems to address these key areas of concern.
It’s early days yet, but Facebook now appears to be saying the right things. And if the company does indeed allow more internal access, that could be a major step in the right direction for addressing these elements.