It had been no secret that Facebook, now known as Meta, runs a problematic enterprise. From the 2016 investigation into Russian influence on the platform to the more recent, facilitating role it played in the January 6th riot, Facebook's current mode of operations has created many real, tangible, negative consequences. The Facebook Files, a series of articles compiled by the Wall Street Journal following leaks of Facebook's internal communications from the whistleblower, Frances Haugen, offers an analysis into how that had come to be. Facebook's fundamental problems emerge through those articles: its business model makes noticeable changes risky. However, even as Facebook attempts - or at least pretends to attempt - to remedy its shortfalls, the real, tangible consequences that it has created raises concern for democracy in the United States.
Facebook's business model relies heavily on user engagement. The company itself is composed of two parts: one that focuses on apps, like Facebook and Instagram, and the other on hardware, such as Oculus, which produces virtual reality equipment, and Portal, a device for video conferencing. Facebook's quarterly reports reveal that the company relies overwhelmingly on its apps at 97.4% of all revenue in the last quarter of 2021. In addition, the app revenue is overwhelmingly comprised of advertisements, at 99.5%. Facebook presents advertisements to the user when they browse content, and therefore the heavy dependency on advertisement revenue translates directly to its heavy reliance on user attention and engagement.
User engagement propelled the company's algorithm change in 2018 and is quoted by company executives following the change as a reason against drastic changes to the platform. As the rate of comments and shares declined in 2017, Facebook instituted a change to promote "meaningful social interactions" by promoting posts that prompted involvement from the user, such as likes, comments, and reshares. The change did well in increasing user engagement. However, at the same time, it unintentionally promoted politically divisive, incendiary content and pushed political parties to deliver more negative messages to remain competitive on the platform. A remedy was found by dialing back a predictive algorithm that promotes posts that people are likely to comment on or reshare. However, according to the same article, "Mr. Zuckerberg said he didn't want to pursue it if it reduced user engagement."
Facebook's concern over public relations is also a symptom of its dependency on user engagement. Bad PR has historically prompted direct responses against Facebook, such as the Delete Facebook Movement. In addition, the company faces pressure from the Republicans and Democrats alike for "discriminating conservatives" by disfavoring right-wing content algorithmically and allowing far-right publishers on its site. After the 2016 election and the backlash of Russian misinformation campaigns, Facebook instituted two tools, "Sparing Sharing," which reduced the reach of "hyperposters," and "Informed Engagement," which "reduced the reach of posts that people were more likely to share if they hadn't read them." An internal report in 2019 found that the two mechanisms suppress major far-right publishers, even though it had not been intentional; following the report, "Informed Engagement" was stopped.
In 2020, in the months leading up to and following the elections, internal discussions about Breitbart and the platform's impact on politics reached a level that no clear solution could be derived to resolve the entangled web of conflict of interest, between the demands of political neutrality, PR, and user engagement. The inclusion of Breitbart on Facebook allows it to avoid Republican criticism and retain user engagement since right-wing content is among the "best performing." However, the same act draws blowback from the opposite direction, arguing that the site does not belong on the platform. Again, no content-agnostic, general solution was found to resolve this conflict.
Instead, on politically sensitive matters, Facebook pursued direct responses: it artificially reduced the accessibility of the messages produced by groups associated with real-world violence and prevented them from recruiting new members. The squelching of the Patriot Party was Facebook's "test of concept."
The Patriot Party — unrelated to the Patriot Party in the 1970s — formed as Facebook groups following the January 6th Insurrection due to the "Stop the Steal" movement, based on false claims about electoral fraud by the former president Donald Trump. Conversations in those groups were "disproportionally heavy on hate speech and incitement of violence," and Facebook observed their connections to armed movements.
Facebook "made it harder for organizers to share Patriot Party content, restricted the visibility of groups connected to the movement and limited 'super-inviters' from recruiting new adherents," so that the group was eventually forced into dormancy. A similar process was applied to Querdenken, a German conspiracy movement. In April, as an experiment "to see if they could suppress Querdenken by depriving it of recruits and minimizing connections between its existing members."
Such methods, although effective, raise questions about fairness in particular and the role of social media in politics in general. Facebook's internal rules for initiating such a response remain opaque to outside the company and the platform's users. As a result, appeals to the platform are usually tricky unless one is on Facebook's VIP list. Without a clearly defined, procedural rule for a response, it is difficult to see Facebook fit into the legal framework of freedom of speech. Although currently, Facebook operates only against entities that it deems harmful, it nevertheless makes mistakes. When those mistakes are made, it is difficult for an average user to prompt the company to correct itself.
In a broader sense, Facebook's capacity to hinder and strangle groups indicates its direct power over civil society. Its ability to prevent concepts from gaining traction demonstrates a qualitative change in the formation and transformation of public opinion: that, although any idea can be freely expressed and rapidly exchanged to a broad audience, an (for the time being) opaque, extralegal entity, that is ultimately not responsible to the democratic process, has the power to annul this liberty.
The story of Facebook is a story of great complexity. A simple "social duty-corporate profitability" dichotomy is an insult to the real problem. Through the Facebook Files, one sees a bitter struggle within the company to understand itself, stay profitable, be responsible, and so forth. When faced by an individual, such problems are simply a fact of life; but when faced by a company that has, at its whims, the power to determine the directions of people independent of their desires, the room for error becomes minuscule. What is most vital, then, is to devise an acceptable means for society to operate in concert with it to ensure that the people control their voices and that voices do not control people.
Comments