It has been a difficult week for Facebook. On Monday, its family of companies – including Facebook, WhatsApp, Instagram, Messenger and Oculus – suffered their largest service interruption to date. For six hours all of the platforms were offline because a routine maintenance process went awry. Billions of users were unable to access their services while company staffers were virtually and physically locked out of the systems they needed to fix the issue.
Then on Tuesday, former Facebook employee and whistle-blower Frances Haugen testified before the United States Congress that the company deliberately puts profit over protecting people by allowing harm to children particularly, and democracy more broadly. Despite efforts from Facebook to counter Haugen’s testimony on various outlets, her account was devastating for the company and came as Congress is deliberating the chance of some kind of legal or political action against the company.
On the same day that Haugen was testifying, the world received an inadvertent reminder of why this was an urgent issue.
If these two things seem disconnected, it is because you have not been paying attention to Facebook’s growing market dominance as a social networking platform and as a communications provider. Today, an estimated two billion people in more than 180 countries use the WhatsApp messaging platform while there are at least 3.5 billion people who use Facebook. Instagram, while not as popular as these two sites, is increasingly important for small businesses in several countries, that use it to build and manage their client bases in lieu of building their own websites.
These platforms are unambiguously important to the global digital society because of their sheer size, and that means that small internal decisions to look the other way when people misuse them are significantly intensified, as well as easily transmitted across international borders. Positive nudges on Facebook drive people to the polls, but misinformation on the same platform drives people to drink horse medicine.
Devastating revelations about how the company thinks about its responsibility towards users coming on the heels of a service failure of this scale raise a simple yet fundamental question: Is Facebook ready for the future it is building and are we prepared to live in it?
From the way Facebook has handled Haugen’s testimony, as well as the service interruption, it is evident that it does not fully understand the behemoth that it has constructed. A simple layman account of the service interruption is that because of a software update Facebook essentially locked itself out of the backend of the system that not only governs how each of the various platforms function, but also the systems that run the company itself.
If between Facebook and WhatsApp alone there are about at least five billion individual accounts, you have to wonder why anyone thought it was a good idea to centralise all of the information in such an elementary way? It is the kind of over-centralisation that gives competition lawyers heartburn and that compels governments to intervene and stop companies from getting too big.
If Facebook was merely a large company that people depended on to communicate that would have been bad enough. But it is a large company that people depend on to communicate that also collects, monetises and transforms the personal data that people provide to it for this communication, and then holds it in opaque systems that are always two steps behind critical political developments. This perhaps explains the simple question that Congress asked Haugen: Is it time to break up Facebook?
The pure economic argument is that as long as the company is growing, it should be allowed to keep growing; after all, it is creating jobs and growing economies. But jobs and economies do not exist outside social and political contexts and will mean nothing if societies collapse. The justification for allowing indefinite growth is feeble, particularly when the evidence that Haugen provided suggests that the company is not willing to change course on proof that it harms societies.
The company’s policies on dealing with the sociology and moral economy created by the unprecedented concentration of data in its hands are wanting. It is seemingly unable or unwilling to understand that making communication easier means that people of bad intent will also find it easier to communicate.
There are fundamental questions of society that strike at the heart of Facebook’s business model that need more rigorous analysis than a couple of one-off company statements. Should a company be able to monetise information that people provide for free in order to maintain their social networks? Should political information be treated differently than commercial information and how? Is advertising the only model for funding social networks? What obligations do these companies have to societies or markets where they are not registered and yet still want to profit from? These are philosophical questions about the nature of society after the digital era that cannot be papered over by empty rhetoric about economic growth.
Indeed, history is replete with examples of corporations that grow too large and have too much influence and the knock-on effect that this has on societies, particularly when they collapse. But Facebook’s own history is full of warning signs that the company’s financial growth has outpaced its comprehension of its social responsibilities.
Early in 2021, there was a widespread backlash when the company made it easier for information to be shared between WhatsApp and Facebook. Users resisted the change by migrating from the platform to competitors like Signal and Telegram, forcing the company to backtrack on the threat that anyone who rejected the new terms of service would lose functionality on the application.
Consider that many of Facebook’s users are in the West, but WhatsApp is only now growing in popularity there; conversely, millions of users in the developing world use WhatsApp, but do not have Facebook accounts. This suggests that a large number of the people who are using WhatsApp primarily as a messaging platform are not interested in having it integrated into their Facebook profiles, if they have any.
It was an ill-advised data grab that underscored that the tech giant’s growth strategy was out of step with what people wanted it for.
Haugen said that she did not believe that the company needed to be broken up, but European regulators disagree. In 2020, the European Commission proposed a set of content policies designed to make Big Tech companies more accountable for the harms that were incubated on their platforms, promising fines of six percent of global revenues and expanding anti-trust fines of up to 10 percent of future revenues, as well as forcing the platforms to sell parts of their business if they continue to violate the rules.
The companies – including Amazon, Twitter, Google and Facebook – resisted the policy proposals although they did offer to work with regulators to find alternatives. The alternatives have been slow to come.
If asked, most people probably would not want all their information centralised, monetised and transformed the way Facebook and other social networking sites are doing now. They offer the information up to connect with family and friends, not for it to be bundled and converted into advertising or information products that are sold to the highest bidder regardless of their intent.
But the platforms’ algorithms and backend architecture are deliberately shrouded in so much secrecy that evidently sometimes even their own staffers do not fully know how to fix them. And this is where Haugen’s testimonies on how Instagram harms minors and how Facebook harms democracy come together: the company knows that it is happening but does not seem to fully understand why and is unwilling to take the measures that are needed to stop it from happening if this would hurt profitability.
One of Haugen’s most powerful observations during the hearing was that “until incentives change at Facebook, we should not expect Facebook to change”. The incentives that drive the company – like any other – are based on the perverse logic of neoliberal economics: that companies must grow indefinitely and that all growth at any cost is good.
But the service outage and the revelations are an invitation to reconsider this economic model, to remind ourselves that there is such a thing as too much concentration of power, and to sincerely engage with the question of what role social networking sites should play in the future we want to live in. And taking up this invitation is a matter of urgency.
Nanjala Nyabola is a writer and political analyst based in Nairobi, Kenya. She is the author of “Digital Democracy, Analogue Politics”, a book on the impact of the internet on Kenyan politics (Zed Books, 2018)