Since Cambridge Analytica happened, Facebook’s story has played out like a series of PR nightmare episodes — one leading to another but none quite so damaging as the latest series of disclosures made by former employee and whistleblower, Frances Haugen. Haugen’s disclosures paint Facebook as a muti-headed beast, filling in as the poster child for almost everything we fear about the Internet — from preferential treatment for elites to harmful mental health impact on teenage girls, from ignoring hate speech out of political expediencies to misleading the most high-profile accountability mechanisms it has set up.
With the emergence of social media in the 2000s, it was believed that this was a significant shift towards greater democratisation of information. The mounting loss of faith in mainstream media led many to believe that this would limit the ability of editors, compromised by economic and political compulsions, to play the role of gatekeepers. It was hoped that public accountability would emerge from the networked nature of the new media. This vision of social media as a democratising actor was based on the ideal that it would be open, neutral and egalitarian, and enable genuine public-driven engagement.
Over time, it has been realised that far from being open, neutral or egalitarian, social media platforms introduce their own parameters to shape how information is accessed, which only amplifies the issues plaguing mainstream media.
Social Media as a 'Personal Chef'
The popular metaphor used to describe social media platforms is a “curated information diet”. Prominent sociologist Danah Boyd used this effectively to describe social media as a personal chef who prepares meals daily based on your cravings but pays no heed to what your body needs to stay healthy. Algorithms play on our insatiable desire to be bombarded with sensational content, pushing us to more polarised ends of our political persuasions. Facebook has led the way in creating successful products that prioritise user engagement and consequent advertising revenue over everything else.
In 2010, at its annual F8 conference, Facebook launched Open Graph, a developer level interface, with much fanfare. At the conference, Zuckerberg described it as a stepping stone towards “building a web where the default is social”.
The new API enabled developers to see not just the connections between people, but also more granular links between their likes and interests. An important part of this strategy was the introduction of the “like” button, which could be used anywhere on the Internet.
This meant that any website could add a like button to any piece of content on their site. Later, in 2012, the “share” button was introduced, and developers were allowed to create apps and buttons that allow users to perform any custom action on any web object.
This data economy is supported by the actions of several actors, but not all of them get full access to data, sometimes even when they are producing it themselves. Third-party website providers who integrate social buttons on their web pages also can only see aggregate numbers around engagement with their content and other insights, but do not have a systematic view of how their content is being discussed inside the platform. This enables Facebook to maximise its data-mining activities, and at the same time, keep control over the key entities of exchange — data, connections and traffic.
Failure to Share Data With Researchers
What Haugen’s revelations show is that Facebook’s internal research also confirms the findings regarding the harmful effects of its platforms. It reveals that Facebook was aware of its harmful effects but did little to address them. The knowledge and the failure to act on that is why this is playing out like a PR nightmare. One likely response that we may see companies like Facebook institute is to stop studying critical questions about the impact of their platforms. Facebook’s limited efforts to share data with outside researchers have been riddled with controversies, most notably where they failed to share the complete data with misinformation researchers, as a New York Times report revealed last month.
Nate Persily was one of the people to lead Social Science One, an academic partnership Facebook has participated in, which would allow outside academics to study internal data. Persily, who resigned in 2020, has argued that in the absence of legislation that compels sharing of critical data, companies like Facebook have no incentive to do so.
Persily argues that we face an unprecedented situation where “almost all of the human experience is now taking place on these platforms, which control intimate communications between individuals and possess voluminous information about what users read, forward, ‘like’ and purchase”.
He recommends a complex data-sharing arrangement where an independent regulator specifies the thresholds and qualification criteria for the platform to share data. He also recognises the manifold surveillance dangers from governments getting access to the data and recommends its express prohibition.
Knee-Jerk Reactions from Governments
What we are witnessing with emerging regulatory efforts in many countries are practices that would take some power away from the platforms and hand it wholesale on a silver platter to the government. This is the biggest red flag as we go about re-orienting our approach to the regulation of BigTech companies.
It is clear from Haugen’s revelations that the research and academic community needs some access to data collected by platforms such as Facebook. By allowing its profits motives to singularly drive its strategies, BigTech companies have contributed to creating a situation where harmful effects of platforms have reached a critical mass, which is calling for knee-jerk reactions from governments. They may yet have the opportunity to do better and begin sharing their data with researchers and relying on independent research to address the harmful impact of their platforms.
(Amber Sinha is a lawyer and the Executive Director of the Centre for Internet and Society in India. The views expressed in this opinion piece are personal.
Disclaimer: Facebook has funded research projects at CIS in the past.
This is an opinion piece and the views expressed above are the author’s own. The Quint neither endorses nor is responsible for the same.)
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)