Regulating the Digital Public Sphere

With the disparaging impact of the spread of disinformation on social media becoming more evident, authoritative voices are calling for policy reform regarding its regulation. Jack M. Balkin has proposed some policy reform that could see this happen effectively.

All photos provided by https://pixabay.com/

Quoting Manjoo’s article (2017) in the New York Times, Facebook “has become the largest and most influential entity in the news business, commanding an audience greater than that of any American or European television news network, any newspaper or magazine in the Western world and any online news outlet”[i].

In the early years of social media, there was a sentiment of ambivalence regarding the potential of it to be an instrument of a democratically sound digitized public sphere. However, the increased spread of disinformation on social media platforms has tarnished the perceived potential these platforms possess in facilitating this.

The UK’s Disinformation and ‘fake news’ Report, drawn up in 2019 by the Digital, Culture, Media and Sports Committee (DCMSC) of the House of Commons defines disinformation as, “the deliberate creation and sharing of false and/or manipulated

information that is intended to deceive and mislead audiences, either for the purposes

of causing harm, or for political, personal or financial gain” [ii].

Bennett and Livingstone (2018) add to the definition of disinformation by characterising this deliberate activity as a “malicious” act where actors plant or seed “strategic deceptions that may appear very credible to those consuming them”[iii].

The real-world impact of disinformation became evident during the U.S presidential elections in 2016. A campaign launched by agents of disinformation aimed to influence the election results by publishing false information to discredit political candidates. One such agent of disinformation was traced to Veles, a small town in central North Macedonia [iv]. The man orchestrating this operation was an entrepreneur, Mirko Ceselkoski. He trained residents of this village in clickbait techniques targeting a U.S. presidential candidate.

Even the ex-president of the United States, Donald Trump, utilised disinformation with his repetitive utterance of his slogan, “fake news”.  He did so to influence his supporters to distrust the media and to discredit the media itself as a reliable source of information[v].

Ruiz and Nilsson mention policymakers, such as, the European Commission, the North Atlantic Treaty Organization, the World Health Organization, and tech firms, all who have identified disinformation as a growing threat for which we lack effective countermeasures.” [vi]

Three recent examples of those seeking to investigate the implementation of effective regulative policy targeting the sharing of disinformation over social media include, the UK’s Department of Digital, Culture, Media and Sport Disinformation and ‘fake news’ Report, the Cairncross Review, and the European Commission’s Report on Disinformation [vii].

The reason why the DCMSC characterised Meta-Facebook as ‘digital gangsters’ is because of its track record of apathy when confronted by authoritative voices to remedy the spread of disinformation on the platform.

Dana Priest, James Jacoby and Anya Bourg published an article in 2018 in the Washington Post which revealed that Facebook was warned by activists, civil society organisations, and journalists about Russian disinformation targeting Ukraine. Meta-Facebook offered apologies about its inability to resolve the issue swiftly. It was revealed that several employees of the social media giant knew it was happening [viii].

Rosa (2022) states the rise of social media and the rampant disinformation campaigns have “forcibly presented with the question as to how the idea of public opinion and the public sphere can be understood at all under conditions of the present” [ix]. For example, the election of Donald Trump took place despite most of the intellectual elites’ opinions and recommendations. It is obvious that social media has changed the dynamics of public communication and public opinion-formation which are having pronounced effects on the realm of politics and on existing theories of democracy and the public sphere.

Staab and Thiel (2022) point out that the 21st century digital public sphere of social media is compromised by the market driven practice of surveillance capitalism through data collection. On social media platforms, such as Meta-Facebook, data is the by-product of digital communication: whenever we communicate, we leave digital traces. Unlike media of the 20th century, social media does not involve unidirectional communication, it is multidirectional, meaning the recipients of information are also the senders of it. This communication generates data that a third party – the social media platform- uses as a raw material for trade to advertisers in exchange for currency. This creates incentives to structure this communication, which corrupts the bidirectional communication that democracy promises[x]

Jack M. Balkin, a Knight Professor of Constitutional Law and the First Amendment at Yale Law School, delivered a keynote address in 2019 at the Association for Computing Machinery Symposium on Computer Science and Law in New York. In this address he spoke about social media as a digital public sphere and how he believes it should be regulated.

Balkin identified three primary functions of social media. Firstly, it facilitates the public’s participationin art, politics, and culture. Secondly, it provides the opportunity to facilitate public communication. Thirdly, it showcases public opinion, which is regulated, in a sense, through community standards and terms of service. This often dictates the rate of propagation and its reach [xi].

He points out the contrast between the 20th century public sphere and the digital one of contemporary times. Social media is not only publishing material which has been scrutinised by bodies bound to legal obligation. It is publishing everyone’s content. There are some measures and standards which can intercept and remove content that does not abide to community standards, even if the law recognises this content’s legality.  This creates problems when one considers the democratic pillar of free speech in this public sphere[xii].

According to Balkin, for social media to be recognised as an institutional public sphere, like any other institution, it needs to be regulated by professional and public-regarding norms. This would incentivise measures that would reinforce public perception of these organisations as trustworthy sources of information. But these professional norms and practices must be implemented in such a way that it does not threaten the principle of free speech[xiii].

Policy reform regarding the proliferation of disinformation would mean holding social media giantsnaccountable for the spread of harmful information.  

This would involve these organisations taking bigger steps in self-regulation.

In his symposium, Balkin, proposes a few measures which could possibly stem the flow of disinformation into users’ feeds. He proposes that regulation should not start at basic internet services, which government and activist groups propose to combat spreaders of disinformation. Instead, he believes that regulation should begin in social media platforms and in search engines [xiv].

All photos provided by https://pixabay.com/

A common criticism of the regulation conducted by social media companies is the over reliance on Artificial Intelligence (AI). Meta-Facebook, and other social media platforms, are just tweaking AI technologies to identify and take down bad content. Scharre (2019) states that these measures will fail if the data does not align with the operating environment of the system[i].

To help with the moderation workload, social media relies on the complaints of users, societal organisations, and government bodies to report concerning posts.

Balkin states that these companies are so hesitant about making content moderation and regulation a priority because it is not in their financial interests to do it. These companies started as technology companies with a vested interest in growth and financial gain by using digital surveillance and behavioural targeted advertising. Incendiary content is good for business[ii].

Social media companies recognise that their success is dependent on public relations as the success of their business requires the public’s trust. This means they are obliged to protect end user autonomy, maintain a platform that facilitates democracy, and promotes free speech. Implementing policy that achieves this is yet to be seen.

However, Balkin states that he doesn’t see these companies changing their business model with regards to data collection and behavioural advertising.

Meta-Facebook has made some positive steps such as the Oversight Board for Content Decisions. The board’s intent is to facilitate the freedom of expression by making independent decisions regarding content on Facebook and Instagram with set principles. The board consists of 40 members from multiple countries. These members will be able to select problematic content cases which will be overseen[iii].

While Balkin recognises this as a step in the right direction, he doubts how effective the board will be since it will only be able to review a fraction of posted disinformation on the platform [iv].

Balkin provides possible options for reform of social media policy regarding regulation of content. He believes that the state must respect the editorial rights of social media companies. These companies must ultimately incentivise the promotion of public good and the maintenance of a healthy digital public sphere.

He speaks about ‘3 policy levers’ which could provide incentives for social media companies to maintain a healthy digital public sphere. The first policy lever to regulate social media he mentions is antitrust and competition law. The second lever is privacy protection and consumer law. The third, and final lever, is balancing intermediary liability with intermediary immunity[v]. If these levers are implemented correctly, it should not impede on users’ free speech.

Regarding these policy levers, Balkin stresses the importance of their symbiotic relationship. Each lever is needed to facilitate the other.

Considering the first policy lever, competition policy should aim to generate more smaller companies with differing applications and norms. Start up companies should not be bought up by larger companies to shelf new innovations which could compete with their established platforms. The competition policy should also seek to separate control over advertising brokering, delivering ads, delivery of content, and content moderation.

Currently, these tasks are being performed within one company. Balkin believes that delegating these tasks to intermediary companies could streamline their fulfilment. A relaxation of antitrust laws would allow media companies to bargain with social media organisations regarding advertising.

With the second policy lever of privacy and consumer protection, Balkin believes a fiduciary model of regulating both social media companies and internet service bodies. This model emphasizes a relationship of trust, between the social media company and the users who utilise the platform. This requires the social media company to acknowledge their duty to the user to not use their data in ways that they do not consent to.

Under this model, social media companies have duties to their users. A duty of care, a duty of confidentiality, and a duty of loyalty. This model requires digital media companies, to alter the dynamics of their current relationship with their users, which currently treats them as a product which is of interest to advertisers[vi].

This will have consequences with regards to the methods of surveillance capitalism that these companies use. This model is flexible, so its implementation can be done through statute, administrative regulation, or judicial doctrine.

The final policy lever is intermediary liability and its balance with intermediary immunity. Balkin believes that intermediary immunity is the only viable solution to the implementation of this lever to encourage social media companies to prioritise behaviour that has the public in mind.

Balkin mentions the lack of incentive to drive social media companies to invest in content moderation services. This strategy would have the state enforce a level of content moderation services through employment and labour law which would entice social media companies to hire more human moderators. Audits of social media companies to ensure the sufficient steps have been taken to satisfy moderation requirements could be conducted.

If intermediary immunity is not a viable option, government may want to implement distributor liability. The company will remain immune until they are notified that they are distributing unlawful content. This distributor liability could extend to paid advertisements.

These 3 policy levers proposed by Balkin could push social media companies to ramp up self-regulation and change their business models to facilitate a more democratically sound digital public sphere.


[i] Scharre P (2019) Killer apps: The real danger of an AI arms race. Foreign Affairs, May/June. Available at: https://www.foreignaffairs.com/articles/2019-04-16/killer-apps (Accessed on: 12/01/2023, 15:16).

[ii] Balkin, J. M.  (2020) How to Regulate (and Not Regulate) Social Media. Available at: https://s3.amazonaws.com/kfai-documents/documents/555c172896/3.25.2020_-Balkin-New-Layout-p5d3–4-.pdf.

[iii] Editor of the Oversight Board (2022). Ensuring Respect for Free Expression, Through Independent Judgment. Available at: https://www.oversightboard.com/ (Accessed on 09/01/2022).

[iv] Balkin, J. M.  (2020) How to Regulate (and Not Regulate) Social Media. Available at: https://s3.amazonaws.com/kfai-documents/documents/555c172896/3.25.2020_-Balkin-New-Layout-p5d3–4-.pdf.

[v] Balkin, J. M.  (2020) How to Regulate (and Not Regulate) Social Media. Available at: https://s3.amazonaws.com/kfai-documents/documents/555c172896/3.25.2020_-Balkin-New-Layout-p5d3–4-.pdf.

[vi] Balkin, J. M.  (2020) How to Regulate (and Not Regulate) Social Media. Available at: https://s3.amazonaws.com/kfai-documents/documents/555c172896/3.25.2020_-Balkin-New-Layout-p5d3–4-.pdf.


[i] Manjoo F (2017) ‘Can Facebook fix its own worst bug? The New York Times Magazine’. The New York Times Magazine, 25 April. Available at: https://www.nytimes.com/2017/04/25/magazine/can-facebook-fix-its-own-worst-bug.html (Accessed: 12/01/2023, 15:13).

[ii] European Commission (2018) A Multi-Dimensional Approach to Disinformation: Report of the Independent High Level Group on Fake News and Online Disinformation. Available at:https://ec.europa.eu/newsroom/dae/document.cfm?doc_id=50271 (Accessed on: 12/01/2023).

[iii] Bennett, W. L. and Livingston, S. (2018), ‘The DisinformationOrder: Disruptive Communication and the Decline of DemocraticInstitutions’. European Journal of Communication, 33(2), pp. 122–139. DOI: 10.1177/0267323118760317.

[iv] Hughes, H. C. and Waismel-Manor, I. (2021), ‘The Macedonian Fake News Industry and the 2016 U.S. Election’.  PS: Political Science and Politics, 54(1), pp. 19–23. DOI: 10.1017/S1049096520000992.

[v] Mould, T. (2018), ‘Introduction to the Special Issue on Fake News: Definitions and Approaches’. Journal of American Folklore, 131(522), pp. 371–78. DOI: 10.5406/jamerfolk.131.522.0371.

[vi] Ruiz, C. R. and Nilsson, T.  (2022). ‘Disinformation and Echo Chambers: How Disinformation Circulates on Social Media Through Identity-Driven Controversies’. American Marketing Association, 42(1), pp. 18-35. DOI: 10.1177/07439156221103852.

[vii] Iosifidis, P. (2020). ‘The Battle to End Fake News: A Qualitative Content Analysis of Facebook Announcements on How it Combats Disinformation’. The International Communication Gazette, 82(1), pp. 60-81. DOI: 10.1177/1748048519880729.

[viii]  Dana Priest, D., Jacoby, J., and Bourg, A. (2018). ‘Russian disinformation on Facebook targeted Ukraine well before the 2016 U.S. election’. The Washington Post, October 28. Available At: https://www.washingtonpost.com/business/economy/russian-disinformation-on-facebook-targeted-ukraine-well-before-the-2016-us-election/2018/10/28/cc38079a-d8aa-11e8-a10f-b51546b10756_story.html (Accessed: 12/01/2023, 15:15).

[ix] Rosa, H. (2022). ‘Social Media Filters and Resonances: Democracy and the Contemporary Public Sphere’. Theory, Culture & Society, 39(4), pp.17–35. DOI: 10.1177/02632764221103520.

[x] Staab, P. and Thiel, T. (2022). ‘Social Media and the Digital Structural Transformation of the Public Sphere’. Theory, Culture & Society, 39(4), pp. 129-143. DOI: 10.1177/02632764221103527.

[xi] Balkin, J. M.  (2020) How to Regulate (and Not Regulate) Social Media. Available at: https://s3.amazonaws.com/kfai-documents/documents/555c172896/3.25.2020_-Balkin-New-Layout-p5d3–4-.pdf.

[xii] Balkin, J. M.  (2020) How to Regulate (and Not Regulate) Social Media. Available at: https://s3.amazonaws.com/kfai-documents/documents/555c172896/3.25.2020_-Balkin-New-Layout-p5d3–4-.pdf.

[xiii] Balkin, J. M.  (2020) How to Regulate (and Not Regulate) Social Media. Available at: https://s3.amazonaws.com/kfai-documents/documents/555c172896/3.25.2020_-Balkin-New-Layout-p5d3–4-.pdf.

 

[xiv] Balkin, J. M.  (2020) How to Regulate (and Not Regulate) Social Media. Available at: https://s3.amazonaws.com/kfai-documents/documents/555c172896/3.25.2020_-Balkin-New-Layout-p5d3–4-.pdf.

Leave a comment

Blog at WordPress.com.

Up ↑