Back
About RSIS
Introduction
Building the Foundations
Welcome Message
Board of Governors
Staff Profiles
Executive Deputy Chairman’s Office
Dean’s Office
Management
Distinguished Fellows
Faculty and Research
Associate Research Fellows, Senior Analysts and Research Analysts
Visiting Fellows
Adjunct Fellows
Administrative Staff
Honours and Awards for RSIS Staff and Students
RSIS Endowment Fund
Endowed Professorships
Career Opportunities
Getting to RSIS
Research
Research Centres
Centre for Multilateralism Studies (CMS)
Centre for Non-Traditional Security Studies (NTS Centre)
Centre of Excellence for National Security
Institute of Defence and Strategic Studies (IDSS)
International Centre for Political Violence and Terrorism Research (ICPVTR)
Research Programmes
National Security Studies Programme (NSSP)
Social Cohesion Research Programme (SCRP)
Studies in Inter-Religious Relations in Plural Societies (SRP) Programme
Other Research
Future Issues and Technology Cluster
Research@RSIS
Science and Technology Studies Programme (STSP) (2017-2020)
Graduate Education
Graduate Programmes Office
Exchange Partners and Programmes
How to Apply
Financial Assistance
Meet the Admissions Team: Information Sessions and other events
RSIS Alumni
Outreach
Global Networks
About Global Networks
RSIS Alumni
Executive Education
About Executive Education
SRP Executive Programme
Terrorism Analyst Training Course (TATC)
International Programmes
About International Programmes
Asia-Pacific Programme for Senior Military Officers (APPSMO)
Asia-Pacific Programme for Senior National Security Officers (APPSNO)
International Conference on Cohesive Societies (ICCS)
International Strategy Forum-Asia (ISF-Asia)
Publications
RSIS Publications
Annual Reviews
Books
Bulletins and Newsletters
RSIS Commentary Series
Counter Terrorist Trends and Analyses
Commemorative / Event Reports
Future Issues
IDSS Papers
Interreligious Relations
Monographs
NTS Insight
Policy Reports
Working Papers
External Publications
Authored Books
Journal Articles
Edited Books
Chapters in Edited Books
Policy Reports
Working Papers
Op-Eds
Glossary of Abbreviations
Policy-relevant Articles Given RSIS Award
RSIS Publications for the Year
External Publications for the Year
Media
Cohesive Societies
Sustainable Security
Other Resource Pages
News Releases
Speeches
Video/Audio Channel
External Podcasts
Events
Contact Us
S. Rajaratnam School of International Studies Think Tank and Graduate School Ponder The Improbable Since 1966
Nanyang Technological University Nanyang Technological University
  • About RSIS
      IntroductionBuilding the FoundationsWelcome MessageBoard of GovernorsHonours and Awards for RSIS Staff and StudentsRSIS Endowment FundEndowed ProfessorshipsCareer OpportunitiesGetting to RSIS
      Staff ProfilesExecutive Deputy Chairman’s OfficeDean’s OfficeManagementDistinguished FellowsFaculty and ResearchAssociate Research Fellows, Senior Analysts and Research AnalystsVisiting FellowsAdjunct FellowsAdministrative Staff
  • Research
      Research CentresCentre for Multilateralism Studies (CMS)Centre for Non-Traditional Security Studies (NTS Centre)Centre of Excellence for National SecurityInstitute of Defence and Strategic Studies (IDSS)International Centre for Political Violence and Terrorism Research (ICPVTR)
      Research ProgrammesNational Security Studies Programme (NSSP)Social Cohesion Research Programme (SCRP)Studies in Inter-Religious Relations in Plural Societies (SRP) Programme
      Other ResearchFuture Issues and Technology ClusterResearch@RSISScience and Technology Studies Programme (STSP) (2017-2020)
  • Graduate Education
      Graduate Programmes OfficeExchange Partners and ProgrammesHow to ApplyFinancial AssistanceMeet the Admissions Team: Information Sessions and other eventsRSIS Alumni
  • Outreach
      Global NetworksAbout Global NetworksRSIS Alumni
      Executive EducationAbout Executive EducationSRP Executive ProgrammeTerrorism Analyst Training Course (TATC)
      International ProgrammesAbout International ProgrammesAsia-Pacific Programme for Senior Military Officers (APPSMO)Asia-Pacific Programme for Senior National Security Officers (APPSNO)International Conference on Cohesive Societies (ICCS)International Strategy Forum-Asia (ISF-Asia)
  • Publications
      RSIS PublicationsAnnual ReviewsBooksBulletins and NewslettersRSIS Commentary SeriesCounter Terrorist Trends and AnalysesCommemorative / Event ReportsFuture IssuesIDSS PapersInterreligious RelationsMonographsNTS InsightPolicy ReportsWorking Papers
      External PublicationsAuthored BooksJournal ArticlesEdited BooksChapters in Edited BooksPolicy ReportsWorking PapersOp-Eds
      Glossary of AbbreviationsPolicy-relevant Articles Given RSIS AwardRSIS Publications for the YearExternal Publications for the Year
  • Media
      Cohesive SocietiesSustainable SecurityOther Resource PagesNews ReleasesSpeechesVideo/Audio ChannelExternal Podcasts
  • Events
  • Contact Us
    • Connect with Us

      rsis.ntu
      rsis_ntu
      rsisntu
      rsisvideocast
      school/rsis-ntu
      rsis.sg
      rsissg
      RSIS
      RSS
      Subscribe to RSIS Publications
      Subscribe to RSIS Events

      Getting to RSIS

      Nanyang Technological University
      Block S4, Level B3,
      50 Nanyang Avenue,
      Singapore 639798

      Click here for direction to RSIS

      Get in Touch

    Connect
    Search
    • RSIS
    • Publication
    • RSIS Publications
    • Assumptions About Censorship in the Digital Domain Are Not Always What They Seem
    • Annual Reviews
    • Books
    • Bulletins and Newsletters
    • RSIS Commentary Series
    • Counter Terrorist Trends and Analyses
    • Commemorative / Event Reports
    • Future Issues
    • IDSS Papers
    • Interreligious Relations
    • Monographs
    • NTS Insight
    • Policy Reports
    • Working Papers

    CO24147 | Assumptions About Censorship in the Digital Domain Are Not Always What They Seem
    Sean Tan, Tan E-Reng

    07 October 2024

    download pdf

    SYNOPSIS

    Despite their public reputation as libertarian bastions of free speech, large private online communication platforms do not necessarily uphold the principles that underpin the freedom of information, particularly where the public interest is concerned.

    COMMENTARY

    While voices against online content moderation have been chiefly confined to a minority of supposed “free speech absolutists”, the recent arrest of Telegram chief Pavel Durov raised a few more public eyebrows. Unease at Durov’s detention by French prosecutors arose not only from high-profile anti-censorship proponents but also from within communities reliant on Telegram for vital, unfiltered information. A similar situation is seen in Russia, where the messaging app is widely used both by the government and its rivals.

    Denying any political motivations, French authorities have emphasised Telegram’s lack of appropriate moderation and the resultant complicity in cybercrime, including child sexual exploitation, drug-related offences, and other illegal content encrypted on the app. Durov’s arrest evokes familiar clichés, such as “state suppression of speech”, “infringement of the private sphere by public entities”, and debates about the exercise of regulatory oversight by governments.

    However, this framing insinuates that censorship is the sole preserve of states and governments. Inclined as they are to focus on the act of restricting online content, many impassioned defenders of free speech often overlook one of the largest potential curtailers of private (and public) discourse: the private sector itself.

    Who wields the power to censor?

    In the digital realm, content moderation and censorship are sometimes distinguished by intent. Sometimes, moderation is conflated with censorship. However, the crux of the matter may instead be one of relative reach and influence.

    The notion of censorship imposed by a higher authority is often associated with powerful state organs that retain a “monopoly on control” (echoing Weberian concepts of the state’s monopoly on violence). Newer theories of structural power in political economy, however, emphasise the immense – and still growing – role of large corporations and, subsequently, direct contestation between private enterprises and states across multiple domains. This includes the communications sector, where the sway held by large media companies over information flows and the dissemination of narratives can easily rival (or even surpass) that of states. The sheer outsized influence held by these corporations arguably allows them the ability to censor.

    The evolution of large online platforms from the late 20th century to the present day mirrors their similarly transforming influence. On the one hand, these platforms have long adopted common legal measures (such as terms of service and privacy policies), which initially maintained oversight over how users interacted with their content while upholding “open” internet access principles. On the other hand, platform owners can now increasingly exploit user content, including suppressing undesirable content.

    Who wants the power to censor?

    Durov’s arrest sets a precedent – it establishes criminal accountability for online platform owners regarding how their platforms are used (and abused). To date, few (if any) owners of large online communication or social media platforms have been regarded as criminally liable for the user-generated activity and content that their platforms host.

    On the surface, Durov’s anti-establishment ethos, his public commitment to user privacy and encryption, and the app’s popularity among opposition movements in authoritarian states imply that Telegram’s owners neither engage in censorship nor intend to do so. Indeed, Telegram’s laissez-faire content moderation philosophy has even earned it the title of a “go-to app for troublemakers”.

    Yet, responses to Durov’s arrest from his most ardent supporters have been revealing, sometimes even contradictory. X owner Elon Musk, who was quick to reiterate his description of content moderation as “propaganda” for censorship, is simultaneously observed as a willing (and prolific) content remover on his own platform. Decisions to obscure content are often shaped by Musk’s personal views. Some evidence even suggests that the platform deliberately limits users’ access to politically-opposed sources.

    Conservative American political commentator Tucker Carlson also labelled Durov “a living warning to any platform owner who refuses to censor the truth at the behest of governments”. Speaking with Carlson last April, Durov had emphasised his reluctance to comply with government directions to restrict access to certain forms of content. He stated that he would not consider requests deemed to impinge upon Telegram’s values of free speech and privacy.

    However, Durov’s decision can also be interpreted as a subjective and discriminating judgement on appropriate content based on a platform owner’s personal beliefs. Moreover, even if one were to assume there is a hands-off approach to content regulation by default, contrary to its pro-privacy stance, Telegram does not automatically enable end-to-end encryption on most of its user conversations, which strongly suggests that it keeps a much closer eye on private communications than publicly disclosed.

    Where else might we identify “corporate censorship”?

    The above examples illustrate large online platforms’ preparedness to scrutinise users’ speeches and expressions according to their owners’ interests. However, personal objectives can also be strongly intertwined with commercial priorities. Examples include Google’s internal content moderation policies regarding the Israel-Hamas conflict, which is linked to its contract to provide Israel with cloud computing services, and moderation on X prompted by the potential loss of lucrative advertising revenue.

    Having said this, corporate censorship does not necessarily reflect commercial or private interests in competition with state or public agendas. In China, where private sector companies have significant state ties, the state may use influential private firms as tools to achieve its political aims. This has recently led to speculation about the likelihood of politically motivated censorship on Chinese private gaming platforms, such as instructions for players to avoid discussing, for instance, specific political issues and COVID-19-related content on the newly released game Black Myth: Wukong.

    Final Takeaways

    Despite regulatory efforts by authorities to rein in the outsized influence of large private corporations and their online platforms, some members of the public – particularly those who favour certain apps and platforms for personal or political reasons – may nonetheless remain suspicious about state intentions.

    To improve transparency and increase public trust, governments can augment ongoing regulatory efforts with initiatives to broaden the public’s involvement in the regulatory process. For example, the European Union’s Digital Services Act includes mechanisms to provide public interest researchers with access to internal platform data, for appointing trusted independent flaggers to detect illegal content, and for whistleblowers with insider knowledge about platforms to notify authorities of any potentially unlawful activity.

    Also, existing laws that ostensibly protect the interests of large online platforms may instead be reinterpreted in such a way as to curb their outsized influence. A current example concerns Section 230 of the Communications Decency Act in the United States, which typically exempts large online platforms from liability for user-generated content and activity. As the law also prescribes exemptions from liability for taking down objectionable content, some lawmakers are considering whether Section 230 can empower users to remove or customise content on online platforms at their discretion rather than that of powerful corporations.

    Most importantly, the multifaceted and complicated nature of corporate censorship underlines it not only as a concerning problem but also on the need to avoid any well-worn assumptions and clichés about online content moderation.

    About the Authors

    Sean Tan and Tan E-Reng are both Senior Analyst and Research Analyst, respectively, in the Centre of Excellence for National Security (CENS) at the S. Rajaratnam School of International Studies (RSIS), Nanyang Technological University (NTU), Singapore.

    Categories: RSIS Commentary Series / Country and Region Studies / East Asia and Asia Pacific / South Asia / Southeast Asia and ASEAN / Global
    comments powered by Disqus
    Ponder It: Assumptions About Censorship in the Digital Domain Are Not Always What They Seem
    Sean Tan and Tan E-Reng discuss and add on some insights to their commentary, in our Ponder It series.

    SYNOPSIS

    Despite their public reputation as libertarian bastions of free speech, large private online communication platforms do not necessarily uphold the principles that underpin the freedom of information, particularly where the public interest is concerned.

    COMMENTARY

    While voices against online content moderation have been chiefly confined to a minority of supposed “free speech absolutists”, the recent arrest of Telegram chief Pavel Durov raised a few more public eyebrows. Unease at Durov’s detention by French prosecutors arose not only from high-profile anti-censorship proponents but also from within communities reliant on Telegram for vital, unfiltered information. A similar situation is seen in Russia, where the messaging app is widely used both by the government and its rivals.

    Denying any political motivations, French authorities have emphasised Telegram’s lack of appropriate moderation and the resultant complicity in cybercrime, including child sexual exploitation, drug-related offences, and other illegal content encrypted on the app. Durov’s arrest evokes familiar clichés, such as “state suppression of speech”, “infringement of the private sphere by public entities”, and debates about the exercise of regulatory oversight by governments.

    However, this framing insinuates that censorship is the sole preserve of states and governments. Inclined as they are to focus on the act of restricting online content, many impassioned defenders of free speech often overlook one of the largest potential curtailers of private (and public) discourse: the private sector itself.

    Who wields the power to censor?

    In the digital realm, content moderation and censorship are sometimes distinguished by intent. Sometimes, moderation is conflated with censorship. However, the crux of the matter may instead be one of relative reach and influence.

    The notion of censorship imposed by a higher authority is often associated with powerful state organs that retain a “monopoly on control” (echoing Weberian concepts of the state’s monopoly on violence). Newer theories of structural power in political economy, however, emphasise the immense – and still growing – role of large corporations and, subsequently, direct contestation between private enterprises and states across multiple domains. This includes the communications sector, where the sway held by large media companies over information flows and the dissemination of narratives can easily rival (or even surpass) that of states. The sheer outsized influence held by these corporations arguably allows them the ability to censor.

    The evolution of large online platforms from the late 20th century to the present day mirrors their similarly transforming influence. On the one hand, these platforms have long adopted common legal measures (such as terms of service and privacy policies), which initially maintained oversight over how users interacted with their content while upholding “open” internet access principles. On the other hand, platform owners can now increasingly exploit user content, including suppressing undesirable content.

    Who wants the power to censor?

    Durov’s arrest sets a precedent – it establishes criminal accountability for online platform owners regarding how their platforms are used (and abused). To date, few (if any) owners of large online communication or social media platforms have been regarded as criminally liable for the user-generated activity and content that their platforms host.

    On the surface, Durov’s anti-establishment ethos, his public commitment to user privacy and encryption, and the app’s popularity among opposition movements in authoritarian states imply that Telegram’s owners neither engage in censorship nor intend to do so. Indeed, Telegram’s laissez-faire content moderation philosophy has even earned it the title of a “go-to app for troublemakers”.

    Yet, responses to Durov’s arrest from his most ardent supporters have been revealing, sometimes even contradictory. X owner Elon Musk, who was quick to reiterate his description of content moderation as “propaganda” for censorship, is simultaneously observed as a willing (and prolific) content remover on his own platform. Decisions to obscure content are often shaped by Musk’s personal views. Some evidence even suggests that the platform deliberately limits users’ access to politically-opposed sources.

    Conservative American political commentator Tucker Carlson also labelled Durov “a living warning to any platform owner who refuses to censor the truth at the behest of governments”. Speaking with Carlson last April, Durov had emphasised his reluctance to comply with government directions to restrict access to certain forms of content. He stated that he would not consider requests deemed to impinge upon Telegram’s values of free speech and privacy.

    However, Durov’s decision can also be interpreted as a subjective and discriminating judgement on appropriate content based on a platform owner’s personal beliefs. Moreover, even if one were to assume there is a hands-off approach to content regulation by default, contrary to its pro-privacy stance, Telegram does not automatically enable end-to-end encryption on most of its user conversations, which strongly suggests that it keeps a much closer eye on private communications than publicly disclosed.

    Where else might we identify “corporate censorship”?

    The above examples illustrate large online platforms’ preparedness to scrutinise users’ speeches and expressions according to their owners’ interests. However, personal objectives can also be strongly intertwined with commercial priorities. Examples include Google’s internal content moderation policies regarding the Israel-Hamas conflict, which is linked to its contract to provide Israel with cloud computing services, and moderation on X prompted by the potential loss of lucrative advertising revenue.

    Having said this, corporate censorship does not necessarily reflect commercial or private interests in competition with state or public agendas. In China, where private sector companies have significant state ties, the state may use influential private firms as tools to achieve its political aims. This has recently led to speculation about the likelihood of politically motivated censorship on Chinese private gaming platforms, such as instructions for players to avoid discussing, for instance, specific political issues and COVID-19-related content on the newly released game Black Myth: Wukong.

    Final Takeaways

    Despite regulatory efforts by authorities to rein in the outsized influence of large private corporations and their online platforms, some members of the public – particularly those who favour certain apps and platforms for personal or political reasons – may nonetheless remain suspicious about state intentions.

    To improve transparency and increase public trust, governments can augment ongoing regulatory efforts with initiatives to broaden the public’s involvement in the regulatory process. For example, the European Union’s Digital Services Act includes mechanisms to provide public interest researchers with access to internal platform data, for appointing trusted independent flaggers to detect illegal content, and for whistleblowers with insider knowledge about platforms to notify authorities of any potentially unlawful activity.

    Also, existing laws that ostensibly protect the interests of large online platforms may instead be reinterpreted in such a way as to curb their outsized influence. A current example concerns Section 230 of the Communications Decency Act in the United States, which typically exempts large online platforms from liability for user-generated content and activity. As the law also prescribes exemptions from liability for taking down objectionable content, some lawmakers are considering whether Section 230 can empower users to remove or customise content on online platforms at their discretion rather than that of powerful corporations.

    Most importantly, the multifaceted and complicated nature of corporate censorship underlines it not only as a concerning problem but also on the need to avoid any well-worn assumptions and clichés about online content moderation.

    About the Authors

    Sean Tan and Tan E-Reng are both Senior Analyst and Research Analyst, respectively, in the Centre of Excellence for National Security (CENS) at the S. Rajaratnam School of International Studies (RSIS), Nanyang Technological University (NTU), Singapore.

    Categories: RSIS Commentary Series / Country and Region Studies

    Popular Links

    About RSISResearch ProgrammesGraduate EducationPublicationsEventsAdmissionsCareersVideo/Audio ChannelRSIS Intranet

    Connect with Us

    rsis.ntu
    rsis_ntu
    rsisntu
    rsisvideocast
    school/rsis-ntu
    rsis.sg
    rsissg
    RSIS
    RSS
    Subscribe to RSIS Publications
    Subscribe to RSIS Events

    Getting to RSIS

    Nanyang Technological University
    Block S4, Level B3,
    50 Nanyang Avenue,
    Singapore 639798

    Click here for direction to RSIS

    Get in Touch

      Copyright © S. Rajaratnam School of International Studies. All rights reserved.
      Privacy Statement / Terms of Use
      Help us improve

        Rate your experience with this website
        123456
        Not satisfiedVery satisfied
        What did you like?
        0/255 characters
        What can be improved?
        0/255 characters
        Your email
        Please enter a valid email.
        Thank you for your feedback.
        This site uses cookies to offer you a better browsing experience. By continuing, you are agreeing to the use of cookies on your device as described in our privacy policy. Learn more
        OK
        Latest Book
        more info