Back
About RSIS
Introduction
Building the Foundations
Welcome Message
Board of Governors
Staff Profiles
Executive Deputy Chairman’s Office
Dean’s Office
Management
Distinguished Fellows
Faculty and Research
Associate Research Fellows, Senior Analysts and Research Analysts
Visiting Fellows
Adjunct Fellows
Administrative Staff
Honours and Awards for RSIS Staff and Students
RSIS Endowment Fund
Endowed Professorships
Career Opportunities
Getting to RSIS
Research
Research Centres
Centre for Multilateralism Studies (CMS)
Centre for Non-Traditional Security Studies (NTS Centre)
Centre of Excellence for National Security
Institute of Defence and Strategic Studies (IDSS)
International Centre for Political Violence and Terrorism Research (ICPVTR)
Research Programmes
National Security Studies Programme (NSSP)
Social Cohesion Research Programme (SCRP)
Studies in Inter-Religious Relations in Plural Societies (SRP) Programme
Other Research
Future Issues and Technology Cluster
Research@RSIS
Science and Technology Studies Programme (STSP) (2017-2020)
Graduate Education
Graduate Programmes Office
Exchange Partners and Programmes
How to Apply
Financial Assistance
Meet the Admissions Team: Information Sessions and other events
RSIS Alumni
Outreach
Global Networks
About Global Networks
RSIS Alumni
Executive Education
About Executive Education
SRP Executive Programme
Terrorism Analyst Training Course (TATC)
International Programmes
About International Programmes
Asia-Pacific Programme for Senior Military Officers (APPSMO)
Asia-Pacific Programme for Senior National Security Officers (APPSNO)
International Conference on Cohesive Societies (ICCS)
International Strategy Forum-Asia (ISF-Asia)
Publications
RSIS Publications
Annual Reviews
Books
Bulletins and Newsletters
RSIS Commentary Series
Counter Terrorist Trends and Analyses
Commemorative / Event Reports
Future Issues
IDSS Papers
Interreligious Relations
Monographs
NTS Insight
Policy Reports
Working Papers
External Publications
Authored Books
Journal Articles
Edited Books
Chapters in Edited Books
Policy Reports
Working Papers
Op-Eds
Glossary of Abbreviations
Policy-relevant Articles Given RSIS Award
RSIS Publications for the Year
External Publications for the Year
Media
Cohesive Societies
Sustainable Security
Other Resource Pages
News Releases
Speeches
Video/Audio Channel
External Podcasts
Events
Contact Us
S. Rajaratnam School of International Studies Think Tank and Graduate School Ponder The Improbable Since 1966
Nanyang Technological University Nanyang Technological University
  • About RSIS
      IntroductionBuilding the FoundationsWelcome MessageBoard of GovernorsHonours and Awards for RSIS Staff and StudentsRSIS Endowment FundEndowed ProfessorshipsCareer OpportunitiesGetting to RSIS
      Staff ProfilesExecutive Deputy Chairman’s OfficeDean’s OfficeManagementDistinguished FellowsFaculty and ResearchAssociate Research Fellows, Senior Analysts and Research AnalystsVisiting FellowsAdjunct FellowsAdministrative Staff
  • Research
      Research CentresCentre for Multilateralism Studies (CMS)Centre for Non-Traditional Security Studies (NTS Centre)Centre of Excellence for National SecurityInstitute of Defence and Strategic Studies (IDSS)International Centre for Political Violence and Terrorism Research (ICPVTR)
      Research ProgrammesNational Security Studies Programme (NSSP)Social Cohesion Research Programme (SCRP)Studies in Inter-Religious Relations in Plural Societies (SRP) Programme
      Other ResearchFuture Issues and Technology ClusterResearch@RSISScience and Technology Studies Programme (STSP) (2017-2020)
  • Graduate Education
      Graduate Programmes OfficeExchange Partners and ProgrammesHow to ApplyFinancial AssistanceMeet the Admissions Team: Information Sessions and other eventsRSIS Alumni
  • Outreach
      Global NetworksAbout Global NetworksRSIS Alumni
      Executive EducationAbout Executive EducationSRP Executive ProgrammeTerrorism Analyst Training Course (TATC)
      International ProgrammesAbout International ProgrammesAsia-Pacific Programme for Senior Military Officers (APPSMO)Asia-Pacific Programme for Senior National Security Officers (APPSNO)International Conference on Cohesive Societies (ICCS)International Strategy Forum-Asia (ISF-Asia)
  • Publications
      RSIS PublicationsAnnual ReviewsBooksBulletins and NewslettersRSIS Commentary SeriesCounter Terrorist Trends and AnalysesCommemorative / Event ReportsFuture IssuesIDSS PapersInterreligious RelationsMonographsNTS InsightPolicy ReportsWorking Papers
      External PublicationsAuthored BooksJournal ArticlesEdited BooksChapters in Edited BooksPolicy ReportsWorking PapersOp-Eds
      Glossary of AbbreviationsPolicy-relevant Articles Given RSIS AwardRSIS Publications for the YearExternal Publications for the Year
  • Media
      Cohesive SocietiesSustainable SecurityOther Resource PagesNews ReleasesSpeechesVideo/Audio ChannelExternal Podcasts
  • Events
  • Contact Us
    • Connect with Us

      rsis.ntu
      rsis_ntu
      rsisntu
      rsisvideocast
      school/rsis-ntu
      rsis.sg
      rsissg
      RSIS
      RSS
      Subscribe to RSIS Publications
      Subscribe to RSIS Events

      Getting to RSIS

      Nanyang Technological University
      Block S4, Level B3,
      50 Nanyang Avenue,
      Singapore 639798

      Click here for direction to RSIS

      Get in Touch

    Connect
    Search
    • RSIS
    • Publication
    • RSIS Publications
    • Elections Integrity in Fake News Era: Who Protects, and How?
    • Annual Reviews
    • Books
    • Bulletins and Newsletters
    • RSIS Commentary Series
    • Counter Terrorist Trends and Analyses
    • Commemorative / Event Reports
    • Future Issues
    • IDSS Papers
    • Interreligious Relations
    • Monographs
    • NTS Insight
    • Policy Reports
    • Working Papers

    CO18200 | Elections Integrity in Fake News Era: Who Protects, and How?
    Shashi Jayakumar

    29 November 2018

    download pdf

    Synopsis

    How are disinformation and fake news threats evolving? What are the key pressure points in democratic societies and what can be done to protect against these threats? How should responsibility by apportioned?

    Commentary

    AN INTRIGUING, but all too brief section in the Singapore Parliamentary Select Committee report on Deliberate Online Falsehoods deals with how to protect elections in Singapore against potential foreign interference. The Committee notes that it did not receive detailed analysis as to whether Singapore’s electoral laws are sufficiently comprehensive and modernised to combat “the sophisticated methods employed by malicious actors today to undermine elections”.

    The dangers are clear enough. State sponsored disinformation and subversion can undermine the workings of democratic society, weaken the trust between government and people, and erode pluralism. While it might be difficult for an influence campaign to ensure a particular outcome in an election, it is perfectly feasible, as Former Facebook Chief of security Alex Stamos recently observed, for such a campaign to throw any election into chaos.

    Foreign Disinformation in Singapore & Region

    The litany of recent (and indeed ongoing) case history is plain to see. Russian activities during the 2016 US presidential elections received extensive coverage in the Singapore Parliamentary Select Committee report.

    The temptation might be to take the report as purely an academic exercise simply because the Russians might seem far away. This would be a mistake. The Select Committee received a confidential briefing by a security agency in Singapore, detailing how Singapore has been the subject of foreign disinformation operations by various states.

    There are also other examples close to home. Consider the baffling and still-unexplained rise of a twitter automated account (“bot”) army throughout Southeast Asia earlier this year. Its appearance may or may not have been be tied to the Malaysian general election, which saw its twittersphere flooded by pro-government and anti-Opposition messages by bots of unknown origin. The threat therefore is real.

    The Singapore Context

    Singapore’s Parliamentary Elections Act contains clauses prohibiting certain classes of people, such as foreigners, from taking part in any election activity. Of course, this is not a silver bullet. Besides direct elections interference by states, there is the issue of guns for hire in the international corporate sector that offer similar social media manipulation toolkits to the highest bidder.

    The most notorious was Cambridge Analytica and its parent, Strategic Communications Limited. Entities like these will likely continue to operate partly in the shadows.

    If Singapore authorities have reasonable grounds to believe that the methods they employ on clients’ behalf might have a deleterious impact on Singapore, or might influence an electoral outcome in Singapore, then it would be logical for the government to attempt to hold these companies, and certain individuals within them, accountable.

    This might seem far-fetched at present, but it would be consistent with the approach taken for Singapore’s Transboundary Haze Pollution Act, where individuals within companies thought to be responsible for forest fires elsewhere (that in turn have a negative impact on Singapore) can be held accountable.

    This could come through enacting new legislation specifically to curb foreign interference of this type, or through amendments to existing legislation.

    Social Media Companies

    There is also a need to work with social media companies, which are showing belated signs of stepping up to the plate. Facebook set up a “war room” specifically to deal with challenges posed by fake news during the recent elections in Brazil and elsewhere. Despite its efforts in Brazil, false news proliferated across Facebook and WhatsApp during the polarising campaign.

    Observers and advocacy groups were concerned in particular with an explosion in the number of well-organised propaganda campaigns (including hoaxes and misleading news) on WhatsApp, seemingly orchestrated by supporters of the eventual far-right victor Jair Bolsonaro. There were calls – not heeded by Facebook, which owns WhatsApp − for WhatsApp to lower its forwarding limit in Brazil from 20 recipients to five (as it has done in India) in order to reduce the impact of these campaigns.

    It is worth pondering, firstly, whether laws are needed that can compel WhatsApp (which is part of Facebook) to do what advocates in Brazil failed to do by suasion, especially in the heat of electoral battle. Secondly, and perhaps more importantly, will companies like Facebook be prepared to open up the inner workings of their war rooms to governments in future elections to allow for real time input, verification and investigation? If no, then should legal means be employed?

    Real-world cooperation between the social media platforms and governments will be key. Having these two actors operate in a clearinghouse fashion to settle issues as they come up in the heat of election campaigning is preferable to increasing the burden on the judicial system.

    Consider for example the proposed French law on information manipulation. The law, recently passed by the Senate, will allow political parties or candidates to complain about widely spread assertions deemed to be false or “implausible” during the run-up to elections. The general premise has come under a great deal of criticism by free speech advocates, as have the specific provisions which detail how a judge must decide within 48 hours whether the allegedly false information could alter the course of an election.

    If these are fulfilled, the judge can order a block on publication. The difficulty is that this places a burden on the judicial system: potentially intractable questions of interpretation may occur, and hasty decisions may be made while facts are still emerging.

    What Lies Ahead

    Aggressors are continually honing their methods. Those who set up fake accounts aimed at influencing the United States mid-term elections went through far greater pains to hide their identities than we have seen with the Kremlin-lined Internet Research Agency that interfered in the 2016 US presidential election.

    Technology (think Artificial Intelligence, and the use of “Deep Fakes”, which can synthesise video and audio in a manner indistinguishable from the real thing) will increasingly feature in the arsenal of subversive actors.

    Those playing defence are kept for the most part on the back foot. What this means is that any new laws or amendments to existing ones aimed at combatting fake news and disinformation in or out of an election period will have as far as possible to be future-proof, taking into account these evolutions.

    New legal provisions focusing solely on containing the present will quickly become anachronisms.

    About the Author

    Shashi Jayakumar is Head, Centre of Excellence for National Security (CENS) and Executive Coordinator, Future Issues and Technology at the S. Rajaratnam School of International Studies (RSIS), Nanyang Technological University (NTU), Singapore.

    Categories: RSIS Commentary Series / Country and Region Studies / Cybersecurity, Biosecurity and Nuclear Safety / Non-Traditional Security / East Asia and Asia Pacific / South Asia / Southeast Asia and ASEAN / Global
    comments powered by Disqus

    Synopsis

    How are disinformation and fake news threats evolving? What are the key pressure points in democratic societies and what can be done to protect against these threats? How should responsibility by apportioned?

    Commentary

    AN INTRIGUING, but all too brief section in the Singapore Parliamentary Select Committee report on Deliberate Online Falsehoods deals with how to protect elections in Singapore against potential foreign interference. The Committee notes that it did not receive detailed analysis as to whether Singapore’s electoral laws are sufficiently comprehensive and modernised to combat “the sophisticated methods employed by malicious actors today to undermine elections”.

    The dangers are clear enough. State sponsored disinformation and subversion can undermine the workings of democratic society, weaken the trust between government and people, and erode pluralism. While it might be difficult for an influence campaign to ensure a particular outcome in an election, it is perfectly feasible, as Former Facebook Chief of security Alex Stamos recently observed, for such a campaign to throw any election into chaos.

    Foreign Disinformation in Singapore & Region

    The litany of recent (and indeed ongoing) case history is plain to see. Russian activities during the 2016 US presidential elections received extensive coverage in the Singapore Parliamentary Select Committee report.

    The temptation might be to take the report as purely an academic exercise simply because the Russians might seem far away. This would be a mistake. The Select Committee received a confidential briefing by a security agency in Singapore, detailing how Singapore has been the subject of foreign disinformation operations by various states.

    There are also other examples close to home. Consider the baffling and still-unexplained rise of a twitter automated account (“bot”) army throughout Southeast Asia earlier this year. Its appearance may or may not have been be tied to the Malaysian general election, which saw its twittersphere flooded by pro-government and anti-Opposition messages by bots of unknown origin. The threat therefore is real.

    The Singapore Context

    Singapore’s Parliamentary Elections Act contains clauses prohibiting certain classes of people, such as foreigners, from taking part in any election activity. Of course, this is not a silver bullet. Besides direct elections interference by states, there is the issue of guns for hire in the international corporate sector that offer similar social media manipulation toolkits to the highest bidder.

    The most notorious was Cambridge Analytica and its parent, Strategic Communications Limited. Entities like these will likely continue to operate partly in the shadows.

    If Singapore authorities have reasonable grounds to believe that the methods they employ on clients’ behalf might have a deleterious impact on Singapore, or might influence an electoral outcome in Singapore, then it would be logical for the government to attempt to hold these companies, and certain individuals within them, accountable.

    This might seem far-fetched at present, but it would be consistent with the approach taken for Singapore’s Transboundary Haze Pollution Act, where individuals within companies thought to be responsible for forest fires elsewhere (that in turn have a negative impact on Singapore) can be held accountable.

    This could come through enacting new legislation specifically to curb foreign interference of this type, or through amendments to existing legislation.

    Social Media Companies

    There is also a need to work with social media companies, which are showing belated signs of stepping up to the plate. Facebook set up a “war room” specifically to deal with challenges posed by fake news during the recent elections in Brazil and elsewhere. Despite its efforts in Brazil, false news proliferated across Facebook and WhatsApp during the polarising campaign.

    Observers and advocacy groups were concerned in particular with an explosion in the number of well-organised propaganda campaigns (including hoaxes and misleading news) on WhatsApp, seemingly orchestrated by supporters of the eventual far-right victor Jair Bolsonaro. There were calls – not heeded by Facebook, which owns WhatsApp − for WhatsApp to lower its forwarding limit in Brazil from 20 recipients to five (as it has done in India) in order to reduce the impact of these campaigns.

    It is worth pondering, firstly, whether laws are needed that can compel WhatsApp (which is part of Facebook) to do what advocates in Brazil failed to do by suasion, especially in the heat of electoral battle. Secondly, and perhaps more importantly, will companies like Facebook be prepared to open up the inner workings of their war rooms to governments in future elections to allow for real time input, verification and investigation? If no, then should legal means be employed?

    Real-world cooperation between the social media platforms and governments will be key. Having these two actors operate in a clearinghouse fashion to settle issues as they come up in the heat of election campaigning is preferable to increasing the burden on the judicial system.

    Consider for example the proposed French law on information manipulation. The law, recently passed by the Senate, will allow political parties or candidates to complain about widely spread assertions deemed to be false or “implausible” during the run-up to elections. The general premise has come under a great deal of criticism by free speech advocates, as have the specific provisions which detail how a judge must decide within 48 hours whether the allegedly false information could alter the course of an election.

    If these are fulfilled, the judge can order a block on publication. The difficulty is that this places a burden on the judicial system: potentially intractable questions of interpretation may occur, and hasty decisions may be made while facts are still emerging.

    What Lies Ahead

    Aggressors are continually honing their methods. Those who set up fake accounts aimed at influencing the United States mid-term elections went through far greater pains to hide their identities than we have seen with the Kremlin-lined Internet Research Agency that interfered in the 2016 US presidential election.

    Technology (think Artificial Intelligence, and the use of “Deep Fakes”, which can synthesise video and audio in a manner indistinguishable from the real thing) will increasingly feature in the arsenal of subversive actors.

    Those playing defence are kept for the most part on the back foot. What this means is that any new laws or amendments to existing ones aimed at combatting fake news and disinformation in or out of an election period will have as far as possible to be future-proof, taking into account these evolutions.

    New legal provisions focusing solely on containing the present will quickly become anachronisms.

    About the Author

    Shashi Jayakumar is Head, Centre of Excellence for National Security (CENS) and Executive Coordinator, Future Issues and Technology at the S. Rajaratnam School of International Studies (RSIS), Nanyang Technological University (NTU), Singapore.

    Categories: RSIS Commentary Series / Country and Region Studies / Cybersecurity, Biosecurity and Nuclear Safety / Non-Traditional Security

    Popular Links

    About RSISResearch ProgrammesGraduate EducationPublicationsEventsAdmissionsCareersVideo/Audio ChannelRSIS Intranet

    Connect with Us

    rsis.ntu
    rsis_ntu
    rsisntu
    rsisvideocast
    school/rsis-ntu
    rsis.sg
    rsissg
    RSIS
    RSS
    Subscribe to RSIS Publications
    Subscribe to RSIS Events

    Getting to RSIS

    Nanyang Technological University
    Block S4, Level B3,
    50 Nanyang Avenue,
    Singapore 639798

    Click here for direction to RSIS

    Get in Touch

      Copyright © S. Rajaratnam School of International Studies. All rights reserved.
      Privacy Statement / Terms of Use
      Help us improve

        Rate your experience with this website
        123456
        Not satisfiedVery satisfied
        What did you like?
        0/255 characters
        What can be improved?
        0/255 characters
        Your email
        Please enter a valid email.
        Thank you for your feedback.
        This site uses cookies to offer you a better browsing experience. By continuing, you are agreeing to the use of cookies on your device as described in our privacy policy. Learn more
        OK
        Latest Book
        more info