• Article

Defending democracy

Decorative image

Originally published on Luminate.com

Across the world, the public debates that shape our politics are going digital. The online political culture has opened the doors to a diversity of views and expanded free expression. We celebrate that inclusivity.  But these changes have also created major new problems for the integrity of our public sphere – weakening rationality, integrity, civility, and authenticity as pillars of democratic deliberation. Beneath the now-ubiquitous trope of “fake news” as the scourge of rational public debate are deep structural challenges for our democracies.

Anyone with a social media account can watch it happening. The Internet is awash in hate spam and disinformation. In part, it is a digital version of the partisan media, political propaganda, and good old fashioned nonsense that has always been present in public debates. But there is something new here that leverages the power of new technology.  There are low-cost, high-impact, state-sponsored attempts to sow hatred and division in other nations. There are organized efforts to drown out honest debate with toxic rhetoric, amplified with armies of automated social media accounts. And at the center are digital platform companies – some of the wealthiest companies in history – whose business model is rooted in tracking the behavior of billions of people and using that to data deliver information and media that keeps them clicking. This is not a recipe for civic virtue.

Painfully, our democracies are being pulled apart by the abuse of information technologies whose powers of enlightenment and education have been bent to the ends of manipulation and conspiracy. We must meet this challenge with public policies to steer technology markets back towards serving the interests of democracy and social welfare.

This work begins with an international alignment around a set of principles – the rights and duties of a Digital Democracy Charter. And it makes change through the implementation of an Action Plan – national strategies that establish rights and responsibilities for the digital future through policy change. This is how we defend democracy from the forces of tribalism, disinformation, hate and intolerance.  This is how we restore a vision of technology as a revolutionary force for the public good.

At the Paris Peace Forum this upcoming week, we will offer a draft Digital Democracy Charter that we invite public, private, and civic sector leaders to use as a foundation to build the future they want to see. The Charter is organized around these principles:

**Principle 1: REMOVE – **We have a right to be protected from illegal content online.

**Principle 2: REDUCE – **We have a duty to shield the public from fraudulent communication.

**Principle 3: SIGNAL – **We have the right to know who is trying to influence our political views and how they are doing it.

**Principle 4: AUDIT – **We have a right to public oversight of the social impact of technologies that automate control over mass information markets.

**Principle 5: PRIVACY – **We have a right to data privacy.

**Principle 6: COMPETE – **We have a duty to protect the public against the exploitation of concentrated market power.

**Principle 7: SECURE – **We have a duty to protect the integrity of our democratic institutions.

**Principle 8: EDUCATE – **We have a duty to educate the public about the social and political impact of new technologies.

**Principle 9: INFORM – **We have a duty to foster a robust public sphere and an informed electorate.

The Charter is a starting point of international collaboration. But it will only matter if we translate it into national action plans that make change. Priorities and implementation strategies will differ among countries – as each nation wrestles with its own particular challenges. And we can learn from each other’s experiences, share ideas, and join forces where it makes sense.  This “agile multilateralism” is international relations for the fast-moving age of technology revolutions. But make no mistake, principles mean little without the Action Plans.  We hope discussions at the Peace Forum will spark concrete next steps in countries around the world.

We are eager to support organizations and governments working on affirmative policy change. In addition to the draft Charter, we are also presenting a template for an Action Plan – a set of policy proposals that will be relevant across every democracy.

There are no single-solutions that can meaningfully change outcomes. Only a combination of policies – all of which are necessary and none of which are sufficient by themselves – will begin to show results over time. We can start immediately by treating the worst of the symptoms.  That means increasing cyber-security against foreign interference, combating online hate speech and fraud, and increasing transparency in political advertising. Dealing with these urgent issues will lead us to root causes, the structural questions at the center of the data economy and the digital information marketplace.  We must establish new data rights, modernize competition policy, and make long term investments in digital literacy and public service journalism.

We cannot predict the course of the path that will restore the integrity of the democratic public sphere in a digital age. But we can begin with a reform agenda that will help us firmly harness the power and potential of technology to the interests of the common good.

Digital Democracy Charter

Countering Online Disinformation, Strengthening our Democracies

Our democracies are awash in digital disinformation. It manipulates political viewpoint, distorts electoral outcomes, and threatens the integrity of self-government rooted in rational public debate. This phenomenon accompanies a paradigm shift in information markets as the Internet displaces traditional media as the primary distribution channel for news.  In many ways, we have benefited profoundly from the decentralization of access to knowledge and communications.  But we have also undermined the traditional market for public service journalism and the foundations of democracy that rely upon it. There are fewer professional reporters, and the credibility of traditional newsrooms is declining in the public sphere.

At the center of this emerging crisis is the power of platform monopolies whose business model is to track, target, and segment people into audiences that are highly susceptible to particular kinds of content. It is a highly profitable business model for commercial advertising and personalized information. But it is also an ideal vector for manipulative publishers and advertisers (including covert agents of foreign powers) to find audiences for every type of prejudice and nonsense. Over time, these target audiences become trapped in “filter bubbles” – groups of people categorized by common predispositions that are fed a steady diet of similar content that reaffirm preexisting beliefs. To hold the attention of these groups (so they can be shown more ads), platform company algorithms raise the level of outrage and sensationalism, normalizing what were once extreme views. Fragmentation, polarization, propaganda, and manipulation in the news media are externalities of the digital economy.

We now stand at an inflection point.  Our democracies are being pulled apart by the abuse of technologies that were once heralded as liberatory. We must design a public policy response to steer technological development back towards serving the wellbeing of democratic society. There are no single-solutions that can meaningfully change outcomes.  Only a combination of policies – all of which are necessary and none of which are sufficient by themselves – will begin to show results over time.  We cannot predict the course of the path that will restore the integrity of the democratic public sphere in a digital age.  But we can begin with a reform agenda that will take us in the right direction.

**DIGITAL DEMOCRACY CHARTER **

**Principle 1: REMOVE  **We have a right to be protected from illegal content. The limited types of content that are already illegal in democratic societies – such as hate speech, defamation, and incitement to violence – should be removed from the Internet. The duty to remove it rapidly should be assigned to major platform companies (who have the necessary resources, technologies and responsibility) with the close supervision of regular order judicial review and a transparent process, including a fast-track appeals process. Because of the risk of overreach and infringements on legitimate speech, this practice should be strictly limited.

**Principle 2: REDUCE **We have a duty to shield the public from fraudulent media. Large digital media platforms should maintain responsive channels to receive input from users, civil society organizations, news organizations and commercial partners. In this way, fraudulent media channels, inauthentic accounts, and malicious disinformation can be flagged for review and down-ranked in algorithmic curation before they can go viral.

**Principle 3:  SIGNAL  **We have the right to know who is trying to influence our political views and how they are doing it. The purveyors of disinformation amplify false narratives through the opaque channels of targeted digital advertising and the amplification of bot networks. New regulations should mandate that all automated accounts are clearly labelled. And we should require that the source of an ad, the funding behind it, and the scope of its reach are explicit to the end-user.

**Principle 4: AUDIT  **We have a right to public oversight of the social impact of technologies that automate decisions in information markets that influence daily life. The technologies that mine large data sets to make predictive judgements, target advertising, and curate digital media feeds are increasingly sophisticated forms of artificial intelligence. These technologies have the potential for enormous social impact – positive and negative – and should be subject to government review, including assessments of training data, design bias and discriminatory outcomes.  These audits should mirror in form and function the health and safety inspections of conventional industries.

**Principle 5: PRIVACY  **We have a right to data privacy.  Mass collection of personal data feeds the algorithms that determine what kind of media content we will see and how often, facilitating the creation of filter bubbles that fracture our political cultures. Individuals have a right to control how data is used to shape their experiences. To counteract this phenomenon, we must tighten and enforce laws that give users more control over how data is collected, used, and monetized. In principle, the less data we provide, the less precisely we will be targeted, and the less likely we will be shunted by algorithms into media communities that reinforce false beliefs.

**Principle 6: COMPETE  **We have a duty to protect the public against the exploitation of concentrated market power. In the realm of digital media, this means we must seek to ensure that consumers have meaningful options to find, send and receive information over digital media. The rise of platform monopolies underscores the need to open markets to new competitors and products with policies such as data portability, restrictions on mergers, and access to essential services.

**Principle 7: SECURE **We have a duty to protect the integrity of our democracy from outside intervention. The recent attempts by foreign powers to use a combination of digital disinformation and cyber-attacks to influence electoral outcomes must be treated as a direct threat to democratic government. Political institutions – such as parties, campaigns and election administration – should be treated as critical infrastructure and afforded the same degree of cyber-security protection as the electrical grid and the water system.

**Principle 8:  EDUCATE **We have a duty to educate the public about the social and political impact of new technologies.  We are in the early stages of digital media’s rise to dominance of global information systems. The traditional standards and signals of source credibility have deteriorated along with the fragmentation of the market. As a society, we need to establish digital media literacy skills in our educational curricula.  And we need to work with civil society groups and public service news organizations to generate broad public awareness about the problem of disinformation.

Principle 9:  INFORM We have a duty to foster a robust public sphere and an informed electorate.  The rise of disinformation as a disruptive phenomenon in democracy coincides with the declining commercial viability of public service journalism, even as the public’s need for it grows. We need public policies designed to reinvigorate journalism.  These may include support for the modernization of public media channels or tax benefits for newsrooms that satisfy basic professional requirements.

Action Plan

Protecting Democracy from Digital Threats

The Digital Democracy Charter is a starting point to align public, private and civic sector stakeholders around shared principles of reform. But that will only matter if we translate it into actions that make change. Because every country faces its own unique combination of challenges with technology and democracy, we must focus on national action plans to craft and implement new policies that harness the power of technology to democratic goals. Plans, insights, and learnings can be coordinated across borders in an “agile multilateralism” even as nations forge ahead with domestic reform agendas

There are no single-solutions that can meaningfully address digital threats to democracy. Only a combination of policies – all of which are necessary and none of which are sufficient by themselves – will begin to show results over time. We can start immediately with a first phase of actions by treating the worst of the symptoms.  That means increasing cyber-security against foreign interference, increasing transparency in political advertising, and combating online hate speech and fraud. Dealing with these urgent issues will lead us to a second phase – addressing root causes. These are the structural tensions between technology and democracy at the center of the data economy and the digital information marketplace.  We must establish a data “bill of rights”, modernize competition policy, and make long term investments in digital literacy and public service journalism.

Phase 1 – Rapid Response:  Digital disinformation in our democracy manipulates political viewpoint, divides our society, weakens the integrity of public debate, normalizes extremism and distorts electoral outcomes. The top priority of the Action Plan is to address immediate symptoms of the exploitation of information markets. First, it is a national security priority to shut down organized disinformation operations that come from foreign agents. Second, it is a clear matter of democratic integrity to mandate transparency for all paid political advertising. And third, it is imperative of public safety and consumer protection to give citizens confidence that illegal activity online will not be tolerated – from cyber-attacks and data breaches to hate speech, harassment, and fraud. 

SECURITY:  We will protect the integrity of our elections from the cyber-attacks and disinformation campaigns of foreign actors and criminal enterprise.

  • Cyber-security for Democracy:  Electoral institutions – such as parties, campaigns and election administration – will meet the same standard of cyber-security protection that we use for critical infrastructure like the electric grid and the water system.- Research:  Resources will be applied to transparent programs at security services and universities to monitor, track and expose organized disinformation operations.- Market Regulation: Companies will be required to take all reasonable measures to protect sensitive data and prevent the abuse of digital media by foreign actors. TRANSPARENCY:  We will protect the right of citizens to know who is trying to influence their political views and how they are doing it. New regulations will curb the amplification of false narratives through the opaque channels of targeted digital advertising, organized political spam, and automated networks of social media accounts.
  • Political Ad Disclosure:  All online political ads must be made available in a searchable database and all political advertisers must be verified as legal.  Each ad must disclose in real time to the consumer the source of the ad, the true source of the funding behind it, and all of the targeting criteria that brought the ad to a specific individual.- Countering Political Spam:  All digital media accounts that exhibit behaviors of automation or high-frequency spam should be clearly labelled as a default setting CONSUMER PROTECTION:  We will protect the public from illegal content. We will apply adaptive, transparent regulations to remove types of content that are already illegal in our democracy – such as hate speech and incitement to violence. We will develop new systems that leverage corporate technologies to find and remove illegal content with the supervision of regular judicial review and a transparent process, including a fast-track appeals process. Because of the risk of infringements on legitimate speech, this practice will be strictly limited.

Phase 2 – Confronting Root Causes:  At the root of the digital democracy crisis is the powerful business model of platform monopolies. First, they leverage the vast quantity of behavioral data they collect through surveillance of billions of Internet users to rent advertisers access to the attention of highly targeted audiences. Then they use sophisticated AI to customize content on digital media to maximize the amount of time people are online and available to see ads.  Of course, in exchange for watching ads, they offer people popular products and services that have contributed great value to social life. But they have been blind to the fact that this business model serves as an ideal vector for manipulative publishers and advertisers (including covert agents of foreign powers) to find audiences for every type of prejudice and nonsense. Fragmentation, polarization, propaganda, and manipulation in the news media are accelerated by the logic of modern information markets. So too is the decline of traditional newsrooms and the erosion of credible public service journalism.

We now stand at an inflection point.  We must design an ambitious public policy response to steer technological development back towards serving the wellbeing of democratic society. We cannot allow our society to be held hostage to a marketplace that undermines the integrity of our democracy. We must address the root causes with a structural policy agenda that focuses on data rights, competition, education, and public service.

**DATA RIGHTS:  **We will establish and enforce rules that give individuals control over how data about them is collected, used, and monetized. The rules must be flexible to adapt to technology change and directly address the connection between data profiling, content targeting, and polarizing media audiences. These targeted data policies that address disinformation fit within a broader agenda of data rights that are foundational for the modern economy.

  • Restrictions on Sensitive Data:  To constrain the profiling of political viewpoint, users should be provided additional protections (collection and use restrictions) for any sensitive data, including that which may be used to reveal a political preference or to manipulate a political viewpoint.- Consent and Control: To prevent all-or-nothing privacy policies that deny service to any user that declines to opt-in to certain types of data collection or use, platforms should be prohibited from discriminating against users that choose privacy. For those that do consent, these consequences of these agreements must be meaningful and understandable to any user.- Child Online Safety:  Additional restrictions in data collection, data use, and certain forms of targeted communications should be applied for vulnerable user groups, especially children under the age of 18.- Data Portability & Interoperability:  In a market that offers limited consumer choice and high barriers to competitive entry, we need policies enabling portability and interoperability of data across services. COMPETITION:  We will protect the public against the exploitation of concentrated market power. Especially in information markets that sustain our democracy, consumers should have meaningful choices to find, send and receive information over digital media.
  • Modernization of Antitrust:  We need new forms of antitrust oversight for the digital economy that look not just at price increases to judge market power but also at control over data, constraints on innovation, and reduction in consumer welfare.- *Concentration Restrictions: * The rapid concentration of power in the digital market is driven by mergers between large companies and acquisitions of upstart competitors. Oversight of commercial mergers should consider not only horizontal market power but also the acquisition of data and patents that enable competitive advantage. ALGORITHMIC ACCOUNTABILITY:  We will develop new forms of public oversight that apply a duty of care and regular auditing to the technologies that control information markets. The algorithms that mine large data sets, target advertising, and curate digital media feeds are increasingly forms of AI. These technologies have the potential for enormous social impact – positive and negative – and should be subject to government oversight, including a review of training data, design bias and discriminatory outcomes.  These audits should mirror health and safety inspections of traditional industries.

PUBLIC SERVICE JOURNALISM:  We will restore and strengthen public service journalism as a cornerstone of democracy. The rise of disinformation as a disruptive phenomenon coincides with the decline in commercial viability for public service journalism in the Internet age. The accumulation of market power over content aggregation and digital advertising in search and social media has undermined the century-old business model of newsroom journalism.  But even as the number of professional journalists drops, the public’s need for their services has surges. What the market fails to provide, society must build for itself with public policy. These might include support for the modernization of public media channels, wage-tax credits for professional journalists that are technology and viewpoint neutral, investments in student journalism to build a career pipeline, or a program of citizen vouchers to put the power to restore journalism in the hands of the people.  

EDUCATION: We will support and strengthen digital media literacy.  The fight against disinformation in democracies will be won by changing public attitudes and the ways people consume digital media. The rise of digital media giants has weakened traditional markers of source credibility by compressing every news headline into a single stream and eroding a shared public narrative of facts in pursuit of greater ad sales. As a society, we need to establish digital media literacy skills in our educational curricula.  We will begin by working with civil society groups to generate broad public awareness about the problem of disinformation. And we will fund programs to deliver digital literacy in our schools to the next generation of voters.

Share this article