The League of Women Voters of the United States joined a letter with more than 120 other groups addressed to social media platform leaders urging them to control the spread of disinformation online, especially as we head into midterm elections this year.
May 12, 2022
Dear Mr. Zuckerberg, Mr. Pichai, Ms. Wojcicki, Mr. Agrawal, Mr. Chew, Mr. Spiegel, and Mr.Mosseri:
With the 2022 midterm elections fast approaching, online disinformation continues to confuse, intimidate, and harass voters, suppress the right to vote, or otherwise disrupt our democracy. As discussed in greater detail below, we call on your platforms to take several affirmative actions well in advance of the midterm elections to combat election disinformation. These actions must include introducing friction to reduce the spread and amplification of disinformation, consistent enforcement of robust civic integrity policies, and greater transparency into business models that allow disinformation to spread.
The January 6, 2021, deadly attack on the US Capitol by far-right extremists attempting to overturn the free, fair, and secure 2020 presidential election was a catastrophic reminder of the fragility of our democracy. This violent insurrection did not happen in a vacuum. It was paired with numerous hurdles that voters faced during the 2020 election cycle amid a pandemic and exacerbated by relentless efforts by former President Trump and his allies to spread disinformation on social media platforms to threaten civil rights, escalate hate speech, undermine election integrity, impose barriers to the ballot box, and discount the votes of communities of color.
Less than two years ago, the large social media and technology companies you run implemented several measures to attempt to limit the spread of disinformation related to the 2020 election, including enhanced content moderation, labeling of disputed or inaccurate information, and highlighting authoritative information and news sources. Platforms’ follow-through on these commitments was inconsistent and insufficient to counter the deluge of disinformation. While those steps were far from perfect, they nevertheless had some material effect on slowing the spread of dangerous lies, conspiracy theories, and attempts to deceive voters. Even incomplete measures can have a substantial impact.
Although your platforms have long struggled to combat election disinformation, research and investigative reporting revealed that your platforms subsequently backed away from enforcement of your own policies and practices soon after the 2020 election. In fact, the surge of disinformation and violent content that spread across platforms after election day but before the inauguration of Joe Biden helped fuel the January 6th insurrection.
Disinformation related to the 2020 election has not gone away but has only continued to proliferate. In fact, according to recent polls, more than 40 percent of Americans still do not believe President Biden legitimately won the 2020 presidential election. Further, fewer Americans have confidence in elections today than they did in the immediate aftermath of the January 6th insurrection.
High-profile disinformation spreaders and other bad actors are continuing to use social media platforms to disseminate messages that undermine trust in elections. We have already seen attacks spreading disinformation around voter accessibility measures and pre-emptive claims of voter fraud, putting local election officials at risk and making it much tougher for localities to recruit people to help run their elections.
The upcoming November 8th midterm election will be the first national election day since the January 6th insurrection, making it extremely important that your platforms take appropriate action to combat disinformation. To protect the integrity of the 2022 midterm elections and the public's confidence in American democracy, we ask that you implement the following measures immediately, such that they are firmly in place for the lead-up to the midterm elections:
●Introduce friction to reduce the distribution of content containing electoral disinformation. While misleading claims should be appropriately labeled to provide context, a growing body of research shows that information-only labels are largely ineffective to halt the spread of disinformation. To reduce the distribution of electoral disinformation, platforms should focus on implementing front and back-end friction in user interfaces, algorithms, and product design to proactively reduce mis/disinformation. This may include modifications to demote or downrank this content and limit users ability to engage with it. For example, viral circuit breakers can be utilized to limit the spread of potential disinformation. Platforms should also conduct and make public regular impact assessments and independent audits of algorithmic tools that lead to the spread of online voter suppression.
●Focus on preventing disinformation targeting non-English speaking communities. Non-English language disinformation has continued to spread beyond the 2020 election. The language gap between content moderators and content has created enforcement disparities leaving non-English speaking communities vulnerable to false claims and disinformation. Platforms must provide adequate resources to enforce non-English content moderation in order to prevent the further spread of disinformation. Platforms should also disclose metrics that quantify the level of resources invested in combating non-English disinformation.
●Consistently enforce civic integrity policies during both election and non-election cycles. Platforms have several civic integrity policies in place to combat the spread of election disinformation, but they are not consistently enforced and contain massive loopholes like Facebook’s Cross Check program, enabling a small number of individuals and organizations with significant reach to repeatedly spread huge amounts of disinformation. Enforcement has become lax during non-election cycles. Platforms must commit to upholding their own civic integrity policies and enforce them in a manner that is consistent and even handed against politicians and nonpoliticians alike throughout elections as well as non-election cycles. This includes establishing civic integrity teams that enforce policies 365 days a year. These policies must address content that calls for political violence, content that could inspire violence such as doxing and attacks on election workers, and content that attempts to delegitimize any past and future U.S. election. There should be a particular focus on enforcement of policies on users with large followings who often produce election disinformation that results in wide dissemination of this content throughout the platforms. Finally, platforms should increase enforcement, staffing, and resources between the period following election day and the day that new members of Congress take office in 2023 to ensure a peaceful transition and that any conspiracy theories or calls for violence following the results of the election can be shut down.
●Prioritize enforcement to combat the ‘Big Lie.’ Today, candidates are using the Big Lie as a platform plank to preemptively declare voter fraud in order to dispute the results of the 2022 election. This is damaging American democracy by undermining faith in the integrity of our elections. Last year’s statewide elections and this year’s primaries so far have shown that bad actors are recycling or are inspired by disinformation from the 2020presidential election. Platforms must remove disinformation that spreads and amplifies the Big Lie. This includes content that glorifies the January 6th insurrection, particularly from political candidates and in fundraising advertisements.
●Consistently apply civic integrity policies to all live content as a means of combating election disinformation. During the 2020 election cycle, platforms changed or modified their civic integrity policies on an almost weekly basis but failed to apply those policies to live content that was posted days or months before new policies went into effect. This allowed disinformation to spread and continue receiving high engagement, making the new policies ineffective. Platforms must apply any new policies to combat disinformation that was spreading prior to those policies taking effect. For example, advertisements that violate new policies should be prevented from running again. If violative content is live and receiving engagement on the platform it should be taken down, regardless of when the content was first posted.
●Prioritize fact-checking of electoral content, including political advertisements and posts from public officials. Platforms should intensify their efforts to address voter interference and fraud and rapidly fact-check content relating to the 2022 midterm elections, especially when that information comes from people wielding power or influence. No one should get a free pass to spread disinformation; you should not profit from enabling it. In the case of electoral content from high-reach accounts with a history of violating platform policies, platforms should implement ‘holding areas’ where human reviewers can evaluate content against platform policies before making it public. Platforms should also apply third-party fact checkers to political advertisements and remove exemptions that allow public officials to spread disinformation with impunity. Fact-checking of electoral content without closing loopholes will allow disinformation to spread at scale and diminish the ability of the platforms to reduce harm at a meaningful level.
●Provide real-time access of social media data to external researchers and watchdogs. Researchers and watchdogs can play a key role in preventing, identifying, and addressing the harms of electoral mis- and disinformation. But to do that, they need reliable access to the data platforms have been reluctant to provide. Platforms should provide free third-party access to tools such as Crowd Tangle and Firehose that contain important data for researchers studying and tracking the spread of disinformation. Additionally, platforms should not take retaliatory action against good-faith research and journalistic efforts seeking to provide greater transparency to the public. By allowing greater access to social media data, platforms can improve overall transparency while increasing the safety of elections.
●Provide greater transparency of political advertisements, enforcement practices and algorithmic models. Those seeking to undermine our elections often operate in the shadows by bankrolling disinformation through online political ads, and take advantage of lax enforcement practices while utilizing algorithms that boost divisive content. Platforms must provide greater transparency in political advertising by creating a publicly available online database of all ads in categories related to elections and social and political issues run on the platform. The database should be machine readable, include targeting parameters used and what categories of users received the ad. Platforms should also publish quarterly transparency reports on the efficacy of enforcement practices, and provide insight into the algorithms that drive their business.
With the proper oversight and protections, your platforms can be helpful tools to promote a strong democracy. At the same time, if you allow disinformation about elections to spread largely unchecked, your platforms will become known as the dominant threat to a thriving democratic process. As the 2022 midterm elections approach, we urge you to take this opportunity to demonstrate that your companies are committed to playing a productive role in the democratic process.
Sincerely,
See attached for full list of signers
The Latest from the League
‘Women Power Democracy,’ the new programmatic focus of the League of Women Voters, will advance a stronger, more representative American democracy.
New programming focus will cover countering mis- and disinformation, increasing election participation, advancing voter access, and reforming redistricting
Sign Up For Email
Keep up with the League. Receive emails to your inbox!
Donate to support our work
to empower voters and defend democracy.