This story was co-published with The Washington Post.
Facebook groups swelled with at least 650,000 posts attacking the legitimacy of Joe Biden’s victory between Election Day and the Jan. 6 siege of the U.S. Capitol, with many calling for executions or other political violence, an investigation by ProPublica and The Washington Post has found.
The barrage — averaging at least 10,000 posts a day, a scale not reported previously — turned the groups into incubators for the baseless claims supporters of then-President Donald Trump voiced as they stormed the Capitol, demanding he get a second term. Many posts portrayed Biden’s election as the result of widespread fraud that required extraordinary action — including the use of force — to prevent the nation from falling into the hands of traitors.
“LOOKS LIKE CIVIL WAR is BECOMING INEVITABLE !!!” read a post a month before the Capitol assault. “WE CANNOT ALLOW FRAUDULENT ELECTIONS TO STAND ! SILENT NO MORE MAJORITY MUST RISE UP NOW AND DEMAND BATTLEGROUND STATES NOT TO CERTIFY FRAUDULENT ELECTIONS NOW !”
Another post, made 10 days after the election, bore the avatar of a smiling woman with her arms raised in apparent triumph and read, “WE ARE AMERICANS!!! WE FOUGHT AND DIED TO START OUR COUNTRY! WE ARE GOING TO FIGHT…FIGHT LIKE HELL. WE WILL SAVE HER❤ THEN WERE GOING TO SHOOT THE TRAITORS!!!!!!!!!!!”
One post showed a Civil War-era picture of a gallows with more than two dozen nooses and hooded figures waiting to be hanged. Other posts called for arrests and executions of specific public figures — both Democrats and Republicans — depicted as betraying the nation by denying Trump a second term.
“BILL BARR WE WILL BE COMING FOR YOU,” wrote a group member after Barr announced the Justice Department had found little evidence to support Trump’s claims of widespread vote rigging. “WE WILL HAVE CIVIL WAR IN THE STREETS BEFORE BIDEN WILL BE PRES.”
Facebook executives have downplayed the company’s role in the Jan. 6 attack and have resisted calls, including from its own Oversight Board, for a comprehensive internal investigation. The company also has yet to turn over all the information requested by the congressional committee studying the Jan. 6 attack. Facebook said it is continuing to negotiate with the committee.
The ProPublica/Post investigation, which analyzed millions of posts between Election Day and Jan. 6 and drew on internal company documents and interviews with former employees, provides the clearest evidence yet that Facebook played a critical role in the spread of false narratives that fomented the violence of Jan. 6.
Its efforts to police such content, the investigation also found, were ineffective and started too late to quell the surge of angry, hateful misinformation coursing through Facebook groups — some of it explicitly calling for violent confrontation with government officials, a theme that foreshadowed the storming of the Capitol that day amid clashes that left five people dead.
Drew Pusateri, a spokesperson for Meta, Facebook’s newly renamed parent company, said that it was not responsible for the violence on Jan. 6. He pointed instead to Trump and others who voiced the lies that sparked the siege on the Capitol.
“The notion that the January 6 insurrection would not have happened but for Facebook is absurd,” Pusateri said. “The former President of the United States pushed a narrative that the election was stolen, including in-person a short distance from the Capitol building that day. The responsibility for the violence that occurred on January 6 lies with those who attacked our Capitol and those who encouraged them.”
To determine the extent of posts attacking Biden’s victory, The Post and ProPublica obtained a unique dataset of 100,000 groups and their posts, along with metadata and images, compiled by CounterAction, a firm that studies online disinformation. The Post and ProPublica used machine learning to narrow that list to 27,000 public groups that showed clear markers of focusing on U.S. politics. Out of the more than 18 million posts in those groups between Election Day and Jan. 6, the analysis searched for words and phrases to identify attacks on the election’s integrity.
The more than 650,000 posts attacking the election — and the 10,000-per-day average — is almost certainly an undercount. The ProPublica/Washington Post analysis only examined posts in a portion of all public groups, and did not include comments, posts in private groups or posts on individuals’ profiles. Only Facebook has access to all the data to calculate the true total — and it hasn’t done so publicly.
Facebook has heavily promoted groups since CEO Mark Zuckerberg made them a strategic priority in 2017. But the ones focused on U.S. politics have become so toxic, say former Facebook employees, that the company established a task force, whose existence has not been previously reported, specifically to police them ahead of Election Day 2020.
The task force removed hundreds of groups with violent or hateful content in the months before Nov. 3, according to the ProPublica/Post investigation.
Yet shortly after the vote, Facebook dissolved the task force and rolled back other intensive enforcement measures. The results of that decision were clear in the data ProPublica and The Post examined: During the nine increasingly tense weeks that led up to Jan. 6, the groups were inundated with posts attacking the legitimacy of Biden’s election while the pace of removals noticeably slowed. Removals did not pick up again until the week of Jan. 6, but even then many of the groups and their posts remained on the site for months after, as Trump supporters continued to falsely claim election fraud and press for states to conduct audits of the vote or to impose new voting restrictions.
Fewer Political Groups Were Removed From Facebook Between Election Day and Jan. 6
Removal dates for about 2,000 public U.S. political groups between August 2020 and March 2021
“Facebook took its eye off the ball in the interim time between Election Day and Jan. 6,” said a former integrity team employee who worked on the groups task force and, like others, spoke on the condition of anonymity to discuss sensitive internal matters. “There was a lot of violating content that did appear on the platform that wouldn’t otherwise have.”
Pusateri denied that the company had pulled back on efforts to combat violent and false postings about the election after the vote. He did not comment on the quantitative findings of the ProPublica/Post investigation.
“The idea that we deprioritized our Civic Integrity work in any way is simply not true,” he said. “We integrated it into a larger Central Integrity team to allow us to apply the work that this team pioneered for elections to other challenges like health-related issues for example. Their work continues to this day.”
The investigation also reveals a problem with the way Facebook polices its groups. Former employees say groups are essential to the company’s ability to keep a stagnant American user base as engaged as possible and boost its revenue, which reached nearly $86 billion in 2020.
But they say as groups have grown more central to Meta’s bottom line, the company’s enforcement efforts have been weak, inconsistent and heavily reliant on the work of unpaid group administrators to do the labor-intensive work of reviewing posts and removing the ones that violate company policies. Many groups have hundreds of thousands or even millions of members, dramatically escalating the challenges of policing posts.
With the administrators themselves steeped in conspiracy theories about the election or, for example, the safety of COVID-19 vaccines, reliable enforcement rarely takes place, say former employees. They say automated tools — which search for particular terms indicating policy violations — are ineffective and easily evaded by users simply misspelling key words.
“Groups are a disaster,” said Frances Haugen, a former member of Facebook’s Civic Integrity team who filed a whistleblower complaint against the company and testified before Congress warning about the damaging effects of the company on democracy worldwide, as well as other problems.
Many of the group posts identified in the analysis fell into what a March internal Facebook report, first published by Politico, defined as “harmful non-violating narratives.” This refers to content that does not break Facebook’s rules, but whose prevalence can cause people to “act in ways which are harmful to themselves, others, or society at large.”
The report warned that such harmful narratives could have had “substantial negative impacts including contributing materially to the Capitol riot and potentially reducing collective civic engagement and social cohesion in the years to come.”
Pusateri declined to comment on specific posts but said the company does not have a policy forbidding posts or comments that attack the legitimacy of the election. He said the company has a dedicated groups integrity team and an ongoing initiative to protect people who use groups from harm.
Facebook officials have noted that more extreme content flowed through smaller social media platforms in the buildup to the Capitol attack, including detailed planning on bringing guns or building gallows that day. But Trump also used Facebook as a key platform for his lies about the election right up until he was banned on Jan. 6. And Facebook’s reliance on groups to drive engagement gave those lies unequaled reach. This combined with the sag in post-election enforcement to make Facebook a key vector for pushing the ideas that fueled violence on Jan. 6.
Critics and former employees say this also underscores a recurring issue with the platform since its founding in Zuckerberg’s Harvard University dorm room in 2004: The company recognizes the need for enforcement only after a problem has caused serious damage, often in the form of real-world mayhem and violence.
Facebook didn’t discover the campaign by the Russia-based Internet Research Agency to spread hyperpartisan content and disinformation during the 2016 presidential election until months after Americans had voted. The company’s actions were late as well when Myanmar’s military leaders used Facebook to foment rapes, murders and forced migrations of minority Rohingya people. Facebook has apologized for failings in both cases.
The response to attacks on the legitimacy of the 2020 U.S. presidential election was similarly slow, as company officials debated among themselves whether and how to block the rapidly metastasizing lies about the election. The data shows they acted aggressively and comprehensively only after Trump supporters had battered their way into the Capitol, sending lawmakers fleeing for their lives.
The ProPublica/Post investigation “is a new and very important illustration of the company's unfortunate tendency to deal with safety problems on its platform in a reactive way,” said Paul Barrett, the deputy director of the Center for Business and Human Rights at New York University's Stern School of Business. “And that almost by definition means that the company will be less effective, because it will not be looking out into the future and preventing problems before they happen.”
The trouble with policing groups
Facebook’s newly vigorous enforcement actions the week of Jan. 6 — which resulted in Trump himself being banned from the platform — marked such a stark contrast from the company’s previous approach that some Trump supporters took to Facebook to complain about the reversal.
“Facebook is Getting Real Brave and Vicious Now,” Jerry Smith, a retired police officer from Missouri who created and ran a group called United Conservatives for America, wrote the day after the Capitol attack. “They Are Removing Tons of Posts From My Groups!”
In a recent interview at his home, Smith said he could not remember writing that message or which deletions prompted his response. He said he opposed political violence and posts that called for it. But he acknowledged it was difficult for him to remove such content as United Conservatives for America’s membership swelled to more than 11,000, with the number of posts surpassing what one person could monitor. The typical group in the ProPublica/Post analysis had more than 1,000 members.
Smith, who showed a reporter that his Facebook account had received 116 violations for breaking company rules, said he found some of Facebook’s policies reasonable but disagreed on how they should be enforced. He posted in United Conservatives for America and other groups at a frenetic pace long before Election Day. As early as the summer of 2020, he warned about alleged Democratic party plans to steal the election and also shared false information about the pandemic, including a video from a conspiracy theorist about the origins of the virus.
“And DEMS Are Pushing For Vote By Mail. Another Way For Them To Steal The Election,” he wrote in August 2020.
In the interview, Smith said he believes that American elections often are rigged and worries that COVID-19 vaccines may be tainted. He has used Facebook groups to share these beliefs with tens of thousands of people — and thinks Facebook’s enforcement of its policies is overly aggressive and a result of political bias against conservatives.
“Are you going to do away with their free speech?” said Smith. “If someone thinks it’s not a fair election … why can’t they have their opinion on whether it’s a fair election or not?”
Facebook cracked down before the election
Facebook’s problems with groups had long been obvious to company employees, who gathered on a remote video conference in early September 2020 to figure out how to stop groups from spreading hate, violent threats and misinformation as Election Day approached, according to former employees.
Known as the Group Task Force, the new unit they formed consisted of members of Facebook’s Civic Integrity team, the specialized unit charged with protecting elections on the platform, as well as employees from engineering and operations teams who help oversee the contract moderators who review posts flagged by users or by automated systems, former employees said. The goal of the task force was to identify political groups with large numbers of posts and comments that violated the social giant’s rules against hate speech and calls for violence. Former employees involved in the effort said they wanted to apply the platform’s rules while respecting political debate and dialogue.
At the same time, Facebook’s Dangerous Individuals and Organizations team was identifying and removing QAnon groups ahead of the election. The results of the two teams’ actions were striking. All of the more than 300 QAnon groups identified by ProPublica and The Post had been removed by October 2020, when Facebook announced a total ban on the movement, the analysis found.
Facebook Can Be Effective When It Chooses
The number of U.S. QAnon groups on Facebook increased in 2020, before the company cracked down
In the end, the Group Task Force removed nearly 400 groups whose posts had been seen nearly 1 billion times before Election Day, according to a post on Workplace, Facebook’s internal discussion tool. The document later was included in the Facebook Papers disclosed by Haugen to Congress and the Securities and Exchange Commission. Still, members of the task force told ProPublica and The Post that the existence of such a team was an indictment of Facebook’s failure to police groups as part of its normal operations.
“The whole thing of the civic team needing to come in and do the takedowns was not a good state of affairs,” said one employee involved in the task force. “You could make a good argument that this should have already been done.”
On Nov. 5, Facebook banned “Stop the Steal,” a hugely viral group created on Election Day itself that quickly attracted over 300,000 members around a message rooted in attacking the legitimacy of the election. The company cited the prevalence of posts calling for violence and using hate speech in banning the group and all other groups using a similar name.
The next day, Nov. 6, the Group Task Force gathered virtually to celebrate its efforts, former employees said. Days later, a task force member published a Workplace post titled “Some Reflections on US2020” to bring attention to its work.
“Along with heroic efforts from other teams across the company, I truly believe the Group Task Force made the election safer and prevented possible instances of real world violence,” said the post.
But the focus on U.S. political groups and content undermining the election wouldn't last.
A noticeable drop in enforcement
On Dec. 2, Facebook executives disbanded the Civic Integrity team and scattered its members to other parts of Facebook’s overall integrity team, reducing their influence. That resulted in the demise of the Group Task Force. The company also rolled back several emergency measures that had been put in place leading up to Election Day to control misbehavior in Facebook groups.
The Post/ProPublica investigation reveals the result: During the lull in enforcement, hundreds of thousands of posts questioned the legitimacy of Biden’s victory, spread lies about voter fraud and at times called for violence. Meanwhile, the company’s pace of group removals slowed to a crawl, the data analysis shows.
Among the content spreading in groups were videos in which former Trump National Security Adviser Michael Flynn spread false claims of electoral fraud and called for martial law. (Through a spokesperson, Flynn declined to comment.) Another frequent post was a cartoon showing Trump chasing a masked Biden, who carried a bag labeled “election theft” with swing states depicted inside. It was posted more than 350 times in the political groups analyzed by ProPublica and the Post, attracting over 2,500 total likes.
One meme featured a photo of former Rep. Trey Gowdy, R-S.C., who rose to fame in right-wing circles by leading a congressional committee’s investigation into the deadly 2012 attack on the American diplomatic compound in Benghazi, Libya, accompanied by the text “If you are ok with rigging an election to win, I am ok with martial law to stop you…” That was posted in groups at least 97 times, garnering over 3,500 total likes. Gowdy has denied saying the phrase.
Another meme showed a photo of Trump winking, with the text “Not Only Can Martial Law Guarantee a Trump Victory, It Also Allows Trump To Arrest Anyone He Wants!” It was posted at least 70 times, generating more than 2,400 total likes. The images and their spread in groups was identified using a CounterAction image analysis tool.
“Everyone needs to make a show of FORCE in DC on the 6th and any congress who doesnt follow the constitution or who doesnt stand up for our president (Pence included) needs to be ’corrected’ by WE the PEOPLE – on the front steps of the state house – for all the world to see!!! THIS IS HOW THE US DEALS WITH HER TRAITORS!!!” read one post from Dec. 27, 2020.
Ten days later, as rioters stormed the Capitol, the ProPublica/Post analysis shows, Facebook began taking down groups at a rate not seen since before the election. An internal Facebook spreadsheet from Jan. 6, which was included in Haugen’s disclosures, contains a section called “Action Items.” The top bullet point was a direction to conduct a “Sweep of Groups with V&I risk” — a term referring to violence and incitement. It had been 35 days since the Civic Integrity team, and with it the Group Task Force, had been disbanded.
Groups still active long after Jan. 6, 2021
Months after the Capitol was breached, Facebook still was working to remove hundreds of political groups that violated company policies.
One of those was Smith’s United Conservatives for America, which continued to carry posts attacking the legitimacy of Biden’s election until Facebook removed it in May.
When Smith met with a reporter in his home in early December, he’d just finished a 30-day posting ban on Facebook. In spite of his account’s history of violations, he was still managing at least one Facebook group — also called United Conservatives for America.
Like its predecessor, the new United Conservatives for America group was racking up strikes for violations of Facebook’s rules, according to a post Smith made to the group in September.
That post included a screenshot of an automated message from Facebook informing him that eight recent posts in the new United Conservatives for America group had been flagged by fact-checkers. As a result, the distribution of his group’s posts was being limited.
Smith remained defiant.
“I'm Not Blaming Our Members,” Smith wrote. “I’m Blaming FakeBook!”
In late December, after being asked about Smith’s account and group, Facebook said it banned his profile and removed United Conservatives for America, citing unspecified violations of its community standards.
Methodology
Data analyzed for this article included posts and other public activity collected from over 100,000 public Facebook groups tracked between January 2020 and June 2021 by CounterAction, a firm that studies online disinformation. The data was obtained by ProPublica and The Washington Post.
CounterAction marked Facebook groups for tracking if group members had posted links to U.S. political websites. Additional Facebook groups were then marked for monitoring if they had any members in common with groups already under observation. This process was repeated over the tracking period to identify newly created groups and add them to the dataset.
Many of these groups disappeared from public view during the period of our analysis. To determine when groups focused on U.S. politics within our dataset went offline, we analyzed the more than 5,000 groups that had meaningful activity (more than 10 posts tracked) but that were no longer online as of Aug. 30, 2021. We hand labeled each group as political if its name and description showed that it was created to represent or support a U.S. political interest or group, to be a forum for U.S. political speech, or to represent or discuss a social or cultural movement with a strong connection to U.S. politics (whether national or local). We ultimately found more than 2,500 such groups, including those for and against various parties, candidates and issues across the political spectrum, groups for various kinds of political memes and discussions, and groups for movements such as the QAnon conspiracy theory, militia groups and Stop the Steal.
We then estimated the time of disappearance for each of these 2,500+ offline U.S. political groups using the latest date seen on their posts and other group activity. Based on our reporting and the timing of spikes in group disappearances, which often coincided with Facebook’s announcements of group suspensions, we believe the majority of them were removed by Facebook. However, some may have been deleted or removed from public view by their own administrators. We shared the list of more than 2,500 groups with Facebook and asked them to clarify whether they were removed by the company or taken offline by their own administrators. Facebook did not respond to our questions about these groups or any other of our quantitative findings.
We used these labeled offline groups to predict which of the still-online groups within our sample were also U.S. political groups. We used posts from the offline groups to train a text classification model to predict whether a post was from a U.S. political group and ran it against all the posts from each group in our dataset. We labeled a group as a likely U.S. political Facebook group when the mean prediction for its posts was over 0.5 (1.0 indicates that the model predicts with maximum probability that the post is from a U.S. political group). We used this labeling method to identify over 27,000 likely U.S. political groups with posts between Election Day and Jan. 6. We hand checked a sample of the groups to calculate an estimated proportion of groups that were actually U.S. political groups, and got a precision rate of about 79%.
To count the number of posts that specifically sought to delegitimize the election results, we examined 18.7 million posts from Election Day through Jan. 6 within the likely U.S. political Facebook groups. We separated out posts from groups with “Stop the Steal” in their name and calculated which keywords and phrases were disproportionately common in posts from those groups using a text-analysis technique called TF-IDF. Then, we handpicked the terms and keywords that were meaningfully linked to election delegitimization theories (e.g., “stop the steal,” “steal the election,” “every legal vote”). We had about 60 terms that indicated delegitimization on their own, plus 86 more in two buckets that, if terms from both buckets were present, indicated delegitimization (e.g., a reference to absentee ballots on its own did not indicate delegitimization, but a reference to “absentee ballots” and “fraud” did). We identified around 1.03 million posts that likely referenced delegitimization. Finally, we hand-checked a sample of these posts to estimate the proportion that actually sought to delegitimize the election, and got a precision rate of about 64%. (False positives included mainstream news articles, debunks of fraud claims and references to other countries’ elections.) We arrived at our final estimate of delegitimizing posts by multiplying the two together, to get an estimate of a bit more than 655,000.
Due to CounterAction’s sampling method, the groups we analyzed likely contain a greater proportion of right-wing groups than the platform as a whole. The activity of the right-wing groups we analyzed matches with the findings of our reporting, and group activity in our sample coincided with Facebook’s public announcements about group removals. However, we would need additional outside data to analyze whether groups in our sample are representative of the broader platform. We sampled and checked precision rates in our analysis based on a 5% margin of error and 95% confidence level.
Tom Hamburger of The Washington Post contributed reporting. Irfan Uraizee of The Washington Post re-created the archival Facebook posts.
Craig Silverman is a national reporter for ProPublica covering voting, platforms, disinformation, and online manipulation.
Jeff Kao is a computational journalist at ProPublica who uses data science to cover technology.
Jeremy B. Merrill is a former news apps developer at ProPublica, concentrating mostly on Congress data and our Represent app.