SAN FRANCISCO – Over the past few weeks, Mark Zuckerberg, Facebook’s chief executive, and his lieutenants have watched the presidential race with an increasing sense of alarm.
Executives have held meetings to discuss President Donald Trump’s evasive comments about whether he would accept a peaceful transfer of power if he lost the election. They watched Trump tell the Proud Boys, a far-right group that has endorsed violence, to “stand back and stand by.” And they have had conversations with civil rights groups, who have privately told them that the company needs to do more because Election Day could erupt into chaos, Facebook employees said.
That has resulted in new actions. On Wednesday, Facebook said it would take more preventive measures to keep political candidates from using it to manipulate the election’s outcome and its aftermath. The company now plans to prohibit all political and issue-based advertising after the polls close on Nov. 3 for an undetermined length of time. And it said it would place notifications at the top of the News Feed notifying people that no winner had been decided until a victor was declared by news outlets.
“This is shaping up to be a very unique election,” Guy Rosen, vice president for integrity at Facebook, said in a call with reporters on Wednesday.
Facebook is doing more to safeguard its platform after introducing measures to reduce election misinformation and interference on its site just last month. At the time, Facebook said it planned to ban new political ads for a contained period — the week before Election Day — and would act swiftly against posts that tried to dissuade people from voting. Zuckerberg also said Facebook would not make any other changes until there was an official election result.
But the additional moves underscore the sense of emergency about the election, as the level of contentiousness has risen between Trump and his opponent, Joe Biden. On Tuesday, to help blunt further political turmoil, Facebook also said it would remove any group, page or Instagram account that openly identified with QAnon, the pro-Trump conspiracy movement.
For years, Facebook has been striving to avoid another 2016 election fiasco, when it was used by Russian operatives to spread disinformation and to destabilize the American electorate. Zuckerberg has since spent billions of dollars to hire new employees for the company’s “integrity” and security divisions, who identify and clamp down on interference. He has said the amount of money spent on securing Facebook exceeded its entire revenue of roughly $5.1 billion during its first year as a public company in 2012.
“We believe that we have done more than any other company over the past four years to help secure the integrity of elections,” Rosen said.
Yet how successful the efforts have been are questionable. The company continues to find and take down foreign interference campaigns, including three Russian disinformation networks as recently as two weeks ago.
Domestic misinformation has also mushroomed, as Facebook has said it will not police speech from politicians and other leading figures for truthfulness. Zuckerberg, who supports unfettered speech, has not wavered from that position as Trump has posted falsehoods and misleading comments on the site.
For next month’s election, Facebook has gamed out almost 80 scenarios — what technology and security workers call “red teaming” exercises — to figure out what could go wrong and to protect against the situations. It also updated its policies to outlaw certain types of statements and threats from elected officials, capped by last month’s sweeping set of changes.
But after weeks of Trump declining to say he would accept the election’s outcome, while also directing his supporters to “watch” the polls, Facebook decided to ramp up protective measures.
Asked why the company was acting now, Facebook executives said they were “continuing to evaluate and plan for different scenarios” with the election.
The open-ended ban on political advertising is significant, after Facebook resisted calls to remove the ads for months. Last month, the company had said it only would stop accepting new political ads in the week before Election Day, so existing political ads would continue circulating. New political ads could have resumed running after Election Day. Zuckerberg has said that ads give less well-known politicians the ability to promote themselves, and that eliminating those ads could hurt their chances at broadening their support base online.
Facebook also said it would rely on a mix of news outlets, including Reuters and The Associated Press, to determine whether a candidate had secured the presidency. Until those news organizations called the race, Facebook said, it would place notifications in the News Feed to say no candidate had won. That buttresses what the company had said it would do last month, when it announced that it would attach labels to posts redirecting users to Reuters if Trump or his supporters falsely claimed an early victory.
To tamp down on potential intimidation at ballot boxes, Facebook also plans to remove posts that call for people to engage in poll watching “when those calls use militarized language or suggest that the goal is to intimidate, exert control, or display power over election officials or voters.”
Trump and others have talked about watching polls in recent weeks. In a debate with Biden last week, Trump urged his supporters to “go into the polls and watch very carefully” on Election Day.
Facebook, which has been criticized for unevenly removing posts and inconsistently enforcing its policies against toxic content, said it had already taken down many posts where people were trying to interfere with the vote. Between March and September, it removed more than 120,000 posts from Facebook and Instagram in the United States because the messages violated its voter interference policies.
The company said that it wouldn’t shy away from eliminating more posts as the election approaches. On Tuesday, it took down a post from Trump where he falsely claimed the flu was more deadly than the coronavirus.
“I want to underscore that we remove this content regardless of who posts it,” said Monica Bickert, head of global policy management at Facebook. “That includes the president.”
© 2020 The New York Times Company
Read more at nytimes.com
In a time of both misinformation and too much information, quality journalism is more crucial than ever.
By subscribing, you can help us get the story right.