SAN FRANCISCO — Over the past few weeks, Mark Zuckerberg, Facebook’s chief executive, and his lieutenants have watched the presidential race with an increasing sense of alarm.
Executives have held meetings to discuss President Trump’s evasive comments about whether he would accept a peaceful transfer of power if he lost the election. They watched Mr. Trump tell the Proud Boys, a far-right group that has endorsed violence, to “stand back and stand by.” And they have had conversations with civil rights groups, who have privately told them that the company needs to do more because Election Day could erupt into chaos, Facebook employees said.
That has resulted in new actions. On Wednesday, Facebook said it would take more preventive measures to keep political candidates from using it to manipulate the election’s outcome and its aftermath. The company now plans to prohibit all political and issue-based advertising after the polls close on Nov. 3 for an undetermined length of time. And it said it would place notifications at the top of the News Feed notifying people that no winner had been decided until a victor was declared by news outlets.
“This is shaping up to be a very unique election,” Guy Rosen, vice president for integrity at Facebook, said in a call with reporters on Wednesday.
Facebook is doing more to safeguard its platform after introducing measures to reduce election misinformation and interference on its site just last month. At the time, Facebook said it planned to ban new political ads for a contained period — the week before Election Day — and would act swiftly against posts that tried to dissuade people from voting. Mr. Zuckerberg also said Facebook would not make any other changes until there was an official election result.
But the additional moves underscore the sense of emergency about the election, as the level of contentiousness has risen between Mr. Trump and his opponent, Joseph R. Biden Jr. On Tuesday, to help blunt further political turmoil, Facebook also said it would remove any group, page or Instagram account that openly identified with QAnon, the pro-Trump conspiracy movement.
For years, Facebook has been striving to avoid another 2016 election fiasco, when it was used by Russian operatives to spread disinformation and to destabilize the American electorate. Mr. Zuckerberg has since spent billions of dollars to hire new employees for the company’s “integrity” and security divisions, who identify and clamp down on interference. He has said the amount of money spent on securing Facebook exceeded its entire revenue of roughly $5.1 billion during its first year as a public company in 2012.
“We believe that we have done more than any other company over the past four years to help secure the integrity of elections,” Mr. Rosen said.
Yet how successful the efforts have been are questionable. The company continues to find and take down foreign interference campaigns, including three Russian disinformation networks as recently as two weeks ago.
Domestic misinformation has also mushroomed, as Facebook has said it will not police speech from politicians and other leading figures for truthfulness. Mr. Zuckerberg, who supports unfettered speech, has not wavered from that position as Mr. Trump has posted falsehoods and misleading comments on the site.
For next month’s election, Facebook has gamed out almost 80 scenarios — what technology and security workers call “red teaming” exercises — to figure out what could go wrong and to protect against the situations. It also updated its policies to outlaw certain types of statements and threats from elected officials, capped by last month’s sweeping set of changes.
But after weeks of Mr. Trump declining to say he would accept the election’s outcome, while also directing his supporters to “watch” the polls, Facebook decided to ramp up protective measures.
Asked why the company was acting now, Facebook executives said they were “continuing to evaluate and plan for different scenarios” with the election.
Representatives from the Trump and Biden campaigns did not immediately respond to requests for comment.
Vanita Gupta, president and chief executive of the Leadership Conference on Civil and Human Rights, said Facebook’s moves were “important steps” to “combat disinformation and the premature calling of election results before every vote is counted.”
The open-ended ban on political advertising is especially significant, after Facebook resisted calls to remove the ads for months. Last month, the company had said it only would stop accepting new political ads in the week before Election Day, so existing political ads would continue circulating. New political ads could have resumed running after Election Day.
But Facebook lags other social media companies in banning political ads. Jack Dorsey, Twitter’s chief executive, banned all political ads from the service a year ago because, he said, they could rapidly spread misinformation and had “significant ramifications that today’s democratic infrastructure may not be prepared to handle.” Last month, Google said it, too, would ban all political and issue ads after Election Day.
Mr. Zuckerberg has said that ads give less well-known politicians the ability to promote themselves, and that eliminating those ads could hurt their chances at broadening their support base online.
Facebook also said it would rely on a mix of news outlets, including Reuters and The Associated Press, to determine whether a candidate had secured the presidency. Until those news organizations called the race, Facebook said, it would place notifications in the News Feed to say no candidate had won. That buttresses what the company had said it would do last month, when it announced that it would attach labels to posts redirecting users to Reuters if Mr. Trump or his supporters falsely claimed an early victory.
To tamp down on potential intimidation at ballot boxes, Facebook also plans to remove posts that call for people to engage in poll watching “when those calls use militarized language or suggest that the goal is to intimidate, exert control, or display power over election officials or voters.”
Mr. Trump and others have talked about watching polls in recent weeks. In a debate with Mr. Biden last week, Mr. Trump urged his supporters to “go into the polls and watch very carefully” on Election Day. His son, Donald Trump Jr., said he wanted to see an “army for Trump” swarming the polls, raising concerns about the threat of violence at the ballot box.
Facebook, which has been criticized for unevenly removing posts and inconsistently enforcing its policies against toxic content, said it had already taken down many posts where people were trying to interfere with the vote. Between March and September, it removed more than 120,000 posts from Facebook and Instagram in the United States because the messages violated its voter interference policies.
Some researchers said Facebook was still not going far enough.
“If we are to believe that Facebook will faithfully enforce its own new policies, then they should take down the posts of the powerful users — including the president’s son — who have already called for violent intimidation around voting and on Election Day,” said Shannon McGregor, a senior researcher with the Center for Information, Technology, and Public Life at the University of North Carolina, Chapel Hill,
The company said that it wouldn’t shy away from eliminating more posts as the election approaches. On Tuesday, it took down a post from Mr. Trump where he falsely claimed the flu was more deadly than the coronavirus.
“I want to underscore that we remove this content regardless of who posts it,” said Monica Bickert, head of global policy management at Facebook. “That includes the president.”