Facebook will “indefinitely” block President Donald Trump from its platforms, saying the president’s posts are too risky in the wake of a harrowing attack by his supporters on the US Capitol. Facebook CEO Mark Zuckerberg announced the unprecedented move on Thursday, a day after rioters stormed the Capitol as Congress met inside to certify Joe Biden as the next US president.
“We believe the risks of allowing the President to continue to use our service during this period are simply too great,” Zuckerberg said in a Facebook post. “Therefore, we are extending the block we have placed on his Facebook and Instagram accounts indefinitely and for at least the next two weeks until the peaceful transition of power is complete.”
will take place on Jan. 20.
The ban, which followed anon the president’s posts, represents Facebook’s strongest actions against Trump’s use of social media to spread misinformation, stir grievances and incite violence. The social media giant, which owns photo-sharing app Instagram, has had a mostly hands-off approach to political speech, exempting politicians from fact-checking. Instead, Facebook has kept up some of Trump’s controversial posts or added labels to the president’s baseless claims of election fraud.
“We did this because we believe that the public has a right to the broadest possible access to political speech, even controversial speech. The current context is now fundamentally different, involving use of our platform to incite violent insurrection against a democratically elected government,” Zuckerberg said.
The violence that broke out on Capitol Hill on Wednesday marked a turning point for Facebook and other social networks that have been reluctant to silence Trump on social media because of public interest. The Metropolitan Police Department said Wednesday night that four people died when a mob stormed the US Capitol. The actions by social networks also highlighted some of the differences in how they handle political content.
Meanwhile, the companies are also facing more pressure from civil rights activists, politicians and others to do more. Sen. Mark Warner, a Virginia Democrat, said Thursday the steps taken by Facebook and Twitter — as well as by YouTube — were “too late and not nearly enough” to curb the problem. Former First Lady Michelle Obama called on Silicon Valley companies to “stop enabling this monstrous behavior” and permanently ban Trump and create policies to prevent technology from “being used by the nation’s leader to fuel insurrection.”
As social media companies clamped down on Trump, some analysts praised the moves but said they were overdue. “Anyone who was following the disinformation on these platforms knew this was probably, unfortunately, inevitable,” said Bob O’Donnell, chief analyst at Technalysis Research. “And what happened on the platforms unquestionably led to these events.”
The White House didn’t immediately respond to a request for comment. On Thursday, Trumpvia White House Social Media Director Dan Scavino that said an “orderly transition” of power would occur on Jan. 20. However, Trump also used the statement to note that he “totally” disagrees with the election’s outcome. He has yet to concede that he lost the election two months ago.
Stronger action against social networks could also push users to alternatives such as Parler and Gab. Here’s how other social networks are handling Trump and content that could incite violence:
Trump has more than 88 million followers on Twitter, allowing him to reach a massive audience online.
On Wednesday, Twitter also‘s account for the first time because three of his tweets violated the company’s rules against interfering in elections or other civic processes. Trump has deleted the tweets but his account will still be locked for 12 hours.
The move came after University of Virginia law professor Danielle Citron, journalist Kara Swisher, Obama Foundation CTO Leslie Miley and Anti-Defamation League CEO Jonathan Greenblatt were among high-profile figures urging Twitter to boot Trump from the platform.
Twitter in the past has placed a public interest notice over Trump’s tweets for glorifying violence, which limited the tweet from being spread.
“Our public interest policy — which has guided our enforcement action in this area for years — ends where we believe the risk of harm is higher and/or more severe,” Twitter said in a tweet.
The company said it will continue to “evaluate the situation in real time.”
On Thursday, Google-owned YouTube tightened a new policy that Trump’s channel violated a day earlier. This intensification of enforcement could accelerate his account’s termination if the channel continues to run afoul of the rule.
Last month, YouTube instituted a policy to remove any new videos alleging that fraud altered the outcome of the 2020 US presidential election. On Wednesday, Trump’s channel posted a video that did just that. His video message urged supporters to “go home now” but also repeated false claims about election fraud. YouTube removed the video under its policy. But the policy, implemented last month, had a grace period lasting until Jan. 20, Inauguration Day. With the grace period, channels breaking the rule would have the offending video removed but faced no other penalties.
YouTube said it has now ended the grace period, rather than waiting until Inauguration Day. Now, videos that violate that policy will be issued a “strike.” Channels are temporarily suspended from posting or livestreaming when they get strikes, and YouTube’s “three strike” system permanently bans channels with three violations in a 90-day period.
“We apply our policies and penalties consistently, regardless of who uploads it,” YouTube tweeted.
A YouTube spokesman said the company didn’t feel it needed to unilaterally punish Trump because it’s already laid out its three-strikes policy on barring creators from posting content. The spokesman also said false claims may not only come from Trump himself, but others within the president’s orbit, and the policy would apply to them as well.
Snapchat also locked Trump’s account for the first time on Wednesday. It’s not the first time the disappearing messaging app has taken action against Trump’s content.
In June, Snapchat said it will no longer promote Trump’s account on a page of curated content called Discover because it doesn’t want to “amplify voices who incite racial violence and injustice.” The move came after racial justice protests broke out in the aftermath of the death of George Floyd.
The company told The New York Times it made the decision to not promote Trump’s account after he tweeted that if protesters outside the White House breached the fence, they’d be “greeted with the most vicious dogs, and most ominous weapons.”
Twitch, owned by Amazon, unplugged Trump’s account as well. “In light of yesterday’s shocking attack on the Capitol, we have disabled President Trump’s Twitch channel,” a spokeswoman said in a statement. “Given the current extraordinary circumstances and the President’s incendiary rhetoric, we believe this is a necessary step to protect our community and prevent Twitch from being used to incite further violence.”
Gab said in a blog post it’s in the process of connecting with Trump’s team about joining the platform. The company set up an account for Trump with more than 448,100 followers. Gab CEO Andrew Torba said in an email that the Trump video other social networks removed “explicitly called for peace.”
Gab, which says it champions free speech, has been used by extremists such as neo-Nazis and white supremacists who have been booted from other social networks. On Wednesday, some Gab users documented going into the offices of Congress members and called for people inside the building to hunt down Vice President Mike Pence, who Trump had criticized earlier in the day, The New York Times reported.
In a blog post, Gab said it works with law enforcement to promote public safety. “We proactively report when our moderation team discovers content which we believe poses an imminent threat to life and respond rapidly when law enforcement identifies any such threat.”
, which has a similar feel to Twitter, is another social network that conservatives have been flocking to after social networks intensified their crackdown against far-right groups such as the Proud Boys.
The company, though, has fewer rules than Facebook, Twitter and other major social networks. “We prefer that removing community members or member-provided content be kept to the absolute minimum,” Parler’s rules state. The platform, for example, can’t knowingly be used “for crime, civil torts, or other unlawful acts.”
In an interview with The New York Times published on Thursday, Parler CEO John Matze said that the company would get involved if its users were breaking the law but that it’s up to a community jury to decide what is illegal or against the company’s rules.
“Look, if it was illegally organized and against the law and what they were doing, they would have gotten it taken down. But I don’t feel responsible for any of this and neither should the platform, considering we’re a neutral town square that just adheres to the law,” he said in the interview.
TikTok said that Trump doesn’t have an account on the short-form video platform that it’s aware of.
“We expect everyone on our platform to follow our Community Guidelines, and content and accounts that violate our policies are removed,” the company said in a statement.
Content that seeks to glorify or promote violence would violate those rules and be removed, the company said.