On the morning of January 7, a group of Facebook sales reps were busily firing off notes to some of their most important clients.
Moments earlier, Mark Zuckerberg had done the unthinkable and locked US president Donald Trump out of his Facebook account until the end of his term. The move represented the most serious punishment any tech company had dared mete out to Trump, and the potential blowback was unknown and frightening.
In the notes to their clients, the Facebook staffers sought to reassure partners who might be nervous about the business implications of the move, and to explain the basis for Facebook’s decision, according to two sources who received the notes.
Unlike with past big moves, Facebook had not sent out an official company memo to advertisers and agencies, sources said. There may not have been time.
Facebook’s decision to ban Trump was announced publicly by Zuckerberg less than 24 hours after a mob of Trump supporters, egged on by the President, stormed the US Capitol in Washington DC. After years of grappling with how to deal with Trump’s provocative behavior on its platform, Facebook’s move that Thursday was surprisingly swift and decisive.
It set in motion an extraordinary sequence of events that left Trump banished from all the major social media platforms on which he’d built a formidable political movement, as the key powers of the tech industry — from Twitter to Google to Amazon — launched a sweeping crackdown that targeted the president as well as a variety of far-right organizations and services.
The actions taken by the internet companies in the days that followed the Capitol siege are sure to be studied and analyzed for years to come. And the justifications of the crackdown are already provoking a heated debate about the concentration of power in the tech industry and the role of social media as a platform for public discourse.
Insider spoke to tech employees, executives, industry insiders, and activists, and reviewed media reports and social media activity, to piece together the frenetic week that changed tech forever. Among the questions still unanswered is the extent of any coordination between the tech firms. But a clear picture that emerges shows how a mix of motives, pressure and spontaneous events — inside and outside the companies — suddenly and inexorably converged to bring about a long overdue reckoning.
“There’s a real aversion to removing people from the platform,” a source close to Twitter said. “The company never really had to wrestle with this until the last four years.”
A current Facebook employee was more blunt: “It’s been bat-shit,” they said. “We are all in uncharted territory.”
“You have all failed the American people today”
The atmosphere inside Facebook had reached a boiling point on Wednesday January 6, as Trump’s “stop the steal” rally in Washington dispersed and throngs of angry supporters broke into the Capitol building in a frenzy that shocked viewers around the world and would leave five people dead.
Employees at Facebook flooded the company’s internal messageboard, calling for their employer to finally take stronger action against Trump. “Please ban him, I feel embarrassed and heartbroken in our company’s inactions in perpetuating his lunacy,” one employee wrote, according to The Information.
Outside advocacy groups contacted leaders at Facebook and turned up the heat. “We were in touch with Sheryl [Sandberg] last week during this whole crisis,” said Jim Steyer, the CEO and founder of Common Sense Media.
Steyer said the leadership of the group Stop Hate for Profit told Facebook’s Chief Operating Officer that the time had come to finally muzzle Trump and to forcefully clamp down on extremism. “You’re promoting lies from Donald Trump, you should take them down; and you are allowing FB Groups to allow white supremacists and domestic terrorists to organize their plans on Facebook Groups,” Steyer recounted in an interview with Insider.
The situation for Twitter was much the same.
“You have all failed the American people today,” musician Selena Gomez tweeted to Twitter CEO Jack Dorsey and the top executives at Facebook and YouTube.
When Trump posted a video later on Wednesday urging the rioters to go home while also praising them as “very special” people, Twitter and Facebook were finally spurred to act. Twitter took down the video, and implemented a 12-hour block on Trump’s account; Facebook followed an hour later, yanking the video and putting Trump in the penalty box for 24 hours. Snapchat put an indefinite hold on Trump’s account.
It was the harshest sanction Trump had ever faced from any social media company. And it was about to get much worse.
Taking the nuclear option in French Polynesia
While the crisis erupted, Twitter CEO Jack Dorsey found himself more than 4,000 miles away from the action, and from a trusted lieutenant.
Dorsey was vacationing in French Polynesia while Twitter’s top lawyer and policy lead, Vijaya Gadde, was on the West Coast, as The New York Times was first to report. The two executives urgently conferred by phone about how to deal with Trump, a source told Insider. Decisions of this magnitude at Twitter “are made entirely by Jack Dorsey and Vijaya Gadde,” according to another source with direct knowledge of the matter.
On Thursday, Facebook had escalated its 24-hour Trump ban and announced that the President would be locked out of the social network until at least January 20, when he left the White House. Zuckerberg, who has complete control over Facebook thanks to its dual class share structure, had come to the decision after consulting Chief Operating Officer Sheryl Sandberg, and various other top lieutenants including global affairs boss Nick Clegg, policy chief Joel Kaplan and diversity chief Maxine Williams, as first reported by NBC News.
The example set by Facebook raised the pressure on Twitter. A Twitter spokesperson told Insider that policy enforcement recommendations are made by its trust and safety team, which reports into Gadde. She then took that recommendation to Dorsey, the spokesperson said.
When Dorsey and Gadde spoke by phone on Friday they agreed to go with the nuclear option: Twitter would ban Trump— one of the most popular users on its service — forever.
“I do not celebrate or feel pride in our having to ban @realdonaldtrump from Twitter,” Dorsey said a few days later in a string of tweets explaining the decision. “We made a decision with the best information we had based on threats to physical safety both on and off Twitter,” he noted.
After Dorsey and Gadde made the momentous decision and communicated it to staff, three board members reached out to discuss it in more detail with Dorsey. Twitter’s stock sank roughly 12% on the news and the move instantly provoked fury among not just Trump loyalists, but even some Trump critics, who worried that Twitter had overstepped by permanently banning the President of the US.
Roger McNamee, an early critic of Facebook, said he believes concerns about legal liability played a part in the rapidly spreading action by multiple tech companies. “I think the lawyers at Facebook, at Google, at Twitter, looked at this thing and went OMG, we enabled this,” he told Insider.
“They realize their legal jeopardy is rising, not falling. And I would be willing to bet anything that this was not the result of these guys suddenly having some sense of civic duty, but rather that they have genuine legal jeopardy here.”
No sanctuary on Parler
The tech industry’s crackdown spread from Trump’s personal accounts to the broader far-right ecosystem. Reddit banned r/DonaldTrump, a forum dedicated to supporting the US President, on the grounds that its rules “prohibit content that promotes hate, or encourages, glorifies, incites, or calls for violence against groups of people or individuals.” YouTube banned the channel of former Trump aide Steve Bannon’s podcast.
And Parler, a social network whose lax approach to content moderation made it a favorite of far-right extremists and a likely refuge for Trump, suddenly found itself in the crosshairs of Google, Apple and Amazon.
Parler had been a concern for Google for months. The app hosted what Google defines as ‘User Generated Content’, meaning it was required to have tools to moderate content such as hate speech, sexual profanity, and anything that promoted violence. The Play Team in charge of policy had sent Parler multiple reminders about these rules, but wasn’t seeing any changes. Meanwhile, Parler CEO John Matze was publicizing his app as a haven for free speech, which only drew more attention to the problem.
When the Capitol Hill riots happened, the content on Parler suddenly got a lot more graphic and violent. In an email to Business Insider, Google cited several posts as examples of why it took action, including one which read: “How do we take back our country? About 20 or so coordinated hits will turn everything around. The tree of liberty needs watering.”
Google’s “Merchandising” team, which is responsible for featuring apps on the Google’s Play storefront, had already worried that promoting Parler meant promoting a tool for further violence. The team planned to hold a meeting that Friday — but before they could convene, the policy team informed them that the company was pulling the app entirely, according to a source familiar with those discussions.
The decision also meant Google was getting ahead of Apple, which had issued a 24-hour warning for Parler to fix its moderation policies. Google’s ban was immediate, though a Google spokesperson told Business Insider that Parler is allowed to return to the Play Store if it brings its moderation policies in line with the Play guidelines.
A surprise move by Amazon
The moves by Google and Apple threatened to eliminate vital distribution channels for Parler’s app, a significant but not fatal blow to the social network. On Saturday, Amazon joined the fray and went for Parler’s jugular.
Amazon Web Services, the ecommerce giant’s cloud hosting business, sent Parler an email that day saying it was kicking it off the Amazon cloud — effectively taking Parler offline.
Amazon “cannot provide services to a customer that is unable to effectively identify and remove content that encourages or incites violence against others,” read the email, which was obtained by BuzzFeed News.
Amazon had already privately raised concerns with the social network over content on its platform, according to company spokesperson. But according to Parler adviser Jeffrey Wernick, Amazon’s notice came out of the blue.
“If we had any inkling there was a problem with the relationship, we’d already be making plans to migrate off,” Wernick told Insider in an interview.
“They sent us some stuff in December that they thought we should take down,” Wernick said. “We acted very quickly to take it down. We agreed with their opinion. Then on January 6 they sent us correspondence that had no attachment to it. Their response was something like ‘Don’t worry everything is resolved.'”
Some AWS employees speculate that the decision was driven by a combination of employee unrest and major customers complaining to company leadership. “Generally we get involved with as little as possible unless our hand is forced,” one AWS employee told Insider. “The largest customers have executive sponsors who they can call directly, in many cases, Andy [Jassy, CEO of AWS] himself.”
One former senior AWS employee said the decision to boot Parler “broke new ground” in terms of the company’s enforcement actions. Amazon’s cloud business is “religious about not discontinuing services,” added another former senior-level AWS employee. Because Amazon oversees a critical aspect of its customers’ operations, the last thing it wants is customers thinking “if it can happen to Parler, it can happen to me,” the former employee explained.
Tim Bray, a senior Amazon engineer who previously resigned over Amazon’s firing of protesting workers, said he expected that questions of legal liability factored into the decision. “If I were Andy Jassy, I would have been talking to the lawyers to say, ‘Hey, are we liable for these guys?’ I suspect the lawyers said, ‘Maybe,” he said.
“I’m sure the PR factor was there, and the Amazon employees [calling for action] – it would be foolish to say any of those had no effect, but I would be willing to bet it was [a legal decision].”
Apple also followed Google’s lead and brought down the hammer on Saturday, suspending Parler from its iOS App Store. And over the next 48 hours, enterprise tech firms like Okta and Twilio announced they were cutting ties with Parler, depriving it of important back-end tools that powered its service.
Brace for blowback
By Tuesday, TikTok and YouTube were the only major tech platforms left open to Donald Trump.
YouTube’s policy team had thus far believed that Trump was less likely to use YouTube in the way he used Twitter and Facebook, and so it chose to keep the account active.
But they were soon proven wrong, with Trump uploading content on Tuesday that YouTube determined violated its policy on inciting violence. The content was removed, and Trump’s channel was slapped with a suspension for at least one week. YouTube also implemented an indefinite ban on comments on all of Trump’s videos. (YouTube declined to share details on the nature of the content that was removed.)
The crackdown that began as a short-term injunction on Trump’s social media comments had become a full-fledged tech industry purge of extremist content. Airbnb announced it would ban anyone who was identified as being part of the storming of the Capitol, as well as members of extremist group Proud Boys. And Salesforce announced that it would block the Republic National Committee from sending emails that incite violence.
“I do not believe this was coordinated,” Dorsey said on Twitter, addressing the fact that all of the tech companies seemed to spring into action around the same time. “More likely,” Dorsey continued, “companies came to their own conclusions or were emboldened by the actions of others.”
Still, tech industry insiders are bracing for how their sudden and extraordinary clampdown will be perceived and what countereaction it may provoke. “Do I think, fundamentally, there’s going to be blowback? Unambiguously,” a former Facebook employee said. “In terms of continuing to foment distrust in how the platform is policed, how much power certain platforms have. This has given life to the concerns.”
Parler registered its domain on Monday with Epik, a registrar and web hosting service famous for hosting Gab, the social network popular with white supremacists. However, Epik insists it has not entered into any talks about hosting Parler, the company told Fox Business. Parler has also sued Amazon claiming that the ecommerce and web hosting giant violated antitrust law.
“Parler has been completely mischaracterized and the truth will be revealed through the litigation process,” Parler’s Wernick said. “We hope we’ll be able to see all the communications inside Google, inside Apple, inside Facebook, inside Amazon,” he continued.
With the Biden administration set to take over in less than a week, many critics have been quick to characterize the tech industry’s actions as efforts to curry political favor.
Color of Change, a civil rights group that has long pushed Facebook for stronger hate speech moderation, attributes this week’s tidal wave of action in part to a sense of momentum and pressure from other companies that reached a point at which inaction was no longer possible.
“I don’t think anyone wants to be looked at as the least progressive out of the tech companies,” said senior campaigns director Jade Magnus Ogunnaike. “Everyone didn’t want to make waves, they just wanted to continue to make money off Trump and Trumpism, no matter how violent it was, and so that’s the problem.”
For all the drastic actions the tech companies took, many of the underlying incentives that led to the crisis remain, she said.
“When a younger, more charismatic, more strategic, more dangerous proponent of Trumpism comes up, will we have to go through this entire process again? I believe so.”
Additional reporting by: Hugh Langley, Lara O’Reilly, Matt Drange, Candy Cheng, Eugene Kim, Meghan Morris.