A year after Trump purge, ‘alt-tech’ offers far-right refuge
Philip Anderson is no fan of online content moderation. His conservative posts have gotten him kicked off Facebook, Twitter and YouTube. Two years ago, Anderson organized a “free speech” protest against the big tech companies. A counterprotester knocked his teeth out.
But even Anderson was repulsed by some of the stuff he saw on Gab, a social media platform that’s become popular with supporters of former President Donald Trump. It included Nazi imagery, racist slurs and other extreme content that goes way beyond anything allowed on major social media platforms.
“If you want Gab to succeed then something has to be done,” Anderson, who is Black, wrote in a recent Gab post. “They are destroying Gab and scaring away all the influential people who would make the platform grow.”
The responses were predictable — more Nazi imagery and crude racial slurs. “Go back to Africa,” wrote one woman with a swastika in her profile.
A year after Trump was banned by Facebook, Twitter and YouTube, a rowdy assortment of newer platforms have lured conservatives with promises of a safe haven free from perceived censorship. While these budding platforms are mounting some ideological competition against their dominant counterparts, they have also become havens for misinformation and hate. Some experts are concerned that they’ll fuel extremism and calls for violence even if they never replicate the success of the mainstream sites.
App analytics firm SensorTower estimates that Parler’s app has seen about 11.3 million downloads globally on the Google and Apple app stores, while Gettr has reached roughly 6.5 million. That growth has been uneven. Parler launched in August 2018, but it didn’t start picking up until 2020. It saw the most monthly installs in November 2020 when it hit 5.6 million.
While new platforms may be good for consumer choice, they pose problems if they spread harmful misinformation or hate speech, said Alexandra Cirone, a Cornell University professor who studies the effect of misinformation on government.
“If far-right platforms are becoming a venue to coordinate illegal activity — for example, the Capitol insurrection — this is a significant problem,” she said.
Falsehoods about the 2020 election fueled the deadly attack on the U.S. Capitol last year, while research shows far-right groups are harnessing COVID-19 conspiracy theories to expand their audience.
While Facebook and Twitter serve a diverse general audience, the far-right platforms cater to a smaller slice of the population. The loose to nonexistent moderation they advertise can also create hothouse environments where participants ramp each other up, and where spam, hate speech and harmful misinformation blooms.
Gab launched in 2016 and now claims to have 15 million monthly visitors, though that number could not be independently verified. The service says it saw a huge jump in signups following the Jan. 6, 2021, riot, which prompted Facebook, Twitter and YouTube to crack down on Trump and others who they said had incited violence.
By comparison, Facebook has 2.9 billion monthly users and 211 million people use Twitter daily.
“We tolerate ‘offensive’ but legal speech,” site creator Andrew Torba wrote in an email to Gab subscribers recently. “We believe that a moderation policy which adheres to the First Amendment, thereby permitting offensive content to rise to the surface, is a valuable and necessary utility to society.”
Offensive content is easy to find on Gab. A search turns up user names featuring racial epithets, as well as antisemitic screeds, neo-Nazi fantasies and homophobic rants.
Members of far-right groups like the Proud Boys? They’re on Gab. So is the Georgia congresswoman kicked off Twitter for spreading COVID-19 misinformation. Steve Bannon, banned from Twitter for suggesting the beheading of Dr. Anthony Fauci, has 72,000 followers on Gab.
Torba wrote in an email to the AP that he envisions Gab will someday be “the backbone of the consumer free speech Internet” and rival Facebook and Google.
Gettr, a more recent arrival, is aiming for a slightly more moderate product. Helmed by former Trump senior adviser Jason Miller, Gettr launched in July and now has 4.5 million users. While the site is dominated by conservative voices now, Miller said he welcomes all viewpoints.
The site bans racial and religious epithets and violent threats. Nonetheless, a quick search turns up a user whose name includes the N-word as well as pro-Nazi content.
“Hitler had some damn good points,” reads one post.
Gettr’s growing user base in Brazil includes President Jair Bolsonaro, who has been cited by Facebook for breaking rules regarding COVID-19 misinformation and the use of fake accounts.
“I think there’s plenty of room for all of our platforms,” Miller said when asked about competition with other new sites. “It’s much more about us taking away market share from Facebook and Twitter than competing amongst ourselves.”
Another mainstream platform popular with Trump supporters is Telegram, which has a broad global user base. Trump has said he plans to launch his own social media platform.
There is no indication that far-right users have left Facebook or Twitter in droves. Users can keep their old Facebook account to stay connected with friends while using Telegram or Parler for unmoderated content.
“So now social media companies are effectively vying for screen time across users,” said Cirone, the Cornell professor.
Anderson, the Texas Trump supporter, said he doesn’t know why he was kicked off Facebook and Twitter. He was outside the Capitol during the Jan. 6, 2021, attack, and has supported the Proud Boys. Twitter declined to comment publicly on Anderson; Facebook did not respond to messages seeking comment.
While Facebook, YouTube and Twitter have taken steps to remove extremist material, Cirone said some groups are still evading moderation. And as Facebook whistleblower Frances Haugen revealed in leaked internal documents last year, the company has struggled to moderate non-English language content.
There are also limits to content moderation.
“Content will travel, and ideas will evolve. Content moderation has political consequences,” said Wayne Weiai Xu, an expert on disinformation and social media at the University of Massachusetts Amherst. “It plays right into the far-right talking point that the big tech is censoring speech and that the liberal elite is forcing the so-called ‘cancel culture’ onto everyone.”