Young person lighting a candle at the scene of a Buffalo NY mass shooting
Photograph: Kent Nishimura/Getty Images

A New Lawsuit Puts the Online White Supremacy Pipeline on Trial

The families of victims of a mass shooting in Buffalo are challenging the platforms they believe led the attacker to carry out a racist massacre.

The families of four people killed at a mass shooting in a Buffalo, New York, supermarket have filed a sweeping lawsuit against a slew of major internet companies, weapon vendors, the family of the perpetrator, and a Japanese toy company.

In a lawsuit filed Friday, the families name internet giants Meta, Amazon, and Alphabet, including social media they own; smaller platforms like Reddit and Snapchat; image board 4chan and its Japanese partner the Good Smile Company; three firearm companies; and the parents of the shooter, Payton Gendron.

While the suit does not set specific dollar amounts—the complaint says it will do so at trial—it attempts to hold this wide array of companies liable for the losses suffered during the May 14, 2022, massacre. It also seeks an order from the court requiring the social media companies “to stop the harmful conduct … [and] remedy the unreasonably dangerous recommendation technologies in their social media products.”

Ten people were killed in the shooting. The perpetrator, Payton Gendron, pleaded guilty to 10 counts of first-degree murder as well as weapons and hate crime charges. In a screed rife with white supremacist ideology and racist memes gleaned from 4chan, Gendron wrote that he selected the Tops Supermarket specifically because it was in a predominantly Black neighborhood.

The suit further seeks an order from the court requiring that social media companies carry warning messages for minors and their parents, stating that social media platforms are “addictive to minor users and pose a clear and present danger of radicalization and violence to the public.”

John Elmore, the Buffalo lawyer representing the families, says this case is personal. “Some of the victims were people I knew,” he says. “They came to my office for help.”

Elmore is joined by the Social Media Victims Law Center, a legal firm that aims “to hold social media companies legally accountable for the harm they inflict on vulnerable users,” per its website, and the Giffords Law Center to Prevent Gun Violence, a nonprofit led by former US congresswoman Gabrielle Giffords, who survived an assassination attempt in 2011.

Elmore says his team has also consulted with lawyers who successfully won a $1.5 billion judgment against Alex Jones and his conspiracy website Infowars.

This lawsuit, should it go forward, joins a chorus of civil actions attempting to foist liability on social media platforms. The families of those killed in a racially motivated attack on the Mother Emanuel church in Charleston, South Carolina, have filed a similar action against Meta and Alphabet. (The two California-based companies have yet to respond to the lawsuit.)

Asked about his decision to sue such a wide array of actors—from the social media platforms that, the lawsuit alleges, helped radicalize Gendron to the platforms that helped him stream his crime and the gun manufacturers that enabled him to do so much damage—Elmore says, “that’s where the evidence led us.”

In particular, Elmore points to the plea allocution in which Gendron’s lawyers admitted that “the racist hate that motivated this crime was spread through online platforms, and the violence that was made possible was due to the easy access of assault weapons.”

Elmore says the goal is to force reform.

“We can’t bring the victims of this lawsuit back, but we can make sure that no other families have to file this kind of lawsuit,” he says. No families deserve to be members of this unenviable club, Elmore says.

The lawsuit essentially takes aim at the full journey that brought Gendron from being a regular American teen to becoming a violent white supremacist—one equipped with the means and intention of massacring as many Black people as possible. They point to platforms like Facebook and Snapchat as the first part of that process.

“Gendron’s radicalization on social media was neither a coincidence nor an accident,” the complaint alleges. “It was the foreseeable consequence of the defendant social media companies’ conscious decision to design, program, and operate platforms and tools that maximize user engagement (and corresponding advertising revenue) at the expense of public safety.”

The lawsuit claims that the white supremacist ideology that captured Gendron, particularly the “great replacement theory”—which imagines an international plot to weaken the political power of white people—is a “product of social media.” While it may have been conjured up by a French author and promoted by hardened neo-Nazis, the lawsuit claims that “replacement theory proponents rely heavily on social media—and the tools and features the Social Media Defendants utilize to increase their own engagement—to promote racist ideology to young and impressionable adherents.”

Exposure to this kind of hate propaganda as a teenager, mixed with the addictive nature of social media, fundamentally altered Grendron’s brain chemistry, Elmore argues in his filings.

Social media platforms maximized user engagement “not by showing them content they request or want to see, but rather, by showing them and otherwise recommending content from which they cannot look away,” the complaint continues. “Taking full advantage of the incomplete development of Gendron’s frontal lobe, Instagram, YouTube, and Snapchat maintained his product engagement by targeting him with increasingly extreme and violent content and connections which, upon information and belief, promoted racism, antisemitism, and gun violence.”

This is not a bug, Elmore argues. “These products were functioning as designed and intended.”

These platforms pointed Gendron to the next step in his radicalization: 4chan.

While there is no algorithm on the notorious image board, there was a waiting “community of fellow racists urging him to move forward,” the lawsuit alleges. What’s more, Gendron was a frequent user of /k/, the weapons board. That community, and similar ones on Discord, helped him prepare for the attack and increase his chances of succeeding.

The lawsuit singles out 4chan financial backer Good Smile, a major Japanese toy company that in 2015 invested $2.4 million for a 30 percent share in the site, according to documents WIRED obtained. Pointing to reporting from WIRED and a lawsuit filed by former employees of the company, the families allege that Good Smile’s role in 4chan “is not that of a passive investor but is actively involved in the management of the social media site.”

In a statement from April, Good Smile denied WIRED’s reporting, insisting, “We do not have a partnership with 4chan, never had influence over the management and/or control of 4chan.” In the same statement, however, Good Smile also says, “We severed any limited relationship we previously had with 4chan in June of 2022. Since then, we have not had any relationship with 4chan.” The company has cited “confidentiality obligations” preventing it from commenting on the matter and has ignored multiple requests for comment.

The documents regarding 4chan’s financial backers were obtained by the New York Attorney General’s office as part of its investigation into the Buffalo attack. That office ultimately declined to pursue a case against 4chan and other online platforms but did make a slew of recommendations—including making it easier to sue internet companies that host livestreams of terror attacks.

More than just publicizing these acts of mass murder, the lawsuit alleges, streaming these attacks actually contributes to their frequency. It points to passages in Gendron’s own writings, which claim that livestreaming served as a kind of insurance policy to make sure he went ahead with the attack. The lawsuit points to five other mass murderers streamed online, arguing that each one spawned copycats.

The livestream and recordings of Gendron’s attack were seen at least 3 million times across various platforms, according to the complaint. The families allege that Alphabet, Reddit, and 4chan all earned advertising revenue from the video.

On top of using Discord to stream his attack, the lawsuit alleges, Gendron “also put his plan to murder Black people in writing on his Discord account, including an in-depth analysis of the weapon and other equipment he would use for the attack.”

WIRED reached out to the social media companies named in the lawsuit. Google spokesperson Jose Castaneda emphasizes YouTube's efforts to limit “extremist content” on its platform. “We have the deepest sympathies for the victims and families of the horrific attack at Tops grocery store in Buffalo last year,” Castaneda says in a statement. "Through the years, YouTube has invested in technology, teams, and policies to identify and remove extremist content. We regularly work with law enforcement, other platforms, and civil society to share intelligence and best practices.”

A spokesperson for Snapchat declined to comment on the lawsuit but sent a statement saying that the company’s platform does not “allow unvetted content to go viral or be algorithmically promoted. Instead, we vet all content before it can reach a large audience, which helps protect against the discovery of potentially harmful or dangerous content.” The other companies have yet to respond.

This lawsuit faces long odds. US courts have rejected similar civil claims against social media companies for their role in hosting Islamic State propaganda, which radicalized the perpetrators of a 2016 attack on the Pulse nightclub in Orlando and a 2017 terror attack in San Bernardino.

The Sixth Circuit Court of Appeals ruled in 2019 that, while there could be a circumstance where social media companies are held liable for acts of terrorism, the claim put forward by the families of the victims killed at Pulse would make these companies “liable for seemingly endless acts of modern violence simply because the individual viewed relevant social media content before deciding to commit the violence.”

Section 230 of the US Communications Decency Act generally shields internet platforms from liability for the actions of their users. The Supreme Court is set to test the limits of that section when it hands down a ruling in Gonzales v. Google. In oral arguments, heard in February, lawyers for the family of Nohemi Gonzalez, an American exchange student killed in an Islamic State–inspired attack on Paris in 2015, argued that Google should not be shielded from legal consequences. YouTube, they argued, aided and abetted the terror group by recommending propaganda to unwitting users.

It is unlikely, but not impossible, that the top court will devise an exception to Section 230, allowing for these kinds of civil complaints, when it hands down its decision in the coming weeks. Elmore is undeterred by the steep hill ahead. 

“We’re going to do the best we can.”

Update 6:35 pm ET, Monday, May 15, 2023: Added comment from a Google spokesperson.