© 2024 All Rights reserved WUSF
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

The alleged Buffalo shooter livestreamed the attack. How sites can stop such videos

The logo of live streaming video platform Twitch.
Christophe Ena
/
AP
The logo of live streaming video platform Twitch.

The alleged perpetrator of Saturday's mass shooting at a Buffalo supermarket livestreamed the racist attack online. Using a GoPro camera attached to a military-style helmet, the shooter streamed live on the site Twitch for around two minutes before the site took the livestream down. Since then, the video has been posted elsewhere on the internet.

Experts say platforms could be doing more to prevent livestreams of atrocities from gaining an audience online.

White supremacists have used social media platforms to publicize attacks in the past

Other white-supremacists have also used social media to publicize gruesome attacks, including the mass shooter in Christchurch, New Zealand in 2019.

Since the Christchurch shooting, social media companies have gotten better in some ways at combating videos of atrocities online, including stopping livestreams of attacks faster.

But violent videos like those of mass shootings are saved by some users and then reappear across the internet on Facebook, Instagram, Twitter, TikTok, and other platforms. Those reuploaded videos are harder for companies to take down, says NPR's Bobby Allyn.

On the site Streamable, the video of the Buffalo shooting was viewed more than 3 million times before it was removed, says Allyn.

New York Gov. Kathy Hochul said social media companies bear some responsibility when crimes like the Buffalo shooting happen.

"The social media platforms that profit from their existence need to be responsible for monitoring and having surveillance, knowing that they can be, in a sense, an accomplice to a crime like this, perhaps not legally but morally," Hochul said.

Allyn reports that social media companies usually are not held liable for what they don't police on their sites. Listen to his discussion on Morning Edition.

Experts say social media companies could do more

Social media companies used to take a mostly hands-off approach to moderating content on their sites, but now more than ever sites are trying to manage the societal problems their sites create, reports Allyn. Facebook, Twitter and other sites like them have teams of thousands working to moderate content and block violent media from reaching people.

For example Twitch, the site the Buffalo shooter livestreamed on, could make it harder for people to open accounts and instantly upload live videos. Other video-streaming sites like TikTok and YouTube require users to have a certain number of followers before they're able to stream live, reports Allyn.


This story originally appeared on the Morning Edition live blog.

Copyright 2022 NPR. To see more, visit https://www.npr.org.

Tags
Nell Clark
Nell Clark is an editor at Morning Edition and a writer for NPR's Live Blog. She pitches stories, edits interviews and reports breaking news. She started in radio at campus station WVFS at Florida State University, then covered climate change and the aftermath of Hurricane Michael for WFSU in Tallahassee, Fla. She joined NPR in 2019 as an intern at Weekend All Things Considered. She is proud to be a member of NPR's Peer-to-Peer Trauma Support Team, a network of staff trained to support colleagues dealing with trauma at work. Before NPR, she worked as a counselor at a sailing summer camp and as a researcher in a deep-sea genetics lab.
You Count on Us, We Count on You: Donate to WUSF to support free, accessible journalism for yourself and the community.