NewsNation

Why would a gunman livestream a shooting?

(NewsNation) — The gunman who opened fire Monday in a Louisville, Kentucky, bank and livestreamed the attack on Instagram was not the first shooter to take to social media to broadcast their crimes.

Five people were killed and several others were injured in Monday’s shooting, including responding police officers.


Broadcasting a criminal offense to the world from a personal social media account might seem counterintuitive, but oftentimes aggressors are driven by power, notoriety and self-hatred, experts say, and social media offers aggressors a unique ability to reach a large audience at any given time.

While the instances are rare, they’ve accompanied some of the worst mass casualty events in recent memory.

Last year, a gunman went live on the video-sharing platform Twitch as he opened fire in a Buffalo, New York, supermarket, killing 10 people.

In 2019, a gunman livestreamed on Facebook during a rampage at two Christchurch mosques in New Zealand.

Back in the U.S in 2017, a group of people streamed themselves torturing an 18-year-old man on Chicago’s West Side.

“Rationally, it would seem incriminating,” said David Carter, a professor in the School of Criminal Justice at Michigan State University. “But that’s not the mindset.”

Oftentimes, crime is about power, media psychologist Pamela Rutledge said. When it comes to livestreamed crime, however, there’s a distinction between large-scale violence like the Louisville bank shooting and lower-level offenses that might be tied to things like gang activity.

“It’s a way of trying to make oneself part of a community, or part of a society, by saying, ‘See, I matter. See, I’m here,’” Rutledge said. “Obviously, it’s not an effective way, but it just really speaks to the level of despair and isolation and self-hate that that person is going through.”

Separately, some criminals may choose to document their crimes in an attempt to look “tough,” such as in a gang initiation, Rutledge said.

Carter pointed out an additional motivation: violence driven by ideological beliefs. In those cases, aggressors tend to be more interested in communicating with other believers or dying for their cause — eliminating any concerns about potential consequences, he said.

“(In New Zealand) it was more important to deliver the message,” Carter said. “That’s what we find with particularly the ideological extremists.”

Motivations vary, but often it comes down to a combination of wanting to brag and garner recognition, said Thaddeus Johnson, a senior fellow on the Council for Criminal Justice.

That holds true even if the sought-after recognition comes at the cost of being caught, he said.

“Think about serial killers in the past,” Johnson said. “They actually left cops and detectives clues. It’s almost like they want to be caught — but they want to be caught so they can have validation of who they identify as — everybody wants to be famous.”

To a rational person, it might seem counterintuitive, but people who carry out acts of violence like the Louisville bank shooting aren’t thinking rationally, Rutledge said.

“It’s very emotion-driven,” she said. “Very likely, they have been pushed to a point of what we might consider sort of mentally unstable.”

Can social media companies prevent it?

The immediate and random nature of these livestreams poses unique challenges for social media platforms that have massive audiences and are posting a nonstop feed of content at any given time.

“It’s hard when you think about the billions of users on many of these platforms worldwide. … “We’re always reacting and playing catch-up,” said Johnson.

Both Twitch and Meta, the parent company that owns Facebook and Instagram, have removed livestreamed criminal violence. Neither responded to NewsNation’s requests for comment, but both companies issued statements at the time that they removed the material.

Following Monday’s shooting in Louisville, Meta announced it “quickly removed the livestream of this tragic incident this morning.”

Twitch was confronted with a similar situation in May, following the deadly shooting in Buffalo. Twitch identified and removed the stream in less than two minutes and permanently banned the user.

“Live content moderation presents unique challenges, and we are continuously evaluating our policies, processes, and products to keep our community safe,” the company wrote in an official statement at the time.

Short of training artificial intelligence to detect and censor potentially violent content, solutions are bound to focus on reaction rather than prevention, Carter said.

“I can’t emphasize enough the global dynamics and billions of people potentially … who are on a social media platform at any given time,” he said. “Trying to monitor that — I mean, that’s nearly impossible.”

The policing of content, then, falls on the other social media users.

“I think a big part of it is relying on the public to report and not just share offensive materials, and maybe (tech companies) can incentivize users better to report,” Johnson said.