© 2024 All Rights reserved WUSF
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Facebook, Google Spread Misinformation About Las Vegas Shooting. What Went Wrong?

Police form a perimeter around the road leading to the Mandalay Bay Resort and Casino after a gunman killed 59 people and injured more than 500 others when he opened fire Sunday night on a country music concert in Las Vegas.
Mark Ralston
/
AFP/Getty Images
Police form a perimeter around the road leading to the Mandalay Bay Resort and Casino after a gunman killed 59 people and injured more than 500 others when he opened fire Sunday night on a country music concert in Las Vegas.

In the hours just after the massacre in Las Vegas, some fake news started showing up on Google and Facebook. A man was falsely accused of being the shooter. His name bubbled up on Facebook emergency sites and when you searched his name on Google, links of sites connecting him with the shooting topped the first page.

It appears to be another case of automation working so fast that humans can't keep pace. Unfortunately, these powerful tech companies continue to be a main destination for news and it's not clear how they can solve the problem.

In this particular case, the man's name first appeared on a message board on a site called 4chan. It's known as a gathering spot for underground hackers and the alt-right. Everyone who posts is anonymous. And we're not publishing the man's name because he's been through enough.

Shortly after the shooting, police announced that a woman named Marilou Danley was a person of interest. She had been living with the shooter in his Nevada home.

On a message board called /Pol/- Politically Incorrect, someone said her ex- husband was the shooter. His Facebook page indicated he was a liberal and the far-right trolls on Pol went to work to spread the word.

Even after police identified the shooter, the wrong man's name appeared for hours in tweets. On Facebook, it appeared on an official "safety check" page for the Las Vegas shooting, which displayed a post from a site called Alt-Right News. And on Google, top searches linked to sites that said he was the shooter. When you searched his name, a 4chan thread about him was promoted as a top story.

So, why did parts of these hugely powerful companies continue to point to an innocent man?

Bill Hartzer, an expert on search, says Google is constantly crawling the Web and picking up new information as it appears. The innocent man went from hardly having anything online to have a whole bunch of stuff.

"Google has not had the time to really vet the search results yet," Hartzer says. "So what they'll do is they will show what they know about this particular name or this particular keyword."

In a statement, Google said the results should not have appeared, but the company will "continue to make algorithmic improvements to prevent this from happening in the future."

One improvement that Greg Sterling thinks Google should make is putting less weight on certain websites, like 4chan. "In this particular context had they weighted sites that were deemed credible more heavily you might not have seen that," says Sterling, is a contributing editor at Search Engine Land. On the other hand, he says "if news sites ... were given some sort of preference in this context you might not have seen that."

Unfortunately, it seemed like Facebook was giving those same sites credibility. In a statement, Facebook said it was working on a way to fix the issue that caused the fake news to appear. (Disclosure: Facebook pays NPR and other leading news organizations to produce live video streams that run on the site.)

But Sterling says part of the issue with having these companies determine what's news is that they're run by engineers. "For the most part the engineers and the people who are running Google search don't think like journalists," he says. "They think like engineers running a product that's very important."

And then there is the scale of what Google and Facebook do. They are huge. And that's only possible because computers do a lot of the work. Yochai Benkler, a law professor at Harvard, says that with such massive scale even if there were humans helping out there would be mistakes.

Benkler says that even if Facebook and Google blocked sites like 4chan, it wouldn't solve the problem. "Tomorrow in another situation like this someone will find some other workaround," Benkler says. "It's not realistic to imagine perfect filtering in real time in moments of such crisis."

But, for the man who spent hours being accused of mass murder, the technical problems at Google and Facebook probably aren't much comfort. And they won't be much comfort to the next person who lands in the crosshairs of fake news.

Copyright 2024 NPR

Laura Sydell fell in love with the intimate storytelling qualities of radio, which combined her passion for theatre and writing with her addiction to news. Over her career she has covered politics, arts, media, religion, and entrepreneurship. Currently Sydell is the Digital Culture Correspondent for NPR's All Things Considered, Morning Edition, Weekend Edition, and NPR.org.
You Count on Us, We Count on You: Donate to WUSF to support free, accessible journalism for yourself and the community.