Douglas Christian/ZUMA Wire
Looking for news you can trust?Subscribe to our free newsletters.
House lawmakers spent three and a half hours on Tuesday going after Google CEO Sundar Pichai on a range of issues, but barely addressed one of the tech giant’s biggest problems: its role in spreading misinformation and radicalizing users.
Instead, the Republicans who control the Judiciary Committee mostly opted to question Pichai, during his first Congressional hearing, on alleged anti-conservative bias, largely citing the same set of dubious studies.
“Your communications vehicle is being used to promote propaganda that leads to violent events.”Numerous stories have detailed how YouTube, which is owned by Google, deploys algorithms that promote videos featuring polemics and hoax theories in a manner that white nationalists and fascist activists have acknowledged as helpful to recruitment. The platform has become famous for recommending videos about flat earth conspiracies, Sandy Hook hoaxes, and how the recent California fires were the result of “direct energy beams.”
“PJ Media found that 96 percent of search results for Trump were from liberal media outlets. In fact, not a single right-leaning site appeared on the first page of the search results,” Rep. Lamar Smith (R-Texas) said during the hearing, referencing a heavily flawed and admittedly unscientific survey conducted by the conservative outlet.
But the platform is also being used as a weapon by the far-right. On Monday, the Washington Post published findings from Data & Society and the Network Contagion Research Institute, which tracks online hate speech, showing how prevalent far-right, conspiracy YouTube videos are on digital extremist hotbeds like 4Chan and Gab.ai. Twenty-two percent of users on Gab link to videos on YouTube, according to Data & Society, often using clips from the platform to push racist and anti-Semitic views.
Outlets like Bellingcat have documented how YouTube comment sections on InfoWars and other fringe videos have served as a stepping stone on the path to radicalizing users towards fascism. (Infowars was banned from the platform in August, however, videos from its affiliate NewsWars and its contributors like Paul Joseph Watson, are still easily accessible.) Google’s social network, Google Plus, also went unaddressed at the hearing. While Plus has smaller usership numbers than YouTube and is set to be shut down in 2019, it has served as a home for radical content from ISIS sympathizers and white supremacists.
While committee members did ask Pichai some questions on important issues—like a search engine the company is considering deploying that would comply with Chinese censorship laws, its extensive and concerning data-collection practices, and its lack of diversity—the only mention of Google’s role in enabling the spread of hoaxes and radical, far-right ideologies came briefly from two Democratic representatives: Jamin Raskin of Maryland and Pramila Jayapal of Washington.
“I think the point at which it becomes a matter of serious public interest is when your communications vehicle is being used to promote propaganda that leads to violent events, like the guy showing up in the Pizzagate conspiracy case,” Raskin said referencing the incident in December 2016 when a man motivated by an internet conspiracy theory fired a rifle inside a Washington D.C. restaurant.
At least one Republican on the committee has spread hoaxes and conspiracy theories online. As the migrant caravan crossing through Mexico was gaining media attention in August, Matt Gaetz (R-Fla.) accused liberal megadonor George Soros of potentially funding the caravan, without basis. Though Gaetz’s accusation came on Twitter, the conspiracy had spread on YouTube as well.