WASHINGTON -- A congressional hearing on online hate turned into a vivid demonstration of the problem Tuesday when a YouTube livestream of the proceedings was bombarded with racist and anti-Semitic comments from internet users.
YouTube disabled the live chat section of the streaming video about 30 minutes into the hearing because of what it called "hateful comments."
The incident came as executives from Google and Facebook appeared before the House Judiciary Committee to answer questions about the companies' role in the spread of hate crimes and the rise of white nationalism in the U.S. They were joined by leaders of such human rights organizations as the Anti-Defamation League and the Equal Justice Society, along with conservative commentator Candace Owens.
Neil Potts, Facebook director of public policy, and Alexandria Walden, counsel for free expression and human rights at Google, defended policies at the two companies that prohibit material inciting violence or hate. Google owns YouTube.
"There is no place for terrorism or hate on Facebook," Potts testified. "We remove any content that incites violence."
The hearing broke down into partisan disagreement among the lawmakers and among some of the witnesses, with Republican members of Congress denouncing as hate speech Democratic Rep. Ilhan Omar's criticism of American supporters of Israel.
As the bickering went on, committee chairman Rep. Jerrold Nadler, D-N.Y., was handed a news report including the hateful comments about the hearing on YouTube. He read them aloud, along with the users' screen names, as the room quieted.
"This just illustrates part of the problem we're dealing with," Nadler said.
Democratic Rep. David Cicilline of Rhode Island grilled the Facebook and Google executives about their companies' responsibility for the spread of white supremacist views, pushing them to acknowledge they have played a role, even if it was unintentional. Potts and Walden conceded the companies have a duty to try to curb hate.
But the challenges became clear as Cicilline pushed Potts to answer why Facebook did not immediately remove far-right commentator Faith Goldy last week, after announcing a ban on white nationalism on the social network.
Goldy, who has asked her viewers to help "stop the white race from vanishing," was not removed until Monday.
"What specific proactive steps is Facebook taking to identify other leaders like Faith Goldy and preemptively remove them from the platform?" Cicilline asked.
Potts reiterated that the company works to identify people with links to hate and violence and banishes them from Facebook.
Connect with the Southeast Missourian Newsroom:
For corrections to this story or other insights for the editor, click here. To submit a letter to the editor, click here. To learn about the Southeast Missourian’s AI Policy, click here.