Meta, X, TikTok, Snapchat, and Discord faced an abundance of questions and criticisms from lawmakers on both sides of the aisle during a Senate hearing on Wednesday. The hearing focused on the impact of social media platforms on children’s well-being and safety, as well as their content moderation policies and practices.
Sen. Thom Tillis (R-N.C.) accused the tech executives of failing to protect children from sexual exploitation and harmful content on their platforms during the hearing.
“I’m here today to tell you that we could regulate you out of business if we wanted to,” Tillis said.
The hearing was held in a packed committee room, where several parents of children who died by suicide or were targeted by online predators were present. Some of them held up photos of their deceased children and demanded accountability from the social media companies.
Sen. Josh Hawley (R-Mo) asked Meta CEO Mark Zuckerberg to stand up and apologize directly to the parents. Zuckerberg complied and expressed his condolences to the families.
“I’m sorry for what you’re going through, and I know that this is incredibly hard,” Zuckerberg said. “No one should have to go through the things that your families have suffered.”
However, Zuckerberg and the other tech executives also defended their platforms and their efforts to improve child safety and combat misinformation. They argued that their platforms provided positive benefits for millions of users, especially during the pandemic, and that they have invested heavily in developing tools and policies to protect their users. TikTok CEO Shou Zi Chew highlighted the company’s age verification system, parental controls, and educational resources.
“We take our responsibility to protect young people on our platform very seriously,” Chew said.
The tech executives also pushed back against the state laws passed by Florida and Texas that would limit their ability to moderate and curate content on their platforms. They claimed that the laws violate their First Amendment rights and would undermine their efforts to create a safe and engaging online environment for their users. X CEO Linda Yaccarino testified that the company has removed more than 1.6 billion accounts for violating its rules since 2015.
“These laws are unconstitutional and would harm the very people they purport to protect,” Yaccarino said. 5.
The hearing is a part of the effort of Congress for tighter regulation of social media companies, especially after the Jan. 6 attack on the U.S. Capitol, which was partly fueled by online misinformation and extremism. Several lawmakers have introduced bills that would address various aspects of social media regulation, such as the Stop CSAM Act and the Kids Online Safety Act, or KOSA. Some lawmakers favor repealing or reforming Section 230 of the Communications Decency Act, which shields social media companies from liability for user-generated content. Others prefer more targeted measures that would focus on specific issues, such as transparency, accountability, and data privacy.