Tech leaders faced a grilling in the Senate, and one offered an apology. But skeptics fear little will change this time.
A lot of heat, but will there be regulation?
Five technology C.E.O.s endured hours of grilling by senators on both sides of the aisle about their apparent failures to make their platforms safer for children, with some lawmakers accusing them of having “blood” on their hands.
But for all of the drama, including Mark Zuckerberg of Meta apologizing to relatives of online child sex abuse victims, few observers believe that there’s much chance of concrete action.
“Your product is killing people,” Senator Josh Hawley, Republican of Missouri, flatly told Zuckerberg at Wednesday’s hearing. Over 3.5 hours, members of the Senate Judiciary Committee laid into the Meta chief and the heads of Discord, Snap, TikTok and X over their policies. (Before the hearing began, senators released internal Meta documents that showed that executives had rejected efforts to devote more resources to safeguard children.)
But tech C.E.O.s offered only qualified support for legislative efforts. Those include the Kids Online Safety Act, or KOSA, which would require tech platforms to take “reasonable measures” to prevent harm, and STOP CSAM and EARN IT, two bills that would curtail some of the liability shield given to those companies by Section 230 of the Communications Decency Act.
Both Evan Spiegel of Snap and Linda Yaccarino of X backed KOSA, and Yaccarino also became the first tech C.E.O. to back the STOP CSAM Act. But neither endorsed EARN IT.
Zuckerberg called for legislation to force Apple and Google — neither of which was asked to testify — to be held responsible for verifying app users’ ages. But he otherwise emphasized that Meta had already offered resources to keep children safe.
Shou Chew of TikTok noted only that his company expected to invest over $2 billion in trust and safety measures this year.
Jason Citron of Discord allowed that Section 230 “needs to be updated,” and his company later said that it supports “elements” of STOP CSAM.
Experts worry that we’ve seen this play out before. Tech companies have zealously sought to defend Section 230, which protects them from liability for content users post on their platforms. Some lawmakers say altering it would be crucial to holding online platforms to account.
Meanwhile, tech groups have fought efforts by states to tighten the use of their services by children. Such laws would lead to a patchwork of regulations that should instead be addressed by Congress, the industry has argued.
Congress has failed to move meaningfully on such legislation. Absent a sea change in congressional will, Wednesday’s drama may have been just that.