Skip to content

VIDEO: Hirono Presses Big Tech Executives to Protect Children Online

Sen. Hirono: “If you are going to continue to attract kids to your platforms, you have an obligation to ensure they are safe on those platforms.”

WASHINGTON, DC – Today, U.S. Senator Mazie K. Hirono (D-HI), a member of the Senate Judiciary Committee, pressed the CEOs of five major social media platforms—Meta, X Corp., TikTok Inc., Snap Inc., and Discord Inc.—about the dangers that underage users face on these platforms. In the hearing, Senator Hirono called out the companies for attracting children to their platforms and then failing to protect them from harm on those same platforms.

“This hearing is about how to keep children safe online. We listened to all of your testimonies to seemingly impressive safeguards for young users,” Senator Hirono said in her opening remarks. “You try to limit the time that they spend, you require parental consent, you have all of these tools, yet trafficking and exploitation of minors online—and on your platforms—continues to be rampant.”

Senator Hirono also highlighted how these social media platforms facilitate the creation and distribution of child sexual abuse material (CSAM) and sex trafficking—a prevalent issue in Hawaii that she has consistently worked to combat.

“Sex trafficking is a serious problem in my home state of Hawaii, especially for Native Hawaiian victims,” continued Senator Hirono. “That social media platforms are being used to facilitate this trafficking, as well as the creation and distribution of CSAM, is deeply concerning. But it’s happening.”

“If you are going to continue to attract kids to your platforms, you have an obligation to ensure they are safe on those platforms,” concluded Senator Hirono. “Your companies cannot continue to profit off young users, only to look the other way when those users–children–are harmed online.”

A transcript of Senator Hirono’s opening remarks is below and video is available for download here.

As we’ve heard, children face all sorts of danger when they use social media–from mental health harms to sexual exploitation, even trafficking. Sex trafficking is a serious problem in my home state of Hawaii, especially for Native Hawaiian victims. That social media platforms are being used to facilitate this trafficking, as well as the creation and distribution of CSAM, is deeply concerning. But it’s happening.

For example, several years ago, a Military Police Officer stationed in Hawaii was sentenced to fifteen years in prison for producing CSAM as part of his online exploitation of a minor female. He began communicating with this twelve-year-old girl through Instagram. He then used Snapchat to send her sexually explicit photos and to solicit such photos from her. He later used these photos to blackmail her. And just last month, the FBI arrested a neo-Nazi cult leader in Hawaii who lured victims to his Discord server. He used that server to share images of extremely disturbing child sexual abuse material—interspersed with neo-Nazi imagery. Members of his child exploitation and hate group are also present on Instagram, Snapchat, X, and TikTok, all of which they used to recruit potential members and victims.

In many cases, including the ones I just mentioned, your companies played a role in helping law enforcement investigate these offenders. But by the time of the investigation, so much damage had already been done. This hearing is about how to keep children safe online. We listened to all of your testimonies to seemingly impressive safeguards for young users. You try to limit the time that they spend, you require parental consent, you have all of these tools, yet trafficking and exploitation of minors online—and on your platforms—continues to be rampant.

Nearly all of your companies make your money through advertising—specifically, by selling the attention of your users. Your product is your users. As a Meta product designer wrote in an email, the “young ones are the best ones . . . you want to bring people to your service young and early.” In other words, hook them early. Research published last month by Harvard’s school of public health estimates that Snap makes an astounding 41% of its revenue by advertising to users under 18, with TikTok at 35%. Seven of the ten largest Discord servers—attracting many paying users—are for games used primarily by children.

All this is to say that social media companies—yours and others—make money by attracting kids to your platforms. But ensuring safety doesn’t make money, it costs money. If you are going to continue to attract kids to your platforms, you have an obligation to ensure they are safe on those platforms. Because the current situation is untenable. That is why we are having this hearing. But to ensure safety for our children, it costs money. Your companies cannot continue to profit off young users, only to look the other way when those users–children–are harmed online.

###