• 2 days ago
During a Senate Judiciary Committee hearing last week, Sen. Josh Hawley (R-MO) reacted to big tech firms saying moderation of child exploitation would be too expensive.

Fuel your success with Forbes. Gain unlimited access to premium journalism, including breaking news, groundbreaking in-depth reported stories, daily digests and more. Plus, members get a front-row seat at members-only events with leading thinkers and doers, access to premium video that can help you get ahead, an ad-light experience, early access to select products including NFT drops and more:

https://account.forbes.com/membership/?utm_source=youtube&utm_medium=display&utm_campaign=growth_non-sub_paid_subscribe_ytdescript


Stay Connected
Forbes on Facebook: http://fb.com/forbes
Forbes Video on Twitter: http://www.twitter.com/forbes
Forbes Video on Instagram: http://instagram.com/forbes
More From Forbes: http://forbes.com
Transcript
00:00Thank you very much, Mr. Pizarro. Thank you again to all of our witnesses. Let's
00:03start the questioning. And Mr. Tanago, I want to start with something that you
00:07said about these companies and your experience. You said that they often
00:12complain that it's just too expensive to take any sort of enforcement action or
00:18deterrence action. Could you just elaborate on that? I mean, what's been
00:21your experience when working with these major platforms that often are hosting
00:25the streaming that you were talking about, for example?
00:28Well, the quote that I referenced comes from Julie Inman Grant, Australia's
00:33eSafety commissioner. And thankfully, Australia has had a robust online safety
00:38act and regulatory powers for several years now. And so eSafety has been
00:43engaging with tech companies to understand what are they doing to
00:47address CSAM, what are they not doing. I think one of the most powerful things
00:51was in 2022 when eSafety published summaries of transparency notice
00:57responses and specific to the question of what were companies doing to address
01:02live stream child sexual abuse. Almost all of them said nothing. They weren't
01:07detecting it, they weren't disrupting it, they weren't reporting it. And I think
01:12that quote comes from her continual engagement with these companies to try
01:16to understand why. Why would you ever look the other way when there's an active
01:22crime scene happening on your platform, through your services, and you have the
01:28technology to detect it in real time. And so apparently what she has been told is
01:34it's too expensive, which tells me that their priorities are absolutely backwards.
01:40What a great way to put it. They are absolutely backwards. And I want to
01:43just highlight what you said there, that the companies complain frequently that
01:47oh it's just too expensive for us, they're not disrupting, they're not
01:50detecting, they're not deterring, and yet in 2023 Meta had a profit. Now profit, not
01:57revenues, profits, 23 billion dollars. Google, 60 billion dollars in profits in
02:03one year. Apple, 97 billion dollars in profits. Amazon, 27 billion dollars in
02:11profits. So don't tell me that they don't have the wherewithal financially to
02:16disrupt, detect, and report. They absolutely do. The truth is they don't
02:22want to do it. And right now they're really not compelled to do it. I want to
02:26show everybody a picture of a young girl. This is a profile of what is a young
02:35woman who's clearly underage. You can just hold it up guys. Let's have a look at this.
02:40So here is what appears to be the picture of a young girl whose birthday
02:46by the way indicated that she was a minor. Now the thing about this girl is
02:49it's fake. The New Mexico Attorney General created this profile as part of
02:54that office's investigation into the behavior of these online platforms. So
02:59they create a profile with a date of birth that clearly indicates she is a
03:04minor, she is underage. And what does she get from this this Facebook profile? What
03:09does she get? She was immediately barraged with friend requests from men
03:14between the ages of 18 to 40. Request after request uses sexually aggressive
03:20languages. This account received numerous sexually explicit chat messages each
03:26week, often with images, sexually explicit images from the men who were messaging
03:32her. The account received an offer to be a sugar baby for $5,000 a week. The
03:38account was added to a chat group focused on cultivating young girls
03:42between the ages of 12 and 16 for sexual exploitation. And get this, this girl was
03:50also, this profile was also served up Instagram video content featuring
03:57other young girls between the ages of 13 and 16. Meaning what? Meaning that
04:01Facebook's own algorithm knew she was underage, knew that there was sexual
04:06exploitation and sexually suggestive messages happening back and forth. The
04:10algorithm knew that and served up additional sexually exploitative
04:14content. The girl was also served an Instagram advertisement for a law firm
04:20representing trafficking survivors. The ultimate irony. Now of course, if this
04:26girl had really existed, could she have brought suit against Facebook? Not a
04:29chance. The law currently prevents it. And yet Facebook's own algorithm, they own
04:35Instagram, Facebook's own algorithm recognizes her as underage,
04:39recognizes the child sexual abuse content going on, recognizes that she may
04:44be a victim of child sexual abuse, and therefore sends her referrals for
04:49attorneys who work in this space. What should that tell us? Their algorithms are
04:53absolutely capable of identifying potentially sexually abusive material,
04:59totally capable of identifying when there's potential grooming going on,
05:04and yet they say, Mr. Tanoga, as you said, they sell the time. Oh, we
05:08just don't have the resources to do it. In fact, what they want to do is
05:11monetize people like this. They benefit, currently, financially from sexual
05:17exploitation. And this is why it is so important, it is so important, to allow
05:23victims, actual victims, thank goodness this girl doesn't exist, but there
05:28are literally millions like her, as you know from your own experience, Ms. Sines,
05:32millions like her who do, who are being exploited, sadly, as we sit here today in
05:36this committee room, who feel as if they have no recourse. And sadly, under our
05:40current law, they have very little recourse. It's time to give victims like
05:43this the right to get into court. Ms. Dallon, can I just ask you, you have so
05:47much experience in this area. In your view, what is the role that the
05:53technology platforms, the online platforms, what role do they have in
05:56deterring and disrupting and reporting this kind of abuse, and why is it
06:00important to require them to do more than they currently are? Thank you. It's
06:06interesting, I think as you were talking also about what we learned from the
06:11Australian East Safety Commissioner, it goes to also a provision within the
06:15Stop CSAM Act about transparency reports. We're spending a lot of time
06:20trying to identify where there's a decrease in reports, what happened. So
06:25much we have learned from what they are telling us from Australia is something
06:29that we would be able to take out, the transparency, what are these companies
06:33doing that is allowing this to happen? Are they detecting? Are they reporting?
06:37Where are they getting their information? How long are they holding? What types of
06:41reports are they getting from victims, from the public? At this point, we
06:46don't know. It's actually remarkable how much we do know when you consider this
06:50is entirely voluntary. Besides when there is a known violation of apparent
06:56child pornography, sex trafficking, or online enticement, at that point it
07:00triggers a report. Other than that, this is all just based on relationships and
07:04meeting the right person and finding out what they're doing. It's not going to
07:08work any longer. We're seeing that the voluntary structure of how the companies
07:14are making decisions without any sort of transparency, so we understand what
07:21they're doing and what they're not doing, it won't last any longer. We're seeing
07:25the drops and at this point more companies, we're talking about the drop
07:29in reports, more companies can simply make different business choices. The
07:33reports will go away. The incidents are not going away. We're just being blinded
07:37to them. And that's why it's so important that we strengthen these reporting
07:40requirements. Absolutely, absolutely. You know, one of the examples that I
07:44have in the testimony was regarding two offenders who were ganging up on a young
07:48girl, forcing her to produce awful, horrific child sexual abuse content. We
07:54sent that report to law enforcement. We called out this weekend to find out what
07:58happened. You had two law enforcement agencies in two states serving multiple
08:02subpoenas, exigent subpoenas, trying to find that victim, can't find the victim
08:07and don't know what happened because enough information was not provided at
08:10the moment they made that report. We need more details. If the report has been
08:15triggered, we need more details so law enforcement can actually protect children.
08:19Thank you so much. Senator Blackburn. Thank you, Mr. Chairman. Thank you to
08:24each of you for taking the time to be here.

Recommended