During a House Energy Committee hearing on examining harms online last week, Rep. Cliff Bentz (R-OR) questioned if the 'Take It Down Act' has the ability to pass.
Fuel your success with Forbes. Gain unlimited access to premium journalism, including breaking news, groundbreaking in-depth reported stories, daily digests and more. Plus, members get a front-row seat at members-only events with leading thinkers and doers, access to premium video that can help you get ahead, an ad-light experience, early access to select products including NFT drops and more:
https://account.forbes.com/membership/?utm_source=youtube&utm_medium=display&utm_campaign=growth_non-sub_paid_subscribe_ytdescript
Stay Connected
Forbes on Facebook: http://fb.com/forbes
Forbes Video on Twitter: http://www.twitter.com/forbes
Forbes Video on Instagram: http://instagram.com/forbes
More From Forbes: http://forbes.com
Fuel your success with Forbes. Gain unlimited access to premium journalism, including breaking news, groundbreaking in-depth reported stories, daily digests and more. Plus, members get a front-row seat at members-only events with leading thinkers and doers, access to premium video that can help you get ahead, an ad-light experience, early access to select products including NFT drops and more:
https://account.forbes.com/membership/?utm_source=youtube&utm_medium=display&utm_campaign=growth_non-sub_paid_subscribe_ytdescript
Stay Connected
Forbes on Facebook: http://fb.com/forbes
Forbes Video on Twitter: http://www.twitter.com/forbes
Forbes Video on Instagram: http://instagram.com/forbes
More From Forbes: http://forbes.com
Category
🗞
NewsTranscript
00:00The gentlelady yields back. Now I'll recognize Mr. Vence for his five minutes of questioning.
00:06Thank you, Mr. Chair, and thanks to all the witnesses for being here today.
00:11In preparation for this hearing, I looked at the title and it said the World Wide Web
00:15Examining Online Harms. And, of course, I'm totally supportive of the fact that we are
00:20prioritizing kids, but I decided I would go back and take a quick look at just how much
00:25online time people were spending. So, I'm sure that all of you are aware of the fact
00:30that Americans now spend seven hours, three minutes a day, each one of us apparently online.
00:37We also, I looked at the ages, I see here from zero to eight, 2.5 hours, from eight
00:43to 10 years old, six hours, 11 to 14, nine hours, 15 to 18, 7.5 hours a day. So, when
00:50we talk about what we're going to do about this problem, boy, is it a big problem. A
00:55huge, almost insurmountable problem. And as I watch my kids and others, I note the
01:03diminishment of critical thinking, the diminishment of skills, the diminishment of self-sufficiency.
01:09There's no lack of problems created online. But I'm happy today that we're talking about
01:15children. I think that's obviously what we need to be doing. Ms. Morrell, your attorneys
01:20must have looked at this bill. Is it going to pass free speech scrutiny? Could you specify
01:27which bill? Well, I say this bill, I mean the Protect Our Kids bill. The Protect Our
01:33Kids bill? Yeah, the one we've been talking about the entire time today. Oh, take it down
01:38or? Take it down. Oh, take it down. Yeah, take it down, forgive me. To pass free speech
01:42concerns, I believe it would. This is just trying to say that platforms, because in Section
01:51230, there's a good Samaritan provision that incentivizes platforms and said you will be
01:55protected from liability for removing content that is obscene, lewd, lascivious, otherwise
02:01objectionable. But there has been no accompanying stick to that carrot. And what take it down
02:07would say is that platforms cannot host content if they are contacted knowingly. That is non
02:14consensual imagery and they need to remove that. And so I think it is well within the
02:18authority of the government to do that. So in anticipation of this hearing, I went and
02:24of course researched the take it down bill and looked at some of the critical comments
02:28on it. It appears that there are questions that have been raised by various commentators
02:32about it. And I think we heard Congressman Obernolte reference the fact that it's not
02:37a perfect bill. We want this bill to survive. So I hope you've reached out and whatever
02:44suggestions you have or anybody on the panel that we should change this bill, please tell
02:48us. Because in looking at the history of everyone trying to get this done, obviously, First
02:53Amendment issues are going to come up. Why I'm asking you questions. Your testimony touches
02:58on the limits of some content filtering tools available. I take it these things are not
03:04effective. Tell me why. Yes. So content filters have not been effective in the smartphone
03:10app based ecosystem because often filters do not have access to the material inside
03:15of apps. And so pornography today is not restricted to pornography websites. It's on social media
03:21itself. And as I mentioned in my opening, often social media is the first entry point
03:25for kids. And they don't they're not meaning to access it, but they stumble upon a link
03:29and they click on it. And because there's no age gating on these porn sites, they're
03:33immediately able to get through to Pornhub. And so the filters are not working. It's very
03:38difficult. They often don't filter the in app browsers that each individual application
03:43actually has its own portal to the Internet. And so it's been a incredibly difficult problem
03:48for parents on their own. And so that's why I've been supportive of solutions like age
03:52verification that would actually require the porn sites to verify this person is an
03:56adult before letting a child through because filters are just not working for parents.
04:01And the technology has changed over the last 20 years that has made it extremely difficult
04:06for filters to work. I want to move back to a much more difficult issue. That's Section
04:11230. So I was on judiciary until I had the good fortune of being selected for this particular
04:18committee. And I must say, in the days and weeks I spent studying Section 230, it appeared
04:23to me hugely challenging to change it. Do you have some suggestions on how we might
04:28actually do that? A couple quick suggestions. I would say one
04:32critical thing would just be to clarify that Section 230 does not apply to product design
04:41to the company's own wrongdoing or their own algorithms as their product. I think that
04:46has been an issue that we've seen of lawsuits getting thrown out in court claiming Section
04:50230 immunity when it was not the speech the platform was hosting of third parties, but
04:55it was actually the product design of those platforms, algorithms connecting sex traffickers
05:00to their victims or promoting blackout challenges into the feet of a 10-year-old girl who then
05:05took her life by accident. And so I think it's really important to clarify
05:08Section 230 was not meant to cover product design. I would also say a bad Samaritan carve
05:14out to say that if you are knowingly hosting criminal content or criminal behavior on your
05:19website, you do not get to hide behind Section 230's immunity. To accompany the good Samaritan
05:24provision, we need a kind of commensurate bad Samaritan carve out for knowingly hosting
05:30criminal material. Thank you. Yield back.