Washington, D.C. — During a Senate Judiciary Committee hearing, U.S. Senator Adam Schiff (D-Calif.) questioned Mary Leary, Professor of Law at the Catholic University of America’s Columbus School of Law, and Carrie Goldberg, victim’s rights attorney, during the Senate Committee on the Judiciary’s hearing regarding privacy rights, children’s safety online, and strengthening legislation to address these concerns.

Watch the full conversation here.
Key Excerpts:
On strengthening protections for children in federal law:
Schiff: […] We established Section 230 for the reasons I think you implied, which is, it was a nascent industry. They urged us to do so, so that we would not stifle innovation. They also made the argument that without 230 they would not moderate content, because they would be sued if they did, and this would encourage them to moderate content. Well, there may have been a time where they moderated content, but those days seem to be over. It certainly wasn’t enacted because it was believed necessary for the First Amendment. First Amendment stands on its own two feet. In the absence of 230, companies could still plead a First Amendment Defense to any case. What is your preferred approach? That is, is it a repeal of 230 is it changing it from an immunity to some form of defense? Is it to cabin 230 in some way by narrowing the scope? What are the merits of the various approaches?
Mary Graw Leary: Thank you, Senator, I would say that it’s important to — a couple of things. I would say that there was discussion in the deep background about this free internet, nascent industry. And when we look at the policies and the findings at the beginning of Section 230 of the Communications Decency Act, there is languages to that. But I would repeat the overwhelming background and discussion was about the Child Protection piece. And therefore I think that the concern, I think that better than repealing the entire thing is to keep the (c)(2) language, which gives a cover, gives protection to a platform if they do — and specifically, if they remove anything they consider to be obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable — that will protect them. That’s all they need. They do not need (c)(1) which is what has been turned into this de facto, near absolute immunity. I think also adding the state’s ability to proceed is an important thing. I think, outside of 230 is holding them liable when they host this material like that has been discussed at length as well. So I think that those and some of the other things that I’ve listed — but I don’t want to use up too much of your time — all work together to respond to the complex crime.
On reducing the burden of proof on victims:
Schiff: […] What do you believe would be most helpful in terms of making sure that you can get the discovery you need? And that we have established the right protections and the right burdens in terms of the platforms and the pipelines?
Carrie Goldberg: Thank you. I agree we are long overdue to just abolish Section 230 but what’s important is that clients need to get into discovery so that they can actually know what you know, the extent of the problem. And the only way we can do that is if the standard is reasonable for parents to plead. If we have to show that the company knew about that picture, that exact victim, that exact perpetrator, there’s no way a client’s going to be able to overcome a motion to dismiss and get into discovery. So, we need to actually have standards like negligence, which the law already affords in almost all causes of action.
###