Center for Democracy and Technology’s Kate Ruane on the Kids Online Safety Act

Boy in White Long Sleeve Shirt Playing Computer Game
Photo by Cottonbro Studio via Pexels.

By 

Yesterday the Senate passed the Kids Online Safety Act (KOSA), a bipartisan bill that seeks to require social media platforms and online services to moderate content that is considered harmful to minors, such as content that discusses eating disorders and suicide, among other things.

The bill was passed 91-3 and would require online platforms that are often used by minors to exercise “duty of care” in increasing safety mechanisms in the online services’ design, such as limiting addictive features like video autoplay, allowing minors to opt-out of personalized algorithmic recommendations, and providing options for increased personal information protection.

KOSA was passed alongside the Children’s Online Privacy Protection Action (COPPA) 2.0, which similarly aims to regulate online services to protect minors from harmful online content.

Oregon Democratic Sen. Ron Wyden, Republican Kentucky Sen. Rand Paul and Republican Utah Sen. Mike Lee voted against both bills.

Sen. Paul had voiced his opposition in a July 25 letter, in which he urged his colleagues to reject the bill due to free speech concerns.

“While proponents of the bill claim that it is not designed to regulate content, imposing a ‘duty of care’ on online platforms to mitigate harms associated with mental health can only lead to one outcome: the stifling of First Amendment protected speech,” he wrote.

Sen. Wyden, despite voting “yes” in a cloture vote on July 25, posted on X that revisions made to KOSA were “constructive” but “remain insufficient.”

“I fear KOSA could be used to sue services that offer privacy technologies like encryption or anonymity features that kids rely on to communicate securely and privately without being spied on by predators online,” he wrote.

Concerns have been raised by the American Civil Liberties Union, the Electronic Frontier Foundation and other free expression advocacy groups who fear the legislation violates the First Amendment. Despite revisions made since the bill’s inception, opponents say the bill could harm vulnerable groups of children who seek information about the LGBTQ+ community or about reproductive care, among other things. The Electronic Frontier Foundation, a nonprofit which defends civil liberties online, has described KOSA as a “dangerous and unconstitutional censorship bill that would empower state officials to target services and online content they do not like.”

Headshot of Kate Ruane.

Headshot of Kate Ruane. Photo courtesy of Kate Ruane

In an interview with First Amendment Watch before the Senate vote, Kate Ruane, director of the Center for Democracy and Technology’s Free Expression Project, a nonprofit organization that has expressed its opposition to the bill, discussed KOSA and its First Amendment implications. Ruane explained minors’ First Amendment rights and expressed concern over the legislation’s impact on children’s access to information about certain topics. She said she’d recommend comprehensive consumer privacy legislation in place of KOSA to protect children online without running afoul of the First Amendment.

Editor’s note: This interview has been edited and condensed for length and clarity.

FAW: What is the Kids Online Safety Act and what free expression issues does it raise?

KR: The Kids Online Safety Act is a bill that was drafted and introduced by Sens. [Richard] Blumenthal and [Marsha] Blackburn — a bipartisan effort — with the goal of protecting kids online. The sponsors had heard from numerous parents and from children who were talking about the ways in which engagement with social media can contribute to mental health disorders, can contribute to, or exacerbate, bullying and other negative things that can happen in kids’ lives. And so they set out to try to help fix that problem. And what they came up with has some really good ideas in it. But the biggest piece of it is that the Kids Online Safety Act, as currently drafted, would require all platforms that are covered by the bill to take reasonable measures to prevent and mitigate, in the creation of their design features, a list of specific harms that the bill describes. One of them is some subset of mental health disorders that are defined in the DSM-IV, including depression and anxiety and eating disorders and suicidality. I think there are a couple more, but those are part of the list. [It] also [seeks to] prevent and mitigate harassment and online bullying.

At first blush, this sounds like a really good and well-intentioned idea. Where the concerns for free expression come in is lots of things, because essentially what we’re doing here is requiring platforms or online services to stop serving certain content to kids, content that whatever government officials might be in charge at the time, thinks could cause these harms. The categories of harms defined are very vague and we don’t have specific definitions for them. It is these things as defined by the DSM going into the future. So we know what it is now, but we actually don’t know what it’s going to be in five years, because the actual scope of these things is going to change over time in ways that we cannot predict right now, the ways that the platforms cannot predict right now. And the concern is that because we are empowering government officials to tell platforms what kinds of speech does and doesn’t harm kids, you are going to get a fair amount of differing interpretations depending upon who is in power. So if you have a Republican in charge of the Federal Trade Commission, perhaps then speech about gender dysphoria, about coming out as a trans person, about LGBTQ lives, about reproductive health, about climate change, maybe are interpreted to cause anxiety and depression in kids, and therefore platforms should not be delivering that content to children. On the other side of the coin, if a Democrat is in charge, maybe speech about automatic weapons, about school shootings, about religious beliefs that preach that LGBTQ people living their lives are contrary to God, maybe that speech gets censored online. So you see the concerns from a broad range of people regarding how this law could be enforced to political ends, rather than to the ends that they are intended, which is to protect kids. 

The other kind of big free expression concern is this law incentivizes platforms to figure out who using their platform is or isn’t a child, and that could require additional data gathering or additional data analysis in an attempt to guess who is and who isn’t a kid. The guessing is going to be, in some measure, inaccurate. And even if that measure is like by percentages, very small, that’s still millions of people, lots of people who are going to be misjudged, either to be children, so adults that now don’t have access to legal content to which they should have unfettered access, or they will be children who will be misjudged to be adults and who will not receive the “protection” that the bill supposedly provides. That concern also has constitutional implications as well, because the Supreme Court has said you cannot burden speech online, you cannot essentially turn the internet into a place that is only suitable for children. And so that concern also, in addition to the content-based concerns I raised earlier, the concern about incentivizing age-assurance or age-verification also has constitutional implications because of previous Supreme Court precedents saying you actually can’t burden speech in a way that tries to sanitize the internet only to make it suitable only for children.

FAW: Do minors’ First Amendment protections differ from those of adults?

Mary Beth Tinker holding her original detention slip after she wore a black armband to school to protest the Vietnam War (with a replica on her left arm) during a speech

Mary Beth Tinker holding her original detention slip after she wore a black armband to school to protest the Vietnam War (with a replica on her left arm) during a speech at Textor Hall, Ithaca College, Sept. 19, 2017. Photo by Amalex5 via Wikimedia Commons, CC BY-SA 4.0

KR: Children have free expression rights. Children do have First Amendment rights. The government does have greater authority to regulate the speech of children, and in certain circumstances, to regulate the speech that reaches children. But it’s not this new standard where if you were a child you have completely different First Amendment rights, or your constitutional rights are so very attenuated. That’s simply not the case. What is true is that minors in school settings can experience greater speech regulations in service of the school being able to function as a school. And so there’s plenty of case law, like Tinker v. Des Moines and that line of cases, which says children do not surrender their First Amendment rights at the schoolhouse gate. School administrators do have the ability to regulate their speech beyond what would be possible in a public park, [for example], because we are trying to educate children, and so to the extent that speech is interfering with the ability to run a school, the school administration has some authority to restrict speech that otherwise would be protected. But if you were doing something like in Tinker v. Des Moines, where Mary Beth Tinker was wearing an armband in protest of the Vietnam War, the Court was like, you actually can’t stop her from doing that. She has a First Amendment right to do that. She wasn’t disrupting the school. She does get to continue to wear that armband because she has the right to do it. So kids have First Amendment rights. They may be attenuated in the school context. They may also be attenuated to the extent that people are attempting to distribute, directly to minors, content that is obscene as to minors. That can be sexually explicit content. But the caveat to that is that where the attempt to burden the distribution of speech that is obscene as to minors also impacts the speech rights of adults, then the highest standard of scrutiny applies to that restriction. 

To the extent that the government has tried to expand the idea that the government has broader authority to restrict the delivery of certain content to minors because it is “harmful,” so far the Supreme Court has said no. California passed a law that prohibited the sale, without parental consent, of violent video games to children. The Supreme Court struck down that law saying that violent speech has never been treated as a category of speech that is outside of constitutional protection the way that obscenity has, and we are not going to change that now. And so if you want to restrict the distribution of violent speech to anybody, you have to meet strict scrutiny, and this doesn’t do that. And that was even as applied, but the law only applied to the sale of violent video games to minors, and the court struck it down. So to me, taken together, that tells us that there are certain contexts where the government does have the ability to suppress or restrict the speech of minors in service of particular government goals that are related to their health and safety, for example, providing them an education. But children have First Amendment rights, and we cannot just broadly restrict those rights to receive information and to speak online simply because some of the speech they may be receiving or interacting with is speech that makes adults uncomfortable.

FAW: Do you believe that content centered around “politically divisive” topics, such as reproductive rights or the war in Gaza, are important for minors to have the opportunity to see? Why or why not? How might this regulation of content impact children?

KR: I think that there are certain kinds of examples of content that have been singled out as particularly objectionable by people who are concerned about the content that children are able to see or not see online. That content being content related to eating disorders, content related to the commission of suicide. These are very disturbing topics. But I would also note that these are topics that are covered in books that children have access to. These are topics that, if you attempt to suppress the content that people are talking about, the content that advocates for eating disorders or the content that describes particular methods of suicide, if you try to restrict that, you also wind up, at the same time, restricting content that is attempting to help people who are experiencing that kind of harm. KOSA, as it currently stands, is so broadly worded that a platform attempting to comply with it will essentially have no choice but to filter all content related to a particularly controversial topic in order to comply. Those topics could be not just related to eating disorders or suicide, both of which have speech included in that category of people that are attempting to help kids who are experiencing that harm get out of that situation. But it’ll also apply to reproductive health. It will apply to LGBTQ issues. It will apply to climate change. It will apply to the wars in Ukraine and Gaza. These are all things that cause anxiety and that platforms aren’t going to know one way or the other because they are moderating content at such a massive scale, the concern is that they will, as a result, suppress far more speech that we would actually want kids to be able to have access to in order to comply.

FAW: Is there a way to do this without restricting minors’ access to content that, as you’ve mentioned, may help them work through certain issues? Where is that fine line drawn to separate unprotected speech versus educational materials, or are you saying it’s difficult to define?

KR: I’m saying that that is very difficult to define, and I haven’t seen it done properly yet. But the thing that I would recommend that policymakers pursue is actually comprehensive consumer privacy legislation that provides significant protection to kids, and that requires high privacy settings that restrict the collection of data about minors to a high degree. My reasoning behind this is because one of the big incentives of existing business models for online services is to collect data in order to sell ads, and in order to deliver content in a way that maintains attention to the platform so that people see more ads. If we can legally restrict the ability to collect unnecessary data and also restrict the ability to use that data to market to kids, those are two very meaningful things that could significantly reduce some of the concerns that are motivating goals, like the Kids Online Safety Act, while not creating all of the content concerns that we have, and while still allowing young users to be able to interact with the services that they like and enjoy, and be able to have potentially even more control over what their experience on those services would be.

The second thing I would recommend is helping kids have more control over their experiences online. What are the tools that we can give them that help them be able to control what they can and cannot see, that help them have their feedback to the platforms be heard and responded to properly? What are the basic tools they can give them? Sometimes they just need a block button so that when somebody that they don’t like or who is delivering them content that is upsetting to them, they can stop that from coming at them. Actually talking to a lot of the youth advocates about this and saying, “All right, what are the things that you need on the platforms that you use that would be helpful to you and that would help you engage and employ the strategies for receiving the content you want while also avoiding the stuff that you don’t?”

The exterior of the U.S. Supreme Court

In this Jan. 26, 2022 file photo, the U.S. Supreme Court is seen after it was reported Supreme Court Justice Stephen Breyer would retire at the end of this term. (Reuters/Joshua Roberts)

FAW: Does KOSA run into the same issues that arose in Moody v. NetChoice, in which the Court said that it often “has barred the government from forcing a private speaker to present views it wished to spurn in order to rejigger the expressive realm”? Wouldn’t KOSA regulate protected content? 

KR: Overall yes, that is exactly what KOSA does. KOSA does regulate the delivery of constitutionally protected content.

FAW: Does that run into the same issues that were of concern in Moody v. NetChoice?

KR: I believe it does. I think that the Court in Moody v. NetChoice was very careful not to signal how it would interpret the constitutionality of a law like KOSA. They analyzed, in dicta, the likely constitutionality of Texas’ content moderation law as applied to services like YouTube’s homepage and Facebook’s newsfeed. What the court did absolutely tell us is that the process of engaging in content moderation, the process that a platform goes through in deciding what content to publish in what order to publish it, and what content not to publish, that that is indeed an editorial process that receives First Amendment protection, and attempts to burden it must survive a very high standard of constitutional scrutiny, and that is true as applied to the Kids Online Safety Act. That being said, I’m not sure that Moody v. NetChoice actually signals to us how the Court will analyze KOSA in that they, in analyzing the Texas statute, essentially said that the Texas government’s motivation, even their reasoning for enacting this particular provision, which was to level the speech playing field online, was entirely illegitimate, and it’s just not something that government can do. It essentially didn’t even survive the lower standard of scrutiny that the court applied to it. So they didn’t decide what standard of scrutiny to apply, because the Texas law was so outside of what could ever possibly be constitutional that it was an easier case for the court to decide. The motivation behind the Kids Online Safety Act is obviously very different. It is not about leveling the speech playing field. It’s not about making sure that conservatives have the same speech rights online as liberals, or whatever [Texas’] motivation was. It’s not about that. It is about trying to figure out how to protect kids online. And that is a very different government interest. That is a legitimate government interest. The court has repeatedly held that is a legitimate government interest. So I think the court will look more at the tailoring of the law to determine whether it burdens the speech of adults more than necessary, whether it burdens the speech rights of children more than necessary in order to achieve its goals.

FAW: According to The Associated Press, there have been some concerns over KOSA’s “duty of care” provision. Reportedly, revisions of the act have stripped the power from state attorneys general to enforce the provision. Why is the duty of care provision so controversial? Is it accurate that revisions have been made to change who enforces this provision? 

KR: Everything that I was talking about with respect to mental health harms and the duty to prevent and mitigate in design features, particular mental health harms, online bullying and harassment, that’s the duty of care. It’s Section 3 of the Kids Online Safety Act. The core constitutional concerns are the duty of care. That being said, The Associated Press is correct that previous versions of the Kids Online Safety Act granted enforcement authority of the duty of care to state attorneys general and the Federal Trade Commission. The current version of the Kids Online Safety Act, the one that was voted on July 25 and will go to the floor for final passage [on July 30], does not permit state attorneys general to enforce the duty of care. This is a significant reduction in concerns in terms of differential enforcement of the law in an “every four years” kind of way, where instead of getting a bunch of differential enforcement of the law in all 50 states, you run the risk of having differential enforcement every four years as the presidential administration changes, or every eight years, depending on how the political winds blow. So it is a reduction in concern, but it does not eliminate the concerns.

And there’s another piece of that that I want to to bring up is that nothing about the Kids Online Safety Act would prevent states from enacting the exact same law. And I could absolutely see states with unified governments — we have more than 26 of them right now — that want to use the duty of care as a tool, but don’t have it available because the federal government’s version, assuming it passes into law, doesn’t permit them to enforce it. I could see them simply enacting their own and that’s a pretty big concern, because that would reintroduce the problems with respect to differential enforcement in the various states that could enact the law. 

More on First Amendment Watch: